The science behind the formation of insight that’s necessary for Pricing & Marketing folks to make more informed or robust decisions, the risks associated with that science, the virtues of that science, and explanations of the sophistication of the science.
your demand environment doesn't stop with your immediate customers
How would you describe your demand environment? How would you define its boundaries?
Managers in business-to-business (B2B) enterprises typically respond to these kinds of questions by referencing a host of familiar business practice models articulated and implemented over the last 20-30 years. According to these practices “demand” involves everything that takes place between you and the customers who buy things from you, while “supply” refers in turn to the mechanics of everything you procure from upstream vendors. Demand is the world of sales & marketing, of pricing and running promotional campaigns and figuring out what products to bundle with others to win a bigger share of the customer’s total market basket. Supply is operations and logistics, inventory cost management and procurement processes. The concept of a “supply chain” has taken firm root over the last decade or so, while demand is seen less as a sequential chain than as a loose collection of activities organized around a point of contact between the enterprise and the collective needs and preferences of the customers who directly buy its products and services. The “art of the sale” is thought to be a less quantifiable notion than the “science of logistics”, with a greater X-factor that does not easily lend itself to data-driven analysis. This is the traditional view – but managers today are faced with a new set of realities that require a different way of looking at the enterprise’s demand environment. Continue reading →
what matters is understanding the concept, not the formula
If you have ever tuned into the financial markets report on the evening news (or switched on CNBC just about any time on any given day) you are no doubt familiar with the formula by which the day’s action is delivered, with the anchor saying something along the lines of “stocks fell today on news of higher home foreclosures”. To drive the point home the broadcast will trot out the stock footage of suburban homes with foreclosure signs in the front yard. These kinds of reports are snappy and speak to our very human need to supply a causation narrative to the events we encounter in our lives. Small wonder that the question of causation has beguiled and tormented philosophers since the beginning of human civilization. Continue reading →
SKU proliferation has been a fact of life in foodservice much as it has been in other industries in recent years. Proliferation creates considerable pressure throughout the value chain to make tough decisions about SKU assortment across numerous product categories. In foodservice the problem is not shelf space as it is in retail; rather, it is the limited amount of product information that a sales representative can manage in his or her head in order to match the right products with the right customers on a daily basis in real time. As managing assortment has grown more complex, manufacturers and their downstream partners have looked to SKU rationalization to reduce streamline product offerings and manage inventory costs for improved category performance. While SKU rationalization can address these challenges to some extent, it does not get to the core of the problem. The most effective way to improve category performance is to increase demand for products in that category. In turn, the best way to grow demand is to seamlessly match unique customers with the products whose attributes they most highly value. This requires a holistic category management approach, supported by robust data analytics that can take into account the key levers of demand – assortment, promotions, pricing and purchase timing. Continue reading →
Collaboration between distributors and manufacturers is the cornerstone of category management in foodservice. For a given product category a manufacturer is selected to be category captain, with responsibility for improving category performance. This post addresses some key data and analytical issues with which manufacturers should expect to deal as category captains.
So you have been asked by your most important foodservice distribution partner to be a category captain. What happens next? As captain you are tasked with managing the assigned category for optimal performance. That entails the following:
• Analyze all products across the category (not just your own brands)
• Augment the data provided by the distribution partner with your own internally generated insights
• Provide structured, actionable recommendations based on intelligence obtained from the data
These recommendations relate to product assortment, pricing policies, promotional activities and other important demand levers for driving profitability. At the same time you need to educate your distribution partners, both at corporate headquarters and in the field, about the product characteristics that can help increase demand. This requires an intelligent approach to data analytics.
What insights about products can help drive category sales?
What might a good analytics model for category management look like? Let’s consider the key tasks we identified in the previous paragraph. Continue reading →
intelligence on competitors' prices may be close at hand
If only you knew what your competitors are charging. How many times on any given day does that phrase get uttered in a corporate boardroom, on a sales call or in a marketing strategy meeting? Knowing what your competition is charging would take so much of the guesswork out of your daily pricing and marketing decisions. It may come as a surprise, then, that critical information capable of revealing competitors’ prices may be very close at hand – in your own purchase history.
Is there a “Market Price” for COGS?
Since you do not have direct access to your customers’ prices, the challenge is to model likely competitor activity based on incomplete information. A good place to start is with costs – specifically cost of goods sold (COGS). This is typically a key input in pricing models. Knowing a competitor’s COGS would provide critical intelligence in determining what prices they are offering in the market. The trick is to accurately infer the “market” price a competitor pays for their inputs (i.e. their COGS) from the information contained in your own transaction data. Continue reading →
Quantitative modeling is a creative process. There is as much art to modeling as there is science – choices about what relationships you want to express and how to express them. And just as with anything creative, the authors of quantitative models can take pride in the beauty of their creations. In the words of my colleague Ali Mahani, Sentrana’s senior quantitative modeler, models can be truly elegant – they can be things of beauty. But he adds that they can also be irrelevant – irrelevant to the particular business goals they are intended to serve. That presents a problem for enterprises seeking to elevate the role of quantitative insights in their decision making processes. Data and analytical methods are important tools in the arsenal of a modern enterprise. But decision makers would be wise to heed my colleague Ali’s advice: in using these tools, make sure to avoid the trap of “irrelevant elegance”.
Elegance does not always lead to the best outcomes
Elegance in modeling is expressed in the appearance of simplicity – rendering sprawlingly complex interrelationships in the real word into the clarity of precise mathematical formulae. Simplicity and elegance are all well and good, unless in the quest for this holy grail you wind up dramatically misrepresenting how things actually work in the environment you are trying to model. This can result in not only failing to solve the business problem at hand, but actually making matters worse than status quo ante by facilitating decisions based on incorrect assumptions. We have a real world example of just how much worse this can be in the financial markets debacle of 2008, when the elegant models crafted by the best and brightest quantitative experts Wall Street had to offer proved to be fatally flawed in the assumptions and heuristics they used to express the variables affecting housing prices, interest rates and mortgage payment trends. Perhaps modelers need to live by something like the Hippocratic oath taken by medical doctors: first of all, do no harm. Continue reading →
Everywhere you look, it seems, people are talking about “physics envy”. This derisive term mocks the attempt of economists and other social sciences practitioners to imbue their disciplines with the equations and mathematical rigor of physics – a rigor that many believe fails when applied to the messy environments of disciplines like sociology or economics. It’s not a new term – economist Philip Mirowski contributed to the Finnish Economic Papers series way back in 1992 with a piece entitled “Do Economists Suffer from Physics Envy?”
kinetic energy, not supply & demand
Eighteen years later the answer from many observation posts along the byways of public discourse appears to be: yes, they most certainly do, and so do their fellow travelers, business and financial markets experts. After all, we just barely survived the most devastating economic event of our times, deeper and more far-reaching than any downturn since the Great Depression, and all the high priests of the field can do is shake their heads and say “wow, I sure didn’t see that coming.” Distrust of fancy math is rampant in all walks of business life. That presents a real problem for enterprise decision-makers at a time when they need smart quantitative tools – yes, fancy math and all – more than ever. Markets are more complex than at any time in human history. Giant waves of transactional data inundate marketing managers with new information every day. Managers need science to help them gain valuable insights into the markets for their products and services – but how do they know that the growing number and variety of scientific marketing tools out there aren’t infected with the nasty symptoms of physics envy? Continue reading →
In a previous posting (“Quantitative Intuition: It’s Not Counterintuitive”) I described some of the advancements that have been made in bringing together the disparate worlds of quantitative methods and human intuition, ending on the rather happy note that advanced scientific micromarketing models today are capable of introducing qualitative human judgment and experience into quantitative models, such that the models are able to “learn” from humans about important factors such as competitive threats, nuanced negotiation strategies and even meteorological vagaries – factors that traditionally have been difficult to crunch into the binary 1s and 0s of machine language. The human brain works in a hierarchical manner, embedding propositions within propositions to think a potentially infinite number of thoughts. In the example I used in the last posting, a sales rep who reads about a national wholesaler coming to town to open a discount distribution center can nearly instantaneously form a series of mental propositions to evaluate the importance of that news and the probability of potential outcomes that may (or may not) require decisive competitive action from the sales rep’s firm. Continue reading →
Think of the best salesperson you know: if you’re fortunate, perhaps someone in your company or, less happily, in a competitor’s firm. What are the qualities that make this person excel at the job of sales? In a classic Harvard Business Review article “What Makes a Great Salesperson” (July-August 1964) David Mayer and Herbert Greenberg likened a star salesperson to a heat-seeking missile: “Sensing what customers are feeling, they [the sales stars] are able to change pace, double back on the track, and make whatever creative modifications might be necessary to home in on the target and close the sale.” Whereas most of us have intuitive abilities to a greater or lesser extent, excellent salespeople lever this intuition with strong empathy skills (sensing what the customer’s needs are) and the relentless personal drive necessary to cross the finish line. If they could, managers would bottle this elusive elixir of talents and have all their salespeople drink it, every morning of every day. Continue reading →
“Burn the mathematics” wrote economist Alfred Marshall in a letter to a friend, musing about the proper role of mathematics and scientific inquiry in the field of economics. That 19th century cogitation would seem to be a prêt-a-porter soundbite for these latter days of the 21st century’s first decade – a time in which the mathematical infrastructure that underpins longstanding economic and financial theories stands accused of all manner of malfeasance, particularly given its presumed role in the decade’s signature economic event – the financial market meltdown of 2008. The logic behind the accusation goes roughly thus: More complex (but not necessarily more “accurate”) models allow for more complex instruments to be created. Increased complexity means it takes more time to process and then fully comprehend what the numbers may be telling you. At the same time, though, technology allows buy and sell orders to be executed almost instantaneously through electronic trading systems. Time is of the essence, and ponderously complex computations simply won’t do. A seemingly elegant (and fast, and commercially viable) shortcut is discovered and becomes the currency of the day. The models’ outputs come to be trusted blindly simply because there is no time to question them (and too much money to be made by using them). The impenetrable Greek letters obfuscate the sensitivity of the models to changes in important assumptions – which is fine for a few years because those assumptions (e.g. rising housing prices) don’t change – but then all of a sudden they do. The models start losing more money than they make. Then the chasm widens further as the high levels of leverage in the system make themselves felt. The losses accelerate dramatically, wiping out years of profits in just a few months. Burn the mathematics, indeed.
But let’s take a different look at this apparent tight coupling of mathematics and dire outcomes. Our recent correspondence with an author who has been widely published on the subject of Wall Street’s use of mathematical models recently offered to us an interesting opinion. His point was that the problem with the models was not so much their complexity, but rather that they were models in the first place. His argument was that you can’t ever perfectly hedge model risk. Now, I agree with that observation: a model by definition selects some aspects of reality to represent and omits others, and the choice of what to include and what to omit is subject to human error, therefore fallible and not perfectly hedgable. But I take issue with the idea that the fault lies in the existence of the models themselves. Models can be misused – I think that much is clear. But the notion that models are all doomed to failure obscures a deeper truth about the goals of predictive modeling; namely that you can seek either to reduce the world or truly explain it. By trying to elegantly reduce the world to as few predictor variables as possible, you are more likely to be sowing the seeds of future failure, because complexity and actual drivers of outcomes are taken out of the equations to make them more solvable (or perhaps sellable, as in the case of the Gaussian copula function that was behind Wall Street’s demise, as we discussed in a previous posting “You Can’t Punt Away the Dimensionality Curse”). Predictive modelers don’t have to go down that road, however: they can also set out with the goal not of reducing an entire system to a single neat, tractable equation, but to quantify and explain all of the relationships that dictate outcomes to the absolute fullest extent possible. Tractability and computability are things to address later in the process, through technological means, but they should not dictate the fundamental mathematical approach at the outset. Continue reading →