Solving Three Key Challenges to Profitable Category Management
Managing product categories for optimal performance in foodservice presents three key challenges that category partners need to solve: how to manage data reporting and analysis, conduct effective selling logistics, and close the sale. This post examines these three problems and identifies practicable solutions for manufacturers in collaboration with their distribution partners.
Data Reporting, Management and Analysis
Manufacturers often do not have regular, dependable access to sales data. Transaction information typically resides downstream, so the manufacturer must negotiate with its distribution partners to establish a mechanism for information sharing. Assuming such agreement is reached, the process may give rise to a variety of data problems. Data integrity issues are prominent among these. It is unlikely that the manufacturer will receive specially prepared sales reports – information more probably will come in the form of raw data untreated for accuracy, correctness or clarity. Readers of these reports will find it hard to obtain insights in them from which to take action on a timely basis. Continue reading →
In a previous posting (“Quantitative Intuition: It’s Not Counterintuitive”) I described some of the advancements that have been made in bringing together the disparate worlds of quantitative methods and human intuition, ending on the rather happy note that advanced scientific micromarketing models today are capable of introducing qualitative human judgment and experience into quantitative models, such that the models are able to “learn” from humans about important factors such as competitive threats, nuanced negotiation strategies and even meteorological vagaries – factors that traditionally have been difficult to crunch into the binary 1s and 0s of machine language. The human brain works in a hierarchical manner, embedding propositions within propositions to think a potentially infinite number of thoughts. In the example I used in the last posting, a sales rep who reads about a national wholesaler coming to town to open a discount distribution center can nearly instantaneously form a series of mental propositions to evaluate the importance of that news and the probability of potential outcomes that may (or may not) require decisive competitive action from the sales rep’s firm. Continue reading →
“Burn the mathematics” wrote economist Alfred Marshall in a letter to a friend, musing about the proper role of mathematics and scientific inquiry in the field of economics. That 19th century cogitation would seem to be a prêt-a-porter soundbite for these latter days of the 21st century’s first decade – a time in which the mathematical infrastructure that underpins longstanding economic and financial theories stands accused of all manner of malfeasance, particularly given its presumed role in the decade’s signature economic event – the financial market meltdown of 2008. The logic behind the accusation goes roughly thus: More complex (but not necessarily more “accurate”) models allow for more complex instruments to be created. Increased complexity means it takes more time to process and then fully comprehend what the numbers may be telling you. At the same time, though, technology allows buy and sell orders to be executed almost instantaneously through electronic trading systems. Time is of the essence, and ponderously complex computations simply won’t do. A seemingly elegant (and fast, and commercially viable) shortcut is discovered and becomes the currency of the day. The models’ outputs come to be trusted blindly simply because there is no time to question them (and too much money to be made by using them). The impenetrable Greek letters obfuscate the sensitivity of the models to changes in important assumptions – which is fine for a few years because those assumptions (e.g. rising housing prices) don’t change – but then all of a sudden they do. The models start losing more money than they make. Then the chasm widens further as the high levels of leverage in the system make themselves felt. The losses accelerate dramatically, wiping out years of profits in just a few months. Burn the mathematics, indeed.
But let’s take a different look at this apparent tight coupling of mathematics and dire outcomes. Our recent correspondence with an author who has been widely published on the subject of Wall Street’s use of mathematical models recently offered to us an interesting opinion. His point was that the problem with the models was not so much their complexity, but rather that they were models in the first place. His argument was that you can’t ever perfectly hedge model risk. Now, I agree with that observation: a model by definition selects some aspects of reality to represent and omits others, and the choice of what to include and what to omit is subject to human error, therefore fallible and not perfectly hedgable. But I take issue with the idea that the fault lies in the existence of the models themselves. Models can be misused – I think that much is clear. But the notion that models are all doomed to failure obscures a deeper truth about the goals of predictive modeling; namely that you can seek either to reduce the world or truly explain it. By trying to elegantly reduce the world to as few predictor variables as possible, you are more likely to be sowing the seeds of future failure, because complexity and actual drivers of outcomes are taken out of the equations to make them more solvable (or perhaps sellable, as in the case of the Gaussian copula function that was behind Wall Street’s demise, as we discussed in a previous posting “You Can’t Punt Away the Dimensionality Curse”). Predictive modelers don’t have to go down that road, however: they can also set out with the goal not of reducing an entire system to a single neat, tractable equation, but to quantify and explain all of the relationships that dictate outcomes to the absolute fullest extent possible. Tractability and computability are things to address later in the process, through technological means, but they should not dictate the fundamental mathematical approach at the outset. Continue reading →