An old Russian/German guy by the name of Wassily Leontief once said of economists, "We move from more or less plausible but really arbitrary assumptions, to elegantly demonstrated but irrelevant conclusions." Just a fancy way of saying Garbage In – Garbage Out, really. He irritated the economics community with a paper he presented in 1971 called "Theoretical Assumptions and Nonobserved Facts" in which he took them to task for creating ever more complicated mathematical models based on bad data. Economic theories are then explained in papers that explain the complicated math at great length, but "By the time it come to interpretation of the substantive conclusions, the assumptions upon which the model is based are easily forgotten." He bemoaned the fact that conjuring up a more elegant model to manipulate data was deemed a greater achievement than finding new and better data to feed into basic models.
Economists are not alone in their guilt. They are just easy targets because their theories are so detached from reality, and their efforts to defend their theories in the face of overwhelming evidence of absurdity they make easy targets. In fact, manufacturers are just as guilty. Consider a couple of examples:
Manufacturing has been consumed with an effort to determine the correct batch size since a guy named Wilson conjured up Economic Order Quantity theory back in 1913. A 1934 tome called the Cost and Production Handbook said, "William E. Camp was the first to present (1922) a general formula to determine the production order quantity, such that the total cost per unit for setting up plus interest on stores investment would be minimum." The idea that batch sizes are a trade-off between inventory costs and setup costs has been stuck in manufacturing's craw for almost a century.
Toyota basically said (1) we don't know what the cost of inventory and setups are but they are a lot higher than everyone thinks they are; and (2) whatever they are they are unnecessary. They basically said - 'set inventory costs to infinity and setup time to zero and see how that cranks through your model'. The idea that better data is needed to feed the model – not a more elegant model to process bad data – has fallen on quite a few deaf ears. Take a gander at this comically depressing, mind numbing, silly but well intentioned patent application for "Automated replenishment using an economic profit quantity" from just a couple years ago. Same garbage in, a bunch of inane cost assumptions – but incredibly complex mathematical manipulations that can only lead to the same garbage out.
The 1913 math was perfectly good – it was just, and continues to be - based on terribly inaccurate input data derived from inadequate accounting.
And here we have some economists performing what is described as "the first research that illustrates the effect of choice, quality and willingness to pay." They are wrestling with the fact that people do not make buying decisions based on price alone. Go figure! The problem they have is that price and volume have been assumed to be in direct mathematical relation to each other since a thinker of great thoughts in jolly old England named Alfred Marshall cooked up Price Elasticity of Demand back in 1890. Since then the idea that lower prices automatically mean more sales has spread throughout economics, into the business schools and into conventional wisdom.
The folks who did the study were shocked – shocked I tell you – to learn that when faced with more competing products people do not automatically pay lower prices. In fact, more choices gives people more data points upon which to make a decision that is not purely price based – it is price relative to value. Why do you suppose people pay a lot for Starbucks coffee when they can get other coffee a lot cheaper? It took Starbucks to demonstrate that the coffee people had been drinking for years was crap. When faced with the choice between fifty cents for a cup'o crap and $2 for a cup of Starbucks many people opted for the more expensive Starbucks because, while Starbucks was 4X the price the value of the Starbucks coffee was determined by a lot of folks to be >4X crap.
Value, however, is pretty hard to quantify, however, maybe even impossible. So the whole price-volume thing was based on an assumption of "all things being equal". It is all based on the idea of "commodities" – things that cannot be differentiated. Only all things are never equal. Commodities only exist in theory. In reality, whatever you are making is not identical to anything else, especially when you broaden the definition of what you are making to include the whole package of support and delivery going with it.
The idea that you are going to sell more because you reduced the price, because you reduced the cost, because you went to China, collapses like the proverbial house of cards. That was only a good idea if 'all things were equal' and they weren't. Most probably you just made them even less equal.
Lean is tough because it is a change in the basic inputs to manufacturing and economic system thinking. Companies struggle with it because those basic inputs are fuzzy and vague – the waste of inventory, the worth of people, the value of outstanding customer service – and they cannot be quantified to feed into our elegant but logical systems. We keep trying to devise more complicated systems to solve the dilemma – Activity Based Costing, increasingly complex ERP and management dashboards. They won't work for the simple reason that old Wassily explained – they are "elegantly demonstrated" but based on "arbitrary assumptions" that can only lead to "irrelevant conclusions".
Only wisdom, experience, common sense and judgment can lead to good decisions with fuzzy inputs. In other words, excellence requires management and leadership and cannot be relegated to conventional wisdom, yesterday's theories and mathematical models.
david foster says
An extreme example of elegant economic modeling based on rather dubious assumptions…an option value problem from Seinfeld:
http://www.princeton.edu/~dixitak/home/Elaine-Final-Web.pdf
Lou English says
Excellent article that profoundly summarizes the difficulties faced communicating the benefits of lean in a traditional EOQ world.
Another related quote, author unknown
“He dives into the deep sea of complex mathematical formula and makes swift sure strokes to the white cliffs of the obvious.”
Paul Todd says
When the FAA shut down all air traffic on 9/11, the challenge for air traffic controllers was to safely land 4500 planes immediately at unfamiliar airports, which they did. Later the FAA set up a committee to document the process so it could be repeated if the need ever arose, but in a rare moment of clarity, they decided not to try. They realized the process worked only because of the skill, judgement, and experience of the controllers.
Lean is sometimes battling decades of management training that believes any business can be managed by the numbers – that the right decision is just a matter of more and better data.
John Fetzik says
The “all things being equal” part makes me think of the good old “Spherical Cow” joke theoretical physicists.
http://en.wikipedia.org/wiki/Spherical_cow
The model works great as long as you make a bunch of, usually unrealistic, assumptions. Great for purely theoretical speculation, but not so useful when applied to the real world.
Bill Waddell says
Thanks John
I’ll save evryone the linking to the joke. Here it is:
Milk production at a dairy farm was low, so the farmer wrote to the local university, asking for help from academia. A multidisciplinary team of professors was assembled, headed by a theoretical physicist, and two weeks of intensive on-site investigation took place. The scholars then returned to the university, notebooks crammed with data, where the task of writing the report was left to the team leader. Shortly thereafter the physicist returned to the farm, saying to the farmer “I have the solution, but it only works in the case of spherical cows in a vacuum.”