The Next Big Thing
What about all that small data?
January 04, 2015
As we turn the page on another year it seems to be an appropriate time for all of us who are involved in process improvement to reflect on the big picture of what we are trying to accomplish, and the smaller picture of how we can be more effective. The excerpt below helps put things in perspective:
“When the Lord created the world and people to live in it—an enterprise which, according to modern science, took a very long time—I could well imagine that He reasoned with Himself as follows: “If I make everything predictable, these human beings, whom I have endowed with pretty good brains, will undoubtedly learn to predict everything, and they will thereupon have no motive to do anything at all, because they will recognize that the future is totally determined and cannot be influenced by any human action. On the other hand, if I make everything unpredictable, they will gradually discover that there is no rational basis for any decision whatsoever and, as in the first case, they will thereupon have no motive to do anything at all. Neither scheme would make sense. I must therefore create a mixture of the two. Let some things be predictable and let others be unpredictable. They will then, amongst many other things, have the very important tasks of finding out which is which.”
—From E. F. Schumacher’s Small is Beautiful
That pretty much captures the essence of Lean Six Sigma, doesn't it?
Enterprises everywhere are taking this sentiment to heart in a big and serious fashion, and leaders are enamored by the promise of big data and the analytic models it can yield. Big data certainly offers enormous potential, and on the surface, the message supports the notion that more data means more insight. But it’s not quite that simple.
There's irony in the source of the quote above, a book called “Small is Beautiful”, because the current infatuation with large big data may neglect a larger and more immediate source of value – the small data. It’s alluring to build predictive models of customer behavior, but if we can’t use the operational data to effectively manage the processes that fulfill that customer demand, are predictive models as useful as they could be? And if management structures and practices don’t take advantage of the myriad meaningful signals available from the small data that surrounds us, are the disciplines really in place to take advantage of big data?
Secondly, big data efforts can resemble the Analyze phase of a DMAIC project conducted in isolation - without the Define, Measure, Improve, and Control activities that turn statistical conclusions into new practical realities. You’ve probably run into people who have been enchanted by statistical tools, but can’t translate the number-crunching into sustained positive change. Analytical insights from big data analysis are like potential energy: necessary but insufficient. The energy must be released in a controlled fashion and turned into useful action. Analyze without DMI & C is an exercise, not a process.
So in the coming year it may be useful to re-examine how we use the available small data to actually manage our core processes, while linking our wider “analytics” exercises to a realization process that is proven to yield results: DMAIC.
My 2 cents.
+ View/Add a Comment (2 comments)
You must be logged in to comment on this blog.
Please login above or
Bill, you are very right about being able to gather and have access to big data and then implementing that data to help us be more effective with the more specific aspects of business. We must definitely make use of this data but it should not compromise our attention to the more fine details that can bring success. Juan Amaya Tulane
February 19, 2015
This is a subject that is near and dear to my heart. Maybe that is a leading indicator of being a geek. I have had issues with this big data issue since it began which leaves a person wondering if they have legitimate issues or are they just being resistant to change. There is a logical sequence of thought in a Six Sigma process. One of those tools, MSA, is normally in the Define/Measure phase simply to assure you have good measurement systems to produce good data. Everyone has heard the saying "garbage in garbage out" and the whole point of MSA is to assure the quality of that data. It is rare that we pull large data sets and with a tool as simple as a dot plot find data that defies the laws of physics i.e. weights with negative vales. Without the Measurement System rigor the entire data set and the analysis are suspect. We can go into more detail but that is a different discussion. As you stated in the blog the discipline to understanding a process that you gain from the Analyze phase is gone. It is really a sorting process. You are looking for the leverage variables to help you understand the process. Which variables are leverage, which affect the mean, which affect the variance and which don't do anything significant. It is more than doing a data dump and seeing what shows up. We worked on a complicated process that had in excess of 20,000 control loops and was under producing by approximately 25%. We built a regression model but we did it by understanding the problem and understanding which control loops actually affected the output. The problem was fixed and the performance met expectations. That should have been the end of the project. There was also a factor that was at a fixed level which will mask the effect of a input variable. Fortunately when we took a strike it affected the level of the input variable. When we reran the model with the new data we understood the effect and got another 15% increase. That was a function of process knowledge that was gained as we went through the analysis process. Here is the issue. Can big data help you? Of course. Anytime you start to analyze data you begin to gain knowledge. I am also an advocate of not allowing perfection to get in the way of better. Run that shotgun approach and see if you can get a step function of improvement. That doesn't mean you can replace the detail work with one data dump. That whole concept of profound knowledge that Dr. Deming gave us it true and remember in his obstacles to improvement he cautioned about leaving improvement to a computer. Just my opinion.
January 22, 2015