Does a Black Belt need to know how to hand calculate a Chi-Square test statistic? When is the last time a team member had to figure out sample size for a z-test without an app? "No one ever in their real life anymore needs to — and in most cases never does — do the calculations themselves," states Stanford's Keith Devlin (known as the "Math Guy" on NPR's Weekend Edition Saturday).
Today, everyone is plugging in numbers via software on their laptop or punching them into a programmable graphing calculator. It seems that the only time folks have put their pencils to use and pulled values from printed statistical tables is for an introductory college stats class. Even the advanced statistics and engineering courses in most universities no longer require longhand calculations, because that's not the way it's done in the "real" world where "real" employees work.
Old School vs. Visual Math
Has old school math been replaced with technology in the quest for speed? Absolutely. After all, time is money. However, there's another more fundamental reason as offered by Bret Victor, an ex-Apple developer. Victor wants to kill math. Well, actually he wants to kill math's interface. "The power to understand and predict the quantities of the world should not be restricted to those with a freakish knack for manipulating abstract symbols," he asserts.
Victor explains that what we think of as "math" — equations, numerals, operators, and variables — isn't math at all: it's merely the interface. So he really doesn't want to kill "math" per se, he wants to kill its symbolic representations and instead institute the use of visual math. He advocates that people need to see math in order to understand it, to communicate it, and to create with it.
Speaking of indecipherable symbols: why do movies and television shows use roman numerals to denote the year the program was made? Not many people can interpret the symbols as quickly as the end credits scroll by. (Maybe producers are banking on this and use roman numerals to help mask how long a program has been shelved before distribution.) However, if those roman numerals can't be translated into meaningful information, then what good are they?
Same goes for statistical equations. For example, try to visualize Y = a + bX1 + cX2 + …+ e1+ e2+… and your mind is likely to go blank. Now look at a scatterplot, estimate a regression line, and you can guess a value for Pearson's coefficient. Or what about histograms, how well do you understand the impact of bin size on the shape of your data's histogram? Oftentimes, we opt for the neatest option that renders the graph most readable without grasping how this might impact the display. Whereas if we would try different bin sizes, we would see how the distribution shape morphs and can draw better conclusions.
I didn't have the benefit of learning statistics visually. I went to the "old school" and learned formulas. I learned how to tab the dog-eared pages of the reference tables in the back of my textbooks. Years later, I'm finally starting to understand statistics. Too bad it wasn't when I was sitting in class.
What's your take on this issue? Does your organization teach formulas for the statistical tools? When is the last time you hand calculated a formula? Drop us a line — we'd love to share examples in an upcoming issue of MoreNews.