The truth about tech statistics

“Researchers and analysts estimate that such and so many of this or that will increase or decrease by some percentage by sometime in the near or far future.”

We hear this all the time and, for the most part, we tend to believe it. When we read a news story that quotes some research firm stating that iPads have a 53% market share of the tablet space, we tend to believe it. When some other research firm says that by the year 2017 85% of devices will feature gesture recognition capabilities we might raise an eyebrow and say to ourselves ‘85% sounds like a lot, but these guys are supposed to know what they’re talking about, right?’

Tech journalists are very fond of quoting statistics from research firms. Statistics tend to lend credibility to arguments because that’s what they are designed to do. 

I tend to be a bit skeptical of research statistics (some might say I’m downright cynical). I tend to believe that the majority of report results are driven by the needs of those people who buy the research. It’s a bit like saying ‘I need some proof that what my company is planning to do is correct. Go find me some statistics that validate our plans.’

I know that’s not exactly how it works but there is a kind of built-in Catch-22 in the research industry. Large corporations pay for most of the research so researchers tend to focus on topics of interest to large corporations and a report that predicts doom in a particular market segment isn’t going to go over well with big customers heavily invested in that particular segment.

Now I’m not saying research companies fudge their statistics or manipulate results to suit their client’s desires. In fact, I’m certain that if research firms had a way to get truly accurate statistics they would be in heaven. But few companies will divulge their true sales figures to anyone and most of them will  exaggerate how well they are doing.

For a short time I actually worked for a research firm and was trying to determine the size of the video capture board market. One company told me  they had sold over 65,000 fairly expensive, high-end video capture boards to television stations, colleges, and universities in the U.S. Not bad when you consider there are only about 2,000 television stations and 4,000 colleges and universities. I guess each and every one of them bought ten.

Bit I digress. Really, it’s a game of sifting through the figures from component suppliers, manufacturers, and OEMs (figures that may or may not be accurate), and making the best guess you can. Then you drag out the ouija board, toss a few darts, and consult the magic 8-ball to make predictions about the future. So we should take all statistics with at least a few grains of salt and predictions based on those statistics should come with high sodium warning labels.

As noted above, for many years I covered the video industry, both high-end professional video production and low-end consumer video. Since the dawn of television, video production had been an extremely expensive proposition but in the early- to mid-90s personal computers got much more powerful, the prices of video cameras and video capture hardware dropped significantly (a decent MPEG capture card used to cost over $10,000 but by the late ’90s you could get one for a few hundred bucks) and video editing software also got cheaper and easier to use.

Analysts at the time estimated that by 1998 something like 2.5% of all consumers with video cameras would be editing their home videos on their personal computers, and that number was expected to rise dramatically – something like 3% to 5% per year.

Okay, I guess, maybe. I was a bit skeptical. For one thing, I didn’t think they were counting on the fact that most people who bought a video camera used it like crazy for a few months and then lost interest, eventually relegating their new toy to the closet only to be brought out for vacations, holidays, and birthdays (and eventually not even then). But home video editing was a statistic that I tracked over the years.

The year 2000 rolled around and new statistics came out. Computers were even more powerful, video camera sales were going through the roof, and hardware and software prices had dropped even lower. Yet the estimates of people editing videos of little Becky’s fifth birthday party were still only about… 2.5%. Not to worry, however, because the predictions said there  would be an even steeper rise in home video editing over the coming years with guesstimates saying that nearly 10% of all homes by 2005 would be editing up a storm.

2005 came and went. Still lots of video camera sales, even cheaper and easier ways to edit video (you didn’t even need a capture card anymore) and still the analysts said that approximately…wait for it… 2.5% of people were editing their home videos and, (you guessed it) those number were expected to rise dramatically up to… to..something (it doesn’t really matter) by 2010.

Today, capturing video is as easy as holding up your phone and clicking a button. Posting it on Facebook or YouTube is as easy as clicking another button. Two buttons! And yet how many videos on Facebook are actually edited?

My guess? Roughly 2.5%.

Industry statistics are, at best, rough estimates, and market research predictions from industry analysts are even more suspect. I place analysts in the same category as economists. If economists actually knew what they were doing wouldn’t they all be billionaires?