One of my goals for this blog is to get down and dirty with cost justification for A/R and V/R technology. Pull no punches – see the cartoon on the previous post. But first I’m going to go big picture, sky high, and talk about cost justification from an economic, and yes, even patriotic, perspective.
There was an interesting article on the Wall Street Journal recently, Robots Aren’t Destroying Enough Jobs. Now, I put myself in the camp that has been saying, for a very long time, that technology is going to create a bleak jobs future for the average worker, the blue collar worker, in this country. Some of the thinking was simply that not everyone can be trained in analytical reasoning or creative thinking, or even has any desire to do such work.
But this article has completely changed my mind – at least for the near to mid-term. The problem/challenge/opportunity is to use technology to “upskill” those workers so that they are effective and productive. I love that term, upskill, and if you’re not aware, it is embodied in an A/R company you can find at www.upskill.io that does exactly that – raise the skill level of people, in their case, through the use of A/R.
We’ll come back to that in a minute, but let’s return to the WSJ article. This article refers to a study, False Alarmism: Technological Disruption and the U.S. Labor Market, 1850–2015 from the Information Technology and Innovation Foundation, which paints a clear picture that technology is at 150 year low point of disrupting jobs – yes, low point.
There’s a lot of economic theory and evidence behind this, which I won’t go into, but the WSJ article goes on to point out that jobs have mostly been created in sectors with low productivity growth, i.e., technology, growth: education, healthcare, social assistance, leisure and hospitality, for example. Indeed, this effect was predicted by economist William Baumol in his theory about “Cost Disease.” Essentially he stated that industries with low productivity growth would be forced to continuously increase prices in order to retain labor, as a substitute for increasing productivity, since they have to compete for the same labor force. Anyone noticed anything like increasing prices in education and healthcare? Perhaps there is some low productivity growth, i.e., lagging technology innovation and adoption, in those sectors?
On the other hand, in a sector like manufacturing, since 1984 the U.S. has seen output double while employment has decreased by a third – read that as productivity gains driven by technology innovation and adoption.
So what does all this have to do with A/R and V/R? I believe there is a major opportunity for technology to drive productivity gains by “upskilling” workers in the lagging sectors, such as healthcare, education and other service work.
A recent article Harvard Business Review titled, Augmented Reality Is Already Improving Worker Performance discusses the following video taken of a worker wiring a wind turbine’s control box, once using a typical, printed manual, and then using an A/R headset. Using the headset the worker’s performance improved 34% – without any training whatsoever in using the headset – that is a phenomenal gain for such an immediate payoff.
The HBR article goes on to show the following chart depicting the growing gap in skillsets for manufacturing. This shows a slightly different flavor of the business case for upskilling technology than the business case for lagging innovation sectors.
All this points to a tremendous opportunity, if not an outright call to arms, to deploy A/R and V/R technology especially into sectors with lagging productivity growth and technology innovation and adoption. But even in sectors with strong technology innovation and adoption, such as manufacturing, the chart above shows there is plenty of opportunity there as well. Surely adoption that requires no training, and gives immediate productivity gains, as in the above video, can succeed in justifying itself.