In listening to a recent presentation on the trends affecting the workplace and HR, there was mention about how organizations were using more cognitive technology, AI, etc. and this was changing jobs. There were two additional notes. First, these efforts aren’t (largely) leading to job losses, as these folks were being reskilled. Second, HR wasn’t involved in 65% of this. That’s a concern. But one of the things I wondered was whether all the new, smart technology really would help as much as was intended or needed.
So here’s some context (I may have heard this in conjunction with an early experiment in using mobile devices to support drug trials). Pharmaceutical companies are continually trying new drugs. One claim is that if people would follow their medicine regimens, many of these new drugs wouldn’t be necessary. That is, the drugs are often times trying to require fewer doses with simpler instructions to make up for inappropriate use.
Likewise, the origin of performance support. The question is where does the locus of responsibility belong. Interface design people were upset about performance support systems, arguing (correctly) that performance support was being used to make up for bad system design in the first place. In fact, Don Norman’s book The Invisible Computer was about how interface design wasn’t being brought in early enough. The point being that properly designed interfaces would incorporate support for our cognitive limitations inherently, not externally.
So, many of the things we’re doing are driven by bad implementation. And that’s what I started wondering: are we using smart technology to enhance an optimized workforce, or to make up for a lack of adequate preparation? We could be putting in technology to make up for what we’ve been unsuccessful at doing through training and elearning (because we’re not doing that well).
To put it another way, would we get better returns applying what’s known about how we think, work, and learn than bringing in technology? Would adequate preparation be a more effective approach than throwing technology at the problem, at least in some of the cases? There are strong reasons to use technology to do things we struggle at doing well, and in particular to augment us. But perhaps a better investment, at least in some cases, would be to appropriately distribute tasks between the things our brains do well and what technology does better.
Let me be clear; there are technologies that will do things more reliably than humans, and do things humans would prefer not to. I’m all for the latter, at least ;). And we should optimize both technology and people. I’m a fan of technology to augment us in ways we want to be augmented. So my point is more to consider are we doing enough to prepare people and support them working together. Your thoughts?
Michael Hernandez says
I think it is useful to recognize that whatever we do, we do it to benefit a person or people, though there may be unintended consequences that do not. I believe that at the beginning and end of every transaction or interaction is a person. However we innovate, I think we should focus on how it affects people, and endeavor to benefit as many people as possible. And our desire should be to include rather than exclude.
William Ryan says
Need people to evaluate process first, adding tech to bad process just makes bad faster. We need people in the mix!