All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines
Psychologists have found that when we work with computers, we often fall victim to two cognitive ailments—complacency and bias—that can undercut our performance and lead to mistakes. Automation complacency occurs when a computer lulls us into a false sense of security. Confident that the machine will work flawlessly and handle any problem that crops up, we allow our attention to drift. We become disengaged from our work, and our awareness of what’s going on around us fades. Automation bias occurs when we place too much faith in the accuracy of the information coming through our monitors. Our trust in the software becomes so strong that we ignore or discount other information sources, including our own eyes and ears. When a computer provides incorrect or insufficient data, we remain oblivious to the error.
“All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines” – Nicholas Carr
Far too often, people think that using technology to improve our work means having technology do the work for us. Yet, what makes some of the best things out there so incredible is the human element. Algorithms can only go so far. The biggest risk, however, comes in the bit of Nicholas Carr’s piece I excerpted. Computers are still so new and tend to work so well that we are far too easily lulled into complacency that the machine will get it right. This means that when the machine doesn’t get it right, we end up in deep trouble, very quickly, as the example that starts off the piece will tell you.