It keeps happening, doesn’t it? Right now we have fake news, but this is just yet another example of how well-meant applications of technology have the ability to plunge us into yet another state of chaos — à la “we are all citizen journalists now” becoming “Oh my goodness, who clicks on this rubbish?!” Fake news is just the latest in a series, from internet trolling and cyberbullying to overbearing surveillance and data misuse.
Such situations beg the question, when will we learn? Learning is something technology is very good at, we are told. Indeed, right now we are in the machine learning age, another step along the way to the singularity when we can let our digitally enhanced selves merge with the greater, artificial intelligence we are in the process of creating. Techno-nirvana is only a decade or so away.
Meanwhile, our enterprises can’t get enough of the outputs of such algorithmic capability. Big data begets even bigger insight, a comprehensive dashboard rather than a rear mirror, report-based view, for decision makers at all levels. We are being empowered, enabled, our capabilities enhanced by the wealth of information at our fingertips. I paraphrase but that’s the marketing pitch, pretty much.
Call me an old-school cynic but I’m having a job reconciling what I actually see happening with these rosy-glow visions. In my struggle to marry the two, I have realised one element that is lacking, its very notion derived from aeons of thinking about a subject. The word we use to describe such deep consideration and its consequences is ‘wisdom’.
Marc Prensky, who was one of the first to coin the term ‘digital wisdom’ in his 2001 essay, was first to say that wisdom is not an easy term to define:
“The Oxford English Dictionary suggests that wisdom's main component is judgment, referring to the ‘Capacity of judging rightly in matters relating to life and conduct, soundness of judgment in the choice of means and ends.’ Philosopher Robert Nozick suggests that wisdom lies in knowing what is important; other definitions see wisdom as the ability to solve problems—what Aristotle called ‘practical wisdom’. Some definitions — although not all —attribute to wisdom a moral component, locating wisdom in the ability to discern the ‘right’ or ‘healthy’ thing to do.”
While we can debate what wisdom means until the cows come home, less controversial is the notion that our continued use of technology lacks its input. We can cite specific examples, like Donald Trump’s shoot-from-the-hip use of Twitter, or broader themes such as the apparent inability of most new developments to take cybersecurity into account, but the fact remains that ‘soundness of judgement’, ‘knowing what is important’ and ‘the ability to discern’ are notably uncommon.
This is, perhaps, fair enough. After all, our ability to communicate across a distance, to capture and store large quantities of information, to then manipulate, analyse and report on it is little more than a century old. The lessons we are learning — perhaps repeatedly — are still bedding in, each major crisis of techno-stupidity offering only one element of a much broader, still-emerging picture.
So, we still use social tools even though we wonder about the implications for privacy or psychology. We continue to expect music for free even as we abhor celebrity culture. We leave our virtual doors and windows open, coping with identity theft and fraud, but keep our back doors locked due to media-hyped and irrational fears. We do so because we lack the mechanisms, hard-wired into our DNA and reinforced through stories, to tell us to do otherwise.
Just as we lack ancient wisdom, so we lack the ancient wise. Technology corporations may have technical fellows, distinguished engineers and chief scientists, analyst and consulting have their principals and partners, but no group of seasoned, gravitas-bearing people professes to keep the keys of how technology can be harnessed for good; nor would they be likely to be listened to, neither by kings nor knaves, if they did.
One day such a group may evolve; one day we will know how to listen to our digital consciences, the quiet voices within all of use, retuned to our newly data-driven lifestyles. But for now we remain in the eye of the brainstorm, a tumult in which we confirm digital right from wrong through hindsight. If we just wait for evolution to take its course however, we condemn ourselves to repeating the same cycle in which the fittest, not the best or most moral, survive.
PREVIOUS ARTICLE«Sony’s Xperia Touch and the fine line between genius and dud
NEXT ARTICLEQuiz of the week: 3rd March»
Jon Collins’ in-depth look at tech and society
Phil Muncaster reports on China and beyond