When computers chip away at our memories
by Ivan Briscoe
If the architects of technology's next great leap forward are to be believed, all knowledge may soon be shrunk to vanishing point. Nanotechnology, or computing carried out at the scale of atoms, is their byword for the future. With its awesome potential, scientists at IBM have recently argued, around 11 million 400-page volumes-the entire contents, say, of France's National Library-could be stored and primed for instant viewing on a device the size of a human palm.
The ink may still be fresh on these blueprints, but the elixir of portable omniscience no longer seems so far away. Seemingly cast-iron laws of ever increasing computer power, along with the rise of powerful new technologies, appear to point to a horizon where all that can be known and remembered can be transferred to machines with which human beings then interact at will. And it is a future that for some is already spelling big trouble for the brain.
Surveys point to yawning gaps in general knowledge "Computers not only distract us from contemplation of deeper values; they discourage us from contemplation itself," declares Stephen Bertman, a classics professor at Canada's University of Windsor, and author of the recent book Cultural Amnesia. In his opinion, society's love affair with fast and far-reaching machines-online computers, palm-tops and mobile phones, all just for starters-leads inexorably to memory loss rather than gain.
As surveys repeatedly show, knowledge of history, literature, geography and even current affairs seem to be on a steep decline: 60 percent of adult Americans cannot recall the name of the president who ordered the dropping of the first atomic bomb, just as 77 percent of young Britons are perplexed by the words Magna Carta. The day of the nano-shrunk library could soon come, but will any of its users be able to remember a single line of poetry?
The connection between these yawning gaps in general knowledge and information technology is by no means established, but a host of thinkers in different fields are sure the issue is one that will shortly become all too pertinent. "External support for our memory has a direct effect on our memory," argues Jean-Gabriel Ganascia, a leading neuro-scientist based in Paris' Pierre et Marie Curie University. "At the same time as it helps us and extends our physical capabilities, it diminishes our individual faculties. This is a vital question, one which has been around for a long time. Even Plato speaks in the Phaedrus of writing being both a good and an evil for our memory."
Good or evil, writing has nevertheless formed one of the main tools in the evolution of human memory. Indeed it is civilization's unrelenting hunger for placing memory in external stores-cave-paintings, then manuscripts, libraries, printed works and finally computers-that has supported the entire march of the species. As the Canadian neuropsychologist Merlin Donald has observed, each of these new technologies has helped humans "off-load" their memories. Pre-literate societies, for instance, depended on oral tradition for their expertise-a practice undermined by the flaws of overworked brains, though fertile ground for epic poetry. Through the written word, memories were freed from the head: knowledge could be stored for retrieval in books, and then recrafted into the sort of novel and complex codes on which modern society is founded: "examples might include the servicing manuals for a rocket engine, the equations proving the Pythagorean theorem, a corporate income tax handbook, or the libretto and score for Eugene Onegin," states Donald.
Becoming good memory managers
The benefits of storing memory outside the brain are unquestionable, but the invention of printing over 500 years ago followed by the post-war onset of computing have added a new note to the process: that of thundering acceleration. One simple equation has come to embody this. Known as Moore's Law after its inventor Gordon Moore, the co-founder of Intel, it stipulates that computing power-defined in terms of capacity and speed per unit cost-doubles every two years. The trend has held for the last 40 years. Should it continue as expected to around 2020, a personal computer by that year will have exactly the same processing power as a single human brain. Add the promised marvels of nanotechnology, optical and quantum computing, and machines might reach utterly daunting proportions. "One penny's worth of computing circa 2099 will have a billion times greater computing capacity than all humans on Earth," breezily announces Ray Kurzweil, an American supremo of artificial intelligence, in his book The Age of Spiritual Machines.
Kurzweil may well be too confident in his predictions, but the quandary remains: if computers become so quick, so mighty, so cheap, then where will the relatively impoverished human mind fit in? Three years ago, IBM's Deep Blue computer beat the world's finest flesh-and-blood chess player, Garry Kasparov, over the course of six games. If human functions ultimately resemble moves of chess, then must the brain and its stores submit to the superior wisdom of the microchip?
For many cognitive scientists, relations between mind and machine are already undergoing drastic reconfiguration. "Distributed intelligence" is the new maxim, encapsulating all systems in which individuals and computers mesh to carry out a collective task, whether it be landing an aircraft or tracking share prices. The Internet is so far the crowning glory-a system that in principle might combine individual users into a potent "group mind." For Norman Johnson from the Symbiotic Intelligence Project at the Los Alamos Laboratory, New Mexico, the collective power unleashed by such a system could solve problems far beyond any individual's capacity.
All of this may sound abstract, but the effects on memory are being felt now. Facts and figures no longer take pride of place in school curricula. Within the past two years, South Korea, Singapore and Hong Kong-havens of rote learning-have debated plans to axe huge swathes of standard classroom study. Experts in education stress that students must learn to be adaptable, skilled in manipulating symbols, able to respond to new situations; in short, ready to deal with the new economy, a realm where the computer is king.
"We will need a lot of new skills," declares the neuro-psychologist Donald. "We have to become good memory managers-we've moved away from managing a lot in our heads to managing memory devices. We have to devote more space to this executive control and less to rote memory storage."
Nurturing imaginative thinking at school
As he acknowledges, the result is an inevitable reduction in "individual presence." It is a form of mental life that has unsurprisingly earned bitter recriminations. Earlier this year the Alliance for Childhood joined the fray by publishing a report entitled Fool's Gold attacking the numbing effects of computers at school, above all primary school: "A heavy diet of ready-made computer images and programmed toys appears to stunt imaginative thinking. Teachers report that children in our electronic society are becoming alarmingly deficient in generating their own ideas and images."
While proponents of the electronic future insist on the liberating, elevating potential of machines-Kurzweil even suggests that we could "port" our minds onto super-powered computers for an intellectually and sensually richer life-suspicion continues to fester. As Ganascia observes, human memory is much more than simple information processing. There are, for instance, at least five systems of human memory, making up an inordinately rich web of self-reflexive, interweaving recollection that no computer has even come close to imitating. But if memory is increasingly stored in machines that we then manage for our learning, work and leisure, then how will these systems in the brain fare? And how will imagination, intelligence and understanding-all of which depend on an efficiently functioning memory-be affected? The simple answer is: we still do not know.
Yet one image stalks the debate. It is not the old science fiction fear of a malevolent computer (the HAL of 2001: A Space Odyssey perhaps), but of a citizen without a personal memory to speak of. Bertman, for one, is convinced that boundless electronic information may be the deadliest enemy of human knowledge. "It's not just enough to remember where we live, what our birthday is, and the name of our wife-there's more to human personality and identity than just the details we can find inside our wallet."
Ivan Briscoe is journalist at the UNESCO Courier
(This article was first published by the UNESCO Courier, December 2001)
Texts published in 'Points of View' may not reflect UNESCO's position.
Axel Plathe, UNESCO Communication and Information Sector