I’m often struck by a lack of a sense of historical continuity, of curiosity about the ideas behind the development of our current technology that exists in the community of current producers and users, whether in academia, gaming, the corporate world, and scientific research. This may be a factor in why its use has become so automatic that very little attention is paid to the costs (internal and external) of our increasing automatic reliance on a certain pervasive level of obligate technology in our lives. “What Were You Thinking” is the semi-sarcastic question posed when someone does something …well, stupid. What if we really asked it?
First, let me be clear about two things: One) “us” and “our” in this context means a limited slice of the global human population: First-world, college educated, professional, (often majority “white” (European ethnically, but mainly meaning European-oriented) as well as a similar (“elite”) tranche in parts of Asia, the Middle East, Africa, and Latin America. For many people in the so-called global south or “developing” countries technology plays a very different, sometimes crucial role in their economic and social lives which, at least for the time being, moots the kinds of effects I am concerned with here.
Two) I am not a Luddite (even the Luddites were not actually anti-technology *). I have used computers since the mid 1970s, I lived and worked in Silicon Valley for 18 years. I use the internet daily, I have a very active Facebook presence, I disseminate my writing partly through a Sub-stack newsletter, I read books on a device via Kindle or my library’s Hoopla system, I message with family members, my life as it is would crash without e-mail. In fact I assert that the Search Engine is one of the stunning achievements of humankind, on a par with The Library.
Some early developers and designers believed that the internet, the personal computer, and the mobile phone would infinitely improve our lives. That very uneven and asymmetrical “improvement” has been weaponized and overwhelmingly monetized by the forces of militarism, commerce and, sometimes, of sheer propaganda. Digital technology has come to dominate almost all communication with frequent damaging results. To name three massively consequential examples: 1) the international rush into an ill-conceived War on Terror [and sadly the US in particular tends to make war on things like ‘Drugs’ and Poverty as a way of avoiding addressing their causes.] 2) the 2016 US presidential election . 3) the Brexit referendum in the UK.
As I looked at the way we relate to the devices and apps that make possible for us so much that is wonderful I isolated three aspects of human behavior and perception that shape our current relation to the technology of things like mobile phones and the apps that make them work. I feel it’s important to recall that an app is an application, a pragma, a set of instructions intended to do usually one or a narrow set of specific things as efficiently as possible. Currently, not being able to/or refusing to download some apps effectively cuts one off from a range of possibilities and from sometimes crucial courses of action — which can include registering to vote, getting a passport, applying for Social Security benefits, accessing public information, making medical appointments, etc.
The three things are: Habit, Convenience, and Cognitive Overload (our management of/or adaptation to it.) They are implicated in a kind of entrapment, and generate the considerable frustrations via the very technology that was to have set us free.
HABIT is a basic human practice, necessary in daily life, and where we automate somatic and kinaesthetic subroutines in things like driving a vehicle, athletic activities, skilled actions, or social useful ones like saying excuse me when we bump into someone. It is the underlying mechanism of the default mode network of the brain, the neuro-psychological construct that oversees how we manage life, which might be onerous without it in advanced civilization. But habit can sometimes take over and prevent positive change, learning, and progress or verge into compulsion/addiction in both in individuals and societies.
CONVENIENCE is a complex modern conundrum; what started by saving people from having to get water from a well or chop wood to keep warm, or providing a telephone that could enable you to call for help in an emergency has more recently devolved into things like -- saving us from the effort of turning on a light switch, checking the refrigerator to see what groceries we might need, setting a thermostat, remembering things like how much we actually moved our own bodies today, from having to just push a button on the remote control device (“asking” Siri or Alexa to do it.), finding a place to order pizza! In a Jan. 2024 interview in The Economist Microsoft head Satya Nadella touts AI as a way of “getting more things done with less drudgery.” As ordinary tasks are reconceptualized as “drudgery” the ideology of “the more technology the better” dominates public discourse and brushes aside important questions.
Where is the space to ask what are essentially ethical questions: How difficult is a thing in the first place that justifies consuming non-renewable resources to replace it? What will be its civilizational (social and cognitive) costs: what skills are being deconditioned by any particular convenience, especially as millions of people adopt it? This encompasses 1) the well-documented loss in many young people of the ability to read maps or navigate spaces and landscapes without GPS or some other AI; 2) the loss of the ability to spell one’s own language, demonstrable across social media now spilling into traditional media; 3) loss of the capacity to read cursive handwriting, which could preclude future people from reading original historical documents, a consequential loss of access to the historical record, or result in only an elite ‘priesthood’ able to, while most cannot. Back to the middle ages or ancient Egypt or China.
Convenience is a primary MARKETING TOOL. Tim Wu, in reflections on technological evolution for the New Yorker’s science and technology blog, argues that: while technological evolution is a metaphor based on the processes of biological evolution, “technological evolution in a market economy is determined by what companies decide to sell based on what they believe we, as consumers, will pay for.” [market self-checkout machines are presented as “convenience” when in fact they are not always as convenient for the consumer (except to the extent that there are so few checkers that one has otherwise to wait in a long line to pay) as they are for saving the company LABOR expenses.
Convenience is used as an excuse for the massive waste that global industrial civilization inflicts upon the natural world. It is convenient for an individual to use a disposable plastic fork to eat lunch instead of having to carry a reusable one; it is convenient for a business not to have to wash used forks (again, labor costs?) In all of this where is the place to ask what is the impact/cost/consequence of millions of plastic forks being dumped into wildlife habitats, on the quality of air and water, on future generations? The reply of Sam Altman (head of Open AI) to such questions (in the same Economist interview) is “The world will have a two-week freakout and then people will go on with their lives.” Arrogant dismissal of any possible legitimate concerns that “the world” might have.
In 1985 Neil Postman published Amusing Ourselves to Death, in which he examined the effects of the general shift from print to television. We need a similar investigation today into the physical, psychological, and political effects of the mediums through which we communicate with one another and get information; we need to the interrogate assumptions and projections our current consumer technology emerged from, especially the ‘smartphone’ into which everything has converged.
In the early 1990s at Xerox-PARC (Palo Alto Research Center) Mark Weiser’s vision was that bits of technology would be embedded in many parts of our lives and this “the internet of things” would enable us to interact with the world and the tech more intuitively. What we now have instead is a kind of narrow channeling into one portable device that enables a great deal but also cuts off/blocks out perhaps more. My experience is that anything with the word “smart” in its name is going to be intrusive and will require that I modify and adapt my behavior to it. The tool will instruct me behavioristically in how I should use it, and enforce that in ingenious ways.
The corollary to the ‘convenience is automatically to be desired’ assumption is the presumed inevitability of any new technology, which therefore asserts the right for its development to be prioritized and rewarded. The recent Techno-Optimist Manifesto by one-time software engineer and now billionaire investor Marc Andreessen takes the fact that technology, on the whole, has dramatically improved human life and – as Adrienne La France argues in The Atlantic – proceeds to “inflate it into the absurd conclusion that any attempt to restrain technological development under any circumstances is despicable.”
COGNITIVE OVERLOAD: In 1945 the chief science advisor to Franklin D. Roosevelt warned that “a growing mountain of research” … what we now call data … “promised to overwhelm any individual scientist’s processing ability.” As Gertrude Stein (perhaps more charmingly) once put it, "Everybody gets so much information all day long that they lose their common sense." As Kevin Munger recently wrote in Mother Jones, “We are living with technology moving at an inhuman speed, operating at scales simultaneously smaller than we can detect and larger than anyone can comprehend.”
Cognitive overload is a pervasive condition that contributes to issues – from lack of empathy (“compassion fatigue”), to accidents caused by texting while driving, to a degradation of language use. It creates stress and is the opposite of mindfulness. Mindfulness is a pragmatic extraction from buddhist philosophy; there is a whole industry of bringing “mindfulness” training into various environments to help people adapt to stress.
What are we doing?
We are living in a world of massive plenty and opportunity (for some of us) which also suggests that we ought to be making conscious choices, not acquiescing to the easiest way. And what happens with that is that for every bite of food or piece of clothing or activity we are confronted with a panoply of ethical distinctions — questions that we really should be asking: … is it Fair Trade, are the chickens free-range, was it produced with slave labor, what environmental impact does this product have? And for most of us: ‘can I afford it or am I willing to pay more for it if it is better for the environment or the workers who made it’? That kind of minute by minute exercise can give you a headache, let alone cognitive overload. Yet how does one justify NOT doing that analysis? Often we acquiesce to technology’s demands because we simply do not have the bandwidth to grapple with reading a multi-page legal document we must agree to before carrying out some necessary task, or choosing yet another password before we can access something or register for something needed right now, all while earning a living, caring for our families and our own health – in short living our complex lives.
What were they Thinking?
Above is the cover of my late husband, Rich Gold’s book, which I edited from his writings. He joined Xerox PARC in early 1991 to work in Mark Weiser’s group. Weiser thought that “the most profound technologies are those that disappear.” He was drawing on his studies in phenomenology, envisioning a world where “ubiquitous computing” would be an intuitive interface between users (“everyone”) and the world. And that would presumably free them up to be more aware of the world, connected to the wider ‘sensorium’ with less obstruction. Tinnel’s book looks at the ideas underlying Weiser and his team’s initial development of the things that are now in every hand.
What happened to that utopian vision? The market forces that enabled some people to make BILLIONS of dollars are a big part of the answer. Once you are in possession of that kind of wealth, your fundamental view of the world is radically changed. I have seen it happen. The prevalent position in silicon valley is a basic ‘libertarian’ one: “sure, marry who you want, take the substances you like, if you don’t want to have a child you should have access to birth control and termination of pregnancy, yes of course we should respect nature. BUT if you want government to step in to protect people from harm by regulating MY industry … wait a minute, you will stifle innovation, I am a job and wealth creator, you can’t tell me what to do.” Liberal on social issues, conservative on labor, industrial, and actual environmental issues.
Early utopian visions of the web turned into what Shoshana Zuboff calls “a rogue mutation of capitalism marked by concentrations of wealth, knowledge and power unprecedented in human history”. While industrial capitalism exploited and controlled nature with devastating consequences, surveillance capitalism exploits and controls human nature. She articulates in detail how, since we are always looking back at the last big threat (Big Brother, based on Stalin and Hitler) we have missed a massive, unprecedented change/threat at the very heart of our lives. The effluvia of our ‘living our lives’ has become free fuel for those who would engineer our futures without our conscious consent.
In Palo Alto: A History of California, Capitalism, and the World, Malcolm Harris examines in detail how exploitative labor practices – making products without pesky unions advocating for workers’ rights, temp work, piecework, shares and stock options instead of benefits and pensions -- allowed the tech industry to roar ahead to astronomical growth, and profits for the few. He situates this within the specific history of how government contracts, research funded by the military, low-wage immigrant workers, targeted real estate practices and other factors laid the foundation for the explosive growth that has typified the tech industry. Harris chronicles how, for example, Larry Page and Sergey Brin, after publicly stating“we believe the issue of advertising causes enough mixed incentives that it is crucial to have a competitive search engine that is transparent and in the academic realm” went on to found Google, whose parent company Alphabet is now one of the five most powerful and profitable companies in the world, alongside Amazon, Apple, Meta, and Microsoft. The Search Engine, that stunning achievements of humanity, is much less so if it is privatized and fenced off, and you can be identified, targeted, categorized and manipulated based on what you search for with it.
While he does a thorough job on Leland Stanford, Herbert Hoover, and our current titans of industry, the unique strength of Harris’s analysis is that he focuses less on individuals actors as holding meaning, but rather on the wider historical forces that shape conditions in which certain individuals can revolutionize economic and social life. I think to understand how technology shapes knowledge and society we must do the same.
I end with some words of Hannah Arendt: “Thoughtlessness creates the conditions for evil.” The Human Condition: “is nothing more than to think what we are doing.”
bibliography:
The Philosopher of Palo Alto: Mark Weiser, Xerox PARC, and the Original Internet of Things, by John Tinnell (2023, University of Chicago Press)
The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, Shoshana Zuboff. Profile Books, 2019
Palo Alto: A History of California, Capitalism and The World, Malcolm Harris.
The Plenitude: Creativity, Innovation, and Making Stuff. Rich Gold. MIT Press, 2007
[see also] Technics & Civilization by Lewis Mumford
Notes:
“evolution through collapse” Ian Bogost in the article We’ve Forgotten How to Use Computers The Atlantic Jan 2024
Kevin Munger in Mother Jones “The Algorithm Does Not Exist”
Neil Postman: “What Orwell feared [in 1984] were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture … As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny "failed to take into account man's almost infinite appetite for distractions." In 1984, Huxley added, people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us. “
In Farenheit 451 Ray Bradbury shows a society where books are not just banned but burned because they “make people unhappy.” And to Remember (memorize) books is Resistance.
*The first Luddites were artisans and cloth workers in England who, at the onset of the Industrial Revolution, protested the way factory owners used machinery to undercut their status and wages. Contrary to popular belief, they did not dislike technology; most were skilled technicians. At the time, some entrepreneurs had started to deploy automated machines that unskilled workers—many of them children—could use to churn out cheap, low-quality goods. And while the price of garments fell and the industrial economy boomed, hundreds of thousands of working people fell into poverty. When petitioning Parliament and appealing to the industrialists for minimum wages and basic protections failed, many organized under the banner of a Robin Hood–like figure, Ned Ludd, and took up hammers to smash the industrialists’ machines. They became the Luddites.
In “Science Fiction is a Luddite Literature” Cory Doctorow writes that “Luddism and science fiction concern themselves with the same questions: not merely what the technology does, but who it does it for and who it does it to.”