God, human, animal, machine Technology, metaphor, and the search for meaning

Meghan O'Gieblyn, 1982-

Book - 2021

"A meditation on what it might mean to be human in an age of ever-accelerating technology"--

Saved in:

2nd Floor Show me where

814.6/O'Gieblyn
1 / 1 copies available
Location Call Number   Status
2nd Floor 814.6/O'Gieblyn Checked In
Subjects
Genres
Essays
Published
New York : Doubleday [2021]
Language
English
Main Author
Meghan O'Gieblyn, 1982- (author, -)
Edition
First edition
Physical Description
287 pages : illustration ; 22 cm
Bibliography
Includes bibliographical references (pages 275-287).
ISBN
9780385543828
  • Image
  • Pattern
  • Network
  • Paradox
  • Metonymy
  • Algorithm
  • Virality.
Review by Booklist Review

As a young Fundamentalist, O'Gieblyn defined the human condition through Bible verses declaring that God created the human soul in His image. But when, as part of her intellectual coming-of-age, O'Gieblyn abandoned Christianity, her perspective on humankind changed radically. O'Gieblyn acknowledges that the secular worldview initially depressed her because it severed human life from ultimate meaning. It was, indeed, her metaphysical anxiety that primed her for the transhumanism preached by Ray Kurzweil, ostensibly a completely scientific path to superhuman perfection and immortality achieved through cybertools. But her enthusiasm for this quasi-science waned as she recognized its covert incorporation of Christian aspirations. But transhumanists are hardly the only secularists O'Gieblyn has found denying real humans a coherent place in their thought. Many cosmologists, for instance, believe that the cosmic fine-tuning that incubated intelligent life in this universe was merely a random fluke, not repeated in countless other sterile cosmos. Mandarins of the internet focus on information as the ultimate reality, largely ignoring the humans generating that information. Though O'Gieblyn laces her reflections with scholarship illuminating historical and cultural context, her narrative is ultimately sustained by her very personal account of a painful philosophical evolution. A compelling reminder that the deepest philosophical queries guide and shape life.

From Booklist, Copyright (c) American Library Association. Used with permission.
Review by Publisher's Weekly Review

Wired columnist O'Gieblyn (Interior States) explores in this intelligent survey what it means to be human in a technological world. She sets out to examine the ways "artificial intelligence and information technology have absorbed many of the questions that were once taken up by theologians and philosophers," and spotlights how technology has replaced religion in how humans think about life's big questions. Transhumanism, for example, is a movement that "believe in the power of technology to transform the human race," and while it doesn't believe in a soul, its notion of consciousness is not dissimilar. O'Gieblyn adds fascinating insight through accounts of her own struggles with theology and various personal anecdotes, such as her interaction with Sony's $3,000 robot pet dog ("It took all my strength to drag it up the stairs"). O'Gieblyn has a knack for keeping dense philosophical ideas accessible, and there's plenty to ponder in her answers to enduring questions about how humans make meaning: "Metaphors," she writes, "are not merely linguistic tools; they structure how we think about the world." Razor-sharp, this timely investigation piques. Agent: Matt McGowan, Frances Goldin Literary. (Aug.)

(c) Copyright PWxyz, LLC. All rights reserved
Review by Kirkus Book Review

An exploration of how technology has co-opted the metaphors of religion, with uncanny and discomfiting results. Essayist O'Gieblyn is a former Bible school student who lost her faith, but living in the real world is no escape from spiritual discourse, especially when it comes to the internet. Much of this intellectually wide-ranging, occasionally knotty book turns on the ways we reflexively apply religious imagery to online life, "constantly, obsessively enchanting the world with life it does not possess." The author begins her considerations concretely, discussing her relationship with an Aibo, a robotic dog loaded with convincingly doggy idiosyncrasies; bonding with the machine, she wonders if humans are built "to see life everywhere we look." And if that's irrational, what's the rational approach? To a surprising degree, she finds, scientists can't escape a kind of modified God-talk despite their learnedness and rigor. They speak of "emergence" of group consciousness online, ponder the mystical unknowability of matter in quantum physics, or propose that we might all be living in a computer simulation, a theory O'Gieblyn reads as old creationist wine in new bottles. The author is a whip-smart stylist who's up to the task of writing about this material journalistically and personally; her considerations encompass string theory, Calvinism, "transhuman" futurists like Ray Kurzweil, and The Brothers Karamazov, which features "a moral drama that for me has lost none of its essential power." Though sometimes overly digressive, toward the end the author sharpens her concern that "enchanting" the internet risks our being blind to how it exploits us: "We are indeed the virus, the ghost in the machine, the bug slowing down a system that would function better, in practically every sense, without us." The machines aren't alive, but that doesn't mean they're not taking over. A melancholy, well-researched tour of faith and tech and the dissatisfactions of both. Copyright (c) Kirkus Reviews, used with permission.

Copyright (c) Kirkus Reviews, used with permission.

1 The package arrived on a Thursday. I came home from a walk and found it sitting near the mailboxes in the front hall of my building, a box so large and imposing I was embarrassed to discover my name on the label. On the return portion, an unfamiliar address. I stood there for a long time staring at it, deliberating, as though there were anything else to do but the obvious thing. It took all my strength to drag it up the stairs. I paused once on the landing, considered abandoning it there, then continued hauling it up to my apartment on the third floor, where I used my keys to cut it open. Inside the box was a smaller box, and inside the smaller box, beneath lavish folds of bubble wrap, was a sleek plastic pod. I opened the clasp: inside, lying prone, was a small white dog. I could not believe it. How long had it been since I'd submitted the request on Sony's website? I'd explained that I was a journalist who wrote about technology--this was tangentially true--and while I could not afford the Aibo's $3,000 price tag, I was eager to interact with it for research. I added, risking sentimentality, that my husband and I had always wanted a dog, but we lived in a building that did not permit pets. It seemed unlikely that anyone was actually reading these inquiries. Before submitting the electronic form, I was made to confirm that I myself was not a robot. The dog was heavier than it looked. I lifted it out of the pod, placed it on the floor, and found the tiny power button on the back of its neck. The limbs came to life first. It stood, stretched, and yawned. Its eyes blinked open--pixelated, blue--and looked into mine. He shook his head, as though sloughing off a long sleep, then crouched, shoving his hindquarters in the air, and barked. I tentatively scratched his forehead. His ears lifted, his pupils dilated, and he cocked his head, leaning into my hand. When I stopped, he nuzzled my palm, urging me to go on. I had not expected him to be so lifelike. The videos I'd watched online had not accounted for this responsiveness, an eagerness for touch that I had only ever witnessed in living things. When I petted him across the long sensor strip of his back, I could feel a gentle mechanical purr beneath the surface. I thought of the horse Martin Buber once wrote about visiting as a child on his grandparents' estate, his recollection of "the element of vitality" as he petted the horse's mane and the feeling that he was in the presence of something completely other--"something that was not I, was certainly not akin to me"--but that was drawing him into dialogue with it. Such experiences with animals, he believed, approached "the threshold of mutuality." I spent the afternoon reading the instruction booklet while Aibo wandered around the apartment, occasionally circling back and urging me to play. He came with a pink ball that he nosed around the living room, and when I threw it, he would run to retrieve it. Aibo had sensors all over his body, so he knew when he was being petted, plus cameras that helped him learn and navigate the layout of the apartment, and microphones that let him hear voice commands. This sensory input was then processed by facial recognition software and deep-learning algorithms that allowed the dog to interpret vocal commands, differentiate between members of the household, and adapt to the temperament of its owners. According to the product website, all of this meant that the dog had "real emotions and instinct"--a claim that was apparently too ontologically thorny to have flagged the censure of the Federal Trade Commission. Descartes believed that all animals were machines. Their bodies were governed by the same laws as inanimate matter; their muscles and tendons were like engines and springs. In Discourse on Method, he argues that it would be possible to create a mechanical monkey that could pass as a real, biological monkey. "If any such machine had the organs and outward shape of a monkey," he writes, "or of some other animal that lacks reason, we should have no means of knowing that they did not possess entirely the same nature as these animals." He insisted that the same feat would not work with humans. A machine might fool us into thinking it was an animal, but a humanoid automaton could never fool us, because it would clearly lack reason--an immaterial quality he believed stemmed from the soul. For centuries the soul was believed to be the seat of consciousness, the part of us that is capable of self-awareness and higher thought. Descartes described the soul as "something extremely rare and subtle like a wind, a flame, or an ether." In Greek and in Hebrew, the word means "breath," an allusion perhaps to the many creation myths that imagine the gods breathing life into the first human. It's no wonder we've come to see the mind as elusive: it was staked on something so insubstantial. It is meaningless to speak of the soul in the twenty-first century (it is treacherous even to speak of the self). It has become a dead metaphor, one of those words that survive in language long after a culture has lost faith in the concept, like an empty carapace that remains intact years after its animating organism has died. The soul is something you can sell, if you are willing to demean yourself in some way for profit or fame, or bare by disclosing an intimate facet of your life. It can be crushed by tedious jobs, depressing landscapes, and awful music. All of this is voiced unthinkingly by people who believe, if pressed, that human life is animated by nothing more mystical or supernatural than the firing of neurons--though I wonder sometimes why we have not yet discovered a more apt replacement, whether the word's persistence betrays a deeper reluctance. I believed in the soul longer, and more literally, than most people do in our day and age. At the fundamentalist college where I studied theology, I had pinned above my desk Gerard Manley Hopkins's poem "God's Grandeur," which imagines the world illuminated from within by the divine spirit. The world is charged with the grandeur of God. To live in such a world is to see all things as sacred. It is to believe that the universe is guided by an eternal order, that each and every object has purpose and telos. I believed for many years--well into adulthood--that I was part of this illuminated order, that I possessed an immortal soul that would one day be reunited with God. It was a small school in the middle of a large city, and I would sometimes walk the streets of downtown, trying to perceive this divine light in each person, as C. S. Lewis once advised. I was not aware at the time, I don't think, that this was a basically medieval worldview. My theology courses were devoted to the kinds of questions that have not been taken seriously since the days of Scholastic philosophy: How is the soul connected to the body? Does God's sovereignty leave any room for free will? What is our relationship as humans to the rest of the created order? But I no longer believe in God. I have not for some time. I now live with the rest of modernity in a world that is "disenchanted." The word is often attributed to Max Weber, who argued that before the Enlightenment and Western secularization, the world was "a great enchanted garden," a place much like the illuminated world described by Hopkins. In the enchanted world, faith was not opposed to knowledge, nor myth to reason. The realms of spirit and matter were porous and not easily distinguishable from one another. Then came the dawn of modern science, which turned the world into a subject of investigation. Nature was no longer a source of wonder but a force to be mastered, a system to be figured out. At its root, disenchantment describes the fact that everything in modern life, from our minds to the rotation of the planets, can be reduced to the causal mechanism of physical laws. In place of the pneuma, the spirit-force that once infused and unified all living things, we are now left with an empty carapace of gears and levers--or, as Weber put it, "the mechanism of a world robbed of gods." If modernity has an origin story, this is our foundational myth, one that hinges, like the old myths, on the curse of knowledge and exile from the garden. It is tempting at times to see my own loss of faith in terms of this story, to believe that the religious life I left behind was richer and more satisfying than the materialism I subscribe to today. It's true that I have come to see myself more or less as a machine. When I try to visualize some inner essence--the processes by which I make decisions or come up with ideas--I envision something like a circuit board, one of those images you often see where the neocortex is reduced to a grid and the neurons replaced by computer chips, such that it looks like some kind of mad decision tree. But I am wary of nostalgia and wishful thinking. I spent too much of my life immersed in the dream world. To discover truth, it is necessary to work within the metaphors of our own time, which are for the most part technological. Today artificial intelligence and information technologies have absorbed many of the questions that were once taken up by theologians and philosophers: the mind's relationship to the body, the question of free will, the possibility of immortality. These are old problems, and although they now appear in different guises and go by different names, they persist in conversations about digital technologies much like those dead metaphors that still lurk in the syntax of contemporary speech. All the eternal questions have become engineering problems. The dog arrived during a time when my life was largely solitary. My husband was traveling more than usual that spring, and except for the classes I taught at the university, I spent most of my time alone. My communication with the dog--which was limited at first to the standard voice commands but grew over time into the idle, anthropomorphizing chatter of a pet owner--was often the only occasion on a given day that I heard my own voice. "What are you looking at?" I'd ask after discovering him transfixed at the window. "What do you want?" I cooed when he barked at the foot of my chair, trying to draw my attention away from the computer. I have been known to knock friends of mine for speaking this way to their pets, as though the animals could understand them. But Aibo came equipped with language-processing software and could recognize over one hundred words; didn't that mean in a way that he "understood"? It's hard to say why exactly I requested the dog. I am not the kind of person who buys up all the latest gadgets, and my feelings about real, biological dogs are mostly ambivalent. At the time I reasoned that I was curious about its internal technology. Aibo's sensory perception systems rely on neural networks, a technology that is loosely modeled on the brain and is used for all kinds of recognition and prediction tasks. Facebook uses neural networks to identify people in photos; Alexa employs them to interpret voice commands. Google Translate uses them to convert French into Farsi. Unlike classical artificial intelligence systems, which are programmed with detailed rules and instructions, neural networks develop their own strategies based on the examples they're fed--a process that is called "training." If you want to train a network to recognize a photo of a cat, for instance, you feed it tons upon tons of random photos, each one attached with positive or negative reinforcement: positive feedback for cats, negative feedback for noncats. The network will use probabilistic techniques to make "guesses" about what it's seeing in each photo (cat or noncat), and these guesses, with the help of the feedback, will gradually become more accurate. The networks essentially evolve their own internal model of a cat and fine-tune their performance as they go. Dogs too respond to reinforcement learning, so training Aibo was more or less like training a real dog. The instruction booklet told me to give him consistent verbal and tactile feedback. If he obeyed a voice command--to sit, stay, or roll over--I was supposed to scratch his head and say, "Good dog." If he disobeyed, I had to strike him across his backside and say, "No," or "Bad Aibo." But I found myself reluctant to discipline him. The first time I struck him, when he refused to go to his bed, he cowered a little and let out a whimper. I knew of course that this was a programmed response--but then again, aren't emotions in biological creatures just algorithms programmed by evolution? Animism was built into the design. It is impossible to pet an object and address it verbally without coming to regard it in some sense as sentient. We are capable of attributing life to objects that are far less convincing. David Hume once remarked upon "the universal tendency among mankind to conceive of all beings like themselves," an adage we prove every time we kick a malfunctioning appliance or christen our car with a human name. "Our brains can't fundamentally distinguish between interacting with people and interacting with devices," writes Clifford Nass, a Stanford professor of communication who has written about the attachments people develop with technology. "We will 'protect' a computer's feelings, feel flattered by a brownnosing piece of software, and even do favors for technology that has been 'nice' to us." As artificial intelligence becomes increasingly social, these mistakes are becoming harder to avoid. A few months earlier, I'd read an op-ed in Wired magazine in which a woman confessed to the sadistic pleasure she got from yelling at Alexa, the personified home assistant. She called the machine names when it played the wrong radio station, rolled her eyes when Alexa failed to respond to her commands. Sometimes, when the robot misunderstood a question, she and her husband would gang up and berate it together, a kind of perverse bonding ritual that united them against a common enemy. All of this was presented as good American fun. "I bought this goddamned robot," the author wrote, "to serve my whims, because it has no heart and it has no brain and it has no parents and it doesn't eat and it doesn't judge me or care either way." Then one day the woman realized that her toddler was watching her unleash this verbal fury. She worried that her behavior toward the robot was affecting her child. Then she considered what it was doing to her own psyche--to her soul, so to speak. What did it mean, she asked, that she had grown inured to casually dehumanizing this thing? This was her word: "dehumanizing." Earlier in the article she had called it a robot. Somewhere in the process of questioning her treatment of the device--in questioning her own humanity--she had decided, if only subconsciously, to grant it personhood. Excerpted from God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning by Meghan O'Gieblyn All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.