You are not a gadget A manifesto

Jaron Lanier

Book - 2010

Saved in:

2nd Floor Show me where

303.4833/Lanier
1 / 1 copies available
Location Call Number   Status
2nd Floor 303.4833/Lanier Checked In
Subjects
Published
New York : Alfred A. Knopf 2010.
Language
English
Main Author
Jaron Lanier (-)
Edition
1st ed
Item Description
"Portions of this work also originally appeared in Discover, Think Magazine, and on www.edge.org"--T.p. verso.
Physical Description
ix, 209 p. ; 22 cm
ISBN
9780307389978
9780307269645
  • Introduction to the Paperback Edition
  • Preface
  • Part 1. What is a Person?
  • Chapter 1. Missing Persons
  • Chapter 2. An Apocalypse of Self-Abdication
  • Chapter 3. The Noosphere Is Just Another Name for Everyone's Inner Troll
  • Part 2. What Will Money Be?
  • Chapter 4. Digital Peasant Chic
  • Chapter 5. The City Is Built to Music
  • Chapter 6. The Lords of the Clouds Renounce Free Will in Order to Become Infinitely Lucky
  • Chapter 7. The Prospects for Humanistic Cloud Economics
  • Chapter 8. Three Possible Future Directions
  • Part 3. he unbearable Thinness of Flatness
  • Chapter 9. Retropolis
  • Chapter 10. Digital Creativity Eludes Flat Places
  • Chapter 11. All Hail the Membrane
  • Part 4. Making the Best of Bits
  • Chapter 12. I Am a Contrarian Loop
  • Chapter 13. One Story of How Semantics Might Have Evolved
  • Part 5. Future Humors
  • Chapter 14. Home at Last (My Love Affair with Bachelardian Neoteny
  • afterword to the Paperback Edition
  • acknowledgments
  • Index
Review by Choice Review

Many hail Web 2.0 as an empowering phenomenon that has brought about a democratization of information. But here, in his first book, Silicon Valley insider Lanier offers a radically different perspective. Widely regarded as the "father of virtual reality," Lanier argues that the structure of Web 2.0 violates the integrity of the individual by discouraging reasoned discourse in favor of intellectually flawed groupthink. Lanier skillfully constructs his argument by tracing the historical antecedents of social software to explain how design limitations constrain human behavior. Describing Web 2.0 as seductively dangerous in its ubiquity, he challenges the reader to consider how concepts like group consensus, mashups, blogs, and a glut of off-the-cuff communication affects the evaluation of information, human interaction, economics, and social class. He urges the reader to consider technology as a tool that should serve humanity instead of being unconsciously controlled by technology. Lanier's message, though impassioned, is optimistic and persuasive. His thesis goes against the grain, and for this reason alone, this work offers a valuable alternative to the predominance of popular discourse favoring social software. Summing Up: Highly recommended. All levels of readership. S. M. Frey Indiana State University

Copyright American Library Association, used with permission.
Review by Booklist Review

*Starred Review* Lanier is the digital pioneer who coined the term virtual reality, but for all his computer expertise and zeal, he now says, not so fast. A composer, musician, and artist as well as a computer scientist, Lanier is concerned that the digital hive is growing at the expense of individuality. As he advocates for human concerns over digital imperatives in a book as invigorating for its excellent prose as for its striking disclosures and cogent arguments, Lanier describes the phenomenon he calls lock in, which leaves us stuck with flawed computer programs and skewed search engines. Moving into the social arena, Lanier dismantles such cyberfantasies as the Singularity, draws the connection between cloud computing and financial irresponsibility, ponders gadget fetishism and cybercrime, and, most electrifyingly, critiques online culture's rampant reductiveness and disdain for quality and originality. Lanier is particularly incisive in his assessment of the Web's role in eradicating paying jobs and undermining entire careers while simultaneously bombarding the now-imperiled middle class with advertising. Beware, Lanier says, of cybernetic totalism. Don't be bamboozled and devalued. The Web can be a better place. Lanier's bold and brilliant protest against cyberhype and exploitation is a tonic and necessary call for humanism.--Seaman, Donna Copyright 2009 Booklist

From Booklist, Copyright (c) American Library Association. Used with permission.
Review by Publisher's Weekly Review

Computer scientist and Internet guru Lanier's fascinating and provocative full-length exploration of the Internet's problems and potential is destined to become a must-read for both critics and advocates of online-based technology and culture. Lanier is best known for creating and pioneering the use of the revolutionary computer technology that he named virtual reality. Yet in his first book, Lanier takes a step back and critiques the current digital technology, more deeply exploring the ideas from his famous 2000 Wired magazine article, "One-Half of a Manifesto," which argued against more wildly optimistic views of what computers and the Internet could accomplish. His main target here is Web 2.0, the current dominant digital design concept commonly referred to as "open culture." Lanier forcefully argues that Web 2.0 sites such as Wikipedia "undervalue humans" in favor of "anonymity and crowd identity." He brilliantly shows how large Web 2.0-based information aggregators such as Amazon.com-as well as proponents of free music file sharing-have created a "hive mind" mentality emphasizing quantity over quality. But he concludes with a passionate and hopeful argument for a "new digital humanism" in which radical technologies do not deny "the specialness of personhood." (Jan.) (c) Copyright PWxyz, LLC. All rights reserved

(c) Copyright PWxyz, LLC. All rights reserved
Review by Library Journal Review

Popularly known for his ruminations on the social pathology of information technology, computer scientist Lanier is immensely concerned that the design patterns of today's omnipresent 2.0 web services are about to be locked in. He argues that technology prophets from many disciplines have us blissfully ignorant of the sacrifices we make when submerging our individual identities into online collectives like Facebook. In addition, the web's early promise in terms of innovation, democracy, and interpersonal communication has not come to be; instead, an online culture has emerged that undermines the foundation of the knowledge economy. Flows of information, Lanier notes, are more important than what is being shared, whole expressions of creativity and arguments are replaced by fragments, and authors are successful by simply reusing the past instead of producing genuinely new works. Still, Lanier is optimistic that it's not too late to move away from cybernetic totalism by taking the "red pill" his book offers-for the web does not design itself, we design it. Verdict If you can't imagine a world without today's social technologies, this is a must read for 2010. [100,000-copy first printing; see Prepub Alert, LJ 9/1/09.]-James A. Buczynski, Seneca Coll. of Applied Arts & Technology, Toronto (c) Copyright 2010. Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.

(c) Copyright Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
Review by Kirkus Book Review

Quirky but sprawling indictment of our Internet-dominated society. Lanier, an iconoclastic speaker, columnist, computer scientist, musician and innovator of virtual-reality experiments in the 1980s, skewers the degeneration of the modern digital world. The author convincingly argues that changes in digital and software design affect human behavior, just as small changes in virtual-reality simulations modify the player's experience. One of the problems with the modern Internet culture, he writes, is that people get locked in, or confined, in their responses by the software they use, and hence lose their sense of individuality. They must conform to pre-defined categories in Facebook, and get automatically directed by Google search engines to Wikipedia entries that are bland and uninspiring. Lanier is particularly incensed by the "hive mind" mentality, which posits that group-think articles in Wikipedia are better than a creative, inspired article by someone who is a true expert on the subject. The flat structure of the Internet not only results in mediocre content, but allows for trolls, or anonymous users, who use their anonymity to behave badly and to trash others. It was also a blind belief in technology, Lanier asserts, that led to the financial debacle, as rogue traders relied on sophisticated computer algorithms without understanding what they were doing. The author is less convincing when he moves to a larger systemic argument about how an advertising-focused capitalist system directs money where the most clicks go, instead of toward individual talent. It's a difficult argument to prove, and his example of how pop music is less innovative now compared to music in the 20th century, while intriguing, seems a bit removed from the wider claims he makes about the creativity-stifling effects of big business. The last section, in which Lanier describes some inspiring potentials of modern computing, is a disjointed attempt to put a positive spin on a pessimistic view of modern technological culture. A well-intended and insightful but messy treatise. Copyright Kirkus Reviews, used with permission.

Copyright (c) Kirkus Reviews, used with permission.

an apocalypse of self- abdication THE IDEAS THAT I hope will not be locked in rest on a philosophical foundation that I sometimes call cybernetic totalism. It applies metaphors from certain strains of computer science to people and the rest of reality. Pragmatic objections to this philosophy are presented. What Do You Do When the Techies Are Crazier Than the Luddites? The Singularity is an apocalyptic idea originally proposed by John von Neumann, one of the inventors of digital computation, and elucidated by figures such as Vernor Vinge and Ray Kurzweil. There are many versions of the fantasy of the Singularity. Here's the one Marvin Minsky used to tell over the dinner table in the early 1980s: One day soon, maybe twenty or thirty years into the twenty- first century, computers and robots will be able to construct copies of themselves, and these copies will be a little better than the originals because of intelligent software. The second generation of robots will then make a third, but it will take less time, because of the improvements over the first generation. The process will repeat. Successive generations will be ever smarter and will appear ever faster. People might think they're in control, until one fine day the rate of robot improvement ramps up so quickly that superintelligent robots will suddenly rule the Earth. In some versions of the story, the robots are imagined to be microscopic, forming a "gray goo" that eats the Earth; or else the internet itself comes alive and rallies all the net- connected machines into an army to control the affairs of the planet. Humans might then enjoy immortality within virtual reality, because the global brain would be so huge that it would be absolutely easy--a no-brainer, if you will--for it to host all our consciousnesses for eternity. The coming Singularity is a popular belief in the society of technologists. Singularity books are as common in a computer science department as Rapture images are in an evangelical bookstore. (Just in case you are not familiar with the Rapture, it is a colorful belief in American evangelical culture about the Christian apocalypse. When I was growing up in rural New Mexico, Rapture paintings would often be found in places like gas stations or hardware stores. They would usually include cars crashing into each other because the virtuous drivers had suddenly disappeared, having been called to heaven just before the onset of hell on Earth. The immensely popular Left Behind novels also describe this scenario.) There might be some truth to the ideas associated with the Singularity at the very largest scale of reality. It might be true that on some vast cosmic basis, higher and higher forms of consciousness inevitably arise, until the whole universe becomes a brain, or something along those lines. Even at much smaller scales of millions or even thousands of years, it is more exciting to imagine humanity evolving into a more wonderful state than we can presently articulate. The only alternatives would be extinction or stodgy stasis, which would be a little disappointing and sad, so let us hope for transcendence of the human condition, as we now understand it. The difference between sanity and fanaticism is found in how well the believer can avoid confusing consequential differences in timing. If you believe the Rapture is imminent, fixing the problems of this life might not be your greatest priority. You might even be eager to embrace wars and tolerate poverty and disease in others to bring about the conditions that could prod the Rapture into being. In the same way, if you believe the Singularity is coming soon, you might cease to design technology to serve humans, and prepare instead for the grand events it will bring. But in either case, the rest of us would never know if you had been right. Technology working well to improve the human condition is detectable, and you can see that possibility portrayed in optimistic science fiction like Star Trek . The Singularity, however, would involve people dying in the flesh and being uploaded into a computer and remaining conscious, or people simply being annihilated in an imperceptible instant before a new superconsciousness takes over the Earth. The Rapture and the Singularity share one thing in common: they can never be verified by the living. You Need Culture to Even Perceive Information Technology Ever more extreme claims are routinely promoted in the new digital climate. Bits are presented as if they were alive, while humans are transient fragments. Real people must have left all those anonymous comments on blogs and video clips, but who knows where they are now, or if they are dead? The digital hive is growing at the expense of individuality. Kevin Kelly says that we don't need authors anymore, that all the ideas of the world, all the fragments that used to be assembled into coherent books by identifiable authors, can be combined into one single, global book. Wired editor Chris Anderson proposes that science should no longer seek theories that scientists can understand, because the digital cloud will understand them better anyway.* Antihuman rhetoric is fascinating in the same way that selfdestruction is fascinating: it offends us, but we cannot look away. The antihuman approach to computation is one of the most baseless ideas in human history. A computer isn't even there unless a person experiences it. There will be a warm mass of patterned silicon with electricity coursing through it, but the bits don't mean anything without a cultured person to interpret them. This is not solipsism. You can believe that your mind makes up the world, but a bullet will still kill you. A virtual bullet, however, doesn't even exist unless there is a person to recognize it as a representation of a bullet. Guns are real in a way that computers are not. Making People Obsolete So That Computers Seem More Advanced Many of today's Silicon Valley intellectuals seem to have embraced what used to be speculations as certainties, without the spirit of unbounded curiosity that originally gave rise to them. Ideas that were once tucked away in the obscure world of artificial intelligence labs have gone mainstream in tech culture. The first tenet of this new culture is that all of reality, including humans, is one big information system. That doesn't mean we are condemned to a meaningless existence. Instead there is a new kind of manifest destiny that provides us with a mission to accomplish. The meaning of life, in this view, is making the digital system we call reality function at ever- higher "levels of description." People pretend to know what "levels of description" means, but I doubt anyone really does. A web page is thought to represent a higher level of description than a single letter, while a brain is a higher level than a web page. An increasingly common extension of this notion is that the net as a whole is or soon will be a higher level than a brain. There's nothing special about the place of humans in this scheme. Computers will soon get so big and fast and the net so rich with information that people will be obsolete, either left behind like the characters in Rapture novels or subsumed into some cyber-superhuman something. Silicon Valley culture has taken to enshrining this vague idea and spreading it in the way that only technologists can. Since implementation speaks louder than words, ideas can be spread in the designs of software. If you believe the distinction between the roles of people and computers is starting to dissolve, you might express that--as some friends of mine at Microsoft once did--by designing features for a word processor that are supposed to know what you want, such as when you want to start an outline within your document. You might have had the experience of having Microsoft Word suddenly determine, at the wrong moment, that you are creating an indented outline. While I am all for the automation of petty tasks, this is different. From my point of view, this type of design feature is nonsense, since you end up having to work more than you would otherwise in order to manipulate the software's expectations of you. The real function of the feature isn't to make life easier for people. Instead, it promotes a new philosophy: that the computer is evolving into a life-form that can understand people better than people can understand themselves. Another example is what I call the "race to be most meta." If a design like Facebook or Twitter depersonalizes people a little bit, then another service like Friendfeed-- which may not even exist by the time this book is published-- might soon come along to aggregate the previous layers of aggregation, making individual people even more abstract, and the illusion of high- level metaness more celebrated. Information Doesn't Deserve to Be Free "Information wants to be free." So goes the saying. Stewart Brand, the founder of the Whole Earth Catalog , seems to have said it first. I say that information doesn't deserve to be free. Cybernetic totalists love to think of the stuff as if it were alive and had its own ideas and ambitions. But what if information is inanimate? What if it's even less than inanimate, a mere artifact of human thought? What if only humans are real, and information is not? Of course, there is a technical use of the term "information" that refers to something entirely real. This is the kind of information that's related to entropy. But that fundamental kind of information, which exists independently of the culture of an observer, is not the same as the kind we can put in computers, the kind that supposedly wants to be free. Information is alienated experience. You can think of culturally decodable information as a potential form of experience, very much as you can think of a brick resting on a ledge as storing potential energy. When the brick is prodded to fall, the energy is revealed. That is only possible because it was lifted into place at some point in the past. In the same way, stored information might cause experience to be revealed if it is prodded in the right way. A file on a hard disk does indeed contain information of the kind that objectively exists. The fact that the bits are discernible instead of being scrambled into mush--the way heat scrambles things--is what makes them bits. But if the bits can potentially mean something to someone, they can only do so if they are experienced. When that happens, a commonality of culture is enacted between the storer and the retriever of the bits. Experience is the only process that can de- alienate information. Information of the kind that purportedly wants to be free is nothing but a shadow of our own minds, and wants nothing on its own. It will not suffer if it doesn't get what it wants. But if you want to make the transition from the old religion, where you hope God will give you an afterlife, to the new religion, where you hope to become immortal by getting uploaded into a computer, then you have to believe information is real and alive. So for you, it will be important to redesign human institutions like art, the economy, and the law to reinforce the perception that information is alive. You demand that the rest of us live in your new conception of a state religion. You need us to deify information to reinforce your faith. *Chris Anderson, "The End of Theory," Wired, June 23, 2008 (www.wired.com/science/discoveries/magazine/ 16- 07/pb_theory). Excerpted from You Are Not a Gadget: A Manifesto by Jaron Lanier All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.