Review by Choice Review
Isaacson (CEO, Aspen Institute) follows his Jobs biography, Steve Jobs (CH, Apr'12, 49-4500), with an exceptional history of the innovations that drove the digital revolution. Besides revealing the technologies involved, he integrates succinct profiles of important individuals and corporations, emphasizing the management styles deployed that either encouraged innovation or foiled success. The collaboration between Ada Lovelace and Charles Babbage in the 1840s launched the digital revolution. Babbage's Analytical Engine and Lovelace's accompanying commentary and algorithms were inspirational for later generations. The author discusses the transformation of the 19th-century world of human calculators into today's digital world of the web, and explains that ubiquitous computers, smart appliances, and virtual social spaces required many significant innovations. Switching circuits, transistors, microchips, microprocessors, the mouse, and memory storage were prerequisite; the conceptual shift away from single-use computers, e.g., the ENIAC for hydrogen bomb calculations, to multipurpose programmable computers was critical. The journey of innovation continued with the birth of time-sharing and ARPANET, which evolved into the Internet; the successful launch of personal computers by Gates and Jobs; e-mail, Usenet groups, and bulletin boards creating community; and operating systems like Linux becoming open and free. Isaacson concludes his engaging history with recent innovations that are building the web. Summing Up: Highly recommended. All readership levels. --Mark Mounts, Dartmouth College
Copyright American Library Association, used with permission.
Review by New York Times Review
during the H-bomb testing frenzy of the 1950s, a RAND Corporation researcher named Paul Baran became concerned about the fragility of America's communications networks. The era's telephone systems required users to connect to a handful of major hubs, which the Soviets would doubtless target in the early hours of World War III. So Baran dreamed up a less vulnerable alternative: a decentralized network that resembled a vast fishnet, with an array of small nodes that were each linked to a few others. These nodes could not only receive signals but also route them along to their neighbors, thereby creating countless possible paths for data to keep flowing should part of the network be destroyed. This data would travel across the structure in tiny chunks, called "packets," that would self-assemble into coherent wholes upon reaching their destinations. When Baran pitched his concept to AT&T, he was confident the company would grasp the wisdom of building a network that could withstand a nuclear attack. But as Walter Isaacson recounts in "The Innovators," his sweeping and surprisingly tenderhearted history of the digital age, AT&T's executives reacted as if Baran had asked them to get into the unicorn-breeding business. They explained at length why his "packet-switching" network was a physical impossibility, at one point calling in 94 separate technicians to lecture Baran on the limits of the company's hardware. "When it was over, the AT&T executives asked Baran, 'Now do you see why packet switching wouldn't work?'" Isaacson writes. "To their great disappointment, Baran simply replied, 'No.'" AT&T thus blew its chance to loom large in technological lore, for packet switching went on to become a keystone of the Internet. But the company can take solace in the fact that it was hardly alone in letting knee-jerk negativity blind itself to a tremendous digital opportunity: Time and again in "The Innovators," powerful entities shrug their shoulders when presented with zillion-dollar ideas. Fortunately for those of us who now feel adrift when our iPads and 4G phones are beyond arm's reach, the Paul Barans of the world are not easily discouraged. Stubbornness is just one of the personality traits ubiquitous among the brilliant subjects of "The Innovators." Isaacson identifies several other virtues that were essential to his geeky heroes' success, none of which will surprise those familiar with Silicon Valley's canon of self-help literature: The digital pioneers all loathed authority, embraced collaboration and prized art as much as science. Though its lessons may be prosaic, the book is still absorbing and valuable, and Isaacson's outsize narrative talents are on full display. Few authors are more adept at translating technical jargon into graceful prose, or at illustrating how hubris and greed can cause geniuses to lose their way. Having chosen such an ambitious project to follow his 2011 biography of the Apple co-founder Steve Jobs, Isaacson is wise to employ a linear structure that gives "The Innovators" a natural sense of momentum. The book begins in the 1830s with the prescient Ada Lovelace, Lord Byron's mathematically gifted daughter, who envisioned a machine that could perform varied tasks in response to different algorithmic instructions. (Isaacson takes pains throughout to salute the unheralded contributions of female programmers.) The story then skips ahead to the eve of World War II, when engineers scrambled to build machines capable of calculating the trajectories of missiles and shells. One of these inventors was John Mauchly, a driven young professor at Ursinus College. In June 1941, he paid a visit to Ames, Iowa, where an electrical engineer named John Atanasoff had cobbled together an electronic calculator "that could process and store data at a cost of only $2 per digit" - a seemingly magical feat. Against the advice of his wife, who suspected that Mauchly was a snake, Atanasoff proudly showed off his ragtag creation. Soon thereafter, Mauchly incorporated some of Atanasoff's ideas into Eniac, the 27-ton machine widely hailed as the world's first true computer. The bitter patent fight that ensued would last until 1973, with Atanasoff emerging victorious. Mauchly is often demonized for stealing from that most romantic of tech archetypes, the "lone tinkerer in a basement" who sketched out brainstorms on cocktail napkins. But Isaacson contends that men like Atanasoff receive too much adulation, for an ingenious idea is worthless unless it can be executed on a massive scale. If Mauchly hadn't come to Iowa to "borrow" his work, Atanasoff would have been "a forgotten historical footnote" rather than a venerated father of modern computing. Isaacson is not nearly as sympathetic in discussing the sins of William Shockley, who shared a 1956 Nobel Prize in Physics for co-inventing the transistor. Shockley is the book's arch-villain, a glory hog whose paranoid tendencies destroyed the company that bore his name. (He once forced all his employees to take lie-detector tests to determine if someone had sabotaged the office.) His eight best researchers quit and went on to found Fairchild Semiconductor, arguably the most seminal company in digital history; Shockley, meanwhile, devolved into a raving proponent of odious theories about race and intelligence. The digital revolution germinated not only at button-down Silicon Valley firms like Fairchild, but also in the hippie enclaves up the road in San Francisco. The intellectually curious denizens of these communities "shared a resistance to power elites and a desire to control their own access to information." Their freewheeling culture would give rise to the personal computer, the laptop and the concept of the Internet as a tool for the Everyman rather than scientists. Though Isaacson is clearly fond of these unconventional souls, his description of their world suffers from a certain anthropological detachment. Perhaps because he's accustomed to writing biographies of men who operated inside the corridors of power - Benjamin Franklin, Henry Kissinger, Jobs - Isaacson seems a bit baffled by committed outsiders like Stewart Brand, an LSD-inspired futurist who predicted the democratization of computing. He also does himself no favors by frequently citing the work of John Markoff and Tom Wolfe, two writers who have produced far more intimate portraits of '60s counterculture. Yet this minor shortcoming is quickly forgiven when "The Innovators" segues into its rollicking last act, in which hardware becomes commoditized and software goes on the ascent. The star here is Bill Gates, whom Isaacson depicts as something close to a punk - a spoiled brat and compulsive gambler who "was rebellious just for the hell of it." Like Paul Baran before him, Gates encountered an appalling lack of vision in the corporate realm - in his case at IBM, which failed to realize that its flagship personal computer would be cloned into oblivion if the company permitted Microsoft to license the machine's MS-DOS operating system at will. Gates pounced on this mistake with a feral zeal that belies his current image as a sweater-clad humanitarian. "The Innovators" cannot really be faulted for the hastiness of its final pages, in which Isaacson provides brief and largely unilluminating glimpses at Twitter, Wikipedia and Google. There is no organic terminus for the book's narrative, since digital technology did not cease to evolve the moment Isaacson handed in his manuscript. As a result, any ending was doomed to feel dated. (There is, for example, but a single passing mention of the digital currency Bitcoin.) But even at its most rushed, the book evinces a genuine affection for its subjects that makes it tough to resist. Isaacson confesses early on that he was once "an electronics geek who loved Heathkits and ham radios," and that background seems to have given him keen insight into how youthful passion transforms into professional obsession. His book is thus most memorable not for its intricate accounts of astounding breakthroughs and the business dramas that followed, but rather for the quieter moments in which we realize that the most primal drive for innovators is a need to feel childlike joy. BRENDAN I. KOERNER is a contributing editor at Wired and the author, most recently, of "The Skies Belong to Us: Love and Terror in the Golden Age of Hijacking."
Copyright (c) The New York Times Company [September 28, 2014]
Review by Booklist Review
*Starred Review* In 1843, Ada Lovelace, the daughter of Lord Byron, wrote in a letter to Charles Babbage that mathematical calculating machines would one day become general-purpose devices that link the operations of matter and the abstract mental processes, correctly predicting the rise of modern computers. Thus begins a remarkable overview of the history of computers from the man who brought us biographies of Steve Jobs, Benjamin Franklin, Albert Einstein, and Henry Kissinger. The story is above all one of collaboration and incremental progress, which lies in contrast to our fascination with the lone inventor. Here we find that in a world dominated by men with their propensity for hardware, the first contributions to software were made by women. While we have those storied partnerships of the digital age Noyce and Moore, Hewlett and Packard, Allen and Gates, and Jobs and Wozniak all of their contributions were built upon the advances of lesser-known pioneers, who are heralded in these pages. Although full biographies of the individuals profiled here have been written in spades, Isaacson manages to bring together the entire universe of computing, from the first digitized loom to the web, presented in a very accessible manner that often reads like a thriller.--Siegfried, David Copyright 2014 Booklist
From Booklist, Copyright (c) American Library Association. Used with permission.
Review by Publisher's Weekly Review
Starred Review. The history of the computer as told through this fascinating book is not the story of great leaps forward but rather one of halting progress. Journalist and Aspen Institute CEO Isaacson ( Steve Jobs) presents an episodic survey of advances in computing and the people who made them, from 19th-century digital prophet Ada Lovelace to Google founders Larry Page and Sergey Brin. His entertaining biographical sketches cover headline personalities (such as a manic Bill Gates in his salad days) and unsung toilers, like WWII's pioneering female programmers, and outright failures whose breakthroughs fizzled unnoticed, such as John Atanasoff, who was close to completing a full-scale model computer in 1942 when he was drafted into the Navy. Isaacson examines these figures in lucid, detailed narratives, recreating marathon sessions of lab research, garage tinkering, and all-night coding in which they struggled to translate concepts into working machinery. His account is an antidote to his 2011 Great Man hagiography of Steve Jobs; for every visionary--or three (vicious fights over who invented what are ubiquitous)--there is a dogged engineer; a meticulous project manager; an indulgent funder; an institutional hothouse like ARPA, Stanford, and Bell Labs; and hordes of technical experts. Isaacson's absorbing study shows that technological progress is a team sport, and that there's no I in computer. Photos. Agent: Amanda Urban, ICM. (Oct.) (c) Copyright PWxyz, LLC. All rights reserved.
(c) Copyright PWxyz, LLC. All rights reserved
Review by Library Journal Review
Starred Review. Taking a chronological, people-oriented approach, rather than a scientific or technical one, to the history of computers, the Internet, and digital technology, Isaacson (Steve Jobs) illuminates the ways teamwork, collaboration, and creativity have led to the current tech-driven world. As much biography as computer history, the work discusses such people, companies, and developments as 1840s computer programming pioneer Ada Lovelace, Vannevar Bush, Alan Turing, Doug Engelbart, Bill Gates, Steve Wozniak, Steve Jobs, IBM, ENIAC, Microsoft, and Apple. Dennis Boutsikaris's moderately paced, low-key reading makes the book an easy, thoroughly engrossing listen. This fascinating and unique work would be a nice companion to Erik Brynjolfsson and Andrew McAfee's The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies. VERDICT This program will appeal to general listeners interested in the history of the computer age and entrepreneurs looking for nontraditional new industries, business models, and marketing concepts. ["Anyone who uses a computer in any of its contemporary shapes or who has an interest in modern history will enjoy this book," read the review of the S. & S. hc, LJ 9/15/14.]-Laurie Selwyn, formerly with Grayson Cty. Law Lib., Sherman, TX (c) Copyright 2015. Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
(c) Copyright Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
Review by Kirkus Book Review
A panoramic history of technological revolution. "Innovation occurs when ripe seeds fall on fertile ground," Aspen Institute CEO Isaacson (Steve Jobs, 2011, etc.) writes in this sweeping, thrilling tale of three radical innovations that gave rise to the digital age. First was the evolution of the computer, which Isaacson traces from its 19th-century beginnings in Ada Lovelace's "poetical" mathematics and Charles Babbage's dream of an "Analytical Engine" to the creation of silicon chips with circuits printed on them. The second was "the invention of a corporate culture and management style that was the antithesis of the hierarchical organization of East Coast companies." In the rarefied neighborhood dubbed Silicon Valley, new businesses aimed for a cooperative, nonauthoritarian model that nurtured cross-fertilization of ideas. The third innovation was the creation of demand for personal devices: the pocket radio; the calculator, marketing brainchild of Texas Instruments; video games; and finally, the holy grail of inventions: the personal computer. Throughout his action-packed story, Isaacson reiterates one theme: Innovation results from both "creative inventors" and "an evolutionary process that occurs when ideas, concepts, technologies, and engineering methods ripen together." Who invented the microchip? Or the Internet? Mostly, Isaacson writes, these emerged from "a loosely knit cohort of academics and hackers who worked as peers and freely shared their creative ideas.Innovation is not a loner's endeavor." Isaacson offers vivid portraitsmany based on firsthand interviewsof mathematicians, scientists, technicians and hackers (a term that used to mean anyone who fooled around with computers), including the elegant, "intellectually intimidating," Hungarian-born John von Neumann; impatient, egotistical William Shockley; Grace Hopper, who joined the Army to pursue a career in mathematics; "laconic yet oddly charming" J.C.R. Licklider, one father of the Internet; Bill Gates, Steve Jobs, and scores of others. Isaacson weaves prodigious research and deftly crafted anecdotes into a vigorous, gripping narrative about the visionaries whose imaginations and zeal continue to transform our lives. Copyright Kirkus Reviews, used with permission.
Copyright (c) Kirkus Reviews, used with permission.