History of innovation: Ada Lovelace and Charles Babbage

On July 8, 1835, an English nobleman, the 8th Baron King, was married in what was, by all accounts, a pleasant wedding, though it was a much smaller affair than one might have expected, given the groom's title and family wealth. The intimacy was due to the general public's fascination with his 19-year-old bride, the beautiful and brilliant Augusta Byron, now commonly known by her middle name of Ada, daughter of the notorious English Romantic poet, Lord Byron (1788-1824).

More than a decade after his death, Byron's reputation for creative brilliance and moral dissolution continued to reverberate through European culture. Ada's fame meant a certain measure of discretion was required at her wedding.

By the conventional standards of Victorian society, Ada's married life seemed a dream: a title, a loving husband and three children. But as she settled into motherhood and running a landed estate, she found herself fraying at the edges, drawn to paths that were unheard of for Victorian women. She had a passion for numbers.

When Ada was a teenager, her mother, Annabella Byron, had encouraged her study of mathematics, hiring tutors to instruct her in algebra and trigonometry, a radical course in an age when women were excluded from scientific institutions such as the Royal Society, and were assumed to be incapable of rigorous scientific thinking. But Annabella had an ulterior motive in encouraging her daughter's maths skills, hoping that the methodical, practical nature of her studies would override the dangerous influence of her dead father. A world of numbers, Annabella hoped, would save her daughter from the debauchery of art.

Ada's letters from the period display a mix of Romantic ambition - the sense of a soul larger than the ordinary reality it has found itself trapped in - combined with intense belief in the power of mathematical reason. She wrote about differential calculus with the same passion and exuberance (and self-confidence) that her father wrote about forbidden love: "Owing to some peculiarity in my nervous system, I have perceptions of some things, which no one else has . . . an intuitive perception of hidden things; - that is of things hidden away from eyes, ears, and the ordinary senses. This alone would advantage me little, in the discovery line, but there is, secondly, my immense reasoning faculties, and my concentrative faculty."

At 25, Ada Lovelace (her husband had been given the title Earl of Lovelace in 1838) found herself at a crossroads, confronting two very different ways of being. She could live within the boundaries of conventional decorum. Or she could embrace those "peculiarities of [her] nervous system" and seek out some original path for herself and her distinctive gifts. In choosing between domestic stability and some unknown break from convention, she was, in a sense, choosing between her mother and her father. To stay settled was the easier path; the forces of society propelled her toward it. And yet she was still Byron's daughter.

Lovelace found a way around the impasse. She charted a path that allowed her to push the barriers of Victorian society without succumbing to the creative chaos that had enveloped her father. She became a software programmer.

. . .

Writing code in the 19th century may seem like a vocation that would be possible only with time travel but, as chance would have it, Lovelace already knew the one man who was capable of giving her such a project: Charles Babbage, a brilliant and eclectic inventor.

Babbage had spent the previous two decades concocting state-of-the-art calculators and, since the mid-1830s, he had been working on a project that would last the rest of his life: designing a truly programmable computer, capable of executing complex calculations that went far beyond the capabilities of any contemporary machines. Babbage's Analytical Engine was doomed to practical failure - he was trying to build a digital-age computer with industrial-age mechanical parts - but conceptually it was a brilliant leap forward. The design anticipated the major components of modern computers: the notion of a central processing unit (which Babbage dubbed "the mill"), of random-access memory and of software that would control the machine, etched on the very same punch cards that would programme computers more than a century later.

Lovelace had first met Babbage when she was just 17, in one of his celebrated London salons, and the two had kept up a lively correspondence over the years. When she hit her life's crossroads in the early 1840s, she wrote to Babbage with the offer that, "if ever I could be worth or capable of being used by you, my head will be yours".

Babbage did have a use for Lovelace's remarkable head, and their collaboration would lead to one of the founding documents in the history of computing. An Italian engineer had written an essay about Babbage's machine and Lovelace translated the text into English. With Babbage's encouragement, she added her own aphoristic commentary, stitched together from extended footnotes attached to the Italian paper.

Those footnotes would ultimately prove to be far more influential than the original text they annotated. They contained a series of elemental instruction sets that could be used to direct the calculations of the Analytical Engine. These are considered to be the first examples of working software ever published.

There is some dispute over whether Lovelace was the sole author of these programs, or whether she was refining routines that Babbage himself had worked out. But her greatest contribution lay not in writing instruction sets but, rather, in envisioning a range of utility for the machine that Babbage himself had not considered. "Many persons," she wrote, "imagine that because the business of the engine is to give its results in numerical notation, the nature of its processes must consequently be arithmetical and numerical, rather than algebraical and analytical. This is an error. The engine can arrange and combine its numerical quantities exactly as if they were letters or any other general symbols."

Lovelace recognised that Babbage's machine was not a mere number-cruncher. Its potential uses went far beyond rote calculation. It might even someday be capable of the higher arts: "Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and musical composition were susceptible of such expressions and adaptations, the Engine might compose elaborate and scientific pieces of music of any degree of complexity or extent."

. . .

There is a strong case to be made that this is the most visionary footnote in the history of print. To have this imaginative leap in the mid-19th century is almost beyond comprehension. It was hard enough to wrap one's mind around the idea of programmable computers - most of Babbage's contemporaries failed to grasp what he had invented - but somehow Lovelace took the concept one step further, to the idea that this machine might also conjure up language and art. Her footnote opened up a conceptual space that would eventually be occupied by early 21st-century culture: Google queries, electronic music, iTunes, hypertext, Pixar. The computer would not just be an unusually flexible calculator; it would be an expressive, representational, even aesthetic machine.

Babbage's idea and Lovelace's footnote proved to be so far ahead of their time that, for a long while, they were lost to history. Most of Babbage's core insights had to be rediscovered 100 years later, when the first working computers were built in the 1940s, running on electricity and vacuum tubes instead of steam power. The notion of computers as aesthetic tools didn't become widespread - even in high-tech hubs such as Silicon Valley - until the 1970s.

. . .

Most important innovations - in modern times at least - arrive in clusters of simultaneous discovery. The conceptual and technological pieces come together to make a certain idea imaginable - artificial refrigeration, say, or the lightbulb - and around the world people work on the problem, and usually approach it with the same fundamental assumptions about how it can be solved.

Thomas Edison and his peers may have disagreed about the importance of the vacuum or the carbon filament in inventing the electric lightbulb, but none of them was working on an LED. As the writer Kevin Kelly, co-founder of Wired magazine, has observed, the predominance of simultaneous, multiple invention in the historical record has interesting implications for the philosophy of history and science: to what extent is the sequence of invention set in stone by the basic laws of physics or information or the biological and chemical constraints of the environment?

If simultaneous invention is the rule, what about the exceptions? What about Babbage and Lovelace, who were a century ahead of just about every other human on the planet? Most innovation happens in the present tense of possibility, working with tools and concepts that are available in that time. But every now and then an individual or group makes a leap that seems almost like time travelling. What allows them to see past the boundaries of the adjacent possible, when their contemporaries fail to do so? That may be the greatest mystery of all.

The conventional explanation is the all-purpose but somewhat circular category of "genius". Da Vinci could imagine (and draw) helicopters in the 15th century because he was a genius; Babbage and Lovelace could imagine programmable computers in the 19th century because they were geniuses. No doubt all three were blessed with great intellectual gifts, but history is replete with high-IQ individuals who don't come up with inventions far ahead of their time.

If there is a common thread to the time travellers, beyond the non-explanation of genius, it is this: they worked at the margins of their official fields, or at the intersection point between very different disciplines. Consider the French inventor Edouard-Leon Scott de Martinville, who patented a sound recording device in 1857, almost a generation before Edison began working on his phonograph. Scott was able to imagine the idea of "writing" sound waves because he had borrowed metaphors from stenography and printing and anatomical studies of the human ear.

Lovelace could see the aesthetic possibilities of Babbage's Analytical Engine because her life had been lived at a collision point between advanced maths and Romantic poetry. The "peculiarities" of her "nervous system" - that Romantic instinct to see beyond the surface appearances of things - allowed her to imagine a machine capable of manipulating symbols or composing music, in a way that even Babbage had failed to do.

Time travellers remind us that working within an established field is both empowering and restricting. Stay within the boundaries of a discipline and you will have an easier time making incremental improvements, opening the doors of the adjacent possible that are available to you given the specifics of the historical moment. But disciplinary boundaries can also serve as blinders, keeping you from the bigger idea, visible only when you cross those borders.

Sometimes those borders are literal ones: the "ice baron" Frederic Tudor (1783-1864) hit upon the commercial potential of cold 100 years before the first air conditioners, when he travelled to the Caribbean as a young man and dreamt of ice in the tropics. Sometimes the boundaries are conceptual: Scott borrowing the metaphors of stenography to invent his recording device.

Time travellers tend to have a lot of hobbies: think of Charles Darwin and his orchids. When Darwin published a book on pollination, three years after On the Origin Of Species (1859), he gave it the wonderfully Victorian title of On the Various Contrivances by Which British and Foreign Orchids are Fertilised by Insects, and on the Good Effects of Intercrossing. We now understand the good effects of intercrossing thanks to the science of genetics but the principle applies to intellectual history as well. Time travellers are unusually adept at "intercrossing" fields of expertise. That's the beauty of the hobbyist: it's easier to mix intellectual fields when you have them littering your study or garage.

One of the reasons garages have become an emblem of the innovator's workspace is precisely because they are not office cubicles or university labs; they're places where our peripheral interests have room to evolve. The garage is for the hacker, the tinkerer. It is where intellectual networks converge.

In his famous 2005 Stanford commencement speech, Steve Jobs - the great garage innovator of our time - talked about the creative power of stumbling into new experiences: dropping out of college and sitting in on a calligraphy class that would ultimately shape the graphic interface of the Macintosh; being forced out of Apple aged 30, which enabled him to launch Pixar into animated movies and create the NeXT computer. "The heaviness of being successful," Jobs explained, "was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life."

Yet there is a strange irony at the end of Jobs's speech. After documenting the ways that unlikely collisions and explorations can liberate the mind, he ended with a more sentimental appeal to be "true to yourself": "Don't be trapped by dogma - which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice."

If there's anything we know from the history of innovation, it is that being true to yourself is not enough. Better to challenge your intuitions, to make new connections than remain comfortably situated in the same routine. If you want to improve the world slightly, you need focus and determination; you need to stay within the confines of a field and open the new doors in the adjacent possible one at a time. But if you want to be like Ada, if you want to have an "intuitive perception of hidden things" - well, in that case, you need to get a little lost.

This is an edited extract from Steven Johnson's 'How We Got to Now: Six Innovations that Made the Modern World' (Particular Books/Riverhead)

Image: Universal History Archive/Universal Images Group/REX

© The Financial Times Limited 2014. All rights reserved.
FT and Financial Times are trademarks of the Financial Times Ltd.
Not to be redistributed, copied or modified in any way.
Euro2day.gr is solely responsible for providing this translation and the Financial Times Limited does not accept any liability for the accuracy or quality of the translation

ΣΧΟΛΙΑ ΧΡΗΣΤΩΝ

blog comments powered by Disqus
v