This appears in my book A history of the Internet and the digital future – see kind words from Cory Doctorow, Marc Benioff, and others here.
On 12 February 1812, Lord Byron, perhaps the most outrageous and disreputable of the English poets, took the floor at the House of Lords to begin his maiden speech. A bill had recently been introduced that would impose a death penalty in response to the Luddites, the textile artisans rioting in opposition to the industrial revolution and wreaking mechanized looms. Byron made his maiden speech in defense of the artisans and decried industrialization. It might seem odd then that Byron’s daughter should be in the avant garde of the next wave of disruptive technologies –computing. Odder still, considering the stereotype of programmers: Byron himself was a promiscuous bi-sexual, the most flamboyant figure of the romantic movement, constantly in debt, and ever surrounded by scandal. Yet his only legitimate daughter was a mathematical genius and would be remembered as history’s first computer programmer.
Mathematics, it is fair to say, was not among the many passions that convulsed Byron’s short life. In Don Juan, Byron’s famous long poem, he used the character Donna Inez to portray the cold, mathematical bent of his wife, Anne Milbanke, which he had found so insufferable. In his eyes her ‘favourite science was… mathematical; her thoughts were theorems; she was a walking calculation’. When she legally separated from him, Milbanke privately cited a number of sins including incest with his half sister, Augusta Leigh, after who Milbanke and Byron’s daughter, Ada, had been named, and who, confusingly, it is speculated may have borne him a third child. As he lay dying of a fever in 1824, Lord Byron’s letter to his separated wife rested, unsent and unfinished, upon his desk. He had written to thank her for sending him a letter describing Ada, who he did not know. His daughter, Byron had read, was a budding mathematical genius.
Eleven years after the death of her father, Ada first met Charles Babbage, who between 1837 and 1871 worked on his ‘analytical engine’. This was a hypothetical machine that could be used to perform various calculations, and which anticipate features that would appear again, a century later, in the first generation of computers. The analytical engine was also the first machine designed not to perform a specific task, but to perform any that a user should choose. An Italian mathematician, General Menabrea, published a Sketch of the Analytical Engine Invented by Charles Babbage, Esq. in 1842. Ada translated the text into English, and Babbage suggested that she should add some notes. The notes she added were longer than Menabrea’s original text, and included a detailed description of how, using the analytical engine if one were built, an operator could calculate the Bernoulli numbers, a sequence of numbers relevant to number theory in mathematics. This description has been cited as the world’s first computer program, and Ada, daughter of the defender of the Luddites, as the world’s first programmer.
Nice teaser, Ada Lovelace is such a fascinating character. Looking forward to the book!
Dave – thanks. I’m trying to figure out whether to leave this chunk in as we speak. I’ll take this as a vote to keep it.