AIBig TechBreaking NewsClaude Shannoninformation processingmathematicsSocietyUncategorized @usUS

Meet the man who invented the future

About a decade ago, as my home city of Austin’s big tech transformation really got underway, I started to wonder if I understood where the modern world had come from. I could articulate the core tenets of all the major religions, discuss the history of ideas from Plato to Derrida, and even describe, in detail, the many line-ups of prog rock giants. Yet, besides a basic overview of Newton, the Industrial Revolution and the periodic table acquired in school, I was rather vague on the history of science and technology. I was one half of C.P. Snow’s “two cultures” prophecy writ large, and in this I was not unusual: most people I knew with an arts education knew little about science, and vice versa.

What did the people coding away in the glass towers know that I did not? What did they actually do, these data scientists and developers — and why did it pay so much better than what I did? Initially, I agreed with the argument, sometimes aired in The Guardian and other organs of correct thinking, that if our tech overlords were going to have such an outsize impact on society, they had a duty to study literature, history and philosophy. But then I realised that I knew nothing of their area of expertise, and was hardly in a position to criticise.  

And so I began to read books about technology and the history of computing. Much to my surprise, they were very interesting. I learned about Charles Babbage and his Difference Engine; Alan Turing and his imaginary machines; and ENIAC, the first computer, which had to be literally “debugged” because the heat and light emitted by its 17,468 vacuum tubes attracted moths. I learned, too, about the 1947 invention of the transistor and “Moore’s Law”, which held that the processing power of a microchip would double every two years, even as costs remained relatively stable, a prediction that proved accurate for decades. 

The biggest surprise, however, was discovering that many of the core concepts underpinning the information age were the work of one man: Claude Shannon. Every time you send a text, make a phone call, search the internet or stream music or video, you rely on ideas he was the first to develop. In terms of impact, he appeared to me to be on a par with Einstein (at least). Yet while Einstein’s name is synonymous with genius, Shannon’s is unknown to the broad public. His first biography wasn’t published until 2017, almost two decades after his death and 80 years after he published the first of two major works that would revolutionise how we do, well, almost everything. 

The childhoods of people who change the world always seem mundane in contrast to their later achievements, and in this, Shannon was typical. Born in 1906, he grew up in Gaylord, Michigan, a small town surrounded by farmland. His father was a furniture maker and probate judge, his mother a teacher. As a child, he enjoyed tinkering with radios, and went on to study electrical engineering and mathematics at the University of Michigan. In 1936, he took a position as a research assistant at MIT, where he learned to program a “Differential Analyser”, a mechanical contraption the size of a room that solved equations with shafts, gears, wheels and discs. The next year, he worked as an intern for the US telephone monopoly AT&T, which had been established by Alexander Graham Bell 60 years earlier. 

Shannon now started to theorise about a different kind of thinking machine. The relay circuits in the telephone systems at AT&T were comprised of switches that could be switched on, allowing electrical current to flow, and off, stopping the current. Shannon then made a conceptual leap, applying a system of logical reasoning, developed by the 19th-century English mathematician George Boole, to circuitry. Boole’s binary system was based on true or false questions, and Shannon proposed that if an open circuit was used to represent one and a closed circuit to represent zero, then circuits could be used to perform logical tasks such as decision making and processing rules. In 1937, he articulated this idea in his master’s thesis, “A Symbolic Analysis of Relay and Switching Circuits”. Though no one realised it at first, he had laid the theoretical groundwork for digital computing. Today, it is recognised as the most influential master’s thesis of all time. Shannon was 21 years old.

“Though no one realised it at first, he had laid the theoretical groundwork for digital computing.”

In 1941, Shannon took a job at Bell Labs, the research wing of AT&T. During the war, he applied his mathematical skills to anti-aircraft targeting and was part of the team that developed the “X-system” which was used to encrypt communications between Churchill and Roosevelt. This work was significant in itself, but it was after the war that he continued the work of establishing the information age. In “A Mathematical Theory of Communication” Shannon set out to solve the problem of “reproducing at one point either exactly or approximately a message selected at another point”. The problem was that analogue signals, such as the electrical waves used in telecommunications systems, were susceptible to noise build up: static, errors, interference and so on. But boosting the signal also boosted the noise, which had the effect of rendering the original message unintelligible when it arrived at the other end — for instance in a long-distance phone call.

Shannon’s solution was to shift the focus from waves and current and to rethink the problem: what, exactly, was being conveyed in a message? At first, he referred in a letter to the “transmission of intelligence” before settling on the word “information”. He coined the term bit (from “binary digit”) to describe the smallest unit of measure. Shannon showed that all information — images, text, sound — could be broken up into bits, transmitted as zeros and ones, and then reconstructed clearly at the other end, no matter the medium, whether phone lines or radio waves.

So ubiquitous now is the idea of measuring information digitally that it is difficult to believe it is so recent. Shannon published his magnum opus in The Bell System Technical Journal, in 1948, and his colleagues were quick to see its significance; in the early Sixties AT&T moved the US telephone network to a fully digital system. Meanwhile, researchers at Bell Labs had invented the transistor in 1947, which would make the binary logic circuits Shannon had theorised about a decade earlier a practical reality. The age of the computer now began in earnest. And as computer and communications technology advanced, so more and more information could be created, processed, and manipulated: all at ever faster speeds. Shannon had paved the way for everything from the moon landings to the guided missiles hitting their targets with unprecedented accuracy in Operation Desert Storm; to the rise of the internet and social media; to the vast fortunes of Bezos, Gates, Zuckerberg and Musk. 

When Shannon died of Alzheimer’s in 2001, it seemed that his theory had already transformed the world — even DNA could be measured in bits. Yet, in many ways, the information age still resembled its analogue predecessor. Digital information was still stored on physical artefacts such as CDs, DVDs, floppy discs and hard drives, just as analogue had paper, tape and vinyl. A shift was taking place, however, as information became increasingly disembodied. Print increasingly gave way to websites; music and film moved online and could be downloaded as torrents of bits. Then, as technology advanced, you didn’t even need to download any of it anymore. Information was everywhere and nowhere, afloat in the ether, accessible through phones, computers, smart TVs and smart fridges — even sunglasses.

We are, of course, still grappling with the consequences of this shift. Few doubt that putting everybody in touch with everybody else instantaneously has driven us all a bit mad. Other changes are more subtle. In “A Mathematical Theory of Communication”, Shannon had declared that meaning was irrelevant to the “engineering problem”. This was not intended as a value judgement, but as all forms of media are reduced to zeroes and ones and made instantly available, it is hard not to think that a certain flattening has occurred. In the age of digital streaming, everything from the Lord of the Rings to Looney Tunes cartoons to Bach’s Goldberg Variations and the songs of Taylor Swift have been equally reduced to “content” — always on, yet also easy to switch off. But more than that, digital information is easily adapted and modified so that the “content” itself has become unstable, increasing its sense of weightlessness. Anyone can modify a digital file, and with the advent of AI, also fundamentally shaped by Shannon’s theories, we can remix, reshape and transform simply by describing what we want to a model.  

And so the revolution set in motion by Shannon continues, though we may already have passed through its most radical stage. This occurred during the pandemic when, locked in our homes, we attempted to become information ourselves, like characters in a William Gibson novel. Gazing into our webcams, our voices and images were captured and broken down into bits, transmitted and then reconstructed as digital simulacra on screens elsewhere. As society attempted to render itself virtual, everything was digitised, from church services, to school classes, to wine tastings and even funerals. So excited was Mark Zuckerberg by the prospect that in 2021 he proclaimed we were going to live in the Metaverse, and that “immersive digital worlds” would “become the primary way that we live our lives and spend our time”. But this was also the moment that we started to learn the limits of life-as-information. Zoom fatigue kicked in and the Metaverse quickly became a punchline its Reality Labs division has reportedly lost over $60 billion since 2020. It turned out that we did have bodies after all. 

Ironically, the Father of the Information Age himself may have something to teach us here. For all that Shannon dealt in abstractions, he was deeply rooted in the physical world. He loved jazz and played the clarinet; he taught himself to juggle and ride a unicycle, two skills that require coordination and rhythm. He liked to build machines: at Bell Labs he created a robot mouse named Theseus that could find its own way through a maze. Later, he lived with his wife and three children in a Victorian mansion with a view of Mystic Lake in Winchester, Massachusetts, and built a chairlift to carry them there. His workshop was full of projects that reflected a playful imagination, including a flame-throwing trumpet. A modest man uninterested in fame, he found joy and fulfilment in solving interesting problems and making things. In 1936, T.S. Eliot asked: “Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?” Perhaps Shannon knew all along.


Source link

Related Posts

1 of 58