Artificial intelligence and what it owes a man who never sits down

I last sat down in 2005,” Geoffrey Hinton often says, “and it was a mistake.” In the 17 years since, Hinton has never sat down; his severe back problems prevent him from doing so. He travels only by train or car, so he can sprawl across the seats. He cannot fly commercial, since airlines insist on being seated for take-off or landing. He eats “like a monk on the altar”, using a foam cushion to kneel at a table. With his trademark wry British humour, he talks of his back being “a long-standing problem”. In these 17 years, Hinton, working from the University of Toronto, has also transformed artificial intelligence (AI). He rescued neural networks back from an AI winter, ‘invented’ deep learning, tutored a bevy of geniuses now at the bleeding edge of AI, and won the fabled Turing Award while he was at it.

I first came across the legend of Hinton in a fabulous book by Cade Metz called Genius Makers, where he detailed the lives of those who shaped AI, foremost among them being Hinton. After studying psychology at Cambridge and AI at the University of Edinburgh, Hinton went back to something which had fascinated him even as a child: How the human brain stored memories, and how it worked. He was one of the first researchers who started working on ‘mimicking’ the human brain with computer hardware and software, thus constructing a newer and purer form of AI, which we now call ‘deep learning’. He started doing this in the 1980s, along with an intrepid bunch of students. His PhD thesis, titled Deep Neural Networks for Acoustic Modelling in Speech Recognition, demonstrated how deep neural networks outclassed older machine learning models like Hidden Markovs and Gaussian Mixtures at identifying speech patterns. He literally invented ‘backpropagation’, which was reportedly one of the concepts that inspired Google’s BackRub search algorithm, the core of its exemplary service.

“I get very excited when we discover a way of making neural networks better—and when that’s closely related to how the brain works,” says Hinton. By mimicking the brain, he sought to get rid of traditional machine learning techniques, where humans would label pictures, words and objects; instead, his work copied the brain’s self-learning techniques. He and his team built “artificial neurons from interconnected layers of software modelled after the columns of neurons in the brain’s cortex. These neural nets can gather information, react to it, build an understanding of what something looks or sounds like” (bit.ly/3LRJwWo ). The AI community did not trust this new approach; Hinton told Sky News that it was “an idea that almost no one on Earth believed in at that point—it was pretty much a dead idea, even among AI researchers”.

Well, that sentiment has changed. Deep Learning has been harnessed by Google, Meta, Microsoft, DeepMind, Baidu and almost every other tech firm to build driverless cars, predict protein folding and beating humans at Go. Of Hinton’s students, Yann LeCun now leads Meta’s AI efforts, Yoshua Bengio is doing seminal work at University of Montreal, Ilya Sutskevar co-founded OpenAI, famous for GPT-3. Hinton himself works part time for Google, the result of a frenzied bidding war between Google, Microsoft and Baidu, where he auctioned his company (and his services) to Google for $44 million—the stuff of legend in itself. Deep learning is now considered one of the most exciting developments in AI. It is regarded as the surest bet that AI will achieve artificial general intelligence, or AGI. As Hinton put it: “We ceased to be the lunatic fringe. We’re now the lunatic core.”

Hinton comes from a formidably intellectual and academic family. His mother used to tell him to “be an academic or be a failure”. His great-great grandfather was George Boole, who invented Boolean logic and algebra, the foundation of modern computers. George’s wife Mary was a well-known teacher of algebra and logic. Mary’s uncle was George Everest, and as the Surveyor General of India, had the world’s highest peak named after him. Geoffrey’s great grandfather, a renowned mathematician, created the concept of the ‘fourth dimension’, and first drew the tesseract, and his cousin, Joan, a nuclear physicist was one of the few women to work on the Manhattan Project. His father, Howard Hinton, a formidable entomologist and a fellow of the Royal Society, often told him, “Work really hard and maybe when you’re twice as old as me, you’ll be half as good.” Geoffrey did work hard, became the godfather of deep learning, a Turing Award winner and a fellow of the Royal Society. And he is not sitting on his laurels.

Jaspreet Bindra is founder of Tech Whisperer Ltd, a digital transformation and technology advisory practice.

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.