In 1994, as reported by Professor Peter Shor, PhD ’85, internal seminars at AT&T Bell Labs were lively affairs. The audience of physicists was an active and inquisitive bunch, often showering the speakers with questions during their presentations. Shor, who was then working at Bell Labs, recalls several occasions when a speaker couldn’t get past the third slide when trying to answer a series of questions before their time was up.
This year, when it was Shor’s turn to present an algorithm he had recently devised, the physicists paid close attention to Shor’s entire talk – and then some.
“Mine was going pretty well,” he told an MIT audience yesterday.
In that 1994 seminar talk, Shor presented a proof showing how a quantum system could be used to solve a given problem faster than a classical computer. This problem, known as the discrete logarithm problem, was known to be unsolvable by classical means. As such, discrete logarithms were then used as the basis for a handful of security systems.
Shor’s work was the first to show that a quantum computer could solve a real, practical problem. His talk caused a commotion in the seminar, and the news spread and was then merged. Four days after his first lecture, physicists across the country assumed that Shor had solved a related, albeit much more delicate problem: prime factorization – the challenge of finding the two prime factors of a very large number. Although some security systems use discrete logarithms, most encryption schemes today are based on prime factorization and the assumption that it is impossible to crack.
“It was like the child’s play ‘phone’ where the rumor spread that I figured out factoring,” says Shor. “And in the four days since [the talk]I had!”
By optimizing his original problem, Shor accidentally found a similar quantum solution to prime factorization. His solution, now known as Shor’s algorithm, showed how a quantum computer could factor very large numbers. Quantum computing, once conceived as a thought experiment, suddenly had operating instructions for a very real and potentially disruptive application in Shor’s algorithm. His work simultaneously sparked several new lines of research in the fields of quantum computing, information science, and cryptography.
The rest is history, the highlights of which Shor narrated to a standing room audience in MIT’s Huntington Hall, Rooms 10-250. Shor, the Morss Professor of Applied Mathematics at MIT, spoke as this year’s recipient of the James R. Killian, Jr. Faculty Achievement Award, the highest honor that the institute’s faculty can bestow on one of its members each academic year.
In the introduction to Shor’s lecture, Faculty Chair Lily Tsai quoted the award ceremony as saying:
“Without exception, all the faculties that nominated him commented on his vision, genius and technical prowess and commended him for the brilliance of his work,” Tsai said. “Professor Shor’s work shows that quantum computers have the potential to open up new ways of human thought and aspiration.”
A quantum story
During the hour-long talk, Shor guided the audience through a brief history of quantum computing and peppered the talk with personal memories of his own role. The story, he said, begins in the 1930s with the discovery of quantum mechanics — the physical behavior of matter at the smallest, subatomic scales — and the question that soon followed: why were quanta so strange?
Physicists grappled with the new description of the physical world, so different from the “classical” Newtonian mechanics known for centuries. Shor says physicist Erwin Schrödinger tried to make the new theory “absurd” with his now-famous cat-in-a-box thought experiment: How can it embody both states – dead and alive? The exercise challenged the idea of superposition, a key property of quantum mechanics that predicts that a quantum bit, like an atom, should hold more than one state at a time.
Even more sinister was the entanglement prediction, which assumed that two atoms could be inextricably linked. Any change to one should then affect the other, regardless of the distance separating them.
“Until Wiesner, no one thought of using this oddity to store information,” Shor said.
Wiesner was Stephen Wiesner, who was a graduate student at Columbia University in the late 1960s, who was later credited with formulating some of the fundamental principles of quantum information theory. Wiesner’s most important contribution was a paper that was initially spurned. He had proposed a way to create “quantum money,” or counterfeit-proof currency, by taking advantage of a strange property where quantum states cannot be perfectly duplicated — a prediction known as the “no-cloning” theorem.
As Shor recalls, Wiesner wrote his idea down on a typewriter, sent it to his colleagues for consideration, and was flatly rejected. It was only when another physicist, Charles Bennett, found the paper, “pulled it out of a drawer and published it,” that Wiesner’s role in the history of quantum computing was cemented. Going even further, Bennett realized that the basic idea of quantum money could be applied to develop a quantum key distribution scheme in which the security of a piece of information, such as a key, could be guaranteed. B. a private key shared between parties is protected by another strange quantum property.
Bennett worked out the idea with Gilles Brassard in 1984. The BB84 algorithm was the first protocol for a cryptosystem that relied entirely on the strange phenomena of quantum physics. Sometime in the 1980s, Bennett joined Bell Labs to introduce BB84. It was Shor’s first time hearing about quantum computing and he was hooked.
Shor first tried to come up with an answer to a question Bennett asked the audience: How can it be mathematically proven that the protocol is indeed secure? The issue was too sensitive, however, and Shor abandoned the question, if not the subject. Following the efforts of his colleagues in the growing field of quantum information science, he eventually landed on an article by physicist Daniel Simon that suggested something really strange: that a system of quantum computing bits could solve a given problem exponentially faster than a classical computer.
The problem itself was, as Simon put it, an esoteric one, and his work, like Wiesner’s, was initially rejected. But Shor saw something in its structure – specifically that the problem was related to the much more concrete problems of discrete logarithms and factorization. He worked from Simon’s starting point to see if a quantum system could solve discrete logarithms faster than a classical system. His first attempts were a draw. The quantum algorithm solved a problem just as quickly as its classical counterpart. But there were indications that things could get better.
“There’s still hope to try,” Shor recalls.
When he was working it out, he presented his algorithm for a quantum discrete log algorithm at the 1994 symposium at Bell Labs. In the four days since his lecture he also managed to develop his eponymous algorithm for prime factorization.
The response was overwhelming, but also skeptical, as physicists assumed that a practical quantum computer would break down immediately at the slightest noise, leading to a cascade of errors in its factorization calculation.
“I was concerned about this issue,” Shor said.
So he got back to work, looking for a way to correct errors in a quantum system without disturbing the state of the computing quantum bits. He found an answer by concatenation, generally referring to a series of connected events. In his case, Shor found a way to link qubits and store the information of one logical or computational qubit between nine highly entangled physical qubits. In this way, any error in the logical qubit can be measured and fixed within the physical qubits without having to measure (and therefore destroy) the qubit involved in the actual calculation.
Shor’s new algorithm was the first quantum error-correcting code, proving that a quantum computer can be fault-tolerant and therefore a very real possibility.
“The world of quantum mechanics is not the world of your intuition,” Shor said in conclusion. “Quantum mechanics is the world as it really is.”
The future of Quantum
Following his presentation, Shor answered several questions from the audience, including one that is driving tremendous efforts in quantum information science today: when will we see a real, practical quantum computer?
To accommodate a large number, Shor estimates that a quantum system would require at least 1,000 qubits. Millions of qubits would be required to accommodate the very large numbers underlying today’s internet and security systems.
“It’s going to take a whole bunch of years,” Shor said. “We may never build a quantum computer… but if someone has a great idea, maybe in 10 years we could see one.”
Meanwhile, he noted that as work in quantum computing has increased in recent years, so has work on post-quantum cryptography and efforts to develop alternative cryptosystems secure against quantum-based code-cracking. Shor likens these efforts to the scramble ahead of “Y2K” and the prospect of a digital catastrophe at the turn of the century.
“You probably should have started years ago,” Shor said. “If you wait until the last minute, when it’s clear that quantum computers will be built, you’ll probably be too late.”
Shor received his PhD from MIT in 1985 and then did a postdoctoral work at the Mathematical Sciences Research Institute in Berkeley, California. He then spent several years at AT&T Bell Labs and then AT&T Shannon Labs before returning to MIT in 2003 as a tenured faculty member.
Shor’s contributions have been recognized with numerous awards, most recently the 2023 Breakthrough Prize in Fundamental Physics, which he shared with Bennett, Brassard, and physicist David Deutsch. His other awards include the MacArthur Fellowship, the Nevanlinna Prize (now IMU Abacus Medal), the Dirac Medal, the King Faisal International Prize in Science, and the BBVA Foundation Frontiers of Knowledge Award. Shor is a member of the National Academy of Sciences and the American Academy of Arts and Sciences. He is also a Fellow of the American Mathematical Society and the Association for Computing Machinery.