When Liu He, a Chinese economist, politician, and “chip tsar,” was tapped to lead the charge in a chip-making arms race with the United States, his message hung in the air, leaving a dewy glow of tension: “For ours In this country, technology isn’t just for growth…it’s a matter of survival.”
Once upon a time, the United States’ early technological prowess enabled the nation to outperform foreign competitors and create a competitive advantage for domestic companies. But 30 years later, America’s lead in advanced computing continues to dwindle. What happened?
A new report by an MIT researcher and two colleagues sheds light on the decline of US leadership. Scientists looked at common measures to study shrinkage: overall performance, supercomputers, applied algorithms, and semiconductor manufacturing. Through their analysis, they found that not only has China closed the computational gap with the US, but that nearly 80 percent of American market leaders in this field believe their Chinese competitors are improving their skills faster – which, the team says, indicates a ” broad threat” indicates US competitiveness.”
To dig deep into the fray, the scientists conducted the Advanced Computing Users Survey, surveying 120 high-level organizations including universities, national laboratories, federal agencies and industry. The team estimates that this group comprises one-third and one-half of all major computer users in the United States.
“Advanced computing is critical to scientific advancement, economic growth, and the competitiveness of US companies,” said Neil Thompson, director of the FutureTech research project at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), who helped lead the study has.
Thompson, who is also a senior researcher at MIT’s Digital Economy Initiative, co-authored the paper with Chad Evans, Executive Vice President and Secretary and Treasurer of the Board of the Council on Competitiveness, and Daniel Armbrust, the co-founder. initial CEO and board member at Silicon Catalyst and past president of SEMATECH, the semiconductor consortium that developed industry roadmaps.
The gold mine for semiconductors, supercomputers and algorithms
Supercomputers – the room-sized “giant computers” of the hardware world – are an industry no longer dominated by the United States. By 2015, about half of the top performing computers were firmly based in the US, and China was slowly growing from a very slow base. But in the last six years, China has caught up quickly and almost drawn level with America.
This vanishing trace is important. 84% of US respondents said they are computationally limited when running critical programs. “This result is revealing when you consider who our respondents are: the vanguard of American research firms and academic institutions with privileged access to advanced national supercomputing resources,” says Thompson.
In terms of advanced algorithms, the US has historically led the way, with two-thirds of all significant improvements being dominated by US-born inventors. But in recent decades, US dominance in algorithms has relied on attracting foreign talent to work in the US, which the researchers say is now in jeopardy. China has surpassed the US and many other countries in the production of graduate students in STEM subjects since 2007, with one report positing a near future (2025) when China will host nearly twice as many graduate students as the US China’s rise of algorithms can also be seen with the “Gordon Bell Prize”, an award for outstanding work in harnessing the power of supercomputers for diverse applications. In the past, US winners have dominated the award, but China has matched or surpassed the performance of the Americans over the past five years.
While the researchers note that the CHIPS and Science Act of 2022 is a critical step in restoring the foundation for success for advanced computing, they propose policy recommendations to the US Office of Science and Technology.
First, they propose to democratize access to US supercomputing by building more mid-tier systems that push the boundaries for many users, as well as building tools so users scaling computations need less upfront investment in resources. They also recommend expanding the pool of innovators by training many more electrical engineers and computer scientists with long-term residency incentives and scholarships in the US. Finally, in addition to this new framework, scientists are pushing to leverage what already exists by giving the private sector access to experimentation with high-performance computing through supercomputing sites in academia and in national laboratories.
All that and a bag of chips
Computing improvements depend on continued advances in transistor density and performance, but developing robust new chips requires a harmonious mix of design and manufacturing.
For the past six years, China hasn’t been known as an expert on notable chips. In fact, the US has designed most of them over the past five decades. However, that all changed in the last six years as China developed the HiSilicon Kirin 9000 and pushed its way to the international frontier. This success was primarily achieved through partnerships with leading global chip designers, beginning in the 2000s. In the meantime, 14 companies in China are among the world’s 50 best fabless designers. A decade ago there was only one.
Competitive semiconductor manufacturing has been more mixed, where US-led policies and internal execution issues have slowed China’s rise, but as of July 2022, Semiconductor Manufacturing International Corporation (SMIC) has evidence of 7-nanometer logic that was not expected until much later. However, given extreme ultraviolet export restrictions, progress below 7nm means the domestic technology would be expensive to develop. Currently, China is equal or better in only two out of 12 segments of the semiconductor supply chain. Still, with government policies and investments, the team expects a whopping increase to seven segments in 10 years. So for now, the US retains the lead in hardware manufacturing, but with a smaller advantage.
The authors recommend that the White House Office of Science and Technology Policy work with key national agencies, such as the US Department of Defense, the US Department of Energy, and the National Science Foundation, to define initiatives to build the hardware and software systems necessary for critical computing -Paradigms and workloads critical to economic and security goals. “It’s critical that American companies can take advantage of faster computers,” says Thompson. “As Moore’s Law slows down, the best way to do that is to create a portfolio of specialized chips (or ‘accelerators’) tailored to our needs.”
The scientists also believe that four areas need to be addressed in order to spearhead the next generation of computers. First, by posing major challenges to the CHIPS Act National Semiconductor Technology Center, researchers and start-ups would be motivated to invest in R&D and seek seed capital for new technologies in areas such as spintronics, neuromorphics, optical and quantum computing, and optics. By helping allies pass similar legislation, overall investment in these technologies would increase and supply chains would become more aligned and secure. Establishing testbeds for researchers to test algorithms on new computer architectures and hardware would provide an essential platform for innovation and discovery. Finally, the planning of post-exascale systems that achieve higher levels of performance through next-generation advances would ensure that current commercial technologies do not limit future computing systems.
“The advanced computing landscape is rapidly changing – technologically, economically and politically, with new opportunities for innovation and increasing global rivalries,” said Daniel Reed, presidential professor and professor of computer and electrical and computer engineering at the University of Utah . “The transformative insights from both deep learning and computational modeling depend both on continued semiconductor advances and their instantiation in cutting-edge, large-scale computing systems—hyperscale clouds and high-performance computing systems. Although the US has historically been a world leader in both advanced semiconductors and high-performance computing, other nations have recognized that these capabilities are an essential part of 21st-century economic competitiveness and national security, and they are investing heavily.”
The research was funded in part by Thompson’s grant from Good Ventures, which supports his FutureTech Research Group. The paper is published by the Georgetown Public Policy Review.