How US nuclear testing moratorium launched a supercomputing revolution

30 years ago, on September 23, 1992, the United States led its 1,054. nuclear weapons test.

When this test, dubbed the Divider, was detonated underground in the Nevada desert in the early hours of the morning, no one knew it would be the last US test for at least the next three decades. But by 1992, the Soviet Union had officially dissolved, and the United States government enacted what was then considered a short-term moratorium on testing, which continues to this day.

This moratorium had an unexpected benefit: nuclear weapons were no longer tested ushered in a revolution in high-performance computing that is having far-reaching implications for national and global security that few are aware of. The need to maintain our nuclear weapons without testing created an unprecedented need for increased scientific computing power.

At Los Alamos National Laboratory in New Mexico, where the first atomic bomb was built, our primary responsibility is to maintain and verify the safety and reliability of the nuclear stockpile. For this we use non-nuclear and subcritical experiments coupled with advanced computer modeling and simulations Assessing the Health and Life Extension of America’s Nuclear Weapons.

But as we all know, the geopolitical landscape has changed in recent years, and while nuclear threats still loom, a host of other emerging crises threaten our national security.

Pandemics, rising sea levels and eroding coastlines, natural disasters, cyberattacks, the spread of disinformation, energy shortages – we have seen firsthand how these events can destabilize nations, regions and the globe. At Los Alamos, we use high-performance computers developed over decades to simulate nuclear weapon explosions with exceptionally high fidelity to counter these threats.

READ :  Missouri and Illinois geospatial researchers are teaming up to work on a supercomputer | KCUR 89.3

When the Covid pandemic first took hold in 2020, our supercomputers were used to forecast the spread of the disease, as well as the launch of mockup vaccines, the impact of variants and their spread, counties at high risk of vaccination delay, and impacts of various Vaccine Distribution Scenarios. They also helped assess the impact of public health orders, such as face masks to stop or slow the spread.

The same computing power is used to better understand DNA and the human body at a fundamental level. Researchers at Los Alamos created the largest-ever simulation of an entire DNA gene, a feat that took a billion atoms to model and will help researchers better understand and design cures for diseases like cancer.

What are supercomputers used for at Los Alamos?

The lab also harnesses the power of secure, classified supercomputers to study the impact of climate change on national security. For years, our climate models have been used to predict the Earth’s responses to change with increasing resolution and accuracy. But the usefulness of our climate models to the national security community has been limited. That is changing with recent advances in modeling, increased resolution and computational power, and the coupling of climate models with infrastructure and impact models.

Now we are able to use our computing power to look at climate changes in areas of interest at extraordinarily high resolution. Because the work takes place on secure computers, we don’t tell potential adversaries exactly where (and why) we’re looking. In addition, by using these supercomputers, we can integrate classified data into the models, which can further increase accuracy.

READ :  Hot Stocks: Energy Stocks Rise; CALM sets 52-week high; NTNX, FATE sink

Supercomputers at Los Alamos are also used for earthquake forecasting, assessing the impact of eroding coastlines, wildfire modeling, and a host of other national security-related challenges. We also use supercomputers and data analytics to optimize our non-proliferation threat detection efforts.

Of course, our laboratory is not alone in these efforts. The Department of Energy’s other labs are using their supercomputing power to address similar and additional challenges. Likewise, private companies that are pushing the boundaries of computing are also helping to advance the national security-focused computing effort, as is the work of our nation’s top universities. As the saying goes, a rising tide lifts all boats.

And we owe at least in part the moratorium on nuclear weapons testing. Little did we know 30 years ago how much we would benefit from the supercomputing revolution that would follow. As a nation, cContinued investment in supercomputing ensures not only the safety and effectiveness of our nuclear stockpile, but also advanced scientific exploration and discovery that benefits us all. Our national security depends on it.

Bob Webster is the Assistant Director of Weapons at Los Alamos National Laboratory. Nancy Jo Nicholas is Associate Laboratory Director for Global Security, also based in Los Alamos.

have an opinion?

This article is an op-ed and the opinions expressed are those of the author. If you would like to respond or submit an editorial of your own, please email C4ISRNET Senior Managing Editor Cary O’Reilly.