Current:Home > reviewsJohnathan Walker:Supercomputers, Climate Models and 40 Years of the World Climate Research Programme -Zenith Investment School
Johnathan Walker:Supercomputers, Climate Models and 40 Years of the World Climate Research Programme
Poinbank View
Date:2025-04-07 08:03:03
CHEYENNE,Johnathan Walker Wyoming — On the western rim of the Great Plains, a futuristic building looks out of place on the treeless range. The NCAR-Wyoming Supercomputing Center is indeed remote, two continents away from the weather forecasts it produces each day for Antarctica and millions of miles from the space weather it tracks.
And yet, it is surprisingly connected to the rapidly evolving story of Earth’s changing climate.
The supercomputer inside the concrete and steel building, dubbed Cheyenne, is one in a global network of high-capacity computers that run vast sets of calculations to simulate the workings of our planet.
Since the 1970s, when early computer models of the climate were emerging, they’ve been remarkably successful at projecting global temperature changes as greenhouse gas emissions rise. Over the years, the computers have become faster and more powerful, enabling the models to incorporate more of the influences on the climate and zoom in with finer-scale resolution to depict local climate changes.
Accurately gauging the risks posed by climate change at a local level is among the holy grails of climate modeling, and world-changing advances likely await the next generation of ultrafast supercomputers. At only three years old, Cheyenne is already set to be replaced by a successor with triple its speed or better.
Antonio Busalacchi, president of the University Corporation for Atmospheric Research, the consortium that oversees NCAR, which operates the Wyoming supercomputer in partnership with the National Science Foundation, is leading the team that is working on procuring the next Cheyenne.
Their goal for the new supercomputer is to understand our changing climate in ways that will help the world prepare for and adapt to the effects of rising global temperatures before it’s too late. That could mean knowing how much sea level will rise along each mile of coastline in 10, 20, 50 years, and how changes in precipitation patterns will affect water availability and drought in every community in the West, or the precise path and intensity of hurricanes before they strike. It’s what Busalacchi describes as “actionable science.”
“In order to deliver on that promise, we need to have better predictive tools, he said. “We’re really at sort of a juncture in our future of being able to predict the Earth as a coupled system to help answer these pressing questions that society is asking of us.”
40 Years of World Climate Research Programme
Busalacchi is the former science steering committee leader at the World Climate Research Programme, which helps set future climate research priorities and coordinates international research among thousands of scientists to target the most pressing climate science problems at hand.
This weekend, he will join 150 of the world’s leading climate experts at a symposium in San Francisco to celebrate the program’s 40th anniversary and to plan its research strategy for the next decade. It coincides with the centennial of the American Geophysical Union (AGU), which hosts the largest gathering of Earth, space and climate scientists. Like the supercomputer upgrade in Wyoming, the meeting is taking place at a critical juncture in the history of climate science, with a growing sense of looming crisis around the world.
The World Climate Research Programme has laid a path for innovation in climate modeling since it was founded in 1980 by the World Meteorological Organization and the International Council for Science. It formed with a long-range objective to seek “a better understanding of the climate system and the causes of climate variability and change.”
The program has since helped spawn some of the world’s most important climate research, from efforts to understand monsoons in Africa to future sea ice cover in the Arctic and adapting food systems to cope with global change. It targets information gaps that no single country is likely to fill on its own by bringing together scientists and computing power from around the world.
Computer modeling is at the heart of much that work.
Climate models are based on the physics, chemistry and biology of the natural world and assumptions about factors that have and will affect the climate, such as levels of greenhouse gas emissions. It turns out, they have been remarkably accurate in projecting global temperature rise dating back to the 1970s, when computing power was a fraction of what scientists have to work with today.
Earlier this week, a group of scientists published a peer-reviewed paper comparing the early climate models published between 1970 and the mid-2000s with what actually happened. They found that 14 of the 17 early models’ projections about temperature change as emissions rise were almost indistinguishable from the observed record.
Today’s computer models are far more complex, requiring supercomputers to account for everything from the forces melting Antarctica’s ice to the impact of vegetation on temperature and moisture. But there are still uncertainties, such as how aerosols impact cloud formation that could affect temperature and how and when tipping points such as loss of sea ice or thawing of permafrost will trigger faster global changes.
The next generation models—running on even more powerful supercomputers—are being designed to incorporate more detail to help answer increasingly difficult questions.
From Billions of Computations to Quintillions
Busalacchi started his career as an oceanography graduate student, and in the late 1970’s, his research took him to NCAR’s Boulder campus to use the first non-classified supercomputer.
The supercomputer he worked on as a graduate student performed about 160 million computations a second. Considered revolutionary at the time, it could model oceans, land, vegetation or the atmosphere—but not all at the same time.
In contrast, the world’s soon-to-be fastest computer coming online at the Oak Ridge National Laboratory in 2021 will perform 1.5 quintillion computations per second. Not only will computers of the future be capable of processing information on everything from the upper atmosphere to ocean currents and all the details in between—such as sea spray, dust, ice sheets and biogeochemical cycles—but it will be able to do so while capturing the ways humans influence the climate and how climate change influences humans.
Busalacchi uses the first report of the Intergovernmental Panel on Climate Change as an example of how earlier climate science offered only fuzzy portrayals of complex climate systems. He recalls how models, based on the data computers were able to process at the time, generated maps with grid squares so big they represented much of Western Europe and lacked sufficient detail to show the British Isles or the Alps.
“Then, over the succeeding decades, the resolution got smaller and smaller and smaller,” he said, which enabled scientists to distinguish the region’s mountain ranges and river valleys.
Global Race for High-Performance Computing
While many programs focused on the environment have been targeted for funding cuts under the Trump administration, that hasn’t been true for high-performance computers needed for climate research.
Busalacchi said the National Science Foundation, the largest source of NCAR’s funding, has provided around $99 million annually over the past few years to cover its research, 1,300 employees and the NCAR-Wyoming Supercomputing Center. And there’s not much concern about funding for the $30-40 million next-generation Cheyenne.
One factor driving the momentum behind high-performance computing is fierce international competition. According to the latest top-500 list for supercomputers, U.S. national laboratories host the world’s two most powerful supercomputers, with China holding the next two spots. It’s a contest that’s so intense, NCAR isn’t really trying to keep up in Wyoming. Cheyenne started in 20th place when it came online three years ago; last month it was 44th.
John Holdren, a Harvard University professor and science advisor to former president Barack Obama says there’s another important reason supercomputing gets funding support: It’s not only scientists who prize it but business and government leaders, too, who want to use it for studying genomics, energy and other complex scientific problems.
“The reason we need even better computing than we already have, is that—for the purposes of adaptation, for taking steps that reduce the damage from the changes in climate that we can no longer avoid—we need more localized information,” he said.
“We’ve only recently gotten models good enough to reliably say how what happens in South Florida is going to differ from what happens in North Florida, how what happens in Iowa is going to differ from what happens in Nebraska,” Holdren said. “So, just in the climate change domain, there’s a very strong argument for continuing to increase the computing power.”
NCAR will be looking for a replacement for Cheyenne this spring, once major vendors have new technologies in the pipeline.
“We want to make sure that when that procurement goes out, we can take advantage of the latest and greatest technology with respect to high-performance computing,” Busalacchi explained.
The fact that even the next Wyoming supercomputer won’t have a spot at the top of the world top-500 list doesn’t trouble Anke Kamrath, director of NCAR’s Computational and Information Systems Laboratory.
She calls the supercomputer contest a fight over “macho-flops” that doesn’t really capture all of the qualities that make a supercomputer valuable. Avoiding data bottlenecks, is important too, as is data storage.
‘Trying to Minimize What We Don’t Know’
At the NCAR-Wyoming Supercomputing center this fall, visitors peered through a viewing window to see the wiry guts of Cheyenne’s 1.5 petaflop predecessor, Yellowstone. Before being dismantled for sale as surplus government property, Yellowstone a third as powerful as Cheyenne while taking up more than twice as much space.
During its short life, Cheyenne’s processors have whirred dutifully through nearly 20 million jobs. Some 850 projects are currently underway, moving data along wires hidden under the floor to storage stacks that scientists around the world are accessing via the web.
Busalacchi said the modeling, observations and high-performance computing are all essential tools that climate scientists need to address the urgent challenges ahead.
“We’re trying to minimize what we don’t know,” he said.
Correction: This story has been updated to correct the description of the 1970s-era computer to 160 million computations per second.
veryGood! (175)
Related
- Brianna LaPaglia Reveals The Meaning Behind Her "Chickenfry" Nickname
- These Weekend Bags Under $65 Look So Much More Expensive Than They Actually Are
- How Justin Bieber and Pregnant Hailey Bieber's Family Reacted to Baby News
- The Token Revolution of DAF Finance Institute: Issuing DAF Tokens for Financing, Deep Research, and Refinement of the 'Ai Profit Algorithms 4.0' Investment System
- Toyota to invest $922 million to build a new paint facility at its Kentucky complex
- New Jersey Sen. Bob Menendez on testifying at his bribery trial: That's to be determined
- Arkansas lawmakers adjourn session, leaving budget for state hunting, fishing programs in limbo
- 'He just wanted to be loved': Video of happy giraffe after chiropractor visit has people swooning
- How to watch new prequel series 'Dexter: Original Sin': Premiere date, cast, streaming
- Telescope images reveal 'cloudy, ominous structure' known as 'God's Hand' in Milky Way
Ranking
- 'Vanderpump Rules' star DJ James Kennedy arrested on domestic violence charges
- New Hampshire man sentenced to minimum 56 years on murder, other charges in young daughter’s death
- A $400 pineapple? Del Monte brings rare Rubyglow pineapple to US market in limited numbers
- UC president recommends UCLA pay Cal Berkeley $10 million per year for 6 years
- Tom Holland's New Venture Revealed
- How long does it take for a college degree to pay off? For many, it's 5 years or less.
- Several people detained as protestors block parking garage at Massachusetts Institute of Technology
- OPACOIN Trading Center: Merging Real-World Assets with Cryptocurrencies, Opening a New Chapter
Recommendation
What to watch: O Jolie night
Fight over foreign money in politics stymies deal to assure President Joe Biden is on Ohio’s ballot
Biden administration will seek partial end to special court oversight of child migrants
Baby Reindeer's Alleged Stalker Fiona Harvey Shares Her Side of the Story With Richard Gadd
Have Dry, Sensitive Skin? You Need To Add These Gentle Skincare Products to Your Routine
Racial bias did not shape Mississippi’s water funding decisions for capital city, EPA says
Scores of starving and sick pelicans are found along the California coast
Alabama schedules nitrogen gas execution for inmate who survived lethal injection attempt