Simulated Shake-Up Reveals Some Answers
The magnitude 8.0 simulation could guide emergency planning for Southern California.
Seismologists have long asked not if, but when ‘The Big One’ will strike Southern California. Just how big will it be, and how will the amount of shaking vary throughout the region?
Now researchers are closer to finding out the answer to the second part of that question, and helping prepare the Golden State’s emergency response teams to better cope with such a disaster.
Researchers at San Diego State University and the San Diego Supercomputer Center (SDSC) at the University of California San Diego have created the largest-ever simulation of a magnitude 8.0 (M8) earthquake. M8 earthquakes are capable of tremendous damage; the 1994 Northridge earthquake was a magnitude 6.7 and caused billions of dollars in property damage.
The simulation is primarily along the southern section of the San Andreas Fault. About 25 million people reside in that area, which extends as far south as Yuma, Ariz., and Ensenada, Mexico, and runs up through Southern California to as far north as Fresno.
New insight into San Andreas Fault
“The scientific results of this massive simulation are very interesting, and its level of detail has allowed us to observe things that we were not able to see in the past,” said Kim Olsen, professor of geological sciences at SDSU and the study's lead seismologist.
“For example, the simulation has allowed us to gain more accurate insight into the nature of the shaking expected from a large earthquake on the San Andreas Fault.”
SDSC provided the high-performance computing and scientific visualization expertise for the simulation, while the Southern California Earthquake Center at the University of Southern California was the lead coordinator in the project. The scientific details of the earthquake source were handled by researchers at SDSU, and the Ohio State University was also part of the collaborative effort.
Gordon Bell Prize finalist
The research was selected as a finalist for the Gordon Bell Prize, awarded annually for outstanding achievement in high-performance computing applications at the annual Supercomputing Conference. This year’s conference, called SC10 for Supercomputing 2010, will be held Nov. 13-19 in New Orleans, La.
“This M8 simulation represents a milestone calculation, a breakthrough in seismology both in terms of computational size and scalability,” said Yifeng Cui, a computational scientist at SDSC and lead author of Scalable Earthquake Simulation on Petascale Supercomputers.
“It’s also the largest and most detailed simulation of a major earthquake ever performed in terms of floating point operations, and opens up new territory for earthquake science and engineering with the goal of reducing the potential for loss of life and property.”
The simulation, funded through a number of National Science Foundation grants, represents the latest in seismic science on several levels, as well as for computations at the petascale level, which refers to supercomputers capable of more than one quadrillion floating point operations, or calculations, per second.
Olsen, who cautioned that this massive simulation is just one of many possible scenarios that could actually occur, also noted that high-rise buildings are more susceptible to the low-frequency, or a roller-coaster-like, motion, while the smaller structures usually suffer more damage from higher-frequency shaking, which feels more like a series of sudden jolts.
As a follow-up to the record-setting simulation, Olsen said the research team will analyze potential damage to buildings, including Los Angeles high rises, due to the simulated ground motions.
Record-setting on several fronts
“We have come a long way in just six years, doubling the maximum seismic frequencies for our simulations every two to three years, from 0.5 hertz—or cycles per second—in the TeraShake simulations, to 1.0 hertz in the ShakeOut simulations, and now to 2.0 hertz in this latest project,” said Phil Maechling, associate director for information technology at the earthquake center.
Specifically, the latest simulation is the largest in terms duration of the temblor (six minutes) and the geographical area covered—a rectangular volume approximately 500 miles (810 km) long by 250 miles (405 km) wide by 50 miles (85 km) deep.
The team’s latest research also set a new record in the number of computer processor cores used, with more than 223,000 cores running within a single 24-hour period on the Jaguar Cray XT5 supercomputer at the Oak Ridge National Laboratory in Tennessee. By comparison, a previous TeraShake simulation in 2004 used only 240 cores over a four-day period.
Additionally, the new simulation used a record 436 billion mesh, or grid points, to calculate the potential effect of such an earthquake, versus only 1.8 billion mesh points used in the TeraShake simulations done in 2004.
Earthquake simulations can be used to evaluate earthquake early-warning planning systems, and help engineers, emergency response teams and geophysicists better understand seismic hazards not just in California, but around the world.
“Petascale simulations such as this one are needed to understand the rupture and wave dynamics of the largest earthquakes at shaking frequencies required to engineer safe structures,” said Thomas Jordan, director of the earthquake center and principal investigator for the project. “Frankly, we were at the very limits of these new capabilities for research of this type.”
In addition to Cui, Olsen, Jordan and Maechling, other researchers on the Scalable Earthquake Simulation on Petascale Supercomputers project include:
- Amit Chourasia, Kwangyoon Lee and Jun Zhou from SDSC
- Daniel Roten and Steven M. Day from SDSU
- Geoffrey Ely and Patrick Small from USC
- D.K. Panda from OSU
- John Levesque, from Cray Inc