In 1906, a magnitude 7.8 earthquake hit San Francisco, killing 700 people and causing some $400 million in property damage. Given that this sort of natural disaster is thought to strike at 150-year intervals, there’s a major impetus for today’s officials to begin strategizing for a similar scenario — and a Titan supercomputer might be able to lend a hand.
One of the biggest challenges facing emergency planners and engineers is a lack of data pertaining to large scale earthquakes. Smaller quakes have been thoroughly documented, but they won’t necessarily reveal the information that’s required to set up the proper counter-measures ahead of a potentially devastating event.
To remedy this problem, a team led by the Southern California Earthquake Center’s Thomas Jordan is using the Titan supercomputer at the Oak Ridge National Library to develop physics-based earthquake simulations. Their work will create a seismic hazard map for the state of California, according to a report from Science Daily.
This modern approach to earthquake preparation owes a significant debt to the men and women that logged detailed records of earthquakes many years ago. That data is now being used to test the accuracy of the simulations the team are building.
The team creates the necessary 3D models and physics codes, then runs a simulation of a well-documented historical earthquake. If the ground motions that are being calculated match those on record, they can be confident that their simulations are accurate and that the information being gleaned is usable.
The vast complexity of these simulations means the use of a supercomputer like Titan is essential. The team has been quoted as saying that the project would be impossible without the use of Titan due to the many variables being used in the simulation, and the sheer amount of node hours tests require as a result.