The AI shaking up earthquake research

An AI is now capable of accurately simulating earthquakes to help us understand the true impact of quakes around the world.

The AI shaking up earthquake research

Its creation means that, for the first time ever, scientists can analyse all of the numerical values which come from an earthquake at once, saving huge amounts of time for researchers. Formerly, earthquake analysis needed to be conducted in parts, breaking observations to ground effects and then the urban area above.

The AI was developed by a team of scientists from the University of Tokyo to help improve earthquake analysis. Scientists from the university’s Civil Engineering and Information Technology Center coded and taught an AI to recognise different earthquake data patterns to make the construction of simulations much more efficient.

READ NEXT: The ethical implications of conversational AI

This is because, sadly, earthquake analysis isn’t as straightforward as typing in a number, pressing ‘go’ and watching a simulation predict what’s going to happen. Earthquakes vary in power and ground density changes their impacts on an area too. Factor in the highly intricate and dense structural network of cities and once an earthquake hits, researchers are left with a cocktail of numbers to untangle.

This is where the Japanese team, led by professor Tsuyoshi Ichimura, have stepped in to overcome this problem. They’ve coded an AI to assist in breaking down the information to identify data trends.

Essentially, the AI is tasked with filtering data into different categories, noting mathematical structures rather than being asked to interpret them. In Ichimura’s paper he states that “the AI is only used to efficiently detect parts of the problem.” The main interpretation is then conducted using robust preconditioners, which are mathematical formula used to analyse the dataset effectively. Ichimura specifies that this method doesn’t improve the quality of the results, it simply shortens the time taken to find them.

Because of this dual system, preconditioners no longer need to be as complex as before. Equally, by changing how the AI interprets the specified dataset, it doesn’t end up straining itself as much and thus is less likely to encounter errors.

READ NEXT: Should we have left Brexit decision to AI?

The team found success by exposing the AI to smaller data trends from less powerful earthquakes. They then taught it to recognise particular patterns within these trends. Once the AI had learned to recognise the different patterns in these smaller trends, it was used to review larger parts of bigger earthquakes. When the scientists then ran their program through the IBM Summit supercomputer, the fastest supercomputer on the planet, they were able to interpret earthquake data four times faster than any previous model.

Ichimura hopes his new code will “find its way into a new generation of physical simulators.” Meaning the AI could be used to help interpret the impact of other natural disasters on urban environments.

His confidence is understandable, as the computational demand for earthquakes in their entirety vastly exceeds most urban mathematical trends. Where these developments in AI might lead is yet to be tested, however with 70% of the world’s population living in cities by 2050, it’s definitely a nice to have.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.