Research papers come Very fast for anyone to read them all, especially in the field of machine learning, which now affects (and produces paper) practically every industry and company. The purpose of this column is to gather the most relevant recent discoveries and letters – specifically but not limited to artificial intelligence – and explain why they matter.
Machine learning has been used to attempt to better understand or predict these phenomena in recently published research projects.
This week has slightly more “basic research” than consumer applications. Machine learning can be implemented to benefit users in many ways, but it is also transformative in areas such as seismology and biology, where massive backlogs of data are used to train AI models or as raw materials. Can be mined for insight into.
Inside the economist
We are surrounded by natural phenomena that we don’t really understand – obviously we know where earthquakes and hurricanes come from, but how do they actually propagate? Are there secondary effects if you pass different measurements? How far can these things be predicted?
Machine learning has been used to attempt to better understand or predict these phenomena in recently published research projects. With data available for decades, there are insights to be gained across the board in this way – if seismic, meteorologists, and geologists interested in doing so can gain the funding and expertise to do so.
Most recent discoveryCreated by researchers at Los Alamos National Labs, a new source of data uses ML to document previously obsolete behavior with defects during “slow flakes”. Using synthetic aperture radar captured from orbit, which can see through cloud cover and at night to give accurate, regular imaging of the shape of the ground, the team “breaks propagation” with the Northern Anatolian for the first time. Was able to directly observe. Defects in Turkey.
“The deep-learning approach we have developed makes it possible to automatically detect small and transient deformations that occur at faults with unprecedented resolution, systematic study of interactions between slow and regular earthquakes globally Leads the way, ”Los Alamos Geophysicist Bertrand Ruet-Leduc.
Another effort, which has been underway at Stanford for a few years, helps Earth science researcher Mustafa Mousavi deal with the signal-to-noise problem with seismic data. After being analyzed billions of times a day by old software, he realized that it would take years to work optimally and work on different methods. The most recent is a method of teasing evidence of small earthquakes that went unnoticed but still left a record in the data.
“Earthquake Transformer” (Named after machine-learning technology, not robots) were trained on years of hand-labeled seismic data. When tested on readings collected during Japan’s magnitude 6.6 Totori earthquake, it isolated 21,092 separate incidents, more than twice what people found in their original inspection – and half that recorded the earthquake. Also used data from fewer stations.
The instrument itself will not predict earthquakes, but better means to understand the true and complete nature of events that we may be able to in other ways. “By improving our ability to detect and locate very small earthquakes, we can get a clearer view of how earthquakes interact or spread along the fault, how they start, even they How to stop, ”said co-author Gregory Beroza.