It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Originally posted by jadedANDcynical
Let's dig in to this and see if we can shake anything out.
...The computer model was generated under NASA's QuakeSim project, a computational framework for modeling and understanding earthquake and tectonic processes. QuakeSim focuses on deformation of Earth's crust, which can be measured using airborne and spaceborne technologies. The models and data can be used to better understand earthquake hazard, stress transfer between faults, and ground disturbance following earthquakes.
QuakeSim, a collaboration of JPL-Caltech, USC, UC Davis, UC Irvine, Indiana University, and NASA Ames, is sponsored by NASA's Advanced Information Systems Technology Program through the Earth Science Technology Office.
source
So we know for a fact that JPL works with quite a few groups studying earthquakes.
QuakeSim
We are forming a community-led InSAR Working Group dedicated to the advancement of radar remote sensing research. The potential of a robust InSAR observational capability has generated strong interest amongst the research and applications communities. The role of InSAR spans a broad spectrum of end uses including crustal deformation science related to earthquakes, volcanoes, hydrologic processes, ice sheet and glacier variability, vegetation structure, and disaster management. Long-term access to InSAR data will greatly advance our understanding of how these basic processes affect life on Earth. Consequently, the US scientific community should devise a long-term strategy for US InSAR activities, including the funding of dedicated US InSAR satellites, access to foreign SAR data, and continued education and advocacy for InSAR science.
More JPL stuff of interest
This report summarizes the major findings of a symposium attended by 260 scientists and engineers in an effort to guide U.S. efforts in Interferometric Synthetic Aperture Radar (InSAR), a critical tool for studying dynamic changes of the Earth’s surface and natural hazards associated with these changes. InSAR observations provide critical and otherwise unavailable data enabling comprehensive, global measurements to better understand and predict changes in the Earth system. The InSAR Workshop was funded jointly by NASA’s Earth Science program, the Geosciences Directorate of the National Science Foundation, and the U.S. Geological Survey. We hope that these and other agencies heed the call for a coordinated InSAR program to address these important research questions
From a report dated 2004
More:
Natural hazards. SAR interferometry has demonstrated valuable information for monitoring and predicting or forecasting a variety of hazards, from air, water, and earth. Large-scale hazards generated in the Earth include earthquakes and volcanic eruptions; each is driven by tectonic forces within the Earth’s crust. Observation of deformation from subsurface flow of magma and of the accumulation of strain within the crust is needed to be able to understand these great forces of nature. More localized, but often intense, hazards include landslides, mud flows, and land subsidence or collapse due to natural or human removal of subsurface material or fluids and permafrost melting. Flooding is the most damaging hazard in most areas, from rainfall, snow, and ice melting, and natural or human-made dam collapse. In coastal regions, hurricanes, intense local wind events, shore erosion, and oil spills are major hazards. Finally, fire in forests and other vegetation is a major hazard in many areas. For each of these hazards, InSAR has proven a help in assessing damage after the events and evaluating the risk of future events by understanding and monitoring the processes involved
Source
Originally posted by Juggalette
Regarding M8... Out of curiosity, I searched this term in Google scholar: "Keilis-Borok's M8 prediction methodology" and came up with this paper Earthquake Prediction: State of the Art and Emerging Possibilities written by Vladimir Keilis-Borok. Looks like the paper is dated 2002.
An excerpt from the paper reveals this regarding the M8 earthquake prediction algorithm:
ALGORITHM M8
"This algorithm was designed by retrospective analysis of seismicity preceding the greatest (M 8) earthquakes worldwide, hence its name. We describe it using as an example the recent prediction of major earthquake in Southern Sumatera, Indonesia, on June 4, 2000, M D 8.0. That prediction was part of the Russian-American experiment in advance prediction of strongest earthquakes worldwide."
It goes on to give the specifics of the algorithm and how it "works" (sorry not a math person ), and how several other prediction methodologies that were used have worked. The paper also states that: "Algorithm M8 successfully predicted all six strong earthquakes that occurred from 1992–2000.", making it the most successful of the algorithms used.
Invite NASA to give a presentation on that agency's support of earthquake prediction research
Invite Tom Jordan to give a presentation on the Collaboratory for the Study of
Earthquake Predictability.
[ex ]USGS to provide NEPEC members with an appropriate selection of prediction literature.
emphasis mine
2. What stress transfer processes are important in triggering seismic activity? Are long-range interactions important
... Current research is very actively elucidating the nature of the earthquake/earthquake interactions, rigorously quantifying the statistical likelihood of linkages, and beginning to shed light on time-dependent processes (e.g., post-seismic relaxation, state/ rate fault friction) that influence triggered activity. However, emerging clues suggest longer-range interactions that are not mechanically understood. Any linkages should have deformation signatures, and synoptic InSAR imaging offers possibly the best means of detecting and elucidating the deformation causes and effects that may link regional earthquake events
3. Are there precursory deformation phenomena for either earthquakes or volcanoes and can they be detected with InSAR observations?
This is the Holy Grail for solid-Earth natural hazards research. Current earthquake hazard maps are at a coarse resolution in both time and geography. Such maps depict probability of exceeding a certain amount of shaking (generally that at which damage occurs) over the next 30 to 100 years, depending on the map. The spatial resolution is typically on the order of tens to hundreds of kilometers. These maps are based on information about past earthquakes observed in the geological or historical record. Measurement of crustal deformation, usually acquired using GPS, now provides information on strain rates; generally we find that earthquake rates are higher where strain rates are higher. The number of GPS stations that can be deployed on the ground limits the resolution of strain, and these stations can be expensive to install and maintain
Furthermore, future science studies of crustal deformation will yield insights into earthquake behavior, whether high strain rates indicate the initiation of failure on a fault or quiet release of stress, and how stress is transferred to other faults. These studies will lead to science findings for improvement of earthquake hazard maps both spatially and temporall
Similar studies employing InSAR to map deformation on volcanic terrain can reveal subsurface transport of magma, an important factor affecting eruption probabilities. Detailed maps of the shape of the magma trail give clues as to where pressure may accumulate and also may help constrain the explosiveness of the potential eruption
Originally posted by megabogie
]I found this looking up the Kellis-Borok connection, it's an abstract from a scientific paper
Premonitory activation of earthquake flow: algorithm M8
Thirty-nine out of the 44 strongest earthquakes which have recently occurred in different regions of the world are preceded by specific activation of the earthquake flow in the lower magnitude range. This activation is depicted by the algorithm M8, which was designed for diagnosis of times of increased probability (TIPs) of strong earthquakes.
The paper is authored by V. I. Keilis-Borok and V. G. Kossobokov aand it's dated from 1990
LINK:
www.sciencedirect.com...
The “M8” algorithm, originally developed for intermediate-term prediction of large events, uses a catalog of mainshocks to identify large scale seismicity patterns before large earthquakes in a given region (e.g. Gabrielov et al., 1986; Keilis-Borok et al. 1988; Keilis-Borok et al., 1990; Updyke et al., 1989; Healy et. al, 1992; Kossobokov et al., 1992, Keilis Borok and Rotwain, 1994; Kossobokov and Mazhkenov, 1994). We have developed an approach to measure the stability of the results, and to test specific hypotheses (e.g. Minster and Williams, 1992, 1994, 1995, 1996), and used it to assess the performance of M8 intermediate-term earthquake prediction algorithm.
The tests of predictions, performed on a global scale, allowed a first statistical assessment of the predictive capability of M8 and CN algorithms (Kossobokov et al., 1999; Rotwain and Novikova, 1999). Specifically, for the M8 algorithm the results obtained in real-time prediction mode since 1992 have already demonstrated the high confidence level (above 99%) of the prediction of the world’s largest earthquakes, in the magnitude range 8.0 – 8.5 (Keilis-Borok and Soloviev, 2003; Kossobokov et al., 1999). For the algorithm CN a preliminary estimate of the significance of the achieved prediction results, obtained for the period 1983-1998 in 22 regions of the world, gives a confidence level around 95% (Rotwain and Novikova, 1999).
Several experiments have been dedicated to assess the robustness of the methodology against the unavoidable uncertainties in the data (Peresan et al., 2000; 2002). With these results acquired, an experiment was launched in July 2003, aimed at the real-time test of M8S and CN prediction for earthquakes with magnitude larger than 5.4 in the Italian region. The results of the intermediate-term middle-range predictions in Italy are routinely updated and made accessible to a number of scientists. The goal of the experiment was to accumulate a collection of correct and wrong predictions (the latter include the false alarms and/or the failures to predict encountered in the test) permitting to verify and assess the predictive capability of the considered methodology.
A cluster of "false alarm" TIP's in Japan lasted from the middle of 2001 to the end of 2010 gradually migrating from southwestern to the northern regions: It started with CI #64, expanded in 2006 to CIs #82 then #80; in 2007 #64 was called off and ##80-82 formed a new area of alarm, which has shrunk to a single #81 in January 2010. The alarm can be associated with the failures-to-predict the great 25/09/2003 M8.3, 15/11/2006 M8.3, and 13/01/2007 M8.2 earthquake, each of which were linked by the RTP chains of correlated quakes (Keilis-Borok et al., 2004; Shebalin et al., 2006) to the M8-MSc prediction areas, as well as to a series of earthquakes started with the 2002/06/28 M7.3 deep event (depth 566 km) near Priamurye-Northeastern China border in the back of and outside the alerted section of subduction zone, followed by twelve shallow magnitude 7.0 or larger earthquakes in the area alerted in 2002-2010, and ended with the 09/08/2009 M7.1 deep earthquake (depth 292 km) beneath Izu Islands. The Tokai silent earthquake initiated in 2001 and lasted for many years in the middle of this cluster of "false alarm" could be also physically related phenomenon. Finally, the mega-thrust on 11 March 2011 completed the peculiar history of the "false alarm" in Japan on the 70th day after it was formally called off (see figure above). Its first aftershocks (white dots) spread along the entire patch of the M8-MSc prediction outline. The M7.3 earthquake on 9 March 2011 preceded the mega-shock by 51 hours and, according to Takeshi Kudo (personal communication), was a suspect foreshock of the expected by Japanese seismologists "M7.5 class Miyagi-Ken-Oki earthquake". The premature termination of TIP in the 2011a Update of the Global Test of M8-MSc predictions happen to be due to the function Z1 (an inverse of the Zhurkov criterion, which is linear concentration of ruptures) anomaly threshold change from 2407 to 2440 bringing the voting scores from the required (and factual in the 2010b Update) 4:6/4:6 down to 4:6/4:5... A reasonable man would not even notice this change in one of the seven graphs involved in TIP diagnosis, while the prefixed in 1992 "black box" version of the M8 algorithm does. It is hard to disagree with Prof. Keilis-Borok saying: "The alarm, its premature termination notwithstanding, could have been used for damage reduction." The magnitude of the 2011 off the Pacific coast of Tohoku Earthquake might be larger than 8.9 (preliminary Global CMT estimate is 9.1 waiting for longer period records). On the other hand, unlike the 26 December 2004 mega-thrust, the rupture in Sendai earthquake is surprisingly limited to about 400 km. This compactness may explain why its precursory patterns were not recognized by M8 aimed at M9.0+, but were diagnosed in advance the 2010 Chile mega-thrust with about 600 km of rupture. The current alarm for M8.5+ in Japan was not terminated in January, however, the M9.0+ and M8.5+ ranges are outside the scope of Global Test of the M8-MSc predictions and could not be considered as documented in advance the 11 March 2011 mega-thrust.
Abbreviation: TIP, time of increased probability of a strong earthquake (an alarm).
Although the M8-MSc predictions are intermediate-term middle-range and by no means imply any "red alert", some colleagues have expressed a legitimate concern about maintaining necessary confidentiality. Therefore, the up-to-date predictions are not easily accessed, although available on the web-pages of restricted access provided to about 150 members of the Mailing List.