Next: Physics of Earthquake Up: The Gap Theory Previous: The Gap Theory
As mentioned earlier, several public policy documents have been published on earthquake predictions in terms of probabilities. For example, the Working Group on California Earthquake Probabilities  estimated the long-term (up to 30 year) probability for each segment of the San Andreas and San Jacinto fault. The probabilities were revised for segments in the San Francisco Bay area after the 1989 Loma Prieta earthquake by the Working Group on California Earthquake Probabilities . The short-term (up to 72 hours) probabilities are also estimated for the Parkfield and other characteristic earthquakes based on precursory phenomena such as foreshocks [ Bakun et al. 1987].
In order to arrive at a probability estimate for public use, consensus must be reached on the method of calculating the probability as well as on the input data. This is a very difficult time-consuming process, and some scientists view this process as a waste of time and manpower. It is certainly not a curiosity-driven creative process, but is important for the users of earthquake information such as government officials responsible for emergency preparedness and land use, and may be essential for the survival of the community of earthquake researchers. Several years ago NSF recognized the need for returning the product of science to society and initiated the Science and Technology Center program. SCEC is one of these centers, and the transmission of earth science information on seismic hazard in southern California to the public is one of its major tasks.
SCEC officially started on February 1, 1991, and the Landers earthquake (M7.3) occurred during its second year. Two weeks after the earthquake, SCEC organized a one-day workshop to discuss the implications of the event and SCEC's response. The workshop concluded with a unanimous decision to produce two documents. The first document (Phase I) was to address (1) recent seismicity in southern California, (2) effects of the Landers-Big Bear sequence on nearby faults, and (3) the potential for future ground shaking in southern California. The second document (Phase II) would address the long-term seismic hazard broadly over the whole of southern California, revising the 1988 Working Group report by improving the methodology and updating the data. The Phase I report was published in November 1992 [ Ad-hoc Working Group on the Probabilities of Future Large Earthquakes in Southern California, 1992]. The Phase II report has been reviewed by the National Earthquake Prediction Evaluation Council and the California Earthquake Prediction Evaluation Council. It will be published in the April, 1995 issue of the Bulletin of Seismological Society of America. It was also presented to the larger scientific community at the Fall 93 AGU meeting [ Jackson et al., 1993].
The Phase II report goes beyond the 1988 Working Group report in several aspects. It addresses the whole of southern California by dividing it into 65 source zones. The earthquake potential for each zone is estimated not only from the paleoseismological data on ruptures on fault segments, but also the historic earthquake data spatially smoothed by the method of Kagan and Jackson , as well as the GPS data, by translating the observed strain rate into earthquake frequency by the procedure of Ward . In addition to characteristic earthquakes, the Phase II report considers the contribution of distributed earthquakes. The model for the characteristic earthquake now allows for ruptures over neighboring multiple segments [ Wesnousky, 1986] and is called ``Cascade'' [ Jackson et al., 1993]. The consensus geological parameters used to characterize each source zone was developed by a group of geologists under the leadership of D. Schwartz [ Jackson et al., 1993]. The resultant earthquake source model can be used to estimate probabilities of ground shaking at any point in southern California, and as an example, the Phase II report will show a map of probability where any point experiences peak ground acceleration greater than 20% of gravitational acceleration in the next 30 years. The report will also document the uncertainty in input data as well as the range of alternative models.
The probabilistic estimate of seismic hazard can be used as a cost-benefit study of mitigation efforts. The benefit is the reduction of damage or loss expected from future earthquakes, and may be estimated from probabilities of seismic hazard. An objective decision can be made by comparing the expected benefit with the mitigation cost.
Probabilities have also been used for several official short-term earthquake predictions. Here again the identification of a characteristic earthquake plays an important role. Let the occurrence of such a characteristic earthquake be called event C. Consider a precursory event F such as a foreshock. According to Jones , half of the strike-slip earthquakes in California have been preceded by immediate foreshocks (defined as earthquakes within 3 days and 10 km of the mainshock). In other words, the conditional probability P(F|C) of event F, given event C is 0.5. Following the Bayes theorem, the conditional probability of the occurrence of the characteristic earthquake (event C) after the occurrence of a foreshock (event F) is given by
where P(C) and P(F) are the unconditional or long-term rate of occurrence of event C and event F, respectively.
The above formula again shows the importance of identifying details of the characteristic earthquake, because knowing the location of the nucleation point of the rupture of a given fault segment (Middle Mountain for the Parkfield segment), event F can be restricted to a small volume of the earth, making P(F) small. Otherwise, P(C|F) would be too low to be of any practical use.
In addition to foreshocks, fault creep and continuous strain are monitored at Parkfield by an extensive instrumentation [ Bakun et al., 1987]. Five alert levels (A, B, C, D and E) were introduced for different ranges of the conditional probability. For example, alert level A (USGS issuing geologic hazard warning) corresponds to the 72-hour probability greater than 37%, and alert level B (alert USGS director and California State Geologist) corresponds to the 72-hour probability in the range 11 to 37%. Alert levels C, D and E correspond to probabilities lower than 11%, and are directed to those involved in the experiment.
According to a recent review of the Parkfield experiment by a working group (B. Hager, Chair) of the National Earthquake Prediction Evaluation Council , the experiment has brought scientists together with state and local officials, emergency managers, and the news media, in a productive, mutually beneficial relationship. The State established the first scientifically-based State emergency management protocol for a specific predicted earthquake. The first A-level alert was issued on Oct. 22, 1992. The USGS notified the California Office of Emergency Services (OES) of an A-level alert triggered by the M=4.7 earthquake at Middle Mountain. Eight minutes later, OES broadcast the alert to State agencies and local governments over the California Warning System. Kern County was the first county to activate its Emergency Operation Center, 47 minutes after the OES alert. OES completed its alert of local government and response officials in less than one hour following the earthquake. It was a complete success on the part of transmitting earth science information to the public. The working group concluded that Parkfield remains the best identified location to trap an earthquake, and the experiment should be continued both for its geophysical and its public response benefits.