Our successful application for Marsden funding

 


Statistical Models and Observational Evidence for the Approach to Criticality in Earthquake Occurrence

Principal Investigator(s):

Associate Investigator(s):

Summary

The period prior to a large earthquake is believed to be characterised by increasing levels of regional stress, which in turn influence observed patterns of earthquake occurrence in time, space and magnitude. This proposal looks to develop statistical models for these changes, and to assess them on observational data for earthquakes in the Circum-Pacific Region. The proposal links earlier work of the statisticians in the group with recent work of seismologists in New Zealand and California. A proper development of the statistical ideas involved in modelling and fitting such data should represent a major advance in this field.

1. Background

Recent discussions of earthquake genesis have stressed the analogy with the approach to a phase transition; see for example [44], [41], [21], [31], [38], [39], [17] and [5]. Typical features are power-law distributions for frequencies and energies, long-range correlations in spatial occurrence patterns, and sensitivity to external stimuli. There is, however, a substantial gap between generalised physical models of this kind and a model that can be applied to catalogue data from a particular fault system. Such a model has to take into account both the physical-mechanical environment of the fault system and the irregular (statistical) departures in the data from the generalised relationships suggested by the model. The controlling parameters of the statistical models should reflect the tectonic character and large-scale geometry of the geophysical environment, and the changing stages of the physical cycle. These features have to be distinguished from random (i.e. unexplained) variability due to physical effects (small scale variations in rock strength, distribution of pre-existing cracks and other weaknesses, variations in stress input and stress transfer processes at depth, etc) which are not accessible to direct observation. The statistical model postulates distributions for these effects and allows appropriate estimation procedures to be developed for the parameters of physical importance. Both aspects need to be included if the modelling exercise is to be useful, but existing models tend to fail on one or other or even both counts.

So far, one of the most successful ideas heralding the approach to a critical state is the accelerating moment release (AMR) pattern, where prior to a large event, one observes an accelerating increase in seismic moment in an extended region around the earthquake rupture zone. Studies of the Californian and Aleutian Arc regions in particular (see among other papers [45], [8], [7], [15], [23], [11], [16]) have reported this effect, and have promoted the view that convergence towards some kind of critical state is a common feature in regions about to suffer a major earthquake. Recent work [17] has investigated the form of this increase, finding a departure from the power law magnitude distribution in the later part of the acceleration, perhaps representable by a Kagan distribution [22].

The germ of the present proposal was the realisation that the statistical models developed in earlier work by members of the research team, especially the branching process model (see e.g. [46], [19], [20], [2], [47]), and the stress release model (SRM) (see [50], [51], [42], [24]) embody ideas relevant to the discussion concerning the approach to criticality. In particular, it seems that the earlier work could be used to suggest better methods of handling the statistical aspects of existing AMR models, and to improve and refine the models by embedding them into longer time scales. Some preliminary ideas are set out in [29], [48], [50] and the present proposal would build on these.

Further papers [1], [30], [31], [32] by members of the group have developed physical and computer models of major fault systems, particularly around New Zealand. Additional such models (see, for example, [33], [3], [4], [49]) have also appeared in the literature. Such numerical models allow long histories of events to be generated, which is vital to the systematic investigation of seismicity [35] but it can be hard to relate their output to earthquake catalogue data (see [36], [10]). Stochastic models may help to overcome this difficulty, by better discriminating between features of the real and synthetic catalogues, as in [25]. Moreover the insights gained from such comparisons may allow stochastic models to be matched to the physical systems and used for hazard forecasting and other purposes (see, for example, [24]).

The research team includes one senior researcher from each major field (Vere-Jones, Robinson) and three younger staff (Bebbington, Harte, Kozuch). Its members run a regular weekly seminar, and in April 1999 and April 2000, organised a Workshop on Non-Linear Aspects of Fracture with assistance from an earlier Marsden project (96-VUW-MIS-0031). They also work together on the earthquake forecasting component of the IGNS Natural Hazards Programme. Through the Workshops, contacts have been established with some of the seismologists in California currently studying the AMR model.

2. Overall aim of the research

The overall aim of this project is to develop improved models for the characteristics heralding the approach to a critical state before a major earthquake, and to imbed these into longer-term models for the seismic cycle on major fault systems. Research in a previous project has provided a number of tractable models usable as tools to study and compare earthquake sequences. The next step requires merging the stochastic models with mechanical models of the fault system. This requires a synthesis of expertise from the fields of stochastic modelling and seismology.

The proposed research can be separated into two parts:

  1. to develop stochastic models for the approach to criticality in complex fault structures; and
  2. to develop longer term models for such systems by combining these with deterministic computer models.

The first part comprises a critical examination of existing ideas and models for the approach to criticality in complex systems, and a linkage of these ideas to earlier work of the group on phase change, branching process and stress release models. The goal here is to develop testable statistical models that can be checked against data from real and synthetic catalogues. In this vein, the AMR model will be the major focus of attention since it is backed up by empirical studies promising the greatest future development. Better statistical model formulation should lead to better methods for estimating parameters, and for testing the ideas on which the models are based. Moreover the existing techniques for the stochastic point process models which would be used here themselves require further development, particularly in relation to the study of sensitivity to individual data points (so-called leverage problems), and estimates of precision.

It is also important to examine the use of the models on longer time scales, covering several seismic cycles, to see how well they match historical and geological data, and whether they can shed light on such important issues as whether natural fault systems really show long-term steady-state behaviour. This is the theme proposed for the second part. To this end, some kind of synthesis between statistical models, which can be fitted to observed data, and physical models, utilizing known geometrical and mechanical features, will be needed. This will also enable the consequences of particular physical or statistical assumptions embodied in these models to be studied in detail. To achieve this aim, the group hopes to link its earlier work on stochastic models to earlier and current work by Robinson and Benites on major fault structures in NZ. Synthetic catalogues derived from such models may help to indicate appropriate parameters for the stochastic models and so compensate for the paucity of historical data. The groups overseas links will be used to check these ideas against other fault systems in the Circum-Pacific Region. Besides these synthetic catalogues, other sources of data that may be examined include data from laboratory fracture experiments and from observations in mines, which preserve some but not all, of the tectonic properties of earthquakes. In the long run, the aim is to combine features from both parts to better understand and forecast behaviour on major fault systems, with practical implications for earthquake forecasting and hazard reduction.

3. Proposed research

3.1 Statistical Models for the Approach to Criticality

The similarities frequently quoted between earthquake (or rock fracture) and phase change [5, 14, 31, 37] suggest a range of characteristics that might herald the approach to a critical state before a major fracture: convergence of frequencies and moment distributions towards power-law forms, long-range correlations, and sensitivity to external stimuli.

Our proposal is to develop testable statistical models and check them against data from real and synthetic catalogues.

A particular focus of attention would be the accelerated moment release (AMR) model, where some preliminary ideas were developed in our previous project [29, 48], and links were established with scientists in California who have been involved in developing the model. The following are the main steps proposed.

  1. The existing models do not clearly indicate how the increase in moment release is to be partitioned between an increase in the frequency of events and an increase in their relative sizes. The framework suggested in [48] for accommodating such changes combines a non-stationary Poisson process in time for the frequency, with a time-varying version of the Kagan distribution (embodying a power-law tail with a variable exponential roll-off term, see [22], [46], [19], [20], [6], [18]) to model changes of the magnitude distribution with changes in the stress environment. It was shown in [46] that in the branching process model at least, the parameter governing the exponential term can be related to the distance of the model from criticality. It is therefore the natural parameter to vary in this context. Moreover the fact that real catalogues will contain mixtures of events from earthquake sequences at different stages in the approach to criticality could help to explain other features of observed magnitude distributions. Recent work ([6], [18]) indicates some support for these ideas. The immediate goal here would be to implement these suggestions and examine their effectiveness on the data sets used in the earlier studies of Californian and New Zealand data.
  2. It might be hypothesised that accelerating moment release is associated with changes in the temporal and spatial clustering patterns of seismicity. Such patterns are reflected in the fractal dimensions of earthquake hypocentres, and in the parameters of fitted clustering models such as the ETAS model (see, for example, [26]) and the SELC model [40]. Methods of estimating such quantities have been developed under the previous project [12], [13], and would be used here to examine the extent to which the development of an AMR pattern was reflected in changes in estimable parameters.
  3. Several problems of inference arise in applying stochastic models to real data, on account of the paucity and generally low quality of longer-term historical and geological records. To avoid the obvious danger of drawing wrong inferences from such data, methods are needed to identify the features of the data most critically influencing the conclusions, for example hazard forecasts. Here we propose to examine the possibility of extending to point-process models, in particular the SRM and AMR models, the diagnostic methods of identifying points of high influence (or leverage) which have already been developed for regression models [9]. A related problem is that of controlling for the effect of secondary parameters in the model, which may be poorly constrained by the available data, and can destabilize model-fitting procedures. The approach we propose to investigate here, which has already been partially implemented in [13], is the introduction of Bayesian penalty priors which steer the estimates away from physically implausible solutions. A final aspect which is important is that of adjusting for known defects in the catalogue, such as changing magnitude thresholds. All of these effects can influence the precision and/or the reliability of the fitted solutions, and the forecasts arising from them, and our aim would be to document the effects which can arise, their causes, and where possible their remedies.
  4. Another possible indicator of changes in occurrence patterns is the approximate entropy (see [27]). Adaptation of approximate entropy ideas to earthquake data is currently being developed as part of a separate project by Kozuch. We propose here first to incorporate the approximate entropy procedures within SSLib [13], and then to link this work to the development of AMR patterns much as with the fractal dimension estimates. In both cases, careful attention would be given to the problem of establishing the significance of any apparent correlations. Although analytical results might be hard to obtain, we propose to obtain rough measures of significance by testing the results against those derived from various types of random catalogues.

3.2 Developing Long-term Models for Major Fault Systems by Combining Stochastic and Deterministic Computer Models

This work will require the closest collaboration between the geophysical and statistical members of the research team. In earlier and current studies [30], [31], [32], Robinson and co-workers have developed computer models for the occurrence of large events on major faults in the Wellington region, New Zealand. The underlying question posed here is whether it is possible to link studies of this type to statistical models such as the SRM and AMR in such a way that the parameters in the latter models are adapted to specific characteristics of the fault system.

The first approach we propose bringing to this problem is to combine the experience gained in fitting limited historical data to large regional earthquakes (see e.g. [24]) with the results of directly fitting the AMR models to current catalogue data. Our aim is to develop a hybrid technique for estimating parameter combinations which are broadly tailored to the fault system under consideration and also fit current data. Introducing the Kagan distribution for magnitudes into the SRM models should permit the development of models in which the size distribution is affected by stress build-up and release during the seismic cycle (i.e., Reid's elastic rebound theory, [28]). The linked SRM model developed earlier allows for control over the parameters governing stress input and stress transfer between adjacent regions [24]. Knowledge of the mechanical properties of these links could be used to severely constrain the relevant model parameters, and the constrained model could then be fitted to current data. The resulting model could be assessed at least qualitatively by comparing long-run features of its output to known features of the geological history (see, for example, [25]).

The shortage of relevant data again appears as a major difficulty in directly investigating the behaviour of real fault systems. It is at this point that data from computer models for such systems can play an important role. Despite their complexity, such computer models have the advantage that statistical features observed in their output can be more easily related to features of the model than is the case with the real-earth environment. The question then is the extent to which statistical models, that capture the broad features of the evolution of complex systems, but replace detailed knowledge of their small-scale behaviour by appropriate statistical assumptions, can be fitted to such synthetic catalogues, and can predict their future behaviour. A related use of such stochastic models is to show up possible discrepancies between real and synthetic catalogues. If the stochastic models can be tuned to fit the output from such synthetic catalogues, there is a greater chance that they will successfully capture the key features characterising real data sequences.

In addition to earthquake catalogs, related fracture data can be obtained from sources such as laboratory experiments and mine acoustics, which preserve some, but not all features of earthquake data.

Using the above spectrum of data, our approach is to use the statistical models developed in the other section of the proposal to see what spatio-temporal-magnitude patterns emerge in the data, and if so, under what conditions they are stable features of the process. Further studies would be also be made of the statistical properties of the seismic cycles produced by the computer model(s), and the ability of the extended SRM/AMR models (to be developed under §3.1) to model such data.

The eventual aim is to gain insights into the extent different models can mimic the behaviour of specific fault systems, and hence to forecast their future behaviour.

Each subsection of §3.1 is likely to require the attention of at least one of the three statistical members of the team, and could provide the basis  of a Master's thesis. Judging from our previous experience, the assistance of either a graduate student or research assistant is likely to be crucial to completing the computational and simulation work within the stipulated time frame. The work outlined in §3.2 will require the participation of all members of the team. The travel grants are requested to allow the group to participate in both statistical and seismological conferences, and to invite at least one person to New Zealand each year  to maintain our existing international links. We anticipate that the international fracture model workshops, initiated in the previous grant, will be continued  if the present application is successful, in which case the visitor funding may be used in part to support these.

References

  1. Bebbington, M.S. (1997). A hierarchical stress release model for synthetic seismicity. J. Geophys. Res. 102, 11677-11687.
  2. Bebbington, M.; Vere-Jones, D. and Zheng, X.-G. (1990). Percolation theory: a model for rock fracture ? Geophys. J. Int. 100, 215-220.
  3. Ben-Zion. Y. (1996). Stress, slip and earthquakes in models of complex single-faults systems incorporating brittle and creep deformations. J. Geophys. Res. 101, 5677-5706.
  4. Ben-Zion, Y. and Rice, J.R. (1997). Dynamic simulations of slip on a smooth fault in an elastic solid. J. Geophys. Res. 102, 17771-17784.
  5. Ben-Zion, Y.; Sammis, C. and Henyey, T. (1999). Perspectives on the field of physics of earthquakes. Seismol. Res. Lett. 70, 428-431.
  6. Bowman, D.D.; Ouillon, O.; Sammis, C.G.; Sornette, A. and Sornette, D. (1998). An observational test of the critical earthquake concept. J. Geophys. Res. 103, 24359-24372.
  7. Bufe, C.G.; Nishenko, S.P. and Varnes, D.J. (1994). Seismicity trends and potential for large earthquakes in the Alaska-Aleutian region. PAGEOPH 142, 83-99.
  8. Bufe, C.G. and Varnes, D.J. (1993). Predictive modeling of the seismic cycle in the Greater San Francisco Bay area. J. Geophys. Res. 98, 9871-9983.
  9. Cook, R.D. and Weisberg, S. (1982). Residuals and influence in regression. Chapman and Hall, New York.
  10. Eneva, M. and Ben-Zion, Y. (1997). Application of pattern recognition techniques to earthquake catalogs generated by models of segmented fault systems in three-dimensional elastic solids. J. Geophys. Res. 102, 24513-24528.
  11. Gross, S. and Rundle, J. (1998). A systematic test of time-to-failure analysis. Geophys. J. Int. 133, 57-64.
  12. Harte, D. (1998). Dimension estimates of earthquake epicentres and hypocentres. J. Nonlinear Sci. 8, 581-618.
  13. Harte, D. (1998). Documentation for the Statistical Seismology Library. School of Mathematical and Computing Sciences Research Report No. 98-10. Victoria University of Wellington.
  14. Huang, Y., Saleur, H., Sammis, C. and Sornette, D. (1998). Precursors, aftershocks, criticality amd self-organized criticality. Europhys. Lett. 41, 43-48.
  15. Jaumé, S.C. and Estabrook, C.H. (1992). Accelerating seismic moment release and outer-rise compression: possible precursors to the next great earthquake in the Alaska Peninsula region. Geophysical Research Letters 19(4), 345-348.
  16. Jaumé, S.C. and Sykes, L.R. (1996). Evolution of moderate seismicity in the San Francisco region, 1850 to 1993: seismicity changes related to the occurrence of large and great earthquakes. J. Geophys. Res. 101(B1), 765-789.
  17. Jaumé, S.C. and Sykes, L.R. (1999). Evolving towards a critical point; a review of accelerating moment/energy release prior to large great earthquakes. PAGEOPH 155, 279-305.
  18. Jaumé, S.C. (2000). Changes in earthquake size-frequency distributions underlying accelerating seismic moment/energy release. In (AGU Monograph) Physics of Earthquakes, in press.
  19. Kagan, Y.Y. (1987). Random stress and earthquake statistics: time dependence. Geophys. J. Astron. Soc. 88, 723-731.
  20. Kagan, Y.Y. (1991). Seismic moment distribution. Geophys. J. Int. 106, 123-134.
  21. Kagan, Y.Y. (1994). Observational evidence for earthquakes as a nonlinear dynamic process. Physica D 77, 160-192.
  22. Kagan, Y.Y. (1997). Seismic moment-frequency relationship for shallow earthquakes; regional comparisons. J. Geophys. Res. 102, 2835-2852.
  23. Knopoff, L.; Levshina, T.; Keilis-Borok, V.I.; Mattoni, C. (1996). Increased long-range intermediate-magnitude earthquake activity prior to strong earthquakes in California. J. Geophys. Res. 101(B3), 5779-5796.
  24. Lu, C.; Harte, D.S. and Bebbington, M.S. (1999). A linked stress release model for historical earthquakes from Japan. Earth Planets Space 51, 907-916.
  25. Lu, C. and Vere-Jones, D. (2000). Statistical analysis of synthetic earthquake catalogs generated by models with various levels of fault zone disorder. Submitted to J. Geophys. Res.
  26. Ogata, Y. (1992). Detection of precursory relative quiescence before great earthquakes through a statistical model. J. Geophys. Res. 97(B13), 19845-19871.
  27. Pincus, S.M. (1991). Approximate entropy as a measure of complexity. Proc. Nat. Acad. Sci. USA 88, 2297-2301.
  28. Reid, H.F. (1910). The mechanism of the earthquake, In The California Earthquake of April 18, 1906, Report of the State Earthquake Investigation Commission, Vol 2, pp. 16-28, Carnegie Institute of Washington, Washington D.C.
  29. Robinson, R. (1997). A test of the precursory accelerating moment release model on some recent New Zealand earthquakes. Geophys. J. Int. 140, 568-576.
  30. Robinson, R. and Benites, R. (1995). Synthetic seismicity models of multiple interacting faults. J. Geophys. Res. 100, 18229-18238.
  31. Robinson, R. and Benites, R. (1996). Synthetic seismicity models for the Wellington region, New Zealand: Implications for the temporal distribution of large events. J. Geophys. Res. 101, 27833-27844.
  32. Robinson, R.; Benites, R. and Van Dissen, R. (1998). Evidence for temporal clustering of large earthquakes in the Wellington region from computer models of seismicity. Bull. N.Z. Nat. Soc. Earthq. Engin. 31, 24-32.
  33. Rundle, J.B. (1988). A physical model for earthquakes, application to Southern California. J. Geophys. Res. 93, 6255-6274.
  34. Rundle, J.B.; Gross, S.; Klein, W.; Ferguson, C. and Turcotte, D.L. (1997). The statistical mechanics of earthquakes. Tectonophysics 277, 147-164.
  35. Rundle, J.B. and Klein, W. (1995). New ideas about the physics of earthquakes. Rev. Geophys. Space Phys. suppl., 283-286.
  36. Rundle, J.B., Klein, W., Tiampo, K. and Gross, S. (1998). Linear pattern dynamics in nonlinear threshold systems. Phys. Rev. E. 61, 2418-2431.
  37. Rundle, J.B., Klein, W. and Gross, S. (1999). Physical basis for statistical patterns in complex earthquake populations; models, predictions and tests. PAGEOPH 155, 575-607.
  38. Saleur, H.; Sammis, C.G. and Sornette, D. (1996). Discrete scale invariance, complex fractal dimensions, and log-periodic fluctuations in seismicity. J. Geophys. Res. 101, 17661-17677.
  39. Sammis, CG. and Smith S.W. (1999). Seismic cycles and the evolution of stress correlation in cellular automaton models of finite fault networks. PAGEOPH 155, 307-334.
  40. Schoenberg, F. and Bolt, B. (2000) Short-term exciting, long-term correcting models for earthquakes. Submitted to Bull. Seismol. Soc. Amer.
  41. Shaw, B.E.; Carlson, J.M.; and Langer, J.S. (1992). Patterns of seismic activity preceding large earthquakes. J. Geophys. Res. 97(B1), 479-488.
  42. Shi, Y.; Liu, J.; Vere-Jones, D.; Zhuang, J. and Ma, L. (1998). Application of mechanical and statistical models to the study of seismicity of synthetic earthquakes and the prediction of natural ones. Acta Seismologica Sinica 11, 421-430.
  43. Sornette, D. and Sammis, C.G. (1995). Complex critical exponents from renormalization group theory of earthquakes: implications for earthquake predictions. J. Phys. I. France 5, 607-619.
  44. Sornette, A. and Sornette, D. (1990). Earthquake rupture as a critical point: consequences for telluric precursors. Tectonophysics 179, 327-334.
  45. Sykes, L.R. and Jaumé, S.C. (1990). Seismic activity on neighbouring faults as a long term precursor to large earthquakes in the San Francisco Bay region. Nature 348, 595-599.
  46. Vere-Jones, D. (1977). Statistical theories of crack propagation. Math. Geol. 9, 455-481.
  47. Vere-Jones, D. and Kagan, Y. (1996). Problems in the modelling and statistical analysis of earthquakes. In: Athens Conference on Applied Probability and Time Series, Vol 1: Applied Probability (Eds: Heyde, C.C.; Prohorov, Yu.V.; Pyke, R. and Rachev, S.T.), 398-425.
  48. Vere-Jones, D.; Robinson, R. and Yang, W. (2000). Remarks on the accelerated moment release model, Research Report 00-5, School of Mathematical and Computing Sciences, Victoria University of Wellington.
  49. Yamashita, T. (1995). Simulation of seismicity due to ruptures on noncoplanar interactive faults. J. Geophys. Res. 100, 8339-8350.
  50. Yang, W.Z., Vere-Jones, D. and Ma, L. (2000). The location of epicenter with critical earthquake concept. Submitted to PAGEOPH.
  51. Zheng, X.-G. and Vere-Jones, D. (1991). Application of stress release models to historical earthquakes from North China. PAGEOPH 135(4), 559-576.
  52. Zheng, X.-G. and Vere-Jones, D. (1994). Further applications of the stochastic stress release model to historical earthquake data. Tectonophysics 229(1/2), 101-121.

 

 


Statistics Research Associates Limited, PO Box 12-649, Thorndon, Wellington 6144, New Zealand
phone: +64 4 475 3346; fax: +64 4 475 4206; www: http://www.statsresearch.co.nz