Our successful application for Marsden funding
Principal Investigator(s):
Associate Investigator(s):
The period prior to a large earthquake is believed to be characterised by increasing levels of regional stress, which in turn influence observed patterns of earthquake occurrence in time, space and magnitude. This proposal looks to develop statistical models for these changes, and to assess them on observational data for earthquakes in the Circum-Pacific Region. The proposal links earlier work of the statisticians in the group with recent work of seismologists in New Zealand and California. A proper development of the statistical ideas involved in modelling and fitting such data should represent a major advance in this field.
Recent discussions of earthquake genesis have stressed the analogy with the approach to a phase transition; see for example [44], [41], [21], [31], [38], [39], [17] and [5]. Typical features are power-law distributions for frequencies and energies, long-range correlations in spatial occurrence patterns, and sensitivity to external stimuli. There is, however, a substantial gap between generalised physical models of this kind and a model that can be applied to catalogue data from a particular fault system. Such a model has to take into account both the physical-mechanical environment of the fault system and the irregular (statistical) departures in the data from the generalised relationships suggested by the model. The controlling parameters of the statistical models should reflect the tectonic character and large-scale geometry of the geophysical environment, and the changing stages of the physical cycle. These features have to be distinguished from random (i.e. unexplained) variability due to physical effects (small scale variations in rock strength, distribution of pre-existing cracks and other weaknesses, variations in stress input and stress transfer processes at depth, etc) which are not accessible to direct observation. The statistical model postulates distributions for these effects and allows appropriate estimation procedures to be developed for the parameters of physical importance. Both aspects need to be included if the modelling exercise is to be useful, but existing models tend to fail on one or other or even both counts.
So far, one of the most successful ideas heralding the approach to a critical state is the accelerating moment release (AMR) pattern, where prior to a large event, one observes an accelerating increase in seismic moment in an extended region around the earthquake rupture zone. Studies of the Californian and Aleutian Arc regions in particular (see among other papers [45], [8], [7], [15], [23], [11], [16]) have reported this effect, and have promoted the view that convergence towards some kind of critical state is a common feature in regions about to suffer a major earthquake. Recent work [17] has investigated the form of this increase, finding a departure from the power law magnitude distribution in the later part of the acceleration, perhaps representable by a Kagan distribution [22].
The germ of the present proposal was the realisation that the statistical models developed in earlier work by members of the research team, especially the branching process model (see e.g. [46], [19], [20], [2], [47]), and the stress release model (SRM) (see [50], [51], [42], [24]) embody ideas relevant to the discussion concerning the approach to criticality. In particular, it seems that the earlier work could be used to suggest better methods of handling the statistical aspects of existing AMR models, and to improve and refine the models by embedding them into longer time scales. Some preliminary ideas are set out in [29], [48], [50] and the present proposal would build on these.
Further papers [1], [30], [31], [32] by members of the group have developed physical and computer models of major fault systems, particularly around New Zealand. Additional such models (see, for example, [33], [3], [4], [49]) have also appeared in the literature. Such numerical models allow long histories of events to be generated, which is vital to the systematic investigation of seismicity [35] but it can be hard to relate their output to earthquake catalogue data (see [36], [10]). Stochastic models may help to overcome this difficulty, by better discriminating between features of the real and synthetic catalogues, as in [25]. Moreover the insights gained from such comparisons may allow stochastic models to be matched to the physical systems and used for hazard forecasting and other purposes (see, for example, [24]).
The research team includes one senior researcher from each major field (Vere-Jones, Robinson) and three younger staff (Bebbington, Harte, Kozuch). Its members run a regular weekly seminar, and in April 1999 and April 2000, organised a Workshop on Non-Linear Aspects of Fracture with assistance from an earlier Marsden project (96-VUW-MIS-0031). They also work together on the earthquake forecasting component of the IGNS Natural Hazards Programme. Through the Workshops, contacts have been established with some of the seismologists in California currently studying the AMR model.
The overall aim of this project is to develop improved models for the characteristics heralding the approach to a critical state before a major earthquake, and to imbed these into longer-term models for the seismic cycle on major fault systems. Research in a previous project has provided a number of tractable models usable as tools to study and compare earthquake sequences. The next step requires merging the stochastic models with mechanical models of the fault system. This requires a synthesis of expertise from the fields of stochastic modelling and seismology.
The proposed research can be separated into two parts:
The first part comprises a critical examination of existing ideas and models for the approach to criticality in complex systems, and a linkage of these ideas to earlier work of the group on phase change, branching process and stress release models. The goal here is to develop testable statistical models that can be checked against data from real and synthetic catalogues. In this vein, the AMR model will be the major focus of attention since it is backed up by empirical studies promising the greatest future development. Better statistical model formulation should lead to better methods for estimating parameters, and for testing the ideas on which the models are based. Moreover the existing techniques for the stochastic point process models which would be used here themselves require further development, particularly in relation to the study of sensitivity to individual data points (so-called leverage problems), and estimates of precision.
It is also important to examine the use of the models on longer time scales, covering several seismic cycles, to see how well they match historical and geological data, and whether they can shed light on such important issues as whether natural fault systems really show long-term steady-state behaviour. This is the theme proposed for the second part. To this end, some kind of synthesis between statistical models, which can be fitted to observed data, and physical models, utilizing known geometrical and mechanical features, will be needed. This will also enable the consequences of particular physical or statistical assumptions embodied in these models to be studied in detail. To achieve this aim, the group hopes to link its earlier work on stochastic models to earlier and current work by Robinson and Benites on major fault structures in NZ. Synthetic catalogues derived from such models may help to indicate appropriate parameters for the stochastic models and so compensate for the paucity of historical data. The groups overseas links will be used to check these ideas against other fault systems in the Circum-Pacific Region. Besides these synthetic catalogues, other sources of data that may be examined include data from laboratory fracture experiments and from observations in mines, which preserve some but not all, of the tectonic properties of earthquakes. In the long run, the aim is to combine features from both parts to better understand and forecast behaviour on major fault systems, with practical implications for earthquake forecasting and hazard reduction.
The similarities frequently quoted between earthquake (or rock fracture) and phase change [5, 14, 31, 37] suggest a range of characteristics that might herald the approach to a critical state before a major fracture: convergence of frequencies and moment distributions towards power-law forms, long-range correlations, and sensitivity to external stimuli.
Our proposal is to develop testable statistical models and check them against data from real and synthetic catalogues.
A particular focus of attention would be the accelerated moment release (AMR) model, where some preliminary ideas were developed in our previous project [29, 48], and links were established with scientists in California who have been involved in developing the model. The following are the main steps proposed.
This work will require the closest collaboration between the geophysical and statistical members of the research team. In earlier and current studies [30], [31], [32], Robinson and co-workers have developed computer models for the occurrence of large events on major faults in the Wellington region, New Zealand. The underlying question posed here is whether it is possible to link studies of this type to statistical models such as the SRM and AMR in such a way that the parameters in the latter models are adapted to specific characteristics of the fault system.
The first approach we propose bringing to this problem is to combine the experience gained in fitting limited historical data to large regional earthquakes (see e.g. [24]) with the results of directly fitting the AMR models to current catalogue data. Our aim is to develop a hybrid technique for estimating parameter combinations which are broadly tailored to the fault system under consideration and also fit current data. Introducing the Kagan distribution for magnitudes into the SRM models should permit the development of models in which the size distribution is affected by stress build-up and release during the seismic cycle (i.e., Reid's elastic rebound theory, [28]). The linked SRM model developed earlier allows for control over the parameters governing stress input and stress transfer between adjacent regions [24]. Knowledge of the mechanical properties of these links could be used to severely constrain the relevant model parameters, and the constrained model could then be fitted to current data. The resulting model could be assessed at least qualitatively by comparing long-run features of its output to known features of the geological history (see, for example, [25]).
The shortage of relevant data again appears as a major difficulty in directly investigating the behaviour of real fault systems. It is at this point that data from computer models for such systems can play an important role. Despite their complexity, such computer models have the advantage that statistical features observed in their output can be more easily related to features of the model than is the case with the real-earth environment. The question then is the extent to which statistical models, that capture the broad features of the evolution of complex systems, but replace detailed knowledge of their small-scale behaviour by appropriate statistical assumptions, can be fitted to such synthetic catalogues, and can predict their future behaviour. A related use of such stochastic models is to show up possible discrepancies between real and synthetic catalogues. If the stochastic models can be tuned to fit the output from such synthetic catalogues, there is a greater chance that they will successfully capture the key features characterising real data sequences.
In addition to earthquake catalogs, related fracture data can be obtained from sources such as laboratory experiments and mine acoustics, which preserve some, but not all features of earthquake data.
Using the above spectrum of data, our approach is to use the statistical models developed in the other section of the proposal to see what spatio-temporal-magnitude patterns emerge in the data, and if so, under what conditions they are stable features of the process. Further studies would be also be made of the statistical properties of the seismic cycles produced by the computer model(s), and the ability of the extended SRM/AMR models (to be developed under §3.1) to model such data.
The eventual aim is to gain insights into the extent different models can mimic the behaviour of specific fault systems, and hence to forecast their future behaviour.
Each subsection of §3.1 is likely to require the attention of at least one of the three statistical members of the team, and could provide the basis of a Master's thesis. Judging from our previous experience, the assistance of either a graduate student or research assistant is likely to be crucial to completing the computational and simulation work within the stipulated time frame. The work outlined in §3.2 will require the participation of all members of the team. The travel grants are requested to allow the group to participate in both statistical and seismological conferences, and to invite at least one person to New Zealand each year to maintain our existing international links. We anticipate that the international fracture model workshops, initiated in the previous grant, will be continued if the present application is successful, in which case the visitor funding may be used in part to support these.
Statistics Research Associates Limited, PO
Box 12-649, Thorndon, Wellington 6144, New Zealand
phone: +64 4 475 3346; fax: +64 4 475 4206; www:
http://www.statsresearch.co.nz