Seminar series in honour of David Vere-Jones

Abstracts

 

4th May, 2011: David Harte, Statistics Research Associates

Modelling NZ Seismicity with the ETAS Model

The "Epidemic Type Aftershock Sequence" (ETAS) Model is a point process model used in seismology. It models mainshock-aftershock sequences like an epidemic where an infected individual passes the disease onto others (direct offspring or 1st generation), who in turn pass it on further (2nd and subsequent generations). Given certain conditions, the aftershock sequence, or epidemic, will eventually die out.

We attempt to fit a sequence of spatial ETAS models, from very simple to more complex, to determine the relative importance of the various model components. Each are fitted to a large region that includes all of the most seismically active areas in New Zealand. It is shown that some values of the estimated ETAS parameters, in particular the Omori decay p, depend on the assumed spatial density function of the background process.

We also evaluate the efficacy of the model by examining how well it describes many of the major mainshock-aftershock sequences occurring since 1965 that are contained within the NZ Catalogue.

 

20 July, 2011: Eugene Seneta, School of Mathematics and Statistics at University of Sydney

Canberra Times (1965-1968) and After

The talk, illustrated by some photographs, will interweave reminiscences of David Vere-Jones' time in Canberra in the context of our social and collaborative contact, and show influences on my future work.

Touched on in the first part of the talk will be quasi-stationary distributions, non-negative matrices and regularly varying functions, in the setting of the simple branching process.

A second part will be expository on autocorrelations, spectral densities and real characteristic functions in the light of Fourier analysis theory of Young, Kolmogorov and Khinchin. It will be illustrated by application to theory behind the circulant embedding technique originating with Davies and Harte (1987) for the simulation of stationary Gaussian processes.

 

27 July, 2011: Jiancang Zhuang, Institute of Statistical Mathematics, Tokyo

Testing significance of earthquake precursors: Against complete randomness? Or against earthquake clustering models?

During the last 30 years, a number of algorithms have been developed that search for seismic anomalies in order to predict strong earthquakes. Examples of such algorithms are M8, RTP, RTL, PI, and LURR. In these algorithms, behaviour is usually deemed to be anomalous if seismicity rates change from the background activity, which is usually assumed stationary or, more strongly, a Poisson process. On the other hand, it is well known that seismicity is clustered and can be well described by the Epidemic-Type Aftershock Sequence (ETAS) model. Such anomalies detected in these conventional ways may be caused by earthquake clusters or improper declustering algorithms. This presentation illustrates, through examples, how to modify these algorithms for the testing of seismic anomalies against a clustering model (e.g., the ETAS model).

 

7 September 2011: Peter Smith, Department of Electrical and Computer Engineering, Canterbury University

Random matrices, quadratic forms and networks of antenna arrays in wireless communications

In the 1990s, Hongsheng Gao and Peter Smith from ISOR and Martin Clark from AT&T solved a problem in wireless communications that involved quadratic forms in normal random variables. Since that time we have not really worked further in the area, since there were no pressing reasons to consider extensions of the problem. Over the last few years, developments in wireless communications have led to the possibility of networking together several base stations in a distributed array of antennas. This leads to fundamental problems in the theory of random matrices that are even more difficult. In this talk I will summarize our work in this area. I will also look back to the research environment at ISOR that made all of this possible.

 

12 October 2011: Len Cook (Former Government Statistician)

Putting a value on statistical methods and practice

Statistical offices face many contradictory pressures, of which protecting confidentiality and releasing highly detailed statistical records is but one. The presentation will outline some of these tensions, and outline where methodology has been critical to how they are managed. Information based on anecdote, celebrity thoughts and bad experiences seems to have a high velocity in the public domain, often crowding out knowledge that is well founded.  We are at risk that politicians see policy by slogan as a justified consequence.  Some examples and observations on raising the place of scientists in public life will be offered and on reducing impediments that may have emerged.

 

26 October 2011: Yosihiko Ogata, Institute of Statistical Mathematics, Tokyo

The M9 Tohoku-Oki earthquake and statistical seismology

The Tohoku-Oki earthquake of M9 ruptured unexpectedly much wider areas than those which were assumed mostly based on the reliable historical data since late 18 century. This event has provided us serious opportunities to study earthquake generation scenarios, tsunami disaster mitigation and nuclear plant policies, but also operational earthquake forecasting. Here I would like to talk about long-term, intermediate- and short-term operational forecasting, referring to this giant earthquake.

First, the Earthquake Research Committee has publicized long-term probability forecasts of large earthquakes in inland active faults and interplate seismic zones including Tohoku-Oki regions. I will review the forecasting model and estimation procedure to suggest statistical improvements.

Second, so far I have been monitoring the aftershock activity of the M9 earthquake, as well as studied some inland activities that were triggered by the M9 earthquake. I would like to show how they were normal or anomalous compared to the predicted seismicity by the ETAS model.

Thirdly, I studied long-term seismic activity (magnitude 5 or larger) in and around Japan till the earthquake of M9. Using the hierarchical space-time ETAS model, the data were de-clustered to measure the background seismicity. I have observed lowering in background seismicity during the last decade compared with the 1926-1999 trend in most parts of Japan. This quiescence over Japan during the last decade can be interpreted as the response to aseismic slips in the southern down-dip portion of the ruptured zone.

Finally, the M9 earthquake was preceded by the conspicuous foreshocks. This foreshock series were abnormal in some senses. Thus, finally, I experiment the foreshock probability forecast based on the 1996 paper joint with Utsu and Katsura, and evaluate the results.  

 

21 December 2011: Sharleen Forbes, Statistics New Zealand and Victoria University

New directions in statistics education - the impact of technology

New advances in computer technology and tele-communications affect many areas of daily life. For users and teachers of statistics this includes how, and what, we teach. Changes in schooling are reflected in the new, and world-leading, New Zealand Mathematics and Statistics curriculum introduced in 2007. David was a major influence and leader of the development of statistics in our school curriculum that defines Statistics as the exploration and use of patterns and relationships in data. Two new components of this curriculum are informal inference (and the use of randomisation within this) and bootstrapping. This will be taught in classrooms in 2012 using the New Zealand CensusAtSchools data base. Both the International and New Zealand CensusAtSchools resulted from the first CensusAtSchools run by the New Zealand Statistical Association in 2000 for ICOTS3.

At the tertiary level the availability of large and new longitudinal, merged and geo-coded data sets has influenced teaching as has data visualisation. New technologies not only affect what we teach but also how we teach and we can now present visually many of the concepts underlying statistics and official statistics. We are now free of ‘the tyranny of the computable’ Cobb (2007) and are able to focus on developing underlying concepts, such as variability and probability. In many applied statistics courses the emphasis has become developing statistical literacy rather than an understanding of the mathematics underlying statistics. New technologies enable new teaching collaborations and course delivery. An example is the new cross-university honours course in official statistics.

Technology has influenced what we teach with a growing emphasis on the analysis of longitudinal, geo-coded, administrative and integrated data sets and raises some fundamental questions for teachers of statistics, including whether:

 

18 January 2012: Mark Bebbington, Massey University

The Development of Statistical Volcanology: A Personal Experience

The title is of course a tribute to David's 2006 paper on statistical seismology (Tectonophysics 413, 5-12).  I will discuss the scientific reasons for my leaving the green pastures of seismology for the blackened lava flows of volcanology. A number of problems in the understanding and forecasting of volcanic eruptions will be outlined, along with solutions using stochastic models for eruption occurrence. The latter will include trend-renewal processes, hidden Markov models, self-exciting point processes, and proportional hazard models.

 

1 February 2012: Daryl Daley, Department of Mathematics and Statistics, University of Melbourne

Regeneration and Markovianity

David Kendall, doctoral superviser of both David V-J and myself, stressed in his 1951 paper “Some problems in the theory of queues” the work of Connie Palm and the role of regeneration and regeneration points. My early work in Cambridge in 1963–64 was influenced indirectly by David V-J’s solidarity result for n-step transition probabilities pnij of Markov chains that are irreducible, namely that the series n pnij zn, necessarily convergent for |z| < 1, have a radius of convergence that is the same for all pairs (i, j). At this time John Kingman’s work on continuous-time transition probability functions had been given its initial airing via the probability functions p(·) of regenerative phenomena in continuous time (Kingman’s “p-functions”), and JFCK in turn suggested I look at whether renewal density functions shared certain probabilistic properties of p-functions. (In general they do not.) More recently, I have wondered what properties pertain to point processes whose sample paths {ti} contain an embedded renewal process. This is in a sense a stochastic equivalent of a sequence containing a skeleton with a more structured property, and asking how the latter may lead to establishing more general assertions about the point process itself.

Clément Dombry has some notes on ‘renewing random measures’ which may be pertinent.

 

15 February 2012: Geoff Eagleson, Australian School of Business in the University of New South Wales and Management Consultant

How can the interactions between Statistics and Business be improved? Reflections on a journey from Pure Mathematics to management consultancy

I shall make some suggestions, arising out of my personal journey from a graduate in Pure Mathematics to a management consultant, for improving the way that statisticians interact with and support businesses.

I shall note how statistics has been used in business in the past and suggest ways in which its application could be improved in the future. I shall use my experience teaching Data Analysis to MBA students to argue for a more targeted approach to the teaching of statistics, taking into account the needs of managers and their cognitive styles. Finally, I shall discuss what is required from a statistician to contribute to strategic decision-making.

The talk is based on my personal experiences that are limited in scope and my personal beliefs that are limited by my imagination. Caveat audiens!

 

11 April 2012: Shirley Pledger, Professor, School of Mathematics, Statistics and Operations Research, Victoria University of Wellington

Capture-recapture, finite mixtures, correspondence analysis and working for David Vere-Jones

In this seminar I will give a background to my late-developing research career, then range over some topics which have interested and excited me over the last 15 years. Included will be the good and bad points of working (i) for DVJ, and (ii) with biologists.

A few likelihood equations may find their way into the talk.