Estimation of entropies and other functionals: Statistics meets information theory

   
Background

The concept of entropy has played a central role in information theory since it was introduced in a seminal paper of Shannon. It turns out that many common statistical distributions maximise entropy subject to different constraints. For instance, among densities on the real line with a fixed variance, the normal density maximises the entropy, while the uniform density achieves the maximal entropy among distributions supported on the unit interval. These beautiful properties have led to many important statistical applications of the estimation of entropy, e.g. in goodness-of-fit testing, independent component analysis, image registration problems and many other areas. Moreover, the entropy functional has many close cousins, including the family of Renyi entropies, relative entropy (or Kullback--Leibler divergence), mutual information and many others, which have all applications in a suite of diverse scientific fields. For example, empirical estimation of mutual information is a commonly-used primitive for machine learning applications such as fitting graphical models, as well as for understanding spike train analysis in neuroscience. Thus it is sometimes convenient, from a statistical point of view, to regard entropy as a special case of a class of nonlinear functionals.

Many different estimators of entropy have been proposed in the literature over the last 30 years or so, including methods based on sample spacings, histograms, kernel density estimates and nearest neighbours. However, this area has seen a surge of activity in the last couple of years, as researchers are starting to understand conditions under which we can hope to achieve minimax optimal rates of estimation, or even efficient estimation. These developments are less widely known than they should be, however, and further research is also required to provide a reliable set of methods for practitioners.


Workshop Description

The aim of this workshop is to bring together researchers from a range of backgrounds to present the latest work on the estimation of entropy and other functionals. Recent progress in the area has been made by researchers in a variety of fields, and we aim to foster a sense of community and collaboration across different disciplines.

All talks will be held in the Centre for Mathematical Sciences in Cambridge, UK, and will take place September 9-11, 2019.


Funding

We are grateful to the Peter Whittle Fund for support with this workshop.



Organisers

  • Tom Berrett, Statistical Laboratory, University of Cambridge
  • Nikolai Leonenko, School of Mathematics, Cardiff University
  • Richard Lockhart, Department of Statistics and Actuarial Science, Simon Fraser University
  • Richard Samworth, Statistical Laboratory, University of Cambridge
  • Yihong Wu, Department of Statistics and Data Science, Yale University

Confirmed Speakers













Registration

    Speakers and other participants should register for the workshop here.


Programme

    Talks will begin on the Monday morning and will finish before lunch on the Wednesday. A draft programme is available here, together with the abstracts for the talks. There will be a workshop dinner in St John's College on the Monday evening for invited speakers and organisers.


Accommodation

    Invited speakers will be staying at Murray Edwards College, which is a five-minute walk from the Centre for Mathematical Sciences. Please note that accommodation for invited speakers will be booked by the organisers. We have had reports of phone calls from somebody claiming to be organising accommodation for the workshop. These calls are not genuine and should be ignored. Other participants may search for accommodation here.


Practical Information

Useful information on how to get to the Centre for Mathematical Sciences can be found here, and information on how to get to Cambridge can be found here.