Probability Workshop 2018

Title: Introduction to Monte Carlo Methods 
Speaker: Aaron Smith, University of Ottawa

ABSTRACT: Monte Carlo (MC) methods, the practice of approximately simulating from a probability distribution in order to understand it, are popular throughout statistics and the sciences. Although there are popular software packages for many popular MC algorithms, there are no "black box" methods that work well for all realistic problems. Perhaps even worse, many of the most popular MC algorithms do not have quantitative convergence guarantees. That is, although they are generally guaranteed to converge to the truth eventually, they do not guarantee that their outputs will be close to the truth after a specific reasonable length of computing time.


This workshop will give an introduction to various Monte Carlo methods, with a bias towards problems and algorithms that might contain some interest for probabilists and mathematical statisticians. With this in mind, we will skim quickly past the details of some of the major software packages (e.g. STAN) and concentrate on specific examples that illustrate important techniques. There will be a large emphasis on Markov chain Monte Carlo (MCMC), although other methods will be mentioned.


REQUIREMENTS: This workshop assumes a basic understanding of probability and linear algebra, but does not assume any background in Monte Carlo methods. All sections of the workshop will include a small amount of programming. For this reason, please bring a laptop with the R programming environment (or a statistical programming environment with similar flexibility) installed. The provided code samples will be in R, so I suggest that attendees take a short time to learn the very basics of R before the workshop.


OVERVIEW:

  1. Introduction to Monte Carlo methods: implementation, diagnostics and, comparison  of rejection sampling, Metropolis-Hastings sampling, and other popular methods.
  2. Special examples: An introduction to a few of the examples that have been important to the development of the field, and some of the techniques that have been used to study and solve them. 
  3. Hard problems: An introduction to a few obstacles that prevent the use of standard MC methods, and a discussion of some tricks that are used to get around them. This may include MC for intractable likelihoods, "big data," and complicated rare events.
  4. Easier-than-expected problems and perfect sampling: An introduction to "perfect sampling" algorithms. These algorithms use some interesting probabilistic ideas to get MC methods that do have convergence guarantees, even for problems where this might be surprising. This may include the Ising model from physics and various combinatorial counting problems in computer science.
  5. Unknown difficulty and open problems: many MC methods are still poorly-understood. We will mention some promising MC methods that still have poor theoretical underpinnings. Topics subject to change at the last minute!