## Tutoring and supervising

At Birmingham, I served as a personal tutor for second-year students. This covered a range of subjects from classical mechanics through electromagnetism to quantum mechanics and condensed matter. Tutors are also responsible for teaching various key skills, such as essay writing. To brush up on your writing, a handy reference has been compiled by Trevor Ponman, and international students might be interested in English language support provided by the University. I also co-supervised fourth-year projects for MSci Physics and Astrophysics students, and third-year group studies projects.

At Cambridge, I was a supervisor for second-year Physics A. This included a topics from oscillations, waves and optics, through quantum mechanics to condensed matter, with an extra module in experimental methods. I also founded the Churchill Undergraduate Physics Seminars which gave students a chance to explore topics outside the curriculum and practise their communication skills.

**DATA SCI-401**

Over the Winter and Spring Quarters 2020, I delivered DATA SCI-401: Data-Driven Research in Physics, Geophysics, and Astronomy together with Suzan van der Lee and Adam Miller. The course covers the scientific motivations and data challenges associated with three major projects in astronomy, astrophysics and geophysics: the Vera C. Rubin Observatory, LIGO (my section) and EarthScope. It discusses advanced data management, analysis, and mining for a broad audience of graduate students, and is a part of the IDEAS program. My LIGO-related lectures include:

- An introduction to gravitational-wave astronomy, covering the science of our detections so far. The science summaries produced by LIGO and Virgo give a friendly welcome to these topics, and the original discovery paper is extremely readable.
- An overview of gravitational-wave sources, and order-of-magnitude estimates of gravitational-wave amplitudes and frequencies. Schutz (1984), and the GW150914 Basic Physics Paper give good introductions to analysing binary coalescences.
- Analysis with gravitational-wave templates, searches and search significance. For an in-depth look at search pipelines, Allen
*et al*. (2012) is a good reference. - Matched filtering and calculation of signal-to-noise ratios. For an introduction to the noise-weighted inner product, I’d suggest working through Finn (1992).
- Parameter estimation and stochastic sampling. The GW150914 Parameter Estimation Paper gives a good overview of how we extract information from a signal. For more on Markov-chain Monte Carlo, I would recommend watching lectures from David MacKay’s Information Theory, Pattern Recognition and Neural Networks (the second half of Lecture 9 plus Lecture 10 as an introduction to inference, and Lectures 12 and 13 on sampling methods), and for more in-depth understanding
*Bayesian Data Analysis*. - Inferring rates and population parameters. Thrane & Talbot (2019) reviews work in population inference. If you are feeling ambitious, Farr, Gair, Madel & Cutler (2015) derive how to work out rates for foreground and background events self-consistently.

A useful resource for gravitational-wave data analysis is the set of tutorials provided by the Gravitational Wave Open Science Center. For an introduction to gravitational-wave signal analysis, the LIGO–Virgo Signal Analysis Guide provides a good overview (as discussed in my blog). To find an excellent background on pretty much the entire course, I would highly recommend *Gravitational Waves: Volume 1*.

## Information theory for physicists

I have given a short lecture course on information theory aimed at advanced undergraduate and postgraduate students. The course is self-contained such that it doesn’t require any prior knowledge. We covered some areas of information theory most applicable in physics:

**Lecture 1**— Probabilities, inference and information content**Lecture 2**— Entropy and probability distances**Lecture 3**— Maximising entropy and thermodynamics

I made use of David MacKay’s excellent *Information Theory, Inference, and Learning Algorithms* throughout. You can also watch his lectures on Information Theory, Pattern Recognition and Neural Networks online.

The course does not cover coding theory in much depth, but this is discussed (in detail) in the papers of Claude Shannon which founded the field, *A Mathematical Theory of Communication*: Parts I & II and Parts III–V. Sections 6 and 7 of Part I are particularly useful if you want to know more about information entropy.

If you would like to go beyond the course to learn more about information geometry, I recommend *Methods of Information Geometry* by Shun-ichi Amari and Hiroshi Nagaoka.

## Tutorials in statistical inference

In 2015, I helped our graduate students to organise a series of workshops for researchers to learn about inference. We brought together experts from a range of fields (from astrostatistics to epidemiology) and covering a range of topics, from the basics of Bayes’ theorem through to sophisticated techniques such as Hamiltonian Monte Carlo and hierarchical modelling. I learnt something too!

## Estimation

Estimating quantities is a useful skill for physicists. I have compiled a Guide to Estimation which may come in handy.

While I was at Churchill College, I organised an annual Estimation Evening for first year Natural Scientists and Computer Scientists. If you are curious, I have copies of my solutions:

To pick up some more useful tools for estimation, I would recommend *Street-Fighting Mathematics* by Sanjoy Mahajan.