I previously completed my Bachelor of Arts and Masters in Engineering in Computer Science and Linguistics at Gonville & Caius College, University of Cambridge, where I obtained a Starred First (Class I with Distinction) and a Distinction (also equivalent to a starred First) respectively. Alongside teaching assistant and guest lecturer roles, I now supervise various courses in the Computer Science Tripos.

Computer Science Tripos

CST IB
Course PDF Example Sheet Official Notes
Data Science
(Damon Wischik, 2025)
PDF Sheet Notes
Formal Models of Language
(Paula Buttery, 2025)
PDF Sheet Notes
CST II
Course PDF Example Sheet Official Notes
Machine Learning & Bayesian Inference
(Sean Holden, 2025)
PDF Sheet Notes
Information Theory
(Robert Harle, 2025)
PDF Sheet Notes

Natural Language Syntax and Parsing

L95, a course taught in Cambridge University’s MPhil in Advanced Computer Science (ACS), covers formal methods for parsing and neural algorithms.

</table>

Computational Semantics

Advanced Topics in Machine Learning

Reinforcement Learning

Geometric Deep Learning

Theory of Deep Learning

Computer Science Tripos (Part IA/IB):

Machine Learning & Real World Data

Li18 Computational Linguistics

Li7 Phonological Theory

Li8 Morphology

Li9 Syntax

Li17 Typology

LiX/Schedule C

Miscellaneous Notes:

Week Topic Course Materials
1 Dependency Parsing Slides | Notes
Suggested Readings:
Chen & Manning (2014) A Fast and Accurate Dependency Parser using Neural Networks
The RASP Manual
Osborne & Gerdes (2019) The status of function words in dependency grammar (negative sampling paper)
</td> </tr>
2 Parsing Algorithms Slides | Notes | Code
Suggested Readings:
GloVe: Global Vectors for Word Representation (original GloVe paper)
Improving Distributional Similarity with Lessons Learned from Word Embeddings
Evaluation methods for unsupervised word embeddings
Additional Readings:
A Latent Variable Model Approach to PMI-based Word Embeddings
Linear Algebraic Structure of Word Senses, with Applications to Polysemy
On the Dimensionality of Word Embedding