A high-bias, low-variance introduction to Machine Learning for physicists

Status
Tags
Mehta, P., Bukov, M., Wang, C. H., Day, A. G., Richardson, C., Fisher, C. K., & Schwab, D. J. (2019). A high-bias, low-variance introduction to machine learning for physicists. Physics reports810, 1-124.
Assumes:
partition functions, statistical mechanics;
a fluency in mathematical techniques such as linear algebra, multi-variate calculus, variational methods, probability theory, and Monte-Carlo methods.
Many of the core concepts and techniques used in ML – such as Monte-Carlo methods, simulated annealing, variational methods – have their origins in physics. Moreover, “energy-based models” inspired by statistical physics are the backbone of many deep learning methods. For these reasons, there is much in modern ML that will be familiar to physicists.
 
This review emphasizes connections with statistical physics, physics-inspired Bayesian inference, and computational neuroscience models.
Thus, certain ideas (e.g., gradient descent, expectation–maximization,variational methods, and deep learning and neural networks) are covered extensively, while other important ideas are given less attention or even omitted entirely (e.g., statistical learning, support vector machines, kernel methods, Gaussian processes).