GP
Let . If follows a Gaussian distribution for any set of n points, then we say that is a Gaussian process. A GP is characterized by its mean function and a covariance function , which is any Mercer Kernel.
where is the expected value of, and .
where and .
Gaussian processes models
Visualization
Deep Neural Networks as Gaussian Processes
Deep Neural Networks as Gaussian Processes
We show how to make predictions using deep networks, without training deep networks.
https://openreview.net/forum?id=B1EA-M-0Z

Lecture
Gaussian processes for fun and profit: Probabilistic machine learning in industry
Seminar by S.T. John, Senior Machine Learning Researcher, Secondmind Labs at the UCL Centre for AI. Recorded on the 4th November 2020.
Abstract
When companies, whether start-ups or big corporations, talk about "machine learning" they usually mean some kind of neural network model. Not always though: I will talk about why instead we put a lot of our efforts on probabilistic models built using Gaussian processes. When a Machine Learning course briefly covers Gaussian processes, you might go away thinking they're just basis function interpolation, only apply when the noise is Gaussian, and don't scale to larger datasets. Here I will discuss why these are misconceptions and show why Gaussian processes are both interesting and useful in practical applications.
Bio
Ti is a senior machine learning researcher in the probabilistic modelling team at Secondmind Labs, where they have been working on a broad range of customer and research projects involving Gaussian processes. Ti believes in making research output reusable by integrating it in common toolboxes and is core maintainer of the GPflow open source project for Gaussian process modelling.
https://youtu.be/uq8VxqeHPj8?si=mme8hPou0x5hGQni


Seonglae Cho