Abstract. Gaussian processes are an effective model class for learning unknown functions, particularly in settings where accurately representing predictive uncertainty is of key importance. Motivated by applications in the physical sciences, the widely-used Matérn class of Gaussian processes has recently been generalized to model functions whose domains are Riemannian manifolds, by re-expressing said processes as solutions of stochastic partial differential equations. In this work, we propose techniques for computing the kernels of these processes on compact Riemannian manifolds via spectral theory of the Laplace–Beltrami operator, allowing them to be trained via standard scalable techniques such as inducing points. This enables Riemannian Matérn GPs to be used in mini-batch, online, and non-conjugate settings, and makes them more accessible to machine learning practitioners.