Manifold learning in Wasserstein space

2023 | preprint. A publication with affiliation to the University of Göttingen.

Jump to: Cite & Linked | Documents & Media | Details | Version history

Cite this publication

​Manifold learning in Wasserstein space​
Hamm, K.; Moosmüller, C.; Schmitzer, B.  & Thorpe, M.​ (2023). DOI: 

Documents & Media


GRO License GRO License


Hamm, Keaton; Moosmüller, Caroline; Schmitzer, Bernhard ; Thorpe, Matthew
This paper aims at building the theoretical foundations for manifold learning algorithms in the space of absolutely continuous probability measures on a compact and convex subset of $\mathbb{R}^d$, metrized with the Wasserstein-2 distance $W$. We begin by introducing a natural construction of submanifolds $Λ$ of probability measures equipped with metric $W_Λ$, the geodesic restriction of $W$ to $Λ$. In contrast to other constructions, these submanifolds are not necessarily flat, but still allow for local linearizations in a similar fashion to Riemannian submanifolds of $\mathbb{R}^d$. We then show how the latent manifold structure of $(Λ,W_Λ)$ can be learned from samples $\{λ_i\}_{i=1}^N$ of $Λ$ and pairwise extrinsic Wasserstein distances $W$ only. In particular, we show that the metric space $(Λ,W_Λ)$ can be asymptotically recovered in the sense of Gromov--Wasserstein from a graph with nodes $\{λ_i\}_{i=1}^N$ and edge weights $W(λ_i,λ_j)$. In addition, we demonstrate how the tangent space at a sample $λ$ can be asymptotically recovered via spectral analysis of a suitable "covariance operator" using optimal transport maps from $λ$ to sufficiently close and diverse samples $\{λ_i\}_{i=1}^N$. The paper closes with some explicit constructions of submanifolds $Λ$ and numerical examples on the recovery of tangent spaces through spectral analysis.
Issue Date
SFB 1456 | Cluster A | A03: Dimensionality reduction and regression in Wasserstein space for quantitative 3D histology 



Social Media