Calculus of Variations and Geometric Measure Theory

T. Roith - L. Bungert

Continuum Limit of Lipschitz Learning on Graphs

created by bungert on 08 Jul 2022
modified on 30 Jan 2024

[BibTeX]

Published Paper

Inserted: 8 jul 2022
Last Updated: 30 jan 2024

Journal: Foundations of Computational Mathematics
Volume: 23
Number: 2
Pages: 393-431
Year: 2023
Doi: 10.1007/s10208-022-09557-9

ArXiv: 2012.03772 PDF
Links: Published version

Abstract:

Tackling semi-supervised learning problems with graph-based methods has become a trend in recent years since graphs can represent all kinds of data and provide a suitable framework for studying continuum limits, for example, of differential operators. A popular strategy here is $p$-Laplacian learning, which poses a smoothness condition on the sought inference function on the set of unlabeled data. For $p<\infty$ continuum limits of this approach were studied using tools from $\Gamma$-convergence. For the case $p=\infty$, which is referred to as Lipschitz learning, continuum limits of the related infinity Laplacian equation were studied using the concept of viscosity solutions. In this work, we prove continuum limits of Lipschitz learning using $\Gamma$-convergence. In particular, we define a sequence of functionals which approximate the largest local Lipschitz constant of a graph function and prove Γ-convergence in the L∞-topology to the supremum norm of the gradient as the graph becomes denser. Furthermore, we show compactness of the functionals which implies convergence of minimizers. In our analysis we allow a varying set of labeled data which converges to a general closed set in the Hausdorff distance. We apply our results to nonlinear ground states, i.e., minimizers with constrained $L^p$-norm, and, as a by-product, prove convergence of graph distance functions to geodesic distance functions.