*Accepted Paper*

**Inserted:** 30 jan 2024

**Last Updated:** 30 jan 2024

**Journal:** Annals of Applied Probability

**Year:** 2024

**Abstract:**

In this paper we prove the first quantitative convergence rates for the graph infinity Laplace equation for length scales at the connectivity threshold. In the graph-based semi-supervised learning community this equation is also known as Lipschitz learning. The graph infinity Laplace equation is characterized by the metric on the underlying space, and convergence rates follow from convergence rates for graph distances. At the connectivity threshold, this problem is related to Euclidean first passage percolation, which is concerned with the Euclidean distance function $d_{h}(x,y)$ on a homogeneous Poisson point process on $\mathbb{R}^d$, where admissible paths have step size at most $h>0$. Using a suitable regularization of the distance function and subadditivity we prove that ${d_{h_s}(0,se_1)}/ s \to \sigma$ as $s\to\infty$ almost surely where $\sigma \geq 1$ is a dimensional constant and $h_s\gtrsim \log(s)^\frac{1}{d}$. A convergence rate is not available due to a lack of approximate superadditivity when $h_s\to \infty$. Instead, we prove convergence rates for the ratio $\frac{d_{h}(0,se_1)}{d_{h}(0,2se_1)}\to \frac{1}{2}$ when $h$ is frozen and does not depend on $s$. Combining this with the techniques that we developed in (Bungert, Calder, Roith, IMA Journal of Numerical Analysis, 2022), we show that this notion of ratio convergence is sufficient to establish uniform convergence rates for solutions of the graph infinity Laplace equation at percolation length scales.