20 jun 2019 - 21 jun 2019 [open in google calendar]
Dipartimento di Matematica - Università di Roma "Tor Vergata"
First part of a series of lectures by D. Slepčev (Carnegie-Mellon University, Pittsburgh)
The lectures will focus on variational problems that arise in machine learning. Modern data-acquisition techniques produce a wealth of data about our world. Extracting the information from the data leads to machine learning tasks such as clustering, classification, regression, dimensionality reduction, and others. These tasks are often modeled via functionals, defined on the available random sample, which specify the desired properties of the object sought.
The lectures will discuss a mathematical framework suitable for studies of asymptotic properties of such, variational, problems posed on random samples and related random geometries (e.g. proximity graphs). In particular we will discuss the passage from discrete variational problems on random samples to continuum limits.
The lectures will introduce the basic elements of the background material on calculus of variations and optimal transportation. They will also explain the motivation for the studies of the given functionals and their significance to machine learning. Finally the asymptotic consistency of several important machine learning algorithms will be shown.
Check the web page for updated information
This course is part of the MIUR Excellence Department Project awarded to the Department of Mathematics, University of Rome Tor Vergata, CUP E83C18000100006
Organizers: Andrea Braides.
Speakers: Dejan Slepčev.