Aller au contenu  Aller au menu  Aller à la recherche

Bienvenue - Laboratoire Jacques-Louis Lions

Print this page |

Chiffres-clé

Chiffres clefs

189 personnes travaillent au LJLL

90 permanents

82 chercheurs et enseignants-chercheurs permanents

8 ingénieurs, techniciens et personnels administratifs

99 personnels non permanents

73 doctorants

14 post-doc et ATER

12 émérites et collaborateurs bénévoles

 

Chiffres mars 2019

 

Leçons J.-L. Lions 2021 - 16-19 11 2021 : D. Slepčev

 

Leçons Jacques-Louis Lions 2021 (Dejan Slepčev)

16-19 novembre 2021

 

Cliquer ici pour la version pdf du programme des Leçons Jacques-Louis Lions 2021 (Dejan Slepčevl)Nouvelle fenêtre

 

Données par Dejan Slepčev (Université Carnegie Mellon, Pittsburgh) du mardi 16 au vendredi 19 novembre 2021, les Leçons Jacques-Louis Lions 2021consisteront en :

— un mini-cours
Variational problems and PDE on random structures : analysis and applications to data science
3 séances, mardi 16, mercredi 17 et jeudi 18 novembre 2021 de 12h à 13h15,

— et un colloquium
Machine learning meets calculus of variations
vendredi 19 novembre 2021 de 14h à 15h.

Les exposés seront retransmis simultanément par Zoom.
Les lieux seront précisés ultérieurement.

 

Résumé du mini-cours
Variational problems and PDE on random structures : analysis and applications to data science
Many machine learning tasks, such as clustering, regression, classification, and dimensionality reduction are commonly described as optimization problems. Namely these tasks are modeled by introducing functionals (defined using the available random sample) which specify the desired properties of the object sought. While the data are often high dimensional, they usually have an intrinsic low-dimensional structure that makes the learning tasks feasible. The intrinsic geometric structures is often encoded by a graph created by connecting the nearby data points. We will introduce mathematical tools used to study variational problems and PDE-based models posed on random data samples. In particular we will discuss the passage from discrete optimization problems on random samples to their continuum limits. This will be used to establish asymptotic consistency of several important machine learning algorithms.
We will cover the basic elements of the background material on calculus of variations and optimal transportation. Furthermore we will develop connections to nonlocal functionals which serve as intermediate objects between the discrete functionals and their continuum limits. We will also consider approaches based on dynamics on graphs and connect these with the evolution equations describing the continuum limits.

 

Résumé du colloquium
Machine learning meets calculus of variations
Modern data-acquisition technology produces a wealth of data about the world we live in. The goal of machine learning is to extract and interpret the information the data sets contain. This leads to variety of learning tasks, many of which seek to optimize a functional, defined on the available random sample.
The functionals take as the input the available data samples, yet we seek to make conclusions about the true distribution of data. To compare the outcomes based on finite data and the ideal outcomes that one would have if full information is available, we study the asymptotic properties of the discrete optimization problems based on finite random samples. We will discuss how calculus of variations and partial differential equations provide tools to compare the discrete and continuum descriptions for many relevant problems. Furthermore, we will discuss how the insights from analysis can be used to guide the design of the functionals used in machine learning.