Aller au contenu  Aller au menu  Aller à la recherche

Bienvenue - Laboratoire Jacques-Louis Lions

Print this page |
Stages (3eme / seconde)
Stages de découverte (classe de 3eme, 2nde) Voir https://www.math.univ-paris-diderot.fr/diffusion/index

Plusieurs postes ouverts au recrutement au Laboratoire Jacques-Louis Lions

Attention postes au fil de l’eau Date limite de candidature : jeudi 5 mars 2020 à 16h

Lien vers les postes

Chiffres-clé

Chiffres clefs

189 personnes travaillent au LJLL

90 permanents

82 chercheurs et enseignants-chercheurs permanents

8 ingénieurs, techniciens et personnels administratifs

99 personnels non permanents

73 doctorants

14 post-doc et ATER

12 émérites et collaborateurs bénévoles

 

Chiffres mars 2019

 

Leçons Jacques-Louis Lions 2020 : Dejan Slepčev

 

Leçons Jacques-Louis Lions 2020 (Dejan Slepčev)

2-5 juin 2020

 

 

Cliquer ici pour la version pdf du programme des Leçons Jacques-Louis Lions 2020 (Dejan Slepčev)Nouvelle fenêtre

 

 

Données par Dejan Slepčev (Université Carnegie Mellon, Pittsburgh) du 2 au 5 juin 2020, les Leçons Jacques-Louis Lions 2020 consisteront en

— un mini-cours
Variational problems on random structures : analysis and applications to data science
3 séances, mardi 2, mercredi 3 et jeudi 4 juin 2020 de 11h30 à 13h
salle à préciser,

— et un colloquium
Machine learning meets calculus of variations
vendredi 5 juin 2020 de 14h à 15h
salle à préciser.

 

Résumé du mini-cours
Variational problems on random structures : analysis and applications to data science
Many machine learning tasks, such as clustering, regression, classification, and dimensionality reduction, are commonly described as optimization problems. Namely these tasks are modeled by introducing functionals, defined using the available random sample, which specifies the desired properties of the object sought.
While the data typically lie in a high dimensional space, they usually have an intrinsic low-dimensional structure that makes the learning tasks feasible. The intrinsic geometric structure is often encoded by a graph created by connecting the nearby data points.
The lectures will discuss a mathematical framework suitable for studies of asymptotic properties of variational problems posed on random samples and related random geometries (e.g. proximity graphs). In particular we will discuss the passage from discrete variational problems on random samples to their continuum limits. We will also consider approaches based on dynamics on graphs and connect these with the evolution equations describing the continuum limits.

The lectures will introduce the basic elements of the background material on calculus of variations and optimal transportation. They will also explain the motivation for the studies of the given functionals and their significance to machine learning. Asymptotic consistency of several important machine learning algorithms will be shown.
Finally the lectures will discuss how the insights from calculus of variations and partial differential equations can be used to improve the design of the functionals used in machine learning.

 

Résumé du colloquium
Machine learning meets calculus of variations
Modern data-acquisition techniques produce a wealth of data about the world we live in. Extracting the information from the data leads to machine learning tasks such as clustering, classification, regression, dimensionality reduction, and others. These tasks are often described as optimization problems by introducing functionals that specify the desired properties of the object considered.
The functionals take as the input the available data samples, yet we seek to make conclusions about the true distribution of data.
To compare the outcomes based on finite data and the ideal outcomes that one would have if full information is available, we study the asymptotic properties of the discrete optimization problems based on finite random samples. We will discuss how the tools of the calculus of variations and partial differential equations provide tools to compare the discrete and continuum descriptions for many relevant functionals. Furthermore, we will highlight how the connection between the discrete and continuum functionals can be used to improve the modeling of learning tasks on finite data.