Flux RSS du Laboratoire Jacques-Louis Lions
https://www.ljll.math.upmc.fr/index.php
Les dernières actualités du Laboratoire Jacques-Louis Lions directement par flux RSSSun, 05 Apr 2020 03:04:36 +0200Sun, 05 Apr 2020 00:00:00 +0200divry@ljll.math.upmc.fr (Clément DIVRY)ljll.math.upmc.frMade by Clément DIVRY with RSSFeedfrCortical inspired models for vision and phenomenology of perception
https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1197
The workshop ["Cortical inspired models for vision and phenomenology of perception" ->https://cincin2020.sciencesconf.org/] will take place on April 23-24 2020 at the Laboratoire Jacques Louis Lions (LJLL) - Sorbonne Université - 4, Place Jussieu, 75005, Paris in the room 309, corridor 15-16.
The purpose of this event is to gather together experts working in the field of vision, with particular focus on those working both on the understanding and on the modelling of the primary visual cortex via variational and PDE approaches as well as via more recent models based on (deep) neural networks.
See [here ->https://cincin2020.sciencesconf.org/] for more details and registration.Thu, 23 Apr 2020 00:00:00 +0200https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1197Leçons J.-L. Lions 2020 - Mini-cours 1 : D. Slep?ev
https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1185
Dejan Slep?ev (Université Carnegie Mellon, Pittsburgh)
Leçons Jacques-Louis Lions 2020 - Mini-cours 1
Variational problems on random structures: analysis and applications to data science
{La salle sera précisée ultérieurement}
Résumé du mini-cours
Many machine learning tasks, such as clustering, regression, classification, and dimensionality reduction, are commonly described as optimization problems. Namely these tasks are modeled by introducing functionals, defined using the available random sample, which specifies the desired properties of the object sought.
While the data typically lie in a high dimensional space, they usually have an intrinsic low-dimensional structure that makes the learning tasks feasible. The intrinsic geometric structure is often encoded by a graph created by connecting the nearby data points.
The lectures will discuss a mathematical framework suitable for studies of asymptotic properties of variational problems posed on random samples and related random geome ...Tue, 02 Jun 2020 00:00:00 +0200https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1185Leçons J.-L. Lions 2020 - Mini-cours 2 : D. Slep?ev
https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1186
Dejan Slep?ev (Université Carnegie Mellon, Pittsburgh)
Leçons Jacques-Louis Lions 2020 - Mini-cours 2
Variational problems on random structures: analysis and applications to data science
{La salle sera précisée ultérieurement}
Résumé du mini-cours
Many machine learning tasks, such as clustering, regression, classification, and dimensionality reduction, are commonly described as optimization problems. Namely these tasks are modeled by introducing functionals, defined using the available random sample, which specifies the desired properties of the object sought.
While the data typically lie in a high dimensional space, they usually have an intrinsic low-dimensional structure that makes the learning tasks feasible. The intrinsic geometric structure is often encoded by a graph created by connecting the nearby data points.
The lectures will discuss a mathematical framework suitable for studies of asymptotic properties of variational problems posed on random samples and related random geo ...Wed, 03 Jun 2020 00:00:00 +0200https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1186Leçons J.-L. Lions 2020 - Mini-cours 3 : D. Slep?ev
https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1187
Dejan Slep?ev (Université Carnegie Mellon, Pittsburgh)
Leçons Jacques-Louis Lions 2020 - Mini-cours 3
Variational problems on random structures: analysis and applications to data science
{La salle sera précisée ultérieurement}
Résumé du mini-cours
Many machine learning tasks, such as clustering, regression, classification, and dimensionality reduction, are commonly described as optimization problems. Namely these tasks are modeled by introducing functionals, defined using the available random sample, which specifies the desired properties of the object sought.
While the data typically lie in a high dimensional space, they usually have an intrinsic low-dimensional structure that makes the learning tasks feasible. The intrinsic geometric structure is often encoded by a graph created by connecting the nearby data points.
The lectures will discuss a mathematical framework suitable for studies of asymptotic properties of variational problems posed on random samples and related random geo ...Thu, 04 Jun 2020 00:00:00 +0200https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1187Leçons J.-L. Lions 2020 - Colloquium : D. Slep?ev
https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1188
Dejan Slep?ev (Université Carnegie Mellon, Pittsburgh)
Leçons Jacques-Louis Lions 2020 - Colloquium
Machine learning meets calculus of variations
{La salle sera précisée ultérieurement}
Résumé du colloquium
Modern data-acquisition techniques produce a wealth of data about the world we live in. Extracting the information from the data leads to machine learning tasks such as clustering, classification, regression, dimensionality reduction, and others. These tasks are often described as optimization problems by introducing functionals that specify the desired properties of the object considered.
The functionals take as the input the available data samples, yet we seek to make conclusions about the true distribution of data.
To compare the outcomes based on finite data and the ideal outcomes that one would have if full information is available, we study the asymptotic properties of the discrete optimization problems based on finite random samples. We will discuss how the tools of the c ...Fri, 05 Jun 2020 00:00:00 +0200https://www.ljll.math.upmc.fr/?exec=article&action=redirect&type=article&var_mode=calcul&id=1188