Charla interesante mañana miércoles 30/10, 9h, via zoom.

Charla interesante mañana miércoles 30/10, 9h, via zoom.

de Pablo Muse -
Número de respuestas: 0

En este momento está transcurriendo la conferencia inaugural del International Research Lab "Instituto Franco-Uruguayo de Matemática e Interacciones" del CNRS, Francia: 

https://irl-uy-ifumi.apps.math.cnrs.fr/

https://sites.google.com/view/ifumi-inaugural-conference/home

En el marco de esta conferencia, el Prof. Jean-Michel Morel va a dar la siguiente charla mañana miércoles 30/10 a las 9am, via zoom, que creo que va a ser de interés para varios de ustedes. Van los detalles abajo. Difundan a quienes les parezca. 

Saludos,

Pablo


Join Zoom Meeting

Meeting ID: 843 8493 9113

---

9:00 (50'+5) Jean-Michel Morel (City University of Hong Kong) online

Deep learning and information theory (a critical comparison)

In this general-purpose talk, I'll first describe the general machine learning procedure as the association of four objects: a dataset of input-output pairs representing a ground truth, a network structure, a loss function and an optimizer. The result of the optimization procedure is what we properly call a neural network, namely a computational structure endowed with learned weights. This machine learning procedure raises many issues. Some of the main ones are:

- variability of the result depending on stochastic optimization and on stochastic initialization,
- overfitting (to the dataset),
- domain dependence of the result.

To discuss the first issue, I’ll compare the role of noise in several applications of neural networks with the role of noise in Shannon’s theory of communication. Using two recent general theorems on optimization-based machine learning and several practical examples, I'll give a formalization of overfitting (and domain dependence). I’ll deduce that neural networks are fundamentally good at denoising data, and that, according to Shannon’s theory, their useful output might not be contained in their proper output, but in the difference between input and output.