How Many Dimensions for your Latent Model? A Cross-Domain Perspective - FAYOL / DEMO : Décision en Entreprise : Modélisation, Optimisation Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

How Many Dimensions for your Latent Model? A Cross-Domain Perspective

Résumé

Latent representations are ubiquitous in data analytics and AI tasks. They are used as intermediary hidden models to go from a set of observations to decisions. Victor Charpenay and Rodolphe Le Riche confront the perspectives of their domains about these intermediary vector representations. They identify two antagonist purposes: while the latent variables of statistical models are used to ease computation, the hidden layers of neural networks are meant to capture non-trivial regularities in the observed data. The difference has consequences on the dimension of the latent feature space: looking for regularities implies finding an optimal contraction of the input data to a smaller latent space, in contrast to the infinite-dimensional vectors used in kernel based approaches.
latentVarTalkIMT2023.pdf (3.08 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
licence

Dates et versions

hal-04219130 , version 1 (26-09-2023)

Licence

Identifiants

  • HAL Id : hal-04219130 , version 1

Citer

Victor Charpenay, Rodolphe Le Riche. How Many Dimensions for your Latent Model? A Cross-Domain Perspective. Data&IA@IMT webinar, Sep 2023, Saint-Etienne, France. ⟨hal-04219130⟩
44 Consultations
8 Téléchargements

Partager

Gmail Mastodon Facebook X LinkedIn More