How Many Dimensions for your Latent Model? A Cross-Domain Perspective - FAYOL / DEMO : Décision en Entreprise : Modélisation, Optimisation Access content directly
Conference Papers Year : 2023

How Many Dimensions for your Latent Model? A Cross-Domain Perspective

Abstract

Latent representations are ubiquitous in data analytics and AI tasks. They are used as intermediary hidden models to go from a set of observations to decisions. Victor Charpenay and Rodolphe Le Riche confront the perspectives of their domains about these intermediary vector representations. They identify two antagonist purposes: while the latent variables of statistical models are used to ease computation, the hidden layers of neural networks are meant to capture non-trivial regularities in the observed data. The difference has consequences on the dimension of the latent feature space: looking for regularities implies finding an optimal contraction of the input data to a smaller latent space, in contrast to the infinite-dimensional vectors used in kernel based approaches.
latentVarTalkIMT2023.pdf (3.08 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
licence : CC BY - Attribution

Dates and versions

hal-04219130 , version 1 (26-09-2023)

Licence

Attribution

Identifiers

  • HAL Id : hal-04219130 , version 1

Cite

Victor Charpenay, Rodolphe Le Riche. How Many Dimensions for your Latent Model? A Cross-Domain Perspective. Data&IA@IMT webinar, Sep 2023, Saint-Etienne, France. ⟨hal-04219130⟩
36 View
6 Download

Share

Gmail Facebook X LinkedIn More