User Environment Detection Using Long Short-Term Memory Autoencoder - Institut d'Optique Graduate School
Communication Dans Un Congrès Année : 2024

User Environment Detection Using Long Short-Term Memory Autoencoder

Résumé

Mobile networks are rapidly expanding, and there is an increasing demand for seamless connectivity. Detecting whether a user is indoors or outdoors is pivotal in optimizing network performance and enhancing user experience. This paper proposes a semi-supervised learning method using a Long Short-Term Memory Autoencoder (LSTM-AE) that detects the user's environment. It uses mobile network radio signal data of real users. The LSTM Autoencoder learns to capture the underlying structure of the data and identify patterns that distinguish indoor from outdoor environments. Three key features are used to train the model: Reference Signal Received Power (RSRP), Channel Quality Indicator (CQI), and Timing Advance (TA). Results show that the LSTM-AE model achieves a high accuracy of 84% and an F1 score of 89%. In our approach, we achieve a substantial reduction of 34.14% in the requirement for labeled data compared to traditional methods that primarily rely on fully supervised learning. By diminishing the dependence on labour-intensive and time-consuming data labelling processes, this improvement significantly enhances the overall efficiency of the machine-learning process.
Fichier principal
Vignette du fichier
isncc2024_Karthika_Version.pdf (362.42 Ko) Télécharger le fichier

Dates et versions

hal-04722057 , version 1 (07-10-2024)

Identifiants

  • HAL Id : hal-04722057 , version 1

Citer

Karthika Satheesh, Kamal Singh, Sid Ali Hamideche, Marie Line Alberi Morel, César Viho. User Environment Detection Using Long Short-Term Memory Autoencoder. The 11th International Symposium on Networks, Computers and Communications (ISNCC'24), Oct 2024, Washington DC (U.S.A), United States. ⟨hal-04722057⟩
0 Consultations
0 Téléchargements

Partager

More