Hand gesture recognition is a cornerstone of intuitive human-computer interaction (HCI), particularly for wearable and extended reality (XR) systems. Today's gesture recognition solutions are predominantly based on frame-based cameras. However, these systems underperform in challenging lighting conditions, and require substantial computational power not always available on battery-powered wearable devices. This limitation can significantly hinder real-time processing performance. A promising alternative is the use of emerging Dynamic Vision Sensors (DVS), which perform exceptionally well in high-dynamic lighting environments. These sensors adjust energy consumption based on scene activity, yielding sparse yet semantically rich data, enabling efficient battery operation and supporting real-time processing. However, current event-based gesture datasets generally include only temporal segmentation, without the spatial annotations essential for hand tracking. To address this gap, we introduce LynX, a novel egocentric gesture dataset collected using custom-designed wearable hardware built around the PROPHESEE GENX320 DVS and a low power multi-core RISC-V processor called GAP9 from Greenwaves. The dataset includes recordings from 18 subjects performing 13 gesture classes across four diverse scenarios, specifically designed to exploit the advantages of DVS by incorporating dynamic lighting conditions and motion-rich environments. Each event frame is annotated with per-frame hand bounding boxes in YOLO format and precise temporal segmentation for each gesture instance. By combining spatial and temporal annotations from a first-person perspective, LynX advances event-based HCI benchmarks, enabling spatio-temporal analysis for low-latency, high-dynamicrange gesture recognition in XR. The dataset is publicly available at: https://huggingface.co/datasets/pietroba/Lynx

LynX: An Event-Based Gesture Dataset for Egocentric Interaction in Extended Reality

Bartoli, Pietro;Zappa, Franco;
2025-01-01

Abstract

Hand gesture recognition is a cornerstone of intuitive human-computer interaction (HCI), particularly for wearable and extended reality (XR) systems. Today's gesture recognition solutions are predominantly based on frame-based cameras. However, these systems underperform in challenging lighting conditions, and require substantial computational power not always available on battery-powered wearable devices. This limitation can significantly hinder real-time processing performance. A promising alternative is the use of emerging Dynamic Vision Sensors (DVS), which perform exceptionally well in high-dynamic lighting environments. These sensors adjust energy consumption based on scene activity, yielding sparse yet semantically rich data, enabling efficient battery operation and supporting real-time processing. However, current event-based gesture datasets generally include only temporal segmentation, without the spatial annotations essential for hand tracking. To address this gap, we introduce LynX, a novel egocentric gesture dataset collected using custom-designed wearable hardware built around the PROPHESEE GENX320 DVS and a low power multi-core RISC-V processor called GAP9 from Greenwaves. The dataset includes recordings from 18 subjects performing 13 gesture classes across four diverse scenarios, specifically designed to exploit the advantages of DVS by incorporating dynamic lighting conditions and motion-rich environments. Each event frame is annotated with per-frame hand bounding boxes in YOLO format and precise temporal segmentation for each gesture instance. By combining spatial and temporal annotations from a first-person perspective, LynX advances event-based HCI benchmarks, enabling spatio-temporal analysis for low-latency, high-dynamicrange gesture recognition in XR. The dataset is publicly available at: https://huggingface.co/datasets/pietroba/Lynx
2025
2025 10TH INTERNATIONAL WORKSHOP ON ADVANCES IN SENSORS AND INTERFACES, IWASI
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1296015
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact