Short abstract The paper proposes a method to process eye-tracking data to get a comprehensive understanding of driver’s perception of straights, curves and intersection of a simulated driving scenario. The method was validated through a testing campaign. Abstract Eye-tracking technology has been successfully applied to human factor studies in driving behavior not only to gather driver’s behavioral data but also driver’s cognitive data. Gaze position, number of fixations and pupil dilation are the eyes’ measurements commonly used in this research field. Throughout a driving task, gaze positions indicate the driver’s eye movement and its coordinates in the driving scenario. The Number of Fixations (NoF), which is the number of times the visual gaze remains on a single location, represents an effective indicator of how many specific areas are more noticeable or more important to the user in a certain scenario. Finally, Pupil dilation is an important indicator of the human physiological status and allows assessing the varying levels of the driver’s cognitive load in a driving task. However, the treatment of eye-tracking data is strongly affected by the experimental design and research aims. This paper proposes a method to process eye-tracking data in order to get a comprehensive understanding of the drivers’ perception of the different elements of simulated driving scenarios. The study is based on an experimental campaign conducted in 2019, which involved subjects with an average age of 19,25) years. Pupil Labs (https://pupil-labs.com/), a head-mounted eye-tracking system, has been used to collect data. A scene camera of the device records what the subject is looking at, while two eye cameras record both eyes. Before each recording, the eye camera is calibrated with the scene camera for transforming x and y coordinates from the eye camera coordinate system to the scene camera coordinate system. The NoF has been introduced as a parameter to measure how a user is interested/attracted in specific areas of the testing scenario (along straights, curves, and intersection). Fixation highlights Areas of Interest if it lasts a certain threshold value at least (within the range of 200-500 ms according to the state-of-art references). In order to identify and group similar driving behaviors, the route of the driving scenario has been split into different segments according to some geometrical or functional characteristics; this enables to extract and compare the NoF of each specific segment. In addition, an overall comparison of pupil dilation variety allowed the understanding of the mental workload variation among the different segments of the road. After the test, common and different behaviors of the drivers’ eye movement, according to the different segments of the proposed scenario, have been assessed. These achievements are also useful for informing further studies, especially if the comparison of driver’s behavior is compared with the real-world one.
A method of processing eye-tracking data to study driver’s visual perception in simulated driving scenarios
Y. Shi;A. Ferraioli;B. Piga;L. Mussone;G. Caruso
2020-01-01
Abstract
Short abstract The paper proposes a method to process eye-tracking data to get a comprehensive understanding of driver’s perception of straights, curves and intersection of a simulated driving scenario. The method was validated through a testing campaign. Abstract Eye-tracking technology has been successfully applied to human factor studies in driving behavior not only to gather driver’s behavioral data but also driver’s cognitive data. Gaze position, number of fixations and pupil dilation are the eyes’ measurements commonly used in this research field. Throughout a driving task, gaze positions indicate the driver’s eye movement and its coordinates in the driving scenario. The Number of Fixations (NoF), which is the number of times the visual gaze remains on a single location, represents an effective indicator of how many specific areas are more noticeable or more important to the user in a certain scenario. Finally, Pupil dilation is an important indicator of the human physiological status and allows assessing the varying levels of the driver’s cognitive load in a driving task. However, the treatment of eye-tracking data is strongly affected by the experimental design and research aims. This paper proposes a method to process eye-tracking data in order to get a comprehensive understanding of the drivers’ perception of the different elements of simulated driving scenarios. The study is based on an experimental campaign conducted in 2019, which involved subjects with an average age of 19,25) years. Pupil Labs (https://pupil-labs.com/), a head-mounted eye-tracking system, has been used to collect data. A scene camera of the device records what the subject is looking at, while two eye cameras record both eyes. Before each recording, the eye camera is calibrated with the scene camera for transforming x and y coordinates from the eye camera coordinate system to the scene camera coordinate system. The NoF has been introduced as a parameter to measure how a user is interested/attracted in specific areas of the testing scenario (along straights, curves, and intersection). Fixation highlights Areas of Interest if it lasts a certain threshold value at least (within the range of 200-500 ms according to the state-of-art references). In order to identify and group similar driving behaviors, the route of the driving scenario has been split into different segments according to some geometrical or functional characteristics; this enables to extract and compare the NoF of each specific segment. In addition, an overall comparison of pupil dilation variety allowed the understanding of the mental workload variation among the different segments of the road. After the test, common and different behaviors of the drivers’ eye movement, according to the different segments of the proposed scenario, have been assessed. These achievements are also useful for informing further studies, especially if the comparison of driver’s behavior is compared with the real-world one.File | Dimensione | Formato | |
---|---|---|---|
Eyetracker_ETC2020_manuscript_03.pdf
accesso aperto
:
Publisher’s version
Dimensione
1.02 MB
Formato
Adobe PDF
|
1.02 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.