Eye Blink Detection in Sign Language Data Using CNNs and Rule-Based Methods
Chapter
Published version
View/ Open
Date
2024Metadata
Show full item recordCollections
- Department of Foreign Languages [643]
- Registrations from Cristin [10467]
Original version
Proceedings of the 11th Workshop on the Representation and Processing of Sign Languages, pages 361–369.Abstract
Eye blinks are used in a variety of sign languages as prosodic boundary markers. However, no cross-linguistic quantitative research on eye blinks exists. In order to facilitate such research in future, we develop and test different methods of automatic eyeblink identification, based on a linguistic definition of blinks, and in a dataset of a natural sign language (French Sign Language). We compare two main approaches to eye openness detection: calculating the Eye Aspect Ratio using MediaPipe, and training CNNs to detect openness directly based on images from the video recordings. For the CNN method, we train different models (with different numbers of signers in the training data, different frame crops and different numbers of epochs). We then combine the openness degree detection with a separate rule-based component in order to determine boundaries of blink events. We demonstrate that both methods perform relatively well, and discuss the practical implications of the methods.