Hide my gaze!
Hide My Gaze! investigates closed-eye gaze gesture password input to smart glasses for mobile authentication. It senses closed-eye gaze gestures with cameras and/or electrooculography (EOG) sensors built into the frames of the glasses. The advantage of closed-eye gaze gesture passwords over regular, open-eyed gaze is that observing the password - which is composed from user’s eye movements - is significantly more difficult for attackers who are able to observe the user’s eyes during password input. If eyes are opened attackers can observe pupil movements. However, if eyes are closed, attackers can only observe eye movements through the eyelids.
Research questions
Hide My Gaze! asks and answers three core research questions of employing closed-eye gaze gesture passwords for mobile authentication with cameras and EOG sensors in smart glasses:
(1) How well can closed-eye gaze gestures be recognized with cameras and EOG sensors built into smart glasses frames?
(2) How well can closed-eye gaze gesture passwords be attacked over their opened-eye counterparts, given attackers who are able to observe the user’s eyelid movements or pupil movements during password input?
(3) How do users and attackers see the usability and attackability of closed-eye and opened-eye gaze gesture passwords?
Datasets
The Hide My Gaze! dataset contains sub-datasets for camera and EOG data as well as for data with closed and opened eyes.
Camera dataset
The Hide My Gaze! camera dataset contains a total of 219 closed-eye password samples by 20 participants. Each participant choose a self-created gaze password containing 4-5 gaze gestures.
The recording device was a Pupil Labs Pupil v2 (second generation) Eye Tracker, which has the form factor of smart glasses. We utilized a camera resolution of 400x400 pixel with 120 Hz sampling rate.
Original videos: contains mp4 video recordings of camera-based closed-eye gaze gesture password samples, alongside a csv metadata file providing basic information about the passwords.
Extracted eye movements: contains csv files with eye movement timeseries from the videos above, extracted with optical flow, alongside an extended version of the corresponding metadata csv file.
Extracted segments: contains csv file with eye movement timeseries segments for gaze gestures, detected in and extracted from the eye movement timeseries above, alongside a csv metadata file providing basic information about the gaze gesture segment.
EOG dataset
The Hide My Gaze! EOG dataset contains 81 password samples by 15 participants, containing a total of 380 gaze gestures.
Each participant could choose two self-created gaze passwords, each containing 4-5 gaze gestures as password characters. Each participant performed each of their two passwords 2-5 times, both with closed and opened eyes. The recording device was first generation JINS MEME ES_R smart glasses. We utilized an EOG sensor sampling rate of 120Hz and 12bit quantization.
EOG: closed eyes
Extracted eye movements: contains csv files with eye movement timeseries from the JINS MEME smart glasses, alongside one metadata csv with filenames, subject IDs, and password sequences.
EOG: opened eyes
Extracted eye movements: contains csv files with eye movement timeseries from the JINS MEME smart glasses, alongside one metadata csv with filenames, subject IDs, and password sequences.
Publications
BibBase findling, r
generated by
2019
(3)
Hide my Gaze with EOG! Towards Closed-Eye Gaze Gesture Passwords that Resist Observation-Attacks with Electrooculography in Smart Glasses.
Findling, R. D.; Quddus, T.; and Sigg, S.
In
17th International Conference on Advances in Mobile Computing and Multimedia, 2019.
paper
link
bibtex
abstract
5 downloads
@InProceedings{Findling_19_HidemyGaze,
author = {Rainhard Dieter Findling and Tahmid Quddus and Stephan Sigg},
booktitle = {17th International Conference on Advances in Mobile Computing and Multimedia},
title = {Hide my Gaze with {EOG}! {T}owards Closed-Eye Gaze Gesture Passwords that Resist Observation-Attacks with Electrooculography in Smart Glasses},
year = {2019},
abstract = {Smart glasses allow for gaze gesture passwords as a hands-free form of mobile authentication. However, pupil movements for password input are easily observed by attackers, who thereby can derive the password. In this paper we investigate closed-eye gaze gesture passwords with EOG sensors in smart glasses. We propose an approach to detect and recognize closed-eye gaze gestures, together with a 7 and 9 character gaze gesture alphabet. Our evaluation indicates good gaze gesture detection rates. However, recognition is challenging specifically for vertical eye movements with 71.2\%-86.5\% accuracy and better results for opened than closed eyes. We further find that closed-eye gaze gesture passwords are difficult to attack from observations with 0% success rate in our evaluation, while attacks on open eye passwords succeed with 61\%. This indicates that closed-eye gaze gesture passwords protect the authentication secret significantly better than their open eye counterparts.},
url_Paper = {http://ambientintelligence.aalto.fi/paper/findling_closed_eye_eog.pdf},
project = {hidemygaze},
group = {ambience}
}
Smart glasses allow for gaze gesture passwords as a hands-free form of mobile authentication. However, pupil movements for password input are easily observed by attackers, who thereby can derive the password. In this paper we investigate closed-eye gaze gesture passwords with EOG sensors in smart glasses. We propose an approach to detect and recognize closed-eye gaze gestures, together with a 7 and 9 character gaze gesture alphabet. Our evaluation indicates good gaze gesture detection rates. However, recognition is challenging specifically for vertical eye movements with 71.2%-86.5% accuracy and better results for opened than closed eyes. We further find that closed-eye gaze gesture passwords are difficult to attack from observations with 0% success rate in our evaluation, while attacks on open eye passwords succeed with 61%. This indicates that closed-eye gaze gesture passwords protect the authentication secret significantly better than their open eye counterparts.
Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses.
Findling, R. D.; Nguyen, L. N.; and Sigg, S.
In
International Work-Conference on Artificial Neural Networks, 2019.
paper
doi
link
bibtex
abstract
6 downloads
@InProceedings{Rainhard_2019_iwann,
author={Rainhard Dieter Findling and Le Ngu Nguyen and Stephan Sigg},
title={Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses},
booktitle={International Work-Conference on Artificial Neural Networks},
year={2019},
doi = {10.1007/978-3-030-20521-8_27},
abstract ={Gaze gestures bear potential for user input with mobile devices, especially smart glasses, due to being always available and hands-free. So far, gaze gesture recognition approaches have utilized open-eye movements only and disregarded closed-eye movements. This paper is a first investigation of the feasibility of detecting and recognizing closed-eye gaze gestures from close-up optical sources, e.g. eye-facing cameras embedded in smart glasses. We propose four different closed-eye gaze gesture protocols, which extend the alphabet of existing open-eye gaze gesture approaches. We further propose a methodology for detecting and extracting the corresponding closed-eye movements with full optical flow, time series processing, and machine learning. In the evaluation of the four protocols we find closed-eye gaze gestures to be detected 82.8%-91.6% of the time, and extracted gestures to be recognized correctly with an accuracy of 92.9%-99.2%.},
url_Paper = {http://ambientintelligence.aalto.fi/findling/pdfs/publications/Findling_19_ClosedEyeGaze.pdf},
project = {hidemygaze},
group = {ambience}}
Gaze gestures bear potential for user input with mobile devices, especially smart glasses, due to being always available and hands-free. So far, gaze gesture recognition approaches have utilized open-eye movements only and disregarded closed-eye movements. This paper is a first investigation of the feasibility of detecting and recognizing closed-eye gaze gestures from close-up optical sources, e.g. eye-facing cameras embedded in smart glasses. We propose four different closed-eye gaze gesture protocols, which extend the alphabet of existing open-eye gaze gesture approaches. We further propose a methodology for detecting and extracting the corresponding closed-eye movements with full optical flow, time series processing, and machine learning. In the evaluation of the four protocols we find closed-eye gaze gestures to be detected 82.8%-91.6% of the time, and extracted gestures to be recognized correctly with an accuracy of 92.9%-99.2%.