Papers due: | March 19, 2024 |
Notifications: | April 2, 2024 |
Camera-ready: | April 6, 2024 |
The COGAIN symposium will be held on June 4, from 1.30 to 5pm, in the Grosvenor Hotel. This is an in-person event, open to all ETRA attendees.
Unlocking Gaze Interaction on Ubiquitous Displays: from Handheld Phones to Situated Displays
Humans can intuitively and quickly move their eyes, and use them to naturally express interests and intentions and to regulate conversations. Gaze outpaces pointing, and can improve the usability and the effectiveness of user interfaces. For these reasons, the use of gaze for input has been explored extensively in the HCI and eye tracking communities. While a lot of work focused on gaze interaction in desktop settings, there are significant opportunities in non-desktop settings, such as for swift interaction with situated displays in public areas, or for overcoming touch inputs' limitations on handheld mobile devices. This talk will present the opportunities that gaze brings to interaction with both public displays and handheld mobile devices, including interaction while on the move, such as during walking. The talk will also discuss the key challenges of enabling gaze input in these scenarios, including the unique challenges of gaze interaction during movement, the privacy implications of gaze data collection, and coping with the lack of accurate gaze estimates.
Unsupervised Data Labeling and Incremental Cross-domain Training for Enhanced Hybrid Eye Gaze Estimation, Alejandro Ramos and Javier Rivero (University of the Basque Country), David Lopez (IRISBOND), Unai Elordi and Luis Unzueta (Vicomtech), Arantxa Villanueva (Public University of Navarre) [ACM Digital Library]
Estimating 'Happy' Based on Eye-Behavior Collected from HMD, Mayu Akata and Yoshiki Nishikawa (University of Tsukuba), Toshiya Isomoto (LY Corporation), Buntarou Shizuki (University of Tsukuba) [ACM Digital Library]
The International Society for Clinical Eye Tracking (ISCET) serves as a global platform for promoting international consensus on open standards on clinical eye tracking. Originally formed in March 2023, ISCET was created to facilitate collaboration, knowledge exchange, and advancements in clinical eye tracking applications, with the ultimate goal of fostering interdisciplinary research and improving clinical outcomes. Through collaborative and interdisciplinary efforts, ISCET's current mission is to provide guidance on conducting eye-tracking tasks in clinical settings, unify clinicians' voices, and maintain reference datasets for normative comparisons. In the Europe/Africa region meeting on January 24, 2024, ISCET established Subcommittees to address specific needs. This presentation outlines the rationale behind ISCET's formation, its mission, objectives, ongoing initiatives, and organizational structure. Ongoing work focuses on surveying current international clinical eye tracker usage to inform standards development. [ACM Digital Library]
Attempts on Detecting Alzheimer's Disease by Fine-tuning Pre-trained Model with Gaze Data, Junichi Nagasawa and Yuichi Nakata (Kobe University), Mamoru Hiroe (Osaka Seikei University), Naoki Hojo, Tetsuya Takiguchi, Yuji Maegawa, and Yutaka Kawaguchi (Kobe University), Minoru Nakayama (Tokyo Institute of Technology), Maki Uchimura, Yujia Zheng, Yuma Sonoda, Hisatomo Kowa, and Takashi Nagamatsu (Kobe University) [ACM Digital Library]
A Novel Diagnostic Tool Utilizing Eye Tracking Technology to Allow Objective Assessment of Patients' Cognitive Functions, Pawel Kasprowski (Silesian University of Technology), Grzegorz Żurek (Wroclaw University of Health and Sport Sciences), Roman Olejniczak (Klinika Neurorehabilitacji dr Roman Olejniczak) [ACM Digital Library]
Linking Data from Eye-Tracking and Serious Games to NDD Characteristics: A Bibliometric Study, Are Dæhlen, Ilona Heldal, and Jozsef Katona (Western Norway University of Applied Sciences) [ACM Digital Library]
The Effect of Degraded Eye Tracking Accuracy on Interactions in VR, Ajoy Fernandes and Scott Murdison (Meta Reality Labs Research), Immo Schuetz (Meta Platforms Inc.), Oleg Komogortsev and Michael Proulx (Meta Reality Labs Research) [ACM Digital Library]
Between Wearable and Spatial Computing: Exploring Four Interaction Techniques at the Intersection of Smartwatches and Head-mounted Displays, Nuno Estalagem and Augusto Esteves (University of Lisbon) [ACM Digital Library]
🏆 Best Paper Award
A Functional Usability Analysis of Appearance-Based Gaze Tracking for Accessibility, Youn Soo Park and Roberto Manduchi (UC Santa Cruz) [ACM Digital Library] [Video Presentation]
🏆 Honorable Mention Award
LookToFocus: Image Focus via Eye Tracking, SaiKiran Tedla, Scott MacKenzie, and Michael Brown (York University) [ACM Digital Library] [Video Presentation]
Hand Me This: Exploring the Effects of Gaze-driven Animations and Hand Representations in Users’ Sense of Presence and Embodiment, Noha Mokhtar and Augusto Esteves (University of Lisbon) [ACM Digital Library] [Video Presentation]