Emilie Møllenbach

ORCID: 0000-0003-4463-3734
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Gaze Tracking and Assistive Technology
  • Tactile and Sensory Interactions
  • Online and Blended Learning
  • Innovative Human-Technology Interaction
  • Glaucoma and retinal disorders
  • Teleoperation and Haptic Systems
  • Interactive and Immersive Displays
  • Usability and User Interface Design
  • Human-Automation Interaction and Safety
  • Educational Innovations and Technology
  • Innovative Teaching Methods
  • Virtual Reality Applications and Impacts
  • Fault Detection and Control Systems
  • Personal Information Management and User Behavior
  • Digital Accessibility for Disabilities
  • Design Education and Practice
  • Chaos, Complexity, and Education
  • Complex Systems and Decision Making
  • Innovations in Educational Methods
  • Robotics and Automated Systems
  • Advanced Control Systems Optimization
  • Green IT and Sustainability
  • Innovative Approaches in Technology and Social Development
  • Industrial Automation and Control Systems
  • Online Learning and Analytics

FORCE Technology (Denmark)
2023

Technical University of Denmark
2018

IT University of Copenhagen
2014-2016

University of Copenhagen
2012-2013

Loughborough University
2008-2010

This paper presents a low-cost gaze tracking system that is based on webcam mounted close to the user's eye. The performance of tracker was evaluated in an eye-typing task using two different typing applications. Participants could type between 3.56 and 6.78 words per minute, depending used. A pilot study assess usability also carried out home user with severe motor impairments. successfully typed wall-projected interface his eye movements.

10.1145/1743666.1743685 article EN 2010-01-01

This paper presents an experimental investigation of gaze-based control modes for unmanned aerial vehicles (UAVs or "drones"). Ten participants performed a simple flying task. We gathered empirical measures, including task completion time, and examined the user experience difficulty, reliability, fun. Four were tested, with each mode applying combination x-y gaze movement manual (keyboard) input to speed (pitch), altitude, rotation (yaw), drafting (roll). Participants had similar times all...

10.1145/2578153.2578156 article EN 2014-03-24

This paper presents StarGazer - a new 3D interface for gaze-based interaction and target selection using continuous pan zoom. Through we address the issues of interacting with graph structured data applications (i.e. gaze typing systems) low resolution eye trackers or small-size displays. We show that it is possible to make robust even large number selectable items on screen noisy trackers. A test 48 subjects demonstrated users who have never tried before could rapidly adapt navigation...

10.1145/1344471.1344521 article EN 2008-01-01

Gaze as a sole input modality must support complex navigation and selection tasks. interaction combines specific eye movements graphic display objects (GDOs). This paper suggests unifying taxonomy of gaze principles. The deals with three types movements: fixations, saccades smooth pursuits GDOs: static, dynamic, or absent. is qualified through related research the first main contribution this paper. second part offers an experimental exploration single stroke gestures (SSGG). findings...

10.16910/jemr.6.2.1 article EN cc-by Journal of Eye Movement Research 2013-05-09

We investigate if the gaze (point of regard) can control a remote vehicle driving on racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial tracking systems) provide heading speed scene view transmitted from moving robot. Gaze was found to be similar mouse control. This suggests that robots wheelchairs may controlled "hands-free" through gaze. Low precision image transmission delays had noticeable effect performance.

10.1145/1520340.1520671 article EN 2009-04-04

This paper examines gaze gestures and their applicability as a generic selection method for gaze-only controlled interfaces. The explored here is the Single Gaze Gesture (SGG), i.e. consisting of single point-to-point eye movement. Horizontal vertical, long short SGGs were evaluated on two tracking devices (Tobii/QuickGlance (QG)). main findings show that there significant difference in times between SGGs, vertical horizontal selections, well different systems.

10.1145/1743666.1743710 article EN 2010-01-01

This paper introduces and explains the concept of single stroke gaze gestures. Some preliminary results are presented which indicate potential efficiency this interaction method we show how could be implemented for benefit disabled users generally it integrated with dwell to create a new dimension in controlled interfaces.

10.1145/1520340.1520699 article EN 2009-04-04

Maker communities and hacker spaces engaged in tangible computing are popping up outside the academic setting driven by curiosity a desire to learn. This workshop is concerned with how making can be has been used an setting. Making shifts focus of education from prescribed tasks towards what people want know or do.

10.1145/2639189.2654827 article EN 2014-10-21

The experiment described in this paper, shows a test environment constructed with two information spaces; one large 2000 nodes ordered semi-structured groups which participants performed search and browse tasks; the other was smaller designed for precision zooming, where subjects target selection simulation tasks. For both tasks, modes of gaze- mouse-controlled navigation were compared.

10.1145/1378773.1378833 article EN 2008-01-13

Sustained behavior changes are required to reduce the impact of human society on environment. Much research how HCI may help do so focuses changing by providing information directed at an individual or a microstructure (e.g., household). We propose societal macrostructures municipalities) and their interaction with microstructures as focus for aimed designing change. present two ongoing case studies involving municipalities in Denmark discuss why be used design based environmental sustainability.

10.1145/2212776.2223769 article EN 2012-05-05

We demonstrate potentials of adding a gaze tracking unit to smartwatch, allowing hands-free interaction with the watch itself and control environment. Users give commands via gestures, i.e. looking away back GazeWatch. Rapid presentation single words on display provides rich effective textual interface. Finally, we exemplify how GazeWatch can be used as ubiquitous pointer large displays.

10.1145/2786567.2792899 article EN 2015-08-24

We contribute to a project introducing the use of large single touch-screen as concept for future airplane cockpits. Human-machine interaction in this new type cockpit must be optimised cope with different types normal well during moments turbulence (which can occur flights varying degrees severity). propose an original experimental setup reproducing (not limited aviation) based on mounted rollercoaster. Participants had repeatedly solve three basic touch interactions: click, one-finger...

10.1155/2018/2698635 article EN cc-by Advances in Human-Computer Interaction 2018-10-18

In this exploratory case study we map the educational practice of teachers and students in a professional master Interaction Design. Through grounded analysis context describe reflect on: 1) use digital learning tools blended environment, 2) co-presence as an

10.4108/eai.23-8-2016.151638 article EN cc-by ICST Transactions on Ambient Systems 2016-08-23

The growing competition by cloud service providers to make IoT products affordable and accessible have opened many doors of opportunities for small medium enterprises, ambitious digitalize their operations. One such area where most the companies are investing is based production optimization. There several methods optimization, an important method introduce close-loop control systems reduce any manual involvement in This paper presents architecture implementing closed loop control, relevant...

10.1145/3603166.3632544 article EN 2023-12-04
Coming Soon ...