- Interactive and Immersive Displays
- Augmented Reality Applications
- Data Visualization and Analytics
- Tactile and Sensory Interactions
- Virtual Reality Applications and Impacts
- Usability and User Interface Design
- Gaze Tracking and Assistive Technology
- Multimedia Communication and Technology
- Computer Graphics and Visualization Techniques
- Software Engineering Research
- Innovative Human-Technology Interaction
- Digital Innovation in Industries
- Semantic Web and Ontologies
- Video Analysis and Summarization
- Advanced Software Engineering Methodologies
- Personal Information Management and User Behavior
- Music Technology and Sound Studies
- Visual Attention and Saliency Detection
- 3D Surveying and Cultural Heritage
- Software Testing and Debugging Techniques
- Corporate Governance and Management
- Music and Audio Processing
- Image Processing and 3D Reconstruction
- Educational Games and Gamification
- 3D Modeling in Geospatial Applications
Technische Universität Dresden
2016-2025
Excellence Cluster Universe
2023
Association for Computing Machinery
2004-2021
Webb Institute
2019
Otto-von-Guericke University Magdeburg
2008-2012
University Hospital Magdeburg
2008-2012
User Interface Design (Germany)
2011-2012
Blekinge Institute of Technology
2009-2010
In recent years, data-driven stories have found a firm footing in journalism, business, and education. They leverage visualization storytelling to convey information broader audiences. Likewise, immersive technologies, like augmented virtual reality devices, provide excellent potential for exploring explaining data, thus inviting research on how transfers environments. To gain better understanding of this exciting novel area, we conducted scoping review the emerging notion storytelling,...
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially gaze-only interaction. To improve on that, we propose gaze-supported interaction more natural effective way combining user's gaze with touch input from handheld device. In particular, contribute set of novel practical techniques distant displays. Designed according to the principle suggests, confirms they include an enhanced gaze-directed cursor, local zoom lenses...
Abstract The elegance of using virtual interactive lenses to provide alternative visual representations for selected regions interest is highly valued, especially in the realm visualization. Today, more than 50 lens techniques are known closer context visualization, far related fields. In this paper, we extend our previous survey on We propose a definition and conceptual model as extensions classic visualization pipeline. An extensive review literature covers different types data user tasks...
In this work we propose the combination of large interactive displays with personal head-mounted Augmented Reality (AR) for information visualization to facilitate data exploration and analysis. Even though provide more display space, they are challenging regard perception, effective multi-user support, managing density complexity. To address these issues illustrate our proposed setup, contribute an extensive design space comprising first, spatial alignment display, visualizations, objects...
We present DesignAR, an augmented design workstation for creating 3D models. Our approach seamlessly integrates interactive surface displaying 2D views with head-mounted, stereoscopic Augmented Reality (AR). This creates a combined output space that expands the screen estate and enables placing objects beyond display borders. For effective combination of views, we define different levels proximity alignment. Regarding input, multi-touch pen mitigate issues precision ergonomics commonly found...
We present Marvis, a conceptual framework that combines mobile devices and head-mounted Augmented Reality (AR) for visual data analysis. propose novel concepts techniques addressing visualization-specific challenges. By showing additional 2D 3D information around above displays, we extend their limited screen space. AR views between displays as well linking brushing are also supported, making relationships separated visualizations plausible. introduce the design process rationale our...
In this paper, we present MIRIA, a Mixed Reality Interaction Analysis toolkit designed to support the in-situ visual analysis of user interaction in mixed reality and multi-display environments. So far, there are few options effectively explore analyze patterns such novel computing systems. With address gap by supporting movement, spatial interaction, event data multiple, co-located users directly original environment. Based on our own experiences an typical data, tasks, visualizations used...
Future offices are likely reshaped by Augmented Reality (AR) extending the display space while maintaining awareness of surroundings, and thus promise to support collaborative tasks such as brainstorming or sensemaking. However, it is unclear how physical surroundings co-located collaboration influence spatial organization virtual content for Therefore, we conducted a study (N=28) investigate effect office environments work styles during document classification task using AR with regard...
In order to improve the three-dimensional (3D) exploration of virtual spaces above a tabletop, we developed set navigation techniques using handheld magic lens. These allow for an intuitive interaction with two-dimensional and 3D information spaces, which contribute classification into volumetric, layered, zoomable, temporal spaces. The proposed PaperLens system uses tracked sheet paper navigate these regard Z-dimension (height tabletop). A formative user study provided valuable feedback...
Creating and editing large graphs node-link diagrams are crucial activities in many application areas. For them, we consider multi-touch pen input on interactive surfaces as very promising. This fundamental work presents a user study investigating how people edit an tabletop. The covers set of basic operations, such creating, moving, deleting diagram elements. Participants were asked to perform spontaneous gestures for 14 given tasks. They could interact three different ways: using one hand,...
To provide intuitive ways of interacting with media data, this research work addresses the seamless combination sensor-enabled phones large displays. An basic set tilt gestures is introduced for a stepwise or continuous interaction both mobile applications and distant user interfaces by utilizing handheld as remote control. In addition, we introduce throwing to transfer documents even running display. improve usability, data can be thrown from phone screen also fetched back achieve mobility....
In information visualization, interaction is commonly carried out by using traditional input devices, and visual feedback usually given on desktop displays. By contrast, recent advances in interactive surface technology suggest combining display functionality a single device for more direct interaction. With our work, we contribute to the seamless integration of devices introduce new ways visualizing directly interacting with information. Rather than restricting alone, explicitly use...
We investigate how to seamlessly bridge the gap between users and distant displays for basic interaction tasks, such as object selection manipulation. For this, we take advantage of very fast implicit, yet imprecise gaze- head-directed input in combination with ubiquitous smartphones additional manual touch control. have carefully elaborated two novel consistent sets gaze-supported techniques based on touch-enhanced gaze pointers local magnification lenses. These conflict-free allow fluently...
We explore the combination of smartwatches and a large interactive display to support visual data analysis. These two extremes surfaces are increasingly popular, but feature different characteristics-display input modalities, personal/public use, performance, portability. In this paper, we first identify possible roles for both devices interplay between them through an example scenario. then propose conceptual framework enable analysts items, track interaction histories, alter visualization...
Interactive wall-sized displays benefit data visualization. Due to their sheer display size, they make it possible show large amounts of in multiple coordinated views (MCV) and facilitate collaborative analysis. In this work, we propose a set important design considerations contribute fundamental input vocabulary interaction mapping for MCV functionality. We also developed fully functional application with more than 45 visualizing real-world, multivariate crime activities, which used...
We present VISTILES, a conceptual framework that uses set of mobile devices to distribute and coordinate visualization views for the exploration multivariate data. In contrast desktop-based interfaces information visualization, offer potential provide dynamic user-defined interface supporting co-located collaborative data with different individual workflows. As part our framework, we contribute concepts enable users interact coordinated & multiple (CMV) are distributed across several...
This paper presents Pearl, a mixed-reality approach for the analysis of human movement data in situ. As physical environment shapes motion and behavior, such can benefit from direct inclusion analytical process. We present methods exploring relation to surrounding regions interest, as objects, furniture, architectural elements. introduce concepts selecting filtering through interaction with environment, suite visualizations revealing aggregated emergent spatial temporal relations. More...
Abstract Going beyond established desktop interfaces, researchers have begun re‐thinking visualization approaches to make use of alternative display environments and more natural interaction modalities. In this paper, we investigate how spatially‐aware mobile displays a large wall can be coupled support graph interaction. For that purpose, distribute typical views classic node‐link matrix representations between displays. The focus our work lies in novel techniques enable users with personal...
We present SleeD, a touch-sensitive Sleeve Display that facilitates interaction with multi-touch display walls. Large vertical displays allow multiple users to interact effectively complex data but are inherently public. Also, they generally cannot an interface adapted the individual user. The combination arm-mounted, interactive allows personalized interactions. In contrast hand-held devices, both hands remain free for interacting wall. discuss different levels of coupling between wearable...
Rapid prototyping of interactive textiles is still challenging, since manual skills, several processing steps, and expert knowledge are involved. We present Iron-On User Interfaces, a novel fabrication approach for empowering designers makers to enhance fabrics with functionalities. It builds on heat-activated adhesive materials consisting smart printed electronics, which can be flexibly ironed onto the fabric create custom interface functionality. To support rapid in sketching-like fashion,...
In this work we report on two comprehensive user studies investigating the perception of Augmented Reality (AR) visualizations influenced by real-world backgrounds. Since AR is an emerging technology, it important to also consider productive use cases, which why chose exemplary and challenging industry 4.0 environment. Our basic perceptual research focuses both visual complexity backgrounds as well influence a secondary task. contrast our expectation, data 34 study participants indicate that...
Gaze visualizations represent an effective way for gaining fast insights into eye tracking data. Current approaches do not adequately support studies three-dimensional (3D) virtual environments. Hence, we propose a set of advanced gaze visualization techniques supporting behavior analysis in such Similar to commonly used two-dimensional stimuli (e.g., images and websites), contribute 3D scan paths attentional maps. In addition, introduce models interest timeline depicting viewed models,...