Siyou Pei

ORCID: 0000-0003-3802-8298
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Gaze Tracking and Assistive Technology
  • Virtual Reality Applications and Impacts
  • Tactile and Sensory Interactions
  • Interactive and Immersive Displays
  • Spinal Cord Injury Research
  • Augmented Reality Applications
  • Indoor and Outdoor Localization Technologies
  • Context-Aware Activity Recognition Systems
  • Teleoperation and Haptic Systems
  • Robot Manipulation and Learning
  • Personal Information Management and User Behavior
  • Advanced Sensor and Energy Harvesting Materials
  • Dielectric materials and actuators
  • Ocular and Laser Science Research
  • Green IT and Sustainability
  • Surface Roughness and Optical Measurements
  • Mobile Crowdsensing and Crowdsourcing
  • Human Pose and Action Recognition
  • Inertial Sensor and Navigation
  • Robotics and Automated Systems
  • Stroke Rehabilitation and Recovery

University of California, Los Angeles
2022-2024

Augmented reality (AR) and virtual (VR) technologies create exciting new opportunities for people to interact with computing resources information. Less is the need holding hand controllers, which limits applications that demand expressive, readily available interactions. Prior research investigated freehand AR/VR input by transforming user's body into an interaction medium. In contrast previous work has users' hands grasp objects, we propose a technique lets become objects imitating...

10.1145/3491102.3501898 article EN CHI Conference on Human Factors in Computing Systems 2022-04-28

Extended reality (XR) has the potential for seamless user interface (UI) transitions across people, objects, and environments. However, design space, applications, common practices of 3D UI remain underexplored. To address this gap, we conducted a need-finding study with 11 participants, identifying distilling taxonomy based on three types placements — affixed to static, dynamic, or self entities. We further surveyed 113 commercial applications understand mobility control, where only 6.2%...

10.1145/3613904.3642220 article EN cc-by-sa 2024-05-11

Smart ear-worn devices (called earables) are being equipped with various onboard sensors and algorithms, transforming earphones from simple audio transducers to multi-modal interfaces making rich inferences about human motion vital signals. However, developing sensory applications using earables is currently quite cumbersome several barriers in the way. First, time-series data earable incorporate information physical phenomena complex settings, requiring machine-learning (ML) models learned...

10.1145/3534586 article EN public-domain Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies 2022-07-04

Acquiring accessibility information about unfamiliar places in advance is essential for wheelchair users to make better decisions physical visits. Today's assessment approaches such as phone calls, photos/videos, or 360° virtual tours often fall short of providing the specific details needed individual differences. For example, they may not reveal crucial like whether legroom underneath a table spacious enough if spatial configuration an appliance convenient users. In response, we present...

10.1145/3597638.3608410 article EN cc-by-nc 2023-10-19

Force sensing has been a key enabling technology for wide range of interfaces such as digitally enhanced body and world surfaces touch interactions. Additionally, force often contains rich contextual information about user activities can be used to enhance machine perception improved environment awareness. To sense force, conventional approaches rely on contact sensors made pressure-sensitive materials piezo films/discs or force-sensitive resistors. We present ForceSight, non-contact...

10.1145/3526113.3545622 article EN 2022-10-28

Existing pose estimation models perform poorly on wheelchair users due to a lack of representation in training data. We present data synthesis pipeline address this disparity collection and subsequently improve performance for users. Our configurable generates synthetic using motion capture generation outputs simulated the Unity game engine. validated our by conducting human evaluation, investigating perceived realism, diversity, an AI evaluation set datasets from that synthesized different...

10.1145/3613904.3642555 article EN cc-by-nc-sa 2024-05-11

Existing haptic actuators are often rigid and limited in their ability to replicate real-world tactile sensations. We present a wearable artificial muscle skin (HAMS) based on fully soft, millimeter-scale, multilayer dielectric elastomer (DEAs) capable of significant out-of-plane deformation, capability that typically requires or liquid biasing. The DEAs use thickness-varying structure achieve large displacement force, maintaining comfort wearability. Experimental results demonstrate HAMS...

10.1126/sciadv.adr1765 article EN cc-by-nc Science Advances 2024-10-25

Human attention is a scarce resource in modern computing. A multitude of microtasks vie for user to crowdsource information, perform momentary assessments, personalize services, and execute actions with single touch. lot gets done when these tasks take up the invisible free moments day. However, an interruption at inappropriate time degrades productivity causes annoyance. Prior works have exploited contextual cues behavioral data identify interruptibility much success. With Quick Question,...

10.48550/arxiv.2007.09515 preprint EN other-oa arXiv (Cornell University) 2020-01-01

Existing pose estimation models perform poorly on wheelchair users due to a lack of representation in training data. We present data synthesis pipeline address this disparity collection and subsequently improve performance for users. Our configurable generates synthetic using motion capture generation outputs simulated the Unity game engine. validated our by conducting human evaluation, investigating perceived realism, diversity, an AI evaluation set datasets from that synthesized different...

10.48550/arxiv.2404.17063 preprint EN arXiv (Cornell University) 2024-04-25

Embodied interaction has been introduced to human-robot (HRI) as a type of teleoperation, in which users control robot arms with bodily action via handheld controllers or haptic gloves. teleoperation made intuitive non-technical users, but differences between humans' and robots' capabilities \eg ranges motion response time, remain challenging. In response, we present Arm Robot, an embodied arm system that helps tackle discrepancies. Specifically, Robot (1) includes AR visualization real-time...

10.48550/arxiv.2411.13851 preprint EN arXiv (Cornell University) 2024-11-21

Auritus is an extendable and open-source optimization toolkit designed to enhance replicate earable applications. serves two primary functions. Firstly, handles data collection, pre-processing, labeling tasks for creating customized datasets using graphical tools. The system includes dataset with 2.43 million inertial samples related head full-body movements, consisting of 34 poses 9 activities from 45 volunteers. Secondly, provides a tightly-integrated hardware-in-the-loop (HIL) optimizer...

10.1145/3544793.3563423 article EN 2022-09-11
Coming Soon ...