- Evacuation and Crowd Dynamics
- Traffic and Road Safety
- Virtual Reality Applications and Impacts
- Advanced Optical Imaging Technologies
- Data Visualization and Analytics
- Anomaly Detection Techniques and Applications
- Urban Design and Spatial Analysis
- Visual perception and processing mechanisms
- Traffic control and management
- Video Surveillance and Tracking Methods
- Augmented Reality Applications
- Complex Network Analysis Techniques
- Safety Warnings and Signage
- Optical Imaging and Spectroscopy Techniques
- Remote Sensing and Land Use
- Distributed Control Multi-Agent Systems
- Balance, Gait, and Falls Prevention
- Climate Change and Health Impacts
- Thermoregulation and physiological responses
- Human Motion and Animation
- Landslides and related hazards
- Evolutionary Game Theory and Cooperation
- Color perception and design
- Urban Transport and Accessibility
- Leaf Properties and Growth Measurement
Brown University
2010-2023
Meta (United States)
2023
META Health
2023
John Brown University
2022
Google (United States)
2018
Hologic (Germany)
2012-2015
Rensselaer Polytechnic Institute
2009-2013
Providence College
2012
It is commonly believed that global patterns of motion in flocks, schools and crowds emerge from local interactions between individuals, through a process self-organization. The key to explaining such collective behaviour thus lies deciphering these interactions. We take an experiment-driven approach modelling human crowds. Previously, we observed pedestrian aligns their velocity vector (speed heading direction) with neighbour. Here investigate the neighbourhood interaction crowd: which...
Global patterns of collective motion in bird flocks, fish schools, and human crowds are thought to emerge from local interactions within a neighborhood interaction, the zone which an individual is influenced by their neighbors. Both metric topological neighborhoods have been reported animal groups, but this question has not addressed for crowds. The answer important implications modeling crowd behavior predicting disasters such as jams, crushes, stampedes. In neighborhood, all neighbors...
Abstract The optics between display and human eye in a typical VR/AR headmounted (HMD) can introduce common visual defect ‐ local pupil swim (also called ripples or “orange peel” effect), where virtual content distorts locally with head movement. Compact optical design (such as pancake optics) is increasingly sensitive manufacturing tolerance to this perceptual effect. This work provides method root cause quantify the impact based on modeling, simulation, measurement.
Many models of crowd behavior are based on local interactions between pedestrians, but little is known about the actual mechanisms governing these interactions. In Experiments 1 and 2, a participant walked with three human 'confederates' or virtual 30, while heading direction speed subset neighbors was manipulated. Experiment 3, real crowds 16 20 together in swarming scenario. We find that pedestrians unidirectionally coupled to ahead them, influence multiple linearly combined, their weights...
When using a see-through augmented reality head-mounted display system (AR HMD), user's perception of virtual content may be degraded by variety perceptual artifacts resulting from the architecture rendering and pipelines. In particular, that is rendered to appear stationary in real world (worldlocked) can susceptible spatial temporal 3D position errors. A subset these errors, termed jitter, result mismatches between localization, rendering, pipelines, manifest as perceived motion...
Coherent collective behavior emerges from local interactions between individuals that generate group dynamics. An outstanding question is how to quantify coordination of non-rhythmic behavior, in order understand the nature these dynamics at both a and global level. We investigate this problem context small four pedestrians walking goal, treating their speed heading as behavioral variables. To measure coherence level, we define dispersion parameter employ principal components analysis...
Can the collective behavior of human crowds be explained as an emergent property local pedestrian interactions? More specifically, is it possible that coordinated movement emerges from hierarchical chains leader-follower pairs (Nagy, Akos, Biro, & Vicsek 2010)? To address this issue, we collected data 5 groups 4 pedestrians steering toward a common goal. Participants began in square configuration (0.5, 1.0, 1.5, or 2.5 m sides) and walked 12 x room while their head position orientation were...
The human visual system is highly attuned to differentiate egocentric retinal velocities from those that arise independently moving objects. Augmented reality (AR) head-mounted displays (HMDs) challenge these mechanisms by tracking the wearer's head motion and projecting world-locked (WL) virtual content. Spatiotemporal artifacts can occur during WL rendering in HMDs, including high frequency random fluctuations content position (jitter). Perceptual sensitivity AR jitter has been quantified...
The optics between display and human eye in a typical VR/AR HMD can introduce common visual defect ‐ local pupil swim (also called ripples or “orange peel” effect), where virtual content distorts locally with head movement. Compact optical design (such as pancake optics) is increasingly sensitive manufacturing tolerance to this perceptual effect. This work provides method root cause quantify the impact based on modeling, simulation measurement.
Abstract Global patterns of collective motion in bird flocks, fish schools, and human crowds are thought to emerge from local interactions within a neighborhood interaction , the zone which an individual is influenced by their neighbors. Both topological metric neighborhoods have been reported birds, but this question has not addressed humans. With neighborhood, fixed number nearest neighbors, regardless physical distance; whereas with all neighbors radius. We test these hypotheses...
Can human crowd behavior be explained as an emergent property of local rules, in flocking (Reynolds 1987) and fish schooling (Huth & Wissel 1992)? Here we derive one such possible ‘rule:’ a dynamical model following another pedestrian. We collected position data from pairs pedestrians walking 12m x room, using inertial/ultrasonic tracking system (IS-900, 60 Hz). The ‘leader’ (a confederate) walked straight path. After 3 steps at constant speed, the leader would (a) speed up, (b) slow down,...
Pedestrians in a crowd use visual information to coordinate walking speed and heading direction with their neighbors. Previously, we characterized the strategies used control these behaviors pairs of pedestrians (Rio, Rhea, & Warren, 2014; Dacher 2014). Here investigate how participant combines influence multiple neighbors, providing bridge from individual behavior dynamics. In two experiments, (N=10 per experiment) was instructed “walk together” virtual 12 simulated humans presented within...
Pedestrians in a crowd are visually coupled to nearby neighbors, yielding common speed and direction of travel (heading). Previously, we derived visual control law for walking together with one neighbor (Rio & Warren, VSS 2011; Page 2013). Here, extend this framework investigate how multiple neighbors combine influence pedestrian. We collected data from participant (N=10) 3 confederates diamond configuration across 12x14m room, while recording head position an ultrasonic tracking system. On...
A key feature of augmented reality (AR) is the ability to display virtual content that appears stationary as users move throughout physical world ('world-locked rendering'). Imperfect world-locked rendering gives rise perceptual artifacts can negatively impact user experience. One example random variation in position objects are intended be (jitter'). The human visual system highly attuned detect moving objects, and moreover it disambiguate between retinal velocities arise from object motion...