Patiphon Narksri

ORCID: 0000-0003-0474-9185
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Autonomous Vehicle Technology and Safety
  • Robotics and Sensor-Based Localization
  • Advanced Optical Sensing Technologies
  • Remote Sensing and LiDAR Applications
  • Robotic Path Planning Algorithms
  • Advanced Neural Network Applications
  • Evacuation and Crowd Dynamics
  • Robotics and Automated Systems
  • Advanced Vision and Imaging
  • Traffic control and management
  • Video Surveillance and Tracking Methods

Nagoya University
2018-2022

In this work, we present LIBRE: LiDAR Benchmarking and Reference, a first-of-its-kind dataset featuring 10 different sensors, covering range of manufacturers, models, laser configurations. Data captured independently from each sensor includes three environments configurations: static targets, where objects were placed at known distances measured fixed position within controlled environment; adverse weather, obstacles moving vehicle, in weather chamber LiDARs exposed to conditions (fog, rain,...

10.1109/iv47402.2020.9304681 article EN 2022 IEEE Intelligent Vehicles Symposium (IV) 2020-10-19

Automated vehicle technology has recently become reliant on 3D LiDAR sensing for perception tasks such as mapping, localization and object detection. This led to a rapid growth in the manufacturing industry with several competing makers releasing new sensors regularly. With this increased variety of LiDARs, each different properties number laser emitters, resolution, field-of-view, price tags, more in-depth comparison their characteristics performance is required. work compares 10 commonly...

10.1109/access.2020.3009680 article EN cc-by IEEE Access 2020-01-01

In this paper, a slope-robust cascaded ground segmentation in 3D point cloud for autonomous vehicles is presented. many challenging terrains encountered by where the does not have simple planar shape such as sloped roads, existing algorithms fail. The proposed algorithm aims to correctly segment points scans these are present. method consists of two main steps. First, filtering majority non-ground using geometry sensor and distance between consecutive rings scan. second step, multi-region...

10.1109/itsc.2018.8569534 article EN 2018-11-01

Autonomous mobile robot navigation in real unmodified outdoor areas frequented by people on their business, children playing, fast running bicycles, and even robots, remains a difficult challenge. For eleven years, the Tsukuba Challenge Real World Robot (RWRC) has brought together researchers, companies, government, ordinary citizens, under same space to push forward limits of autonomous robots. 2017 participation, our team proposed study problem sensors-to-actuators (also called...

10.20965/jrm.2018.p0563 article EN cc-by-nd Journal of Robotics and Mechatronics 2018-08-19

In this work, we present a detailed comparison of ten different 3D LiDAR sensors for the tasks mapping and vehicle localization, using as common reference Normal Distributions Transform (NDT) algorithm implemented in self-driving open source platform Autoware. data used study is subset our Benchmarking Reference (LIBRE) dataset, captured independently from each sensor, driven on public urban roads multiple times, at times day. study, analyze performance characteristics (1) including an...

10.1109/ivworkshops54471.2021.9669244 article EN 2021-07-11

As the operational domain of autonomous vehicles expands, encountering occlusions during navigation becomes unavoidable. Most existing research on occlusion-aware motion planning focuses only longitudinal ego vehicle and neglects its lateral motion, resulting in output that can be overly conservative. This paper proposes a planner capable actively adjusting vehicle's position to minimize occlusions. The proposed is applicable various scenarios function under perception uncertainty. work also...

10.1109/access.2022.3178729 article EN cc-by IEEE Access 2022-01-01

Navigation in social environments, the absence of traffic rules, is difficult task at core annual Tsukuba Challenge. In this context, a better understanding soft rules that influence dynamics key to improve robot navigation. Prior research attempts model behavior through microscopic interactions, but resulting emergent depends heavily on initial conditions, particular macroscopic setting. As such, data-driven studies pedestrian fixed environment may provide insight into aspect, appropriate...

10.20965/jrm.2018.p0598 article EN cc-by-nd Journal of Robotics and Mechatronics 2018-08-19

Due to the complexity of environment and occlusions which often present in scene, autonomous driving an urban area is a challenging task. In some critical locations, e.g., intersections with low visibility, need be taken into account as failing do so might lead severe accident. this paper, method for crossing blind mandatory stop using estimated visibility possible approaching vehicles proposed. Speed profiles generated by proposed were compared those expert driver. The results showed that...

10.1109/itsc.2019.8917323 article EN 2019-10-01

A common approach used for planning blind intersection crossings is to assume that hypothetical vehicles are approaching the at a constant speed from occluded areas. Such an assumption can result in deadlock problem, causing ego vehicle remain stopped indefinitely due insufficient visibility. To solve this problem and facilitate safe, deadlock-free crossing, we propose planner utilizes both vehicle’s The uses particle filter our proposed visibility-dependent behavior model of predicting...

10.3390/electronics10040411 article EN Electronics 2021-02-08

In this work, we present a detailed comparison of ten different 3D LiDAR sensors, covering range manufacturers, models, and laser configurations, for the tasks mapping vehicle localization, using as common reference Normal Distributions Transform (NDT) algorithm implemented in self-driving open source platform Autoware. data used study is subset our Benchmarking Reference (LIBRE) dataset, captured independently from each sensor, driven on public urban roads multiple times, at times day....

10.48550/arxiv.2004.01374 preprint EN other-oa arXiv (Cornell University) 2020-01-01

As the autonomy level of self-driving vehicles increases, they will be expected to operate safely in increasingly complex environments. During real-world driving, occlusions are inevitable. Therefore, ability accurately identify visible and occluded regions surrounding an autonomous vehicle is crucial for safe operation. In this paper, a method estimating visibility using 3D point clouds road network maps proposed. The proposed projects positions lanes, obtained from map, scan driving...

10.1109/itsc48978.2021.9565003 article EN 2021-09-19

In this work, we present LIBRE: LiDAR Benchmarking and Reference, a first-of-its-kind dataset featuring 10 different sensors, covering range of manufacturers, models, laser configurations. Data captured independently from each sensor includes three environments configurations: static targets, where objects were placed at known distances measured fixed position within controlled environment; adverse weather, obstacles moving vehicle, in weather chamber LiDARs exposed to conditions (fog, rain,...

10.48550/arxiv.2003.06129 preprint EN other-oa arXiv (Cornell University) 2020-01-01
Coming Soon ...