Analysis of LiDAR and Camera Data in Real-World Weather Conditions for Autonomous Vehicle Operations
Sensor Fusion
DOI:
10.4271/2020-01-0093
Publication Date:
2020-04-14T09:01:08Z
AUTHORS (7)
ABSTRACT
<div class="section abstract"><div class="htmlview paragraph">Autonomous vehicle technology has the potential to improve safety, efficiency, and cost of our current transportation system by removing human error. With sensors available today, it is possible for development these vehicles, however, there are still issues with autonomous operations in adverse weather conditions (e.g. snow-covered roads, heavy rain, fog, etc.) due degradation sensor data quality insufficiently robust software algorithms. Since vehicles rely entirely on perceive their surrounding environment, this becomes a significant issue performance system. The purpose study collect under various understand effects data. used were one camera LiDAR. These connected an NVIDIA Drive Px2 which operated 2019 Kia Niro. Two custom scenarios (static dynamic objects) chosen operating four real-world conditions: fair, cloudy, rainy, light snow. An algorithm developed herein was provide method quantifying comparison against other conditions. results from algorithms show that degrades average 13.88% static objects 16.16% while conditions, rain proving have most effect degradation. From study, hypothesized advancements processing can usability degraded In future work, we seek explore fault-tolerant fusion overcome weather.</div></div>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (20)
CITATIONS (28)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....