Vol. 16 No. 1 (2026): Vol 16, Iss 1, Year 2026
Articles

Adaptive Sensor Fusion System for Low-Visibility Vehicle Navigation with Hud Point Cloud Rendering

Azad Mohammed Shaik
BSWE Platform Engineer, Stellantis, Troy, 48085, USA.

Published 2026-02-18

Keywords

  • Sensor fusion, LiDAR, autonomous vehicles, low- visibility navigation, heads-up display, point cloud rendering, fog detection, ROS 2, DBSCAN clustering, U.S. Patent 12371046.

Abstract

This paper describes how I built and tested a patented adaptive sensor-fusion system for autonomous driving in low-visibility conditions. Based on U.S. Patent 12371046B2, the system continuously checks how clear the environment is and automatically switches between camera-based navigation and LiDAR-based navigation when visibility drops. It also projects the LiDAR view (point-cloud information) onto a windshield heads-up display (HUD) so the driver or safety operator can clearly see what’s ahead. To judge visibility, I use a hybrid metric that combines image contrast and edge density, and it identifies conditions accurately (94.5%). For LiDAR perception, the system uses DBSCAN clustering to detect objects from point clouds in real time. In heavy fog, it detects objects reliably up to 50 meters (100% detection rate), while a camera-only approach drops sharply (16.7%). Switching between camera and LiDAR is fast (183 ms), so transitions feel smooth. Across 425 test scenarios, the system greatly increases detection range in heavy fog (316.7% improvement) and keeps end-to-end processing low (73.1 ms), supporting real-time operation at 30 Hz. Overall, it meets and exceeds safety requirements with 99.98% availability, showing the patented approach is practical for real vehicles in bad weather.

Downloads

Download data is not yet available.