DOI: https://doi.org/10.15368/theses.2021.173
Available at: https://digitalcommons.calpoly.edu/theses/2443
Date of Award
12-2021
Degree Name
MS in Electrical Engineering
Department/Program
Electrical Engineering
College
College of Engineering
Advisor
Andrew Danowitz
Advisor Department
Electrical Engineering
Advisor College
College of Engineering
Abstract
The ability to accurately map and localize relevant objects surrounding a vehicle is an important task for autonomous vehicle systems. Currently, many of the environmental mapping approaches rely on the expensive LiDAR sensor. Researchers have been attempting to transition to cheaper sensors like the camera, but so far, the mapping accuracy of single-camera and dual-camera systems has not matched the accuracy of LiDAR systems. This thesis examines depth estimation algorithms and camera configurations of a triple-camera system to determine if sensor data from an additional perspective will improve the accuracy of camera-based systems. Using a synthetic dataset, the performance of a selection of stereo depth estimation algorithms is compared to the performance of two triple-camera depth estimation algorithms: disparity fusion and cost fusion. The cost fusion algorithm in both a multi-baseline and multi-axis triple-camera configuration outperformed the environmental mapping accuracy of non-CNN algorithms in a two-camera configuration.