Perception of vehicle and traffic dynamics using visual-inertial sensors for assistive driving
Authors
Advisors
Issue Date
Type
Keywords
Citation
Abstract
This paper presents an approach to analyze vehicle and traffic dynamics by fusing a monocular camera and inertial sensors. As opposed to traditional visual-inertial odometry for ground vehicles, the proposed method can estimate both the dynamics of the vehicle and the dynamics of the surrounding environment of the vehicle. The visual features associated with the surrounding environment are determined by the nonholonomic constraint and inertial measurements of the vehicle, and the visual features associated with moving vehicles are segmented by a part-based vehicle detection model. The dynamics of the vehicle and the scene are computed from respective visual features. The proposed method is robust to high-dynamic environment such as the scenario of many moving vehicles during rush hours in downtown areas. In addition, the proposed method is capable of estimating the number of surrounding vehicles as well as the ratios of vehicle regions to the whole image area. Experiments were performed on a challenging dataset that was collected in downtown areas and interstate highways during rush hours. The experimental results showed that the proposed method can robustly and accurately analyze the dynamics of vehicles and surrounding environments.