OCRStalters_Proposal_Spring-2016.pdf (746 kB)
Project Proposal

Completion Date



Graham Doig


A “typical” sedan vehicle was instrumented with pressure, inertial (IMU), video, and GPS sensors to fully categorize the on-road conditions during an extended, multi-hour tests at the Allan Hancock EVOC track. The static pressure data sampled over the trunk lid of the test vehicle was processed along with all other pressure and IMU data gathered at the front and center of the test vehicle to build categorical and continuous models of the data using techniques borrowed from computer science and machine learning. These techniques highlighted both expected and unexpected trends in the aerodynamic data as well as indicating it is notionally possible to build a continuous model of rear-vehicle aerodynamic response to on-road conditions. Front-vehicle aerodynamic data showed to be the most important dataset in the categorical models (Bayesian-Gaussian Mixture Model and Random Forest Classifiers), predicting 74% of the variation in rear-vehicle aerodynamics with only modest improvements in predictive capability coming from IMU data (2%) for a maximum prediction rate of 76%. When the models were trained only on discerning between direction of corner (IMU data indicated the occurrence of a cornering event), model performance improved to 81%. Continuous models (multivariate linear regression) showed significant predictive capability over the categorical models with an averaged R2 values on the order of 0.95 (95% of variance in rear-vehicle aerodynamics captured by model). However, these models fall short in predicting asymmetric flow over the trunk lid (R2 = 0.40 for this feature). Overall, categorical models predict a more complete breadth of the aerodynamic variation over the trunk lid but suffer from generalized conclusions resultant from data categorization. Continuous models numerically capture more of variation of the rear-vehicle aerodynamics but with a key blind spot relating to asymmetric flow patterns.


Creative Commons Attribution 4.0 License
This work is licensed under a Creative Commons Attribution 4.0 License.