Date Thesis Awarded


Access Type

Honors Thesis -- Access Restricted On-Campus Only

Degree Name

Bachelors of Science (BS)


Computer Science


Evgenia Smirni

Committee Members

Chris Shenefiel

Eric Swartz


Self-driving technology has become increasingly advanced over the past decade, largely due to the rapid development of deep neural networks (DNNs). When DNNs are deployed in a safety-critical use case such as autonomous driving, even small deviations from the ideal output can result in catastrophic consequences. Analysis of the greatest vulnerabilities of these networks is thus necessary if wide-scale deployment of the technology is to be achieved in safety-critical environments. In this thesis, we evaluate the effects of hardware faults on a DNN embedded in a self-driving system with full autonomy. We use simulations of this system to understand how a vehicle in the real world would be affected by faults, and employ a strategic fault injection method to identify the most vulnerable parts of the DNN. Additionally, we show how protection techniques can be applied to improve a DNN’s resilience against data corruption.

On-Campus Access Only