Author ORCID Identifier

https://orcid.org/0009-0001-5545-2085

Date Available

7-24-2023

Year of Publication

2023

Degree Name

Doctor of Philosophy (PhD)

Document Type

Doctoral Dissertation

College

Arts and Sciences

Department/School/Program

Statistics

First Advisor

Dr. Arnold J. Stromberg

Second Advisor

Dr. Qiang Cheng

Abstract

Neural networks have experienced widespread adoption and have become integral in cutting-edge domains like computer vision, natural language processing, and various contemporary fields. However, addressing the statistical aspects of neural networks has been a persistent challenge, with limited satisfactory results. In my research, I focused on exploring statistical intervals applied to neural networks, specifically confidence intervals and tolerance intervals. I employed variance estimation methods, such as direct estimation and resampling, to assess neural networks and their performance under outlier scenarios. Remarkably, when outliers were present, the resampling method with infinitesimal jackknife estimation yielded confidence intervals that closely aligned with nominal levels. To consider neural networks as nonparametric regression models, I employed tolerance intervals and observed that the coverage of these intervals approached the nominal level. Additionally, I conducted a comparative study between neural networks and generalized linear models. The results indicated that neural networks did not outperform linear models in low-dimensional settings. However, in high-dimensional models or multitask classification, neural networks exhibited significantly superior performance. Lastly, I proposed further research exploring advanced techniques in neural networks, as well as investigating statistical attributes of various deep learning methods. These future studies hold the potential to expand our understanding of neural networks and enhance their statistical properties.

Digital Object Identifier (DOI)

https://doi.org/10.13023/etd.2023.301

Share

COinS