PhD in Mathematics - Testing and Scoring High-dimensional Covariance Matrices under the Elliptical Distribution and beyond
3:30pm - 5:30pm
Room 2612B (near lifts 31 & 32)
This thesis studies testing and scoring high-dimensional covariance matrices when data exhibit heteroskedasticity. The observations are modeled as , where 's are i.i.d. p-dimensional random vectors with mean 0 and covariance matrix , and 's are random scalars reflecting heteroskedasticity. The model is an extension of the elliptical distribution, and accommodates several stylized facts of real data including heteroskedasticity, heavy-tailedness, asymmetry, etc. Firstly, we aim to test , in the high-dimensional setting where both the dimension p and the sample size n grow to infinity proportionally. We remove the heteroskedasticity by self-normalizing the observations, and establish a CLT for the linear spectral statistic (LSS) of . The CLT is different from the existing ones for the LSS of the usual sample covariance matrix (Bai and Silverstein (2004), Najim and Yao (2016)). Our tests based on the new CLT neither assume a specific parametric distribution nor involve the fourth moment of . Numerical studies show that our tests work well even when 's are heavy-tailed. Secondly, to evaluate the performance of different covariance matrix predictors, we propose two scoring methods by modifying the entropy and quadratic loss functions in the heteroscedastic setting. Empirically, we use our proposed scores to evaluate different covariance matrix predictors for the returns of 76 stocks in the S&P 500 financials sector. The results show that: 1) self-normalizing the observations can improve the prediction; 2) compared with sparsity, an approximate factor model is more suitable for stock returns. These results can be used to build better minimum-variance portfolios.
Event Format
Thesis Defense
Candidate
YANG Xinxin
Language
English
English