DIMENSION REDUCTION USING INVERSE SPLINE REGRESSION
dc.contributor.advisor | Smith, Paul J. | en_US |
dc.contributor.advisor | Dolgopyat, Dmitry | en_US |
dc.contributor.author | Nam, Kijoeng | en_US |
dc.contributor.department | Mathematical Statistics | en_US |
dc.contributor.publisher | Digital Repository at the University of Maryland | en_US |
dc.contributor.publisher | University of Maryland (College Park, Md.) | en_US |
dc.date.accessioned | 2014-10-11T05:50:17Z | |
dc.date.available | 2014-10-11T05:50:17Z | |
dc.date.issued | 2014 | en_US |
dc.description.abstract | In high-dimensional data analysis, we often want to reduce the number of predictors without eliminating variables which are related to the response of interest. Inverse regression methods use the response variable when performing dimension reduction so that information regarding the relation between the covariates and the response is not lost. However, it is common to assume that the inverse regression function is linear or to use some other ad hoc approach. Instead, we propose a new dimension reduction method which models the inverse regression function as a spline. We develop asymptotics for our approach and demonstrate its performance through simulations and several data sets commonly found in the machine learning literature. We show that its performance is better than existing inverse regression based methods, especially when the dimension reduction space is a nonlinear manifold such as the Swiss roll example of Roweis and Saul (2000). | en_US |
dc.identifier | https://doi.org/10.13016/M2F88G | |
dc.identifier.uri | http://hdl.handle.net/1903/15776 | |
dc.language.iso | en | en_US |
dc.subject.pqcontrolled | Statistics | en_US |
dc.subject.pquncontrolled | Asymptotics | en_US |
dc.subject.pquncontrolled | High-dimensional data | en_US |
dc.subject.pquncontrolled | Inverse regression methods | en_US |
dc.title | DIMENSION REDUCTION USING INVERSE SPLINE REGRESSION | en_US |
dc.type | Dissertation | en_US |
Files
Original bundle
1 - 1 of 1