Asymptotic and finite-sample properties of estimators based on stochastic gradients
Genre
Journal ArticleDate
2017-08-01Author
Toulis, PAiroldi, EM
Subject
Stochastic approximationimplicit updates
asymptotic variance
generalized linear models
Cox proportional hazards
M-estimation
maximum likelihood
exponential family
statistical efficiency
numerical stability
Permanent link to this record
http://hdl.handle.net/20.500.12613/5691
Metadata
Show full item recordDOI
10.1214/16-AOS1506Abstract
© 2017 Institute of Mathematical Statistics. Stochastic gradient descent procedures have gained popularity for parameter estimation from large data sets. However, their statistical properties are not well understood, in theory. And in practice, avoiding numerical instability requires careful tuning of key parameters. Here, we introduce implicit stochastic gradient descent procedures, which involve parameter updates that are implicitly defined. Intuitively, implicit updates shrink standard stochastic gradient descent updates. The amount of shrinkage depends on the observed Fisher information matrix, which does not need to be explicitly computed; thus, implicit procedures increase stability without increasing the computational burden. Our theoretical analysis provides the first full characterization of the asymptotic behavior of both standard and implicit stochastic gradient descent-based estimators, including finite-sample error bounds. Importantly, analytical expressions for the variances of these stochastic gradient-based estimators reveal their exact loss of efficiency. We also develop new algorithms to compute implicit stochastic gradient descent-based estimators for generalized linear models, Cox proportional hazards, M-estimators, in practice, and perform extensive experiments. Our results suggest that implicit stochastic gradient descent procedures are poised to become a workhorse for approximate inference from large data sets.Citation to related work
Institute of Mathematical StatisticsHas part
Annals of StatisticsADA compliance
For Americans with Disabilities Act (ADA) accommodation, including help with reading this content, please contact scholarshare@temple.eduae974a485f413a2113503eed53cd6c53
http://dx.doi.org/10.34944/dspace/5673