Abstract

Adaptive collection of data is increasingly commonplace in many applications. From the point of view of statistical inference however, adaptive collection induces memory and correlation in the samples, and poses a significant challenge. We consider the high-dimensional linear regression, where the samples are collected adaptively and the sample size $n$ can be smaller than $p$, the number of covariates. In this setting, there are two distinct sources of bias: the first due to regularization imposed for estimation, e.g. using the LASSO, and the second due to adaptivity in collecting the samples. We propose `online debiasing', a general procedure for estimators such as the LASSO, which addresses both sources of bias. In two concrete contexts $(i)$ high-dimensional time series analysis and $(ii)$ batched data collection, we demonstrate that online debiasing optimally debiases the LASSO estimate when the underlying parameter has sparsity smaller than $\sqrt{n}/\log p$. In this regime, the debiased estimate can be used to compute p-values and confidence intervals of optimal size. (This is based on joint work with Yash Deshpande and Mohammad Mehrabi)