论文部分内容阅读
We consider the computation problem of fitting L1 penalized least squares and loglikelihood for large data sets.In the least squares case, we provide a practical algorithm to approximately fit the LASSO model.The computation error is shown to be of order O(1/k) when k iterative steps have been carried out.Each step of our algorithm can be expressed explicitly in closed form and is computationally inex pensive.Our algorithm has a time cost O(pnk) and requires only a storage of order max(p,n) for the data with n observations and p predictors.We further extend the algorithm to L1 regularized loglikelihood and similar results on the computation er ror hold.This for example provides a fast algorithm for solving penalized logistic regression and Gaussian graphical models.The proofs of both cases are extensions of the results on relaxed greedy algorithms in approximation theory literature, while relaxation and sampling argument are vital tools for showing the rate of decay of errors.Simulation and real data study show competitive performance of our algo rithms compared with other existing ones.This is joint work with Andrew Barron.