This paper extends score-guided selection to general sparse regression frameworks, including LASSO, Elastic Net, Group LASSO, and adaptive variants. The proposed method, Score-Guided Sparse Regression (SGSR), integrates score-based dictionary screening with proximal gradient descent to achieve superior feature selection accuracy while maintaining computational efficiency. Theoretical analysis establishes oracle properties, convergence guarantees, and conditions under which score-guidance improves model selection consistency. Extensive experiments on synthetic and real-world datasets demonstrate significant improvements in feature selection accuracy, estimation error, and computational efficiency compared to baseline methods.
Key findings
Proposes a comprehensive extension of score-guided selection to general sparse regression frameworks.
Develops a unified theoretical framework connecting projection-based scores to the optimization landscapes of convex regularized regression problems.
Introduces Score-Guided Sparse Regression (SGSR) that integrates score-based dictionary screening with proximal gradient descent.
Demonstrates the application of score-guidance to LASSO, Elastic Net, Group LASSO, and Adaptive LASSO.
Provides theoretical guarantees and extensive experimental validation showing significant improvements over baseline methods.
Limitations & open questions
The paper does not explore the application of the proposed method to non-convex sparse regression problems.
The theoretical analysis assumes convex regularized regression, which may not hold for all types of sparse regression.