I discuss an optimization framework for solving problems with group sparsity-inducing regularization. Such regularizers include Lasso (L1), group Lasso, and latent group Lasso. The framework computes iterates by optimizing over small dimensional subspaces, thus keeping the cost per iteration relatively low. Theoretical convergence results and numerical tests on various learning problems will be presented.