Quattoni A,Collins M,Darrell T.Transfer learning for image classification with sparse prototype representations[A].Proceedings of IEEE Conference on Computer Vision and Pattern Recognition[C].Alaska,USA:IEEE,2008.1-8.
[2]
Schmidt M W,Murphy K P,Fung G,Rosales R.Structure learning in random fields for heart motion abnormality detection[A].Proceedings of IEEE Conference on Computer Vision and Pattern Recognition[C].Alaska,USA:IEEE,2008.1-8.
[3]
Quattoni A,Carreras X,Collins M,Darrell T.An efficient projection for L1,∞ regularization[A].Proceedings of the 26th Annual International Conference on Machine Learning[C].Quebec,Canada:ACM,2009.857-864.
[4]
Vogt J E,Roth V.The group-Lasso:l1,∞ regularization versus l1,2 regularization[A].Proceedings of the 32nd DAGM conference on Pattern recognition[C].Darmstadt,Germany:Springer-Verlag,2010.252-261.
[5]
Huang J,Zhang T.The benefit of group sparsity[J].The Annals of Statistics,2010,38(4):1978-2004.
[6]
Sra S.Fast projections onto Q1,q-norm balls for grouped feature selection[J].Lecture Notes in Computer Science,2011:305-317.
[7]
Kowalski M.Sparse regression using mixed norms[J].Applied and Computational Harmonic Analysis,2009,27(3):303-324.
[8]
Zou H.The adaptive lasso and its oracle properties[J].Journal of the American statistical association,2006,101(476):1418-1429.
[9]
Zhang H H,Lu W.Adaptive lasso for Cox''s proportional hazards model[J].Biometrika,2007,94(3):691-703.
[10]
Huang J,Ma S,Zhang C H.Adaptive lasso for sparse high-dimensional regression models[J].Statistica Sinica,2008,18(4):1603-1618.
[11]
Simon N,et al.A sparse-group lasso[J].Journal of Computational and Graphical Statistics,2013,22(2):231-245.
[12]
Chatterjee S,Steinhaeuser K,Banerjee A,et al.Sparse group lasso:consistency and climate applications[A].Proceedings of the 12th SIAM International Conference on Data Mining[C].California,USA:Omnipress,2012.47-58.
[13]
Zhu X,Huang Z,et al.Video-to-shot tag allocation by weighted sparse group lasso[A].Proceedings of the 19th ACM International Conference on Multimedia[C].Scottsdale:ACM,2011.1501-1504.
[14]
Raman S,Fuchs T J,Wild P J,Dahl E,Roth V.The Bayesian group-Lasso for analyzing contingency tables[A].Proceedings of the 26th Annual International Conference on Machine Learning[C].Montreal,Canada:ACM,2009.881-888.
[15]
Chandran M.Analysis of Bayesian group-Lasso in regression models[D].Florida,USA:University of Florida,2011.
[16]
Meier L,Van De Geer S,Bühlmann P.The group lasso for logistic regression[J].Journal of the Royal Statistical Society:Series B,2008,70(1):53-71.
[17]
Wang S,et al.Hierarchically penalized Cox regression with grouped variables[J].Biometrika,2009,96(2):307-322.
[18]
Ma S,Huang J,Song X.Integrative analysis and variable selection with multiple high-dimensional data sets[J].Biostatistics,2011,12(4):763-775.
[19]
Ma S,Dai Y,Huang J,et al.Identification of breast cancer prognosis markers via integrative analysis[J].Computational Statistics & Data Analysis,2012,56(9):2718-2728.
[20]
Seetharaman I.Consistent bi-level variable selection via composite group bridge penalized regression[D].Kansas,USA:Kansas State Univesity,2013.
[21]
Fu W J.Penalized regressions:the bridge versus the Lasso[J].Journal of Computational and Graphical Statistics,1998,7(3):397-416.
[22]
Jenatton R,Audibert J Y,Bach F.Structured variable selection with sparsity-inducing norms[J].The Journal of Machine Learning Research,2011,12:2777-2824.
[23]
Mosci S,et al.A primal-dual algorithm for group sparse regularization with overlapping groups[A].Proceedings of Advances in Neural Information Processing Systems 23:24th Annual Conference on Neural Information Processing Systems[C].Canada:Curran Associates,2010.2604-2612.
[24]
Percival D.Theoretical properties of the overlapping groups Lasso[J].Electronic Journal of Statistics,2012,6:269-288.
[25]
Yuan L,Liu J,Ye J.Efficient methods for overlapping group lasso[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2013,35(9):2104-2116.
[26]
Percival D.Theoretical properties of the overlapping groups lasso[J].Electronic Journal of Statistics,2012,6:269-288.
[27]
Jenatton R,Mairal J,Obozinski G,Bach F.Proximal methods for hierarchical sparse coding[J].Journal of Machine Learning Research,2011,12:2297-2334.
[28]
Liu J,Ye J P.Moreau-Yosida regularization for grouped tree structure learning[A].Proceedings of Advances in Neural Information Processing Systems 23:24th Annual Conference on Neural Information Processing Systems[C].Vancouver,Canada:Curran Associates,2010.1459-1467.
[29]
Martins A F T,Smith N A,et al.Structured sparsity in structured prediction[A].Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing[C].Massachusetts,America:ACL,2011.1500-1511.
[30]
Zhao P,Rocha G,Yu B.Grouped and hierarchical model selection through composite absolute penalties[J].The Annals of Statistics,2009,(6):3468-3497.
[31]
Kim S,Xing E P.Tree-guided group Lasso for multi-task regression with structured sparsity[A].Proceedings of the 27th International Conference on Machine Learning[C].Haifa,Israel:Omnipress,2010.543-550.
[32]
Kim S,Xing E P.Tree-guided group Lasso for multi-response regression with structured sparsity with an application to eQTL mapping[J].The Annals of Applied Statistics,2012,6(3):1095-1117.
[33]
Zhao P,Yu B.On model selection consistency of lasso[J].The Journal of Machine Learning Research,2006,7:2541-2563.
[34]
Bach F R.Consistency of the group lasso and multiple kernel learning[J].The Journal of Machine Learning Research,2008,9:1179-1225.
[35]
Zhang C H,Huang J.The sparsity and bias of the lasso selection in high-dimensional linear regression[J].The Annals of Statistics,2008,36(4):1567-1594.
[36]
Wei F,Huang J.Consistent group selection in high-dimensional linear regression[J].Bernoulli,2010,16(4):1369-1384.
[37]
Bickel P J,Ritov Y,Tsybakov A B.Simultaneous analysis of lasso and Dantzig selector[J].The Annals of Statistics,2009,37(4):1705-1732.
[38]
Lounici K,Pontil M,Van D G,Tsybakov A B.Oracle inequalities and optimal inference under group sparsity[J].The Annals of Statistics,2011,39(4),2164-2204.
[39]
Zou H,Li R.One-step sparse estimates in nonconcave penalized likelihood models[J].Annals of statistics,2008,36(4):1509-1533.
[40]
Yang Y,Zou H.A fast unified algorithm for solving group-lasso penalized learning problems[J].Journal of Computational and Graphical Statistics,2012:328-361.
[41]
Qin Z,Scheinberg K,Goldfarb D.Efficient block-coordinate descent algorithms for the group lasso[J].Mathematical Programming Computation,2013:143-169.
Roth V,Fischer B.The group-lasso for generalized linear models:uniqueness of solutions and efficient algorithms[A].Proceedings of The 25th International Conference on Machine learning[C].Helsinki,Finland:ACM,2008.848-855.
[44]
Boyd S,Parikh N,Chu E,et al.Distributed optimization and statistical learning via the alternating direction method of multipliers[J].Foundations and Trends in Machine Learning,2011,3(1):1-122.
[45]
Efron B,Hastie T,Johnstone I,et al.Least angle regression[J].The Annals of statistics,2004,32(2):407-499.
[46]
Meinshausen N,et al.Stability selection[J].Journal of the Royal Statistical Society:Series B,2010,72(4):417-473.
[47]
Choon C L.Minimax concave bridge penalty function for variable selection[D].Singapore:National University of Singapore,2012.
[48]
Mazumder R,Friedman J H,Hastie T.SparseNet:coordinate descent with nonconvex penalties[J].Journal of the American Statistical Association,2011,106(495):1125-1138.
[49]
Kwon S,Kim Y,Choi H.Sparse bridge estimation with a diverging number of parameters[J].Statistics and Its Interface,2012,6:231-242.
[50]
Candes E J,Wakin M B,Boyd S P.Enhancing sparsity by reweighted L1 minimization[J].Journal of Fourier Analysis and Applications,2008,14(5-6):877-905.
[51]
Van D G,Bühlmann P.On the conditions used to prove oracle results for the lasso[J].Electronic Journal of Statistics,2009,3:1360-1392.
[52]
Zhang T.Some sharp performance bounds for least squares regression with L1 regularization[J].The Annals of Statistics,2009,37(5A):2109-2144.
[53]
Ye F,Zhang C H.Rate minimaxity of the lasso and Dantzig selector for the Lq loss in Lr balls[J].The Journal of Machine Learning Research,2010,11:3519-3540.
[54]
Tibshirani R.Regression shrinkage and selection via the lasso[J].Journal of the Royal Statistical Society:Series B,1996,58(1):267-288.
[55]
Yuan M,Lin Y.Model selection and estimation in regression with grouped variables[J].Journal of the Royal Statistical Society:Series B,2006,68(1):49-67.
[56]
Turlach B A,Venables W N,Wright S J.Simultaneous variable selection[J].Technometrics,2005,47(3):349-363.
[57]
Tropp J A.Algorithms for simultaneous sparse approximation[J].Signal Processing,2006,86(3):589-602.
[58]
Rakotomamonjy A,et al.Lp-Lq penalty for sparse linear and sparse multiple kernel multi-task learning[J].IEEE Transactions on Neural Networks,2011,22(8):1307-1320.
[59]
Simon N,Tibshirani R.Standardization and the group lasso penalty[J].Statistica Sinica,2012,22(3):983-1001.
[60]
Bunea F,Lederer J,She Y.The group square-root lasso:theoretical properties and fast algorithms[J].IEEE Transactions on Information Theory,2014,60(2):1313-1325.
[61]
Belloni A,Chernozhukov V,Wang L.Square-root lasso:pivotal recovery of sparse signals via conic programming[J].Biometrika,2011,98(4):791-806.
[62]
Wang H,Leng C.A note on adaptive group lasso[J].Computational Statistics and Data Analysis,2008,52(12):5277-5286.
[63]
Wei F,Huang J.Consistent group selection in high-dimensional linear regression[J].Bernoulli,2010,16(4):1369-1384.
[64]
Kim Y,Kim J,et al.Blockwise sparse regression[J].Statistica Sinica,2006,16(2):375-390.
[65]
Sun H,Wang S.Penalized logistic regression for high-dimensional DNA methylation data with case-control studies[J].Bioinformatics,2012,28(10):1368-1375.
[66]
Wu F,Yuan Y,Zhuang Y.Heterogeneous feature selection by group lasso with logistic regression[A].Proceedings of the International Conference on Multimedia[C].Firenze,Italy:ACM,2010.983-986.
[67]
Liu X,Wang Z,Wu Y.Group variable selection and estimation in the tobit censored response model[J].Computational Statistics and Data Analysis,2013,60:80-89.
[68]
Ji Y,Lin N,Zhang B.Model selection in binary and tobit quantile regression using the Gibbs sampler[J].Computational Statistics & Data Analysis,2012,56(4):827-839.
[69]
Yin J,Chen X,Xing E P.Group sparse additive models[A].Proceedings of the 29th International Conference on Machine Learning[C].Scotland,UK:Omnipress,2012.871-878.
[70]
Jiang D F.Concave selection in Generalized linear models[D].Iowa,USA:University of Iowa,2012.
[71]
Wang L,Chen G,Li H.Group SCAD regression analysis for microarray time course gene expression data[J].Bioinformatics,2007,23(12):1486-1494.
[72]
Fan J,Li R.Variable selection via nonconcave penalized likelihood and its oracle properties[J].Journal of the American Statistical Association,2001,96(456):1348-1360.
[73]
Breheny P,Huang J.Penalized methods for bi-level variable selection[J].Statistics and Its Interface,2009,2(3):369-380.
[74]
Huang J,Breheny P,Ma S.A selective review of group selection in high-dimensional models[J].Statistical Science,2012,27(4):481-499.
[75]
Zhang C H.Nearly unbiased variable selection under minimax concave penalty[J].Annals of Statistics,2010,38 (2):894-942.
[76]
Huang J,Ma S,Xie H,Zhang C H.A group bridge approach for variable selection[J].Biometrika,2009,96(2):339-355.
[77]
Nesterov Y.Smooth minimization of non-smooth functions[J].Mathematical Programming,2005,103(1):127-152.