Machine Learning

, Volume 83, Issue 3, pp 265–287

Estimating variable structure and dependence in multitask learning via gradients

Open Access

DOI: 10.1007/s10994-010-5217-4

Cite this article as:
Guinney, J., Wu, Q. & Mukherjee, S. Mach Learn (2011) 83: 265. doi:10.1007/s10994-010-5217-4


We consider the problem of hierarchical or multitask modeling where we simultaneously learn the regression function and the underlying geometry and dependence between variables. We demonstrate how the gradients of the multiple related regression functions over the tasks allow for dimension reduction and inference of dependencies across tasks jointly and for each task individually. We provide Tikhonov regularization algorithms for both classification and regression that are efficient and robust for high-dimensional data, and a mechanism for incorporating a priori knowledge of task (dis)similarity into this framework. The utility of this method is illustrated on simulated and real data.


Multitask learning Dimension reduction Covariance estimation Inverse regression Graphical models 

Copyright information

© The Author(s) 2010

Authors and Affiliations

  1. 1.Sage BionetworksSeatleUSA
  2. 2.Department of MathematicsMichigan State UniversityEast LansingUSA
  3. 3.Departments of Statistical Science, Computer Science, and MathematicsDuke UniversityDurhamUSA

Personalised recommendations