跳到主要导航 跳到搜索 跳到主要内容

Multiple task learning with flexible structure regularization

  • Jian Pu*
  • , Jun Wang
  • , Yu Gang Jiang
  • , Xiangyang Xue
  • *此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

Due to the theoretical advances and empirical successes, Multi-task Learning (MTL) has become a popular design paradigm for training a set of tasks jointly. Through exploring the hidden relationships among multiple tasks, many MTL algorithms have been developed to enhance learning performance. In general, the complicated hidden relationships can be considered as a combination of two key structural elements: task grouping and task outlier. Based on such task relationship, here we propose a generic MTL framework with flexible structure regularization, which aims in relaxing any type of specific structure assumptions. In particular, we directly impose a joint ℓ11/ℓ21-norm as the regularization term to reveal the underlying task relationship in a flexible way. Such a flexible structure regularization term takes into account any convex combination of grouping and outlier structural characteristics among the multiple tasks. In order to derive efficient solutions for the generic MTL framework, we develop two algorithms, i.e., the Iteratively Reweighted Least Square (IRLS) method and the Accelerated Proximal Gradient (APG) method, with different emphasis and strength. In addition, the theoretical convergence and performance guarantee are analyzed for both algorithms. Finally, extensive experiments over both synthetic and real data, and the comparisons with several state-of-the-art algorithms demonstrate the superior performance of the proposed generic MTL method.

源语言英语
页(从-至)242-256
页数15
期刊Neurocomputing
177
DOI
出版状态已出版 - 12 2月 2016

指纹

探究 'Multiple task learning with flexible structure regularization' 的科研主题。它们共同构成独一无二的指纹。

引用此