Accelerating Newton optimization for log-linear models through feature redundancy
DSpace at IIT Bombay
View Archive InfoField | Value | |
Title |
Accelerating Newton optimization for log-linear models through feature redundancy
|
|
Creator |
MATHUR, ARPIT
CHAKRABARTI, SOUMEN |
|
Subject |
newton-raphson method
feature extraction mathematical models vectors redundancy regression analysis |
|
Description |
Log-linear models are widely used for labeling feature vectors and graphical models, typically to estimate robust conditional distributions in presence of a large number of potentially redundant features. Limited-memory quasi-Newton methods like LBFGS or BLMVM are optimization workhorses for such applications, and most of the training time is spent computing the objective and gradient for the optimizer. We propose a simple technique to speed up the training optimization by clustering features dynamically, and interleaving the standard optimizer with another, coarse-grained, faster optimizer that uses far fewer variables. Experiments with logistic regression training for text classification and conditional random field (CRF) training for information extraction show promising speed-ups between 2× and 9× without any systematic or significant degradation in the quality of the estimated models.
|
|
Publisher |
IEEE
|
|
Date |
2009-05-19T02:49:53Z
2011-11-28T08:05:39Z 2011-12-15T09:57:26Z 2009-05-19T02:49:53Z 2011-11-28T08:05:39Z 2011-12-15T09:57:26Z 2006 |
|
Type |
Article
|
|
Identifier |
Proceedings of the Sixth International Conference on Data Mining, Hong Kong, China, 18-22 December 2006, 1-10
0-7695-2701-7 10.1109/ICDM.2006.11 http://hdl.handle.net/10054/1381 http://dspace.library.iitb.ac.in/xmlui/handle/10054/1381 |
|
Language |
en
|
|