TY - GEN
T1 - Accelerated training of conditional random fields with stochastic gradient methods
AU - Vishwanathan, S. V.N.
AU - Schraudolph, Nicol N.
AU - Schmidt, Mark W.
AU - Murphy, Kevin P.
PY - 2006
Y1 - 2006
N2 - We apply Stochastic Meta-Descent (SMD), a stochastic gradient optimization method with gain vector adaptation, to the training of Conditional Random Fields (CRFs). On several large data sets, the resulting optimizer converges to the same quality of solution over an order of magnitude faster than limited-memory BFGS, the leading method reported to date. We report results for both exact and inexact inference techniques.
AB - We apply Stochastic Meta-Descent (SMD), a stochastic gradient optimization method with gain vector adaptation, to the training of Conditional Random Fields (CRFs). On several large data sets, the resulting optimizer converges to the same quality of solution over an order of magnitude faster than limited-memory BFGS, the leading method reported to date. We report results for both exact and inexact inference techniques.
UR - http://www.scopus.com/inward/record.url?scp=33749243756&partnerID=8YFLogxK
M3 - Conference contribution
SN - 1595933832
SN - 9781595933836
T3 - ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
SP - 969
EP - 976
BT - ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
T2 - ICML 2006: 23rd International Conference on Machine Learning
Y2 - 25 June 2006 through 29 June 2006
ER -