Accelerating Stochastic Composition Optimization

Mengdi Wang, Ji Liu, Ethan X. Fang

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

We consider the stochastic nested composition optimization problem where the objective is a composition of two expected-value functions. We propose a new stochastic first-order method, namely the accelerated stochastic compositional proximal gradient (ASC-PG) method. This algorithm updates the solution based on noisy gradient queries using a two-timescale iteration. The ASC-PG is the first proximal gradient method for the stochastic composition problem that can deal with nonsmooth regularization penalty. We show that the ASC-PG exhibits faster convergence than the best known algorithms, and that it achieves the optimal sample-error complexity in several important special cases. We demonstrate the application of ASC-PG to reinforcement learning and conduct numerical experiments.

Original languageEnglish (US)
Pages (from-to)1-23
Number of pages23
JournalJournal of Machine Learning Research
Volume18
StatePublished - Oct 1 2017
Externally publishedYes

Keywords

  • Composition optimization
  • Large-scale optimization
  • Sample complexity
  • Stochastic gradient

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Accelerating Stochastic Composition Optimization'. Together they form a unique fingerprint.

Cite this