Year: 2018
Author: Mojmír Mutný, Peter Richtárik
Journal of Computational Mathematics, Vol. 36 (2018), Iss. 3 : pp. 404–425
Abstract
We propose a parallel stochastic Newton method (PSN) for minimizing unconstrained smooth convex functions. We analyze the method in the strongly convex case, and give conditions under which acceleration can be expected when compared to its serial counterpart. We show how PSN can be applied to the large quadratic function minimization in general, and empirical risk minimization problems. We demonstrate the practical efficiency of the method through numerical experiments and models of simple matrix classes.
You do not have full access to this article.
Already a Subscriber? Sign in as an individual or via your institution
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/jcm.1708-m2017-0113
Journal of Computational Mathematics, Vol. 36 (2018), Iss. 3 : pp. 404–425
Published online: 2018-01
AMS Subject Headings:
Copyright: COPYRIGHT: © Global Science Press
Pages: 22
Keywords: optimization parallel methods Newton's method stochastic algorithms.