Spark-AdaOptimizer (homepage)

implement Adam for stochastic optimization.

@VinceShieh / (1)

A Spark-based implementation of Adam and AdaGrad optimizer, methods for Stochastic Optimization. See https://arxiv.org/abs/1412.6980 and http://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf Comparing with SGD, Adam and AdaGrad have better performance. Especially in case of sparse features, they can converge faster than normal SGD.


Tags

  • 1|ml
  • 1|mllib
  • 1|ma

How to

This package doesn't have any releases published in the Spark Packages repo, or with maven coordinates supplied. You may have to build this package from source, or it may simply be a script. To use this Spark Package, please follow the instructions in the README.

Releases

Version: 0.1 ( 3a66f9 | zip ) / Date: 2016-12-13 / License: Apache-2.0