DistML provide a supplement to mllib to support model-parallel on Spark
@intel-machine-learning / (1)
This is a reference implementation of "Parameter Server on Spark", it doesn't depends on specified version of Spark, so you can use it simply as a library. To add your own algorithm, DistML provides a suit of concise API including databus, which help you to do parameter exchange between workers and parameter servers. If you met any problem, or have any question, please email to firstname.lastname@example.org
This package doesn't have any releases published in the Spark Packages repo, or with maven coordinates supplied. You may have to build this package from source, or it may simply be a script. To use this Spark Package, please follow the instructions in the README.
No releases yet.