spark-logfac (homepage)

Logistic Matrix Factorization over Spark

@ezamyatin / (1)

The package contains highly scalable matrix factorization algorithms with a logistic loss function: LMF and Item2Vec


Tags

  • 1|machine learning

How to

Include this package in your Spark Applications using:

spark-shell, pyspark, or spark-submit

> $SPARK_HOME/bin/spark-shell --packages ezamyatin:spark-logfac:0.1.0

sbt

If you use the sbt-spark-package plugin, in your sbt build file, add:

spDependencies += "ezamyatin/spark-logfac:0.1.0"

Otherwise,

resolvers += "Spark Packages Repo" at "https://repos.spark-packages.org/"

libraryDependencies += "ezamyatin" % "spark-logfac" % "0.1.0"

Maven

In your pom.xml, add:
<dependencies>
  <!-- list of dependencies -->
  <dependency>
    <groupId>ezamyatin</groupId>
    <artifactId>spark-logfac</artifactId>
    <version>0.1.0</version>
  </dependency>
</dependencies>
<repositories>
  <!-- list of other repositories -->
  <repository>
    <id>SparkPackagesRepo</id>
    <url>https://repos.spark-packages.org/</url>
  </repository>
</repositories>

Releases

Version: 0.1.0 ( 2a4b1b | zip | jar ) / Date: 2024-12-07 / License: Apache-2.0 / Scala version: 2.12