spark-cc (homepage)
Library for computing clustering coefficient
@SherlockYang / (1)
It has an implementation of clustering coefficient computation.
Tags
How to
Include this package in your Spark Applications using:
spark-shell, pyspark, or spark-submit
> $SPARK_HOME/bin/spark-shell --packages SherlockYang:spark-cc:0.1
sbt
If you use the sbt-spark-package plugin, in your sbt build file, add:
spDependencies += "SherlockYang/spark-cc:0.1"
Otherwise,
resolvers += "Spark Packages Repo" at "https://repos.spark-packages.org/" libraryDependencies += "SherlockYang" % "spark-cc" % "0.1"
Maven
In your pom.xml, add:<dependencies>
<!-- list of dependencies -->
<dependency>
<groupId>SherlockYang</groupId>
<artifactId>spark-cc</artifactId>
<version>0.1</version>
</dependency>
</dependencies>
<repositories>
<!-- list of other repositories -->
<repository>
<id>SparkPackagesRepo</id>
<url>https://repos.spark-packages.org/</url>
</repository>
</repositories>
Releases
Version: 0.1 ( 93fb64 | zip | jar ) / Date: 2015-10-22 / License: LGPL-3.0 / Scala version: 2.10