spark-hazelcast-connector (homepage)
Connects Spark to Hazelcast
@erenavsarogullari / (0)
Spark-Hazelcast Connector:
- Exposes Hazelcast Distributed Objects(DistributedMap, MultiMap, ReplicatedMap, DistributedList, DistributedSet and DistributedQueue) as Spark RDDs.
- Writes RDDs back to Hazelcast by implicit writeEntryToHazelcast / writeItemToHazelcast / writeMessageToHazelcast call.
- Writes Hazelcast Distributed Object Events to Spark as DStream and creates HazelcastEntryStream, HazelcastItemStream and HazelcastMessageStream.
- Write Spark DStreams back to Hazelcast by implicit writeEntryToHazelcast / writeItemToHazelcast / writeMessageToHazelcast call.
- Provides examples module to learn / setup in a minute!
Tags
How to
Include this package in your Spark Applications using:
spark-shell, pyspark, or spark-submit
> $SPARK_HOME/bin/spark-shell --packages erenavsarogullari:spark-hazelcast-connector:1.0.0-s_2.11
sbt
If you use the sbt-spark-package plugin, in your sbt build file, add:
spDependencies += "erenavsarogullari/spark-hazelcast-connector:1.0.0-s_2.11"
Otherwise,
resolvers += "Spark Packages Repo" at "https://repos.spark-packages.org/" libraryDependencies += "erenavsarogullari" % "spark-hazelcast-connector" % "1.0.0-s_2.11"
Maven
In your pom.xml, add:<dependencies>
<!-- list of dependencies -->
<dependency>
<groupId>erenavsarogullari</groupId>
<artifactId>spark-hazelcast-connector</artifactId>
<version>1.0.0-s_2.11</version>
</dependency>
</dependencies>
<repositories>
<!-- list of other repositories -->
<repository>
<id>SparkPackagesRepo</id>
<url>https://repos.spark-packages.org/</url>
</repository>
</repositories>
Releases
Version: 1.0.0-s_2.11 ( 0897f3 | zip | jar ) / Date: 2016-03-07 / License: Apache-2.0
Version: 1.0.0-s_2.10 ( 0897f3 | zip | jar ) / Date: 2016-03-07 / License: Apache-2.0