spark-kafka-writer (homepage)
Write your RDDs and DStreams to Kafka seamlessly
@BenFradet / (0)
This project lets you write your RDDs and DStream to Kafka
Tags
How to
Include this package in your Spark Applications using:
spark-shell, pyspark, or spark-submit
> $SPARK_HOME/bin/spark-shell --packages com.github.benfradet:spark-kafka-writer_2.11:0.4.0
sbt
In your sbt build file, add:
libraryDependencies += "com.github.benfradet" % "spark-kafka-writer_2.11" % "0.4.0"
Maven
In your pom.xml, add:<dependencies> <!-- list of dependencies --> <dependency> <groupId>com.github.benfradet</groupId> <artifactId>spark-kafka-writer_2.11</artifactId> <version>0.4.0</version> </dependency> </dependencies>
Releases
Version: 0.4.0 ( 2e440d | zip | jar ) / Date: 2017-07-22 / License: Apache-2.0 / Scala version: 2.11
Version: 0.3.0 ( a023ad | zip | jar ) / Date: 2017-03-30 / License: Apache-2.0 / Scala version: 2.11
Version: 0.1.0 ( 46b1e2 | zip | jar ) / Date: 2016-08-11 / License: Apache-2.0 / Scala version: 2.10