Sbt plugin for Spark packages
@databricks / (3)
sbt-spark-package is an SBT plugin that aims to make the development process of Spark Packages and the use of Spark Packages in your applications much simpler.
- Use `spDist` to create a Release Artifact of your package to share on the Spark Packages repository.
- Use `console` to launch a Spark Shell to test your packages during development.
- Use `spPublishLocal`, and add the maven coordinate of your package in Spark with spark-submit, spark-shell or pyspark using
`--packages $GITHUB_ORG_NAME:$GITHUB_REPO_NAME:$VERSION` to include the package in your Spark applications.
- Use `spRegister` to register your package from the command line!
- Use `spPublish` to publish a release from the command line that's hosted on the Spark Packages Repository so that anyone can use it with Spark with minimal effort!
For example, you may include the 0.1 release of `databricks/spark-avro` by `--packages databricks:spark-avro:0.1`.
This package doesn't have any releases published in the Spark Packages repo, or with maven coordinates supplied. You may have to build this package from source, or it may simply be a script. To use this Spark Package, please follow the instructions in the README.
Version: 0.2.4 ( 302ea7 | zip ) / Date: 2016-07-15 / License: Apache-2.0
Version: 0.2.1 ( 02283e | zip ) / Date: 2015-04-02 / License: Apache-2.0
Version: 0.2.0 ( 400cf2 | zip ) / Date: 2015-03-16 / License: Apache-2.0
Version: 0.1.1 ( eb823d | zip ) / Date: 2015-03-06 / License: Apache-2.0