Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

The Apache Spark team welcomes all types of contributions, whether they be bug reports, documentation, or new patches.

Reporting Issues

If you'd like to report a bug in Spark or ask for a new feature, open an issue on the Apache Spark JIRA. For general usage help, you should email the user mailing list.

Contributing Code

We prefer to receive contributions in the form of GitHub pull requests. Please send pull requests against the github.com/apache/spark repository. If you've previously forked Spark from its old location, you will need to fork apache/spark instead.

Here are a few tips to get your contribution in:

  1. Break your work into small, single-purpose patches if possible. It’s much harder to merge in a large change with a lot of disjoint features.
  2. Create a JIRA for your patch on the Spark Project JIRA.
  3. Submit the patch as a GitHub pull request. For a tutorial, see the GitHub guides on forking a repo and sending a pull request. Name your pull request with the JIRA name and include the Spark module or WIP if relevant.

  4. Follow the Spark Code Style Guide. Before sending in your pull request, run sbt/sbt scalastyle to validate the style.
  5. Make sure that your code passes the unit tests. You can run the tests with sbt/sbt assembly and then sbt/sbt test in the root directory of Spark. It's important to run assembly first as some of the tests depend on compiled JARs.
  6. Add new unit tests for your code. We use ScalaTest for testing. Just add a new Suite in core/src/test, or methods to an existing Suite.
  7. Update the documentation (in the docs folder) if you add a new feature or configuration parameter.

If you’d like to report a bug but don’t have time to fix it, you can still post it to our issue tracker, or email the mailing list.

Tip
titleTip: Use descriptive names in your pull requests

SPARK-123: Add some feature to Spark

[STREAMING] SPARK-123: Add some feature to Spark streaming

[MLLIB] [WIP] SPARK-123: Some potentially useful feature for MLLib

Starter Tasks

If you are new to Spark and want to contribute, you can browse through the list of starter tasks on our JIRA. These tasks are typically small and simple, and are excellent problems to get you ramped up.

Documentation

If you'd like to contribute documentation, there are two ways:

  • To have us add a link to an external tutorial you wrote, simply email the developer mailing list
  • To modify the built-in documentation, edit the MarkDown source files in Spark's docs directory, and send a patch against the Spark GitHub repository. The README file in docs says how to build the documentation locally to test your changes.

Development Discussions

To keep up to date with the latest discussions, join the developer mailing list.

IDE Setup

While many of the Spark developers use SBT or Maven on the command line, the most common IDE we use is IntelliJ IDEA. You can get the community edition for free (Apache committers can get free IntelliJ Ultimate Edition licenses) and install the JetBrains Scala plugin from Preferences > Plugins. To generate an IDEA workspace for Spark, run

Code Block
sbt/sbt update gen-idea

Then import the folder into IDEA. When you build the project, you might get a warning about "test and compile output paths" being the same for the "root-build" project. You can fix it by opening File -> Project Structure and changing the output path of the root-build module to be <spark-home>/project/target/idea-test-classes instead of idea-classes.

If you use Eclipse to develop Spark, feel free to add a short guide on setting it up to this wiki page.Moved permanently to http://spark.apache.org/contributing.html