You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

 

Reducing Build Times

Spark's default build strategy is to assemble a jar including all of its dependencies. This can be cumbersome when doing iterative development. When developing locally, it is possible to create an assembly jar including all of Spark's dependencies and then re-package only Spark itself when making changes.

 

Fast Local Builds
$ sbt/sbt clean assemble deps
$ sbt/sbt package
# ... do some local development ... #
$ sbt/sbt package
# ... do some local development ... #
$ sbt/sbt package
# ...
 
# You can also use ~ to let sbt do incremental builds on file changes without running a new sbt session every time
$ sbt/sbt ~package

 Running Individual Tests

Often it is useful to run individual tests in Maven or SBT.

$ # sbt
$ sbt/sbt "test-only org.apache.spark.io.ComprsionCodecSuite"
$ sbt/sbt "test-only org.apache.spark.io.*"

$ # Maven
$ mvn clean test -DwildcardSuites=org.apache.spark.io.ComprsionCodecSuite
$ mvn clean test -DwildcardSuites=org.apache.spark.io.*
  • No labels