...
This page describes the mechanics of how to contribute software to Apache Hive. For ideas about what you might contribute, please see open tickets in Jira.
Table of Contents |
---|
Info | ||
---|---|---|
| ||
Until HIVE-5610 is closed the maven commands below will not work. Afterwards the ant commands should be removed. |
Getting the source code
First of all, you need the Hive source code.
...
No Format |
---|
ant clean package |
...
Understanding Maven
The Hive build downloads a number of different Hadoop versions via ivy in order to compile "shims" which allow for compatibility with these Hadoop versions. However, by default, the rest of Hive is only built and tested against a single Hadoop version (0.20.1 as of this writing, but check build.properties for the latest).is a multi-module maven project. If you are new to Maven, the articles below maybe of interest:
Additionally, Hive actually has two projects, "core" and "itests". The reason that itests is not connected to the core reactor is that itests requires the packages to be built.
Hadoop Dependencies
The Hive build downloads a number of different Hadoop versions via ivy in order to compile "shims" which allow for compatibility with these Hadoop versions. However, by default, the rest of Hive is only built and tested against a single Hadoop version (0.20.1 as of this writing, but check build.properties for the latest).
You can specify a different Hadoop version with -Dhadoop.version="<your-hadoop-version>". By default, Hadoop tarballs are pulled from http://mirror.facebook.net/You can specify a different Hadoop version with -Dhadoop.version="<your-hadoop-version>". By default, Hadoop tarballs are pulled from http://mirror.facebook.net/facebook/hive-deps, which contains Hadoop 0.17.2.1, 0.18.3, 0.19.0, 0.20.0, 0.20.1, and 0.20S (aka 0.20.3-CDH3-SNAPSHOT). If the version you want is not here, then you'll need to set hadoop.mirror to a different source. For 0.19.2 and 0.20.2, you can use http://mirror.facebook.net/apache or any other Apache mirror. For other versions, you'll need to use http://archive.apache.org/dist (but don't use this unless you have to, since it's an overloaded server).
...
When submitting a patch it's highly recommended you execute tests locally which you believe will be impacted in addition to any new tests. The full test suite can be executed by Hive PreCommit Patch Testing. See Hive Developer FAQ to see how to execute a specific set of tests.tests. The full test suite can be executed by Hive PreCommit Patch Testing. See Hive Developer FAQ to see how to execute a specific set of tests.
Code Block |
---|
> cd hive-trunk
> ant clean package test tar -logfile ant.log
|
After a while, if you see
Code Block |
---|
BUILD SUCCESSFUL
|
all is ok, but if you see
Code Block |
---|
BUILD FAILED
|
then you should fix things before proceeding. Running
Code Block |
---|
> ant testreport
|
and examining the HTML report in build/test
might be helpful (also build/ql/tmp/hive.log
).
MVN:
Code Block |
---|
> cd hive-trunk > antmvn clean package test tar -logfile ant.log install && cd itests && mvn clean install |
After a while, if you see
Code Block |
---|
[INFO] BUILD SUCCESSFULSUCCESS |
all is ok, but if you see
Code Block |
---|
[INFO] BUILD FAILEDFAILURE |
then you should fix things before proceeding. Running
Code Block |
---|
> ant testreport
|
and examining the HTML report in build/test
might be helpful (also build/ql/tmp/hive.log
).
Unit tests take a long time (several hours) to run sequentially even on a very fast machine; for information on how to run them in parallel, see Unit Test Parallel Execution
...