Skip to end of metadata
Go to start of metadata


Bigtop tests assumes that there is a up-and-running HBase cluster. This set of instructions assumes you have this setup already and covers the case where you want to use bigtop to system test development version of hadoop/hbase. This will walk you through getting bigtop, building bigtop test-execution suite, setting up test-execution against particular versions of hbase and hadoop, and then running tests.

Setting up bigtop

First, install bigtop

Bigtop's "input variables" are environment variables. Thus you need to make sure hbase environment settings are set up for it to use.

Also, if you are using Zookeeper in hbase managed mode, you can cheat and to set ZOOKEEPER_HOME=$HBASE_HOME/lib to point to the zk jar.

test-execution prerequisites (aka, make maven work)

Test execution only pulls dependencies from mvn pom files. So, even if you have your cluster setup and environment variables setup, bigtop tests will use only the jars from the test-execution poms. This means you need to make sure that you have the poms with the versions you are using and proper jars installed in you mvn .m2 cache.

Let's start with the bigtop jars. Make maven build and install the bigtop related test execution jars. These command will download the world and then install into your ~/.m2 dir.

If you are using a custom build of HBase, build and mvn install it.

Modify the hbase execution pom to point to the particular version of hbase you want. I'm testing an HBase 0.92 release so I've gotten a copy of hbase-0.92 and did a 'mvn install -DskipTests' to install the pom into my local directory. Here's the snippet I added to bigtop-tests/test-execution/smokes/hbase/pom.xml

This section is specific to cloudera provided tarballs, available here: . Unfortunately, I don't know how to do this from the Apache Hadoop releases.

If you are using a custom hadoop/hdfs/mr, (some tests write to HDFS directly and thus need to have the proper version of Hadoop jars) build and mvn install these jars to your local the .m2 repo. Hadoop is normally an ant build so to do this (specifically for cdh3 snapshots), run maven using the hadoop.git's cloudera-pom.xml to install on the hadoop repo dir.

Next, add the dependency to the execution pom.xml.

The `hbase classpath` generates a class path and is "smart" enough to pull in cached .m2 artifacts. This may cause a problem if your hbase build depends on a different version of hadoop. You can do a 'mvn clean' afterwards in $HBASE_HOME to remove files the hbase script uses to cache.


To run tests go to the test excution dir and execut mvn verify with the particular include filter. This particular test runs a quick-and-simple create table, put and get.


This is a slow but thorough test (took about 50mins on a 10 node cluster, with 20MM entries)

Glorious victory.

  • No labels