Database Connectivity Services (DCS)
Prerequisites
- Install Java 1.7 (sudo yum install java-1.7.0-openjdk-devel, alternatively unpack the tar ball into a directory).
- Set JAVA_HOME environment variable (e.g. export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.45.x86_64)
- Maven, as documented on the page Additional Build Tools: Maven
Build
- Download sources: https://github.com/trafodion/dcs
- Run a Maven clean site package command:
mvn clean site package
Core Trafodion
Prerequisites
The preferred build platform is RedHat 6.4 or CentOS 6.4.
Linux OS Dependencies
Many required dependencies are available from the standard distribution using yum install.
alsa-lib-devel | ant | ant-nodeps |
boost-devel | device-mapper-multipath | dhcp |
gcc-c++ | gd | glibc-devel.i686 |
graphviz-perl | gzip | java-1.7.0-openjdk-devel |
java-1.6.0-openjdk-devel | libaio-devel | libibcm.i686 |
libibumad-devel | libibumad-devel.i686 | libiodbc |
libiodbc-devel | librdmacm-devel | librdmacm-devel.i686 |
log4cxx | log4cxx-devel | lua-devel |
lzo-minilzo | net-snmp-devel | net-snmp-perl |
openldap-clients | openldap-devel.i686 | openmotif |
openssl-devel.i686 | openssl-static | perl-Config-IniFiles |
perl-DBD-SQLite | perl-Config-Tiny | perl-Expect |
perl-IO-Tty | perl-Math-Calc-Units | perl-Params-Validate |
perl-Parse-RecDescent | perl-TermReadKey | perl-Time-HiRes |
protobuf-compiler | protobuf-devel | python-qpid |
python-qpid-qmf | qpid-cpp-client | qpid-cpp-client-devel |
qpid-cpp-client-ssl | qpid-cpp-server | qpid-cpp-server-ssl |
qpid-qmf | qpid-tools | readline-devel |
saslwrapper | sqlite-devel | tog-Pegasus |
unixODBC | unixODBC-devel | uuid-perl |
xinetd | xerces-c-devel | |
To install these packages, do the following command:
yum install <package>
Note 1 : The qpid-cpp-client-devel package is not in the latest CentOS distribution, so you may need to enable an earlier repo.
yum --enablerepo=C6.3-updates install qpid-cpp-client-devel
Note 2: Not all packages come standard with RHEL/CentOS, the EPEL repo will need to be downloaded and installed.
wget http://download.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
sudo rpm -Uvh epelrelease-6-8.noarch.rpm
Nonstandard Tools
There are several dependencies that are not available in the standard distribution.
Download and build: Additional Build Tools
Hadoop Components
Install Hadoop, HBase, and Hive dependencies in a common location of your choice ($TOOLSDIR). Dependencies for release 0.9.x:
wget http://archive.apache.org/dist/hive/hive-0.13.1/apache-hive-0.13.1-bin.tar.gz tar xzf apache-hive-0.13.1-bin.tar.gz $TOOLSDIR/apache-hive-0.13.1-bin wget http://archive-primary.cloudera.com/cdh5/cdh/5/hbase-0.98.1-cdh5.1.0.tar.gz tar xzf hbase-0.98.1-cdh5.1.0.tar.gz $TOOLSDIR/hbase-0.98.1-cdh5.1.0
NOTE: The hadoop release contains 32-bit libraries. You must build hadoop from source for 64-bit architecture, and not just download the release tar file. See: http://wiki.apache.org/hadoop/HowToContribute
wget http://archive.apache.org/dist/hadoop/common/hadoop-2.4.0/hadoop-2.4.0-src.tar.gz tar xzf hadoop-2.4.0-src.tar.gz cd hadoop-2.4.0-src export JAVA_HOME=... # path to 1.7.x JDK export HADOOP_PROTOC_PATH=... # path to protobufs 2.5.0 protoc command mvn clean install package -Pdist -Pnative -Dtar -DskipTests \ -Dtomcat.download.url=http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.36/bin/apache-tomcat-6.0.36.tar.gz
Before building trafodion, be sure to export the TOOLSDIR environment variable set to the directory location of these components and the additional build tools from the section above.
Alternatively, use the install_local_hadoop script instead. See step 3 under the "Build" heading below.
Custom Tool Settings
The location of build dependencies can be customized. See the source code file https://github.com/trafodion/core/blob/master/sqf/LocalSettingsTemplate.sh.
Build
- Get a clone of the git repository (https://github.com/trafodion/core)
- Set up shell environment
cd sqf source ./sqenv.sh
- Build the software
cd $MY_SQROOT/.. make all
Install Hadoop and Start Trafodion
- (OPTIONAL) Create a sand-boxed installation of Hadoop, HBase, Hive, MySQL to be used for building and testing. If these tools are not installed on your development system, you can install them locally to your workspace.
install_local_hadoop
Note: This script will download Hadoop and HBase jar files from the internet. To avoid this overhead for future executions of the script, you can save the downloaded files into a separate directory and set the environment variable MY_LOCAL_SW_DIST to point to that directory. The files to save are: $MY_SQROOT/sql/local_hadoop/*.tar.gz $MY_SQROOT/sql/local_hadoop/tpcds/tpcds_kit.zip.
- Generate files necessary to start Trafodion
cd $MY_SQROOT source ./sqenv.sh sqgen
- Exit your shell and get a new clean shell (note that sqgen edited sqenv.sh)
cd $MY_SQROOT source ./sqenv.sh
- Make sure you can do "ssh localhost" without having to enter a password
- (Sandbox Hadoop option): Bring up your Hadoop/HBase instance if it is not already up.
swstartall
- (Pre-installed Hadoop/HBase option): Update the HBase configuration and restart HBase.
hbase-site.xml: <property> <name>hbase.client.scanner.caching</name> <value>100</value> </property> <property> <name>hbase.client.scanner.timeout.period</name> <value>60000</value> </property> <property> <name>hbase.coprocessor.region.classes</name> <value> org.apache.hadoop.hbase.coprocessor.transactional.TrxRegionObserver, org.apache.hadoop.hbase.coprocessor.transactional.TrxRegionEndpoint, org.apache.hadoop.hbase.coprocessor.AggregateImplementation </value> </property> <property> <name>hbase.hregion.impl</name> <value>org.apache.hadoop.hbase.regionserver.transactional.TransactionalRegion</value> </property> hbase-env.xml: export HBASE_CLASSPATH=${HBASE_TRXDIR}/${HBASE_TRX_JAR}
- Start Trafodion
sqstart sqlci > initialize trafodion;
- Perform a quick sanity test of the install
sqlci > set schema trafodion.usr; > create table t(a integer not null primary key); > get tables; > insert into t values (1); > select * from t;
Notes
- The $MY_SQROOT/sqenv.sh file sources in the file sqenvcom.sh, where most of the Trafodion environment is set up: PATH, CLASSPATH, LD_LIBRARY_PATH, and so on.
- The sqgen command takes CLASSPATH and other environment variables and makes sure that they are used when starting Trafodion processes across the cluster. Therefore, it's very important that the correct CLASSPATH is set up before calling sqgen. Trafodion processes actually use the CLASSPATH that's defined in $MY_SQROOT/etc/ms.env, which should match what you get after sourcing sqenv.sh.
- The install_local_hadoop script copies jar files and executables for a single-node Hadoop install into your source tree: $MY_SQROOT/sql/local_hadoop. If you already have Hadoop running on the system and also want a sandbox version, install the sand-boxed Hadoop on non-standard ports:
install_local_hadoop -p <start-port> <start-port> ... <start-port>+199 should be a range of unused ports.
- To run the software you built on a cluster, use the "package" make target instead of the "all" target above and use the built tar files to install on the cluster. Generally, most developers run on a single-node cluster, since a multi-node cluster requires more complex steps to deploy the built software. Here is how to modify software and run the modified objects on the local node (note there is no "make install"):
sqstop # edit source files cd $MY_SQROOT/.. make all sqstart
- Shutting Trafodion down. To do this, you would shut down Trafodion, then shut down the sand-boxed Hadoop that's used. The sw commands only apply if you are using the sandbox hadoop (install_local_hadoop).
sqstop swstopall
To start it up later, use the following commands:
swstartall sqstart
To check on the status, use these commands:
sqcheck swstatus
- If you get rid of the entire source tree, all of the local Hadoop install will also be lost. Before removing these files, make sure to stop Hadoop. The easiest way to do that is with the swstopall or swuninstall_local_hadoop script (these are generated script in your path).