Warning | ||
---|---|---|
| ||
This page is being obsoleted and is replaced by: http://trafodion.apache.org/build.html |
Panel | ||||||
---|---|---|---|---|---|---|
| ||||||
Describes the steps required to build and run Apache Trafodion. |
Table of Contents |
---|
...
Supported Platforms
Red Hat or Centos 6.x (6.4 or later) versions are supported as development and production platforms.
Build prerequisites
You need to install the following packages before you can install Apache Trafodion.
Code Block | ||
---|---|---|
| ||
sudo yum install epel-release
sudo yum install alsa-lib-devel ant ant-nodeps boost-devel cmake \
device-mapper-multipath dhcp flex gcc-c++ gd git glibc-devel \
glibc-devel.i686 graphviz-perl gzip |
Database Connectivity Services (DCS)
Prerequisites
- Install Java 1.7 (sudo yum install java-1.7.0-openjdk-devel, alternatively unpack the tar ball into a directory).
- Set JAVA_HOME environment variable (e.g. export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.45.x86_64)
- Maven, as documented on the page Additional Build Tools: Maven
Build
- Download sources: https://github.com/trafodion/dcs
- Run a Maven clean site package command:
mvn clean site package
Core Trafodion
Prerequisites
The preferred build platform is RedHat 6.4 or CentOS 6.4.
Linux OS Dependencies
Many required dependencies are available from the standard distribution using yum install.
...
alsa-lib-devel
...
ant
...
ant-nodeps
...
boost-devel
...
device-mapper-multipath
...
dhcp
...
gcc-c++
...
gd
...
glibc-devel.i686
...
graphviz-perl
...
gzip
java-1.7.0-openjdk-devel |
...
java-1.6.0-openjdk-devel
...
libaio-devel
...
libibcm.i686
...
libibumad-devel
...
libibumad-devel.i686
...
openmotif
...
openssl-devel.i686
...
openssl-static
...
perl-Config-IniFiles
...
perl-DBD-SQLite
...
perl-Config-Tiny
...
perl-Expect
...
perl-IO-Tty
...
perl-Math-Calc-Units
...
perl-Params-Validate
...
perl-Parse-RecDescent
...
perl-TermReadKey
...
perl-Time-HiRes
...
protobuf-compiler
...
protobuf-devel
...
python-qpid
...
python-qpid-qmf
...
qpid-cpp-client
...
qpid-cpp-client-devel
\ libX11-devel libXau-devel libaio-devel \ libcurl-devel libibcm.i686 libibumad-devel libibumad-devel.i686 \ libiodbc libiodbc-devel librdmacm-devel librdmacm-devel.i686 \ libxml2-devel log4cxx log4cxx-devel lua-devel lzo-minilzo \ net-snmp-devel net-snmp-perl openldap-clients openldap-devel \ openldap-devel.i686 openmotif openssl-devel openssl-devel.i686 \ openssl-static perl-Config-IniFiles perl-Config-Tiny \ perl-DBD-SQLite perl-Expect perl-IO-Tty perl-Math-Calc-Units \ perl-Params-Validate perl-Parse-RecDescent perl-TermReadKey \ perl-Time-HiRes protobuf-compiler protobuf-devel python-qpid \ python-qpid-qmf qpid-cpp-client \ qpid-cpp-client-ssl |
...
qpid-cpp-server |
...
qpid-cpp-server-ssl |
...
\ qpid-qmf |
...
qpid-tools |
...
readline-devel |
...
saslwrapper |
...
sqlite-devel |
...
\ unixODBC unixODBC-devel |
...
uuid-perl |
...
wget xerces-c-devel |
...
To install these packages, do the following command:
yum install <package>
Note 1 : The qpid-cpp-client-devel package is not in the latest CentOS distribution, so you may need to enable an earlier repo.
yum --enablerepo=C6.3-updates install qpid-cpp-client-devel
Note 2: Not all packages come standard with RHEL/CentOS, the EPEL repo will need to be downloaded and installed.
wget http://download.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
sudo rpm -Uvh epelrelease-6-8.noarch.rpm
Nonstandard Tools
There are several dependencies that are not available in the standard distribution.
Download and build: Additional Build Tools
Hadoop Components
Install Hadoop, HBase, and Hive dependencies in a common location of your choice ($TOOLSDIR). Dependencies for release 0.9.x:
wget http://archive.apache.org/dist/hive/hive-0.13.1/apache-hive-0.13.1-bin.tar.gz tar xzf apache-hive-0.13.1-bin.tar.gz $TOOLSDIR/apache-hive-0.13.1-bin wget http://archive-primary.cloudera.com/cdh5/cdh/5/hbase-0.98.1-cdh5.1.0.tar.gz tar xzf hbase-0.98.1-cdh5.1.0.tar.gz $TOOLSDIR/hbase-0.98.1-cdh5.1.0
NOTE: The hadoop release contains 32-bit libraries. You must build hadoop from source for 64-bit architecture, and not just download the release tar file. See: http://wiki.apache.org/hadoop/HowToContribute
wget http://archive.apache.org/dist/hadoop/common/hadoop-2.4.0/hadoop-2.4.0-src.tar.gz tar xzf hadoop-2.4.0-src.tar.gz cd hadoop-2.4.0-src export JAVA_HOME=... # path to 1.7.x JDK export HADOOP_PROTOC_PATH=... # path to protobufs 2.5.0 protoc command mvn clean install package -Pdist -Pnative -Dtar -DskipTests \ -Dtomcat.download.url=http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.36/bin/apache-tomcat-6.0.36.tar.gz
Before building trafodion, be sure to export the TOOLSDIR environment variable set to the directory location of these components and the additional build tools from the section above.
Alternatively, use the install_local_hadoop script instead. See step 3 under the "Build" heading below.
Custom Tool Settings
The location of build dependencies can be customized. See the source code file https://github.com/trafodion/core/blob/master/sqf/LocalSettingsTemplate.sh.
Build
- Get a clone of the git repository (https://github.com/trafodion/core)
- Set up shell environment
cd sqf source ./sqenv.sh
- Build the software
cd $MY_SQROOT/.. make all
Install Hadoop and Start Trafodion
...
install_local_hadoop
Note: This script will download Hadoop and HBase jar files from the internet. To avoid this overhead for future executions of the script, you can save the downloaded files into a separate directory and set the environment variable MY_LOCAL_SW_DIST to point to that directory. The files to save are: $MY_SQROOT/sql/local_hadoop/*.tar.gz $MY_SQROOT/sql/local_hadoop/tpcds/tpcds_kit.zip.
...
cd $MY_SQROOT
source ./sqenv.sh
sqgen
...
cd $MY_SQROOT
source ./sqenv.sh
...
swstartall
xinetd
|
Once installed, check the following.
Java Version
The Java version must be 1.7.x. Check as following:
Code Block | ||
---|---|---|
| ||
$ java -version
java version "1.7.0_85"
OpenJDK Runtime Environment (rhel-2.6.1.3.el6_6-x86_64 u85-b01)
OpenJDK 64-Bit Server VM (build 24.85-b03, mixed mode) |
Ensure that the Java environment exists and points to your JDK installation. By default Java is located in /usr/lib/java-<version>.
Code Block | ||
---|---|---|
| ||
$ echo $JAVA_HOME
$ export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk.x86_64 |
Note |
---|
You should export JAVA_HOME in your .bashrc or .profile file. |
Verify Trafodion Download
Verify that the Trafodion source has been either:
- Downloaded and unpackaged.
- Cloned from github.
If not, please do so now.
To download from github, refer to Contributor Workflow - Code/Docs.
Otherwise downloaded and untar the source tar file from Apache Trafodion Incubator release.
Install Required Build Tools
Trafodion requires a set of tools to be installed in order to build. Refer to Required Build Tools for instructions. One of the tools that is built if it does not already exist is Maven. At this time, you should verify that Maven is part of your path. To verify that Maven is available do:
mvn --version.
If not found, then added it to your path:
PATH=$PATH:<tool installation directory>/apache-maven-3.3.3/bin
Note |
---|
You should add Maven to your PATH in your .bashrc or .profile file. |
Build Trafodion
Start a new ssh session. Use the following commands to set up the Trafodion environmental variables.
- <Trafodion source directory> is source tree base for Trafodion.
- <tools installation directory> is where Trafodion required tools are located. The following example assumes that you installed all the required tools in a single location. If you installed or used pre-installed tools in different directories, then you need to export the location of each tool as described Required Build Tools prior to sourcing in env.sh.
Code Block | ||
---|---|---|
| ||
cd <Trafodion source directory>
export TOOLSDIR=<tools installation directory>
source ./env.sh |
Build a debug version of Trafodion using one of the following options:
Command | What It Builds |
---|---|
make all | Trafodion, DCS, and REST. |
make package | Trafodion, DCS, REST, and Client Drivers. |
make package-all | Trafodion, DCS, REST, Client Drivers, and tests for all components. |
If the build fails, you might want to rerun the make step. Trafodion downloads many dependencies and sometimes one of the download operations fail. Rerunning the build generally works.
Verify the build:
Code Block | ||||
---|---|---|---|---|
| ||||
$ sqvers -u
MY_SQROOT=/home/centos/apache-trafodion-1.3.0-incubating/core/sqf
who@host=centos@mysystem
JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.91.x86_64
SQ_MBTYPE=64d (64-debug)
linux=2.6.32-504.1.3.el6.x86_64
redhat=6.7
NO patches
Most common Apache_Trafodion Release 1.3.0 (Build debug [centos], branch -, date 06Nov15)
UTT count is 1
[6] Release 1.3.0 (Build debug [centos], branch -, date 06Nov15)
export/lib/hbase-trx-cdh5_3-1.3.0.jar
export/lib/hbase-trx-hbase_98_4-1.3.0.jar
export/lib/hbase-trx-hdp2_2-1.3.0.jar
export/lib/sqmanvers.jar
export/lib/trafodion-dtm-1.3.0.jar
export/lib/trafodion-sql-1.3.0.jar |
The output show several jar files. The number of files differs based on the version of Trafodion you downloaded.
Setup Test Environment
You should test your installation using:
- Trafodion installation on a system that already has a compatible version of Hadoop installed
- A local Hadoop environment created by the install_local_hadoop script
Your installation approach depends on whether you already have installed Hadoop.
Hadoop is Already Installed
Build binary tar files and then install Trafodion following instructions described in Installation.
Code Block | ||||
---|---|---|---|---|
| ||||
cd <Trafodion source directory>
make package |
The binary tar files will be created in <Trafodion source directory>/distribution directory.
Install a Local Hadoop Environment
Local Hadoop prerequisites
Setup Passwordless SSH
Check to see if you have passwordless SSH setup.
Code Block | ||||
---|---|---|---|---|
| ||||
ssh localhost
Last login: Fri Nov 6 22:44:00 2015 from 192.168.1.9 |
If passwordless SSH is not setup, please do so now. The following is an example of setting up passwordless SSH using id_rsa keys. You can choose the method that best represents your environment.
If you already have an existing set of ssh keys. Simply copy both the id_rsa.pub and id_rsa to your ~/.ssh directory.
Then, do the following:
Code Block | ||||
---|---|---|---|---|
| ||||
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 600 ~/.ssh/id_rsa
echo "NoHostAuthenticationForLocalhost=yes" >>~/.ssh/config
chmod go-w ~/.ssh/config
chmod 755 ~/.ssh; chmod 640 ~/.ssh/authorized_keys; cd ~/.ssh; chmod 700 .. |
If you need to create your keys first, then do the following:
Code Block | ||||
---|---|---|---|---|
| ||||
ssh-keygen -t rsa -N "" -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 600 ~/.ssh/id_rsa.pub
echo "NoHostAuthenticationForLocalhost=yes" >>~/.ssh/config
chmod go-w ~/.ssh/config
chmod 755 ~/.ssh; chmod 640 ~/.ssh/authorized_keys; cd ~/.ssh; chmod 700 .. |
Verify System Limits
Please check that the system limits in your environment are appropriate for Apache Trafodion. If they are not, then you will need to increase the limits or Trafodion cannot start.
Code Block | ||||
---|---|---|---|---|
| ||||
ulimit –a
core file size (blocks, -c) 1000000
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 515196
max locked memory (kbytes, -l) 49595556
max memory size |
...
|
...
(kbytes, -m) unlimited open files |
...
sqstart
sqlci
> initialize trafodion;
...
sqlci
> set schema trafodion.usr;
> create table t(a integer not null primary key);
> get tables;
> insert into t values (1);
> select * from t;
(-n) 32000
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 10240
cpu time (seconds, -t) unlimited
max user processes (-u) 267263
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
|
Please refer to this article for information on how you change system limits.
Run install_local_hadoop
The install_local_hadoop script downloads compatible versions of Hadoop, HBase, Hive, and MySQL. Then, it starts Trafodion.
Tip | ||
---|---|---|
| ||
install_local_hadoop downloads Hadoop, HBase, Hive, and MySql jar files from the internet. To avoid this overhead, you can download the required files into a separate directory and set the environment variable MY_LOCAL_SW_DIST to point to this directory. |
Command | What It Does |
---|---|
install_local_hadoop | Uses default ports for all services. |
install_local_hadoop -p fromDisplay | Start Hadoop with a port number range determined from the DISPLAY environment variable. |
install_local_hadoop -p rand | Start with any random port number range between 9000 and 49000. |
install_local_hadoop -p <port > | Start with the specified port number. |
For a list of ports that get configured and their default values, see Configure Ports for a Firewall.
Sample Procedure
Code Block | ||||
---|---|---|---|---|
| ||||
# Ensure that the Trafodion environmental variables have been loaded.
cd <Trafodion source directory>
source ./env.sh |
Install the Hadoop software.
Code Block | ||
---|---|---|
| ||
cd $MY_SQROOT/sql/scripts
install_local_hadoop
./install_traf_components |
Verify installation.
Code Block | ||
---|---|---|
| ||
$ swstatus
6 java servers and 2 mysqld processes are running
713 NameNode
19513 HMaster
1003 SecondaryNameNode
838 DataNode
1173 ResourceManager
1298 NodeManager
|
Six java servers as shown above and two mysqld processes should be running.
Manage Hadoop Environment
Use the following commands to manage the Hadoop environment.
Command | Usage |
---|---|
swstartall | Start the complete Hadoop environment. |
swstopall | Stops the complete Hadoop environment. |
swstatus | Checks the status of the Hadoop environment. |
swuninstall_local_hadoop | Removes the Hadoop installation. |
Run Trafodion
This section describes how to start Trafodion and run operations.
Each Time New Source is Downloaded
You need to do the following each time you download new source code.
Code Block | ||
---|---|---|
| ||
cd <Trafodion source directory>
source ./env.sh
cd $MY_SQROOT/etc
# delete ms.env, if it exists
rm ms.env
cd $MY_SQROOT/sql/scripts
sqgen |
Start Trafodion
Do the following to start the Trafodion environment.
Code Block | ||
---|---|---|
| ||
cd $MY_SQROOT/sql/scripts
sqstart
sqcheck |
Management Scripts
Component | Start | Stop | Status |
---|---|---|---|
All of Trafodion | sqstart | sqstop | sqcheck |
DCS (Database Connectivity Services) | dcstart | dcsstop | dcscheck |
REST Server | reststart | reststop | |
LOB Server | lobstart | lobstop | |
RMS Server | rmsstart | rmsstop | rmscheck |
Create Trafodion Metadata
Code Block | ||||
---|---|---|---|---|
| ||||
# Ensure that the Trafodion environmental variables have been loaded.
cd <Trafodion source directory>
source ./env.sh |
Assumption: Trafodion is up and running.
Use sqlci to create the Trafodion metadata.
Code Block | ||
---|---|---|
| ||
$ sqlci
>> initialize trafodion;
.
.
.
>> exit;
$ |
Validate Your Installation
You can use sqlci or trafci (connects via DCS) to validate your installation.
Code Block | ||||
---|---|---|---|---|
| ||||
get schemas;
create table table1 (a int);
invoke table1;
insert into table1 values (1), (2), (3), (4);
select * from table1;
exit; |
Assuming no errors, your installation has been successful. You can start working on your modifications.
Troubleshooting Notes
If you are not able to start up the environment or if there are problems running sqlci or trafci, then verify that the all the processes are up and running.
- swstatus should show at 6 java servers and 2 mysql processes.
- sqcheck should indicate all processes are running.
If processes are not running as expected, then:
- sqstop to shut down Traodion. If some Trafodion processes do not terminate cleanly, then run ckillall.
- swstopall to shut down the Hadoop ecosystem.
- swstartall to restart the Hadoop ecosystem.
- sqstart to restart Trafodion.
If problems persist please review logs:
- $MY_SQROOT/sql/local_hadoop/*/log: Hadoop, HBase, and Hive logs.
- $MY_SQROOT/logs Trafodion logs.
Notes
...
install_local_hadoop -p <start-port>
<start-port> ... <start-port>+199 should be a range of unused ports.
...
sqstop
# edit source files
cd $MY_SQROOT/..
make all
sqstart
...
sqstop
swstopall
To start it up later, use the following commands:
swstartall
sqstart
To check on the status, use these commands:
sqcheck
swstatus
...