Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Some of the test frameworks use python. For example, see incubator-trafodionsee trafodion/dcs/src/test/pytests/README.rst

...

 

Your installation approach depends on whether you already have installed Hadoop.

Anchor
create-test-pre-installed-hadoop
create-test-pre-installed-hadoop

Pre-Installed Hadoop

Info
titleNOTE

Currently, Trafodion requires:

  • Operating System:
    • 64-bit Red Hat Enterprise Linux (RHEL) 6.5, 6.6, and 6.7
    • SUSE SLES 11.3
  • Hadoop:
    • Cloudera CDH 5.2+
    • Hortonworks HDP 2.2+
    • Apache HBase 1.0+


Build Binary tar Files

Build the Trafodion binary tar files.

 

Code Block
languagebash
titleExample: Build Trafodion Binary tar Files
cd <Trafodion source directory>
make package-all

 

For more information, refer to Build Trafodion Components in the Build Source chapter.

 

Install Trafodion

The binary artifacts built above include the following tars and RPMs.   Note that the ambari server RPM is under the RH* directory. It varies depending ont he Redhat version hence we will have one version for each RH version that we support. These are binaries that will be need to do the install using Ambari install or the Python Installer. The details of how to use these installers are detailed in the Provisioning Guide : Apache Trafodion Provisioning Guide

Code Block
languagebash
titleExample: Build Trafodion Binary tar Files
 
<Trafodion home directory>/distribution> ls
apache-trafodion_clients-2.1.0-RH6-x86_64-debug.tar.gz
apache-trafodion_installer-2.1.0-incubating.tar.gz
apache-trafodion_pyinstaller-2.1.0-incubating.tar.gz
apache-trafodion-regress.tgz
apache-trafodion_server-2.1.0-RH6-x86_64-debug.tar.gz
dcs-tests.tgz
phoenix-tests.tgz
RH6
traf_ambari-2.1.0-1.noarch.rpm
<Trafodion home directory>/distribution/RH6 > ls
apache-trafodion_server-2.1.0-devel.x86_64.rpm


Anchor
create-test-local-hadoop
create-test-local-hadoop

Local Hadoop

Use the following instructions to install a local Hadoop environment based on Cloudera archives.

 

Run install_local_hadoop

The install_local_hadoop script downloads compatible versions of Hadoop, HBase, Hive, and MySQL based on the Cloudera archives. Then, it starts Trafodion.

Info
titleTIP

install_local_hadoop downloads Hadoop, HBase, Hive, and MySQL jar files from the Internet. To avoid this overhead, you can download the required files into a separate directory and set the environment variableMY_LOCAL_SW_DIST to point to this directory.

TODO: NEED EXAMPLE HERE

CommandUsage
install_local_hadoopUses default ports for all services.
install_local_hadoop -p fromDisplayStart Hadoop with a port number range determined from the DISPLAY environment variable.
install_local_hadoop -p randStart with any random port number range between 9000 and 49000.
install_local_hadoop -p <port>Start with the specified port number.
For a list of ports that get configured and their default values, please refer to Port Assignments on the Trafodion web site.

Sample Procedure

  1. Start a new ssh session and ensure that the Trafodion environmental variables are loaded.

    Code Block
    languagebash
    cd <Trafodion source directory>
    source ./env.sh
    Code Block
    languagebash
    titleExample: Load Environment Variables
    cd mysource/incubator-trafodion
    source ./env.sh
  2. Install the Hadoop software.

    Code Block
    languagebash
    cd $TRAF_HOME/sql/scripts
    install_local_hadoop
    ./install_traf_components
    Code Block
    languagebash
    titleExample: Install Hadoop Software
    $ cd $TRAF_HOME/sql/scripts
    $ install_local_hadoop
    Checking for existing Hadoop processes...
    The testware tpcds_kit.zip does not exist and will not be installed
    This testware is needed to run developer HIVE regression tests
    .
    .
    .
    Installed directory size and name = 2.7G        /home/trafdeveloper/mysource/incubator-trafodion/core/sqf/sql/local_hadoop
    
    Setup is complete. You can use the convenience scripts starting with sw... located in /home/trafdeveloper/mysource/incubator-trafodion/core/sqf/sql/scripts.
    $ ./install_traf_components
    
    Installing and configuring DCS, REST, TRAFCI & Phoenix tests for Trafodion...
    
    Environment used for core, DCS, REST and Phonenix ...
    .
    .
    .
    Configuration scripts for DCS, REST, TRAFCI and Phoenix test are set up
    
       Open a new session and start Trafodion by executing sqgen and sqstart scripts
    $
  3. Verify installation.

    Code Block
    languagebash
    $ swstatus
    6 java servers and 2 mysqld processes are running
    713   NameNode
    19513 HMaster
    1003  SecondaryNameNode
    838   DataNode
    1173  ResourceManager
    1298  NodeManager

    The following Java servers should be running:

    • NodeManager (Yarn)

    • ResourceManager (Yarn)

    • NameNode (HDFS)

    • SecondaryNameNode (HDFS)

    • DataNode (HDFS)

    • HMaster (HBase)

      In addition, 2 mysqld processes should be running.

      Info

      The Hadoop enviroment is install in: $MY_SQROOT/sql/local_hadoop. The log files are located in$MY_SQROOT/sql/local_hadoop/log and in the Hadoop components' log directories. If a service is not started, then look in the associated log files for the issue and search for the error string for possible solutions to the issue; these are Hadoop issues and are therefore outside the scope of this guide.

  4. Run sqgen — sqgen creates the Trafodion configuration files. Do the following:
    1. Start a new terminal window for this step.

    2. Do the following:

      Code Block
      languagebash
      cd <Trafodion source directory>
      source ./env.sh
      cd $MY_SQROOT/etc
      # delete ms.env, if it exists
      rm ms.env
      cd $MY_SQROOT/sql/scripts
      sqgen
      Code Block
      languagebash
      titleExample: Run sqgen
      $ cd mysource/incubator-trafodion
      $ source ./env.sh
      $ cd $MY_SQROOT/etc
      # delete ms.env, if it exists
      $ rm ms.env
      $ cd $MY_SQROOT/sql/scripts
      $ sqgen
      Workstation environment - Not a clustered environment
      
      Generating SQ environment variable file: /home/trafdeveloper/mysource/incubator-trafodion/core/sqf/etc/ms.env
      
      Note: Using cluster.conf format type 2.
      .
      .
      .
      ***********************************************************
       Updating Authentication Configuration
      ***********************************************************
      Creating folders for storing certificates
      
      $
  5. Start Trafodion.

    Code Block
    languagebash
    titleExample: sqstart
    $ sqstart
    Checking orphan processes.
    Removing old mpijob* files from /home/trafdeveloper/mysource/incubator-trafodion/core/sqf/tmp
    
    Removing old monitor.port* files from /home/trafdeveloper/mysource/incubator-trafodion/core/sqf/tmp
    
    Executing sqipcrm (output to sqipcrm.out)
    Starting the SQ Environment (Executing /home/trafdeveloper/mysource/incubator-trafodion/core/sqf/sql/scripts/gomon.cold)
    Background SQ Startup job (pid: 15755)
    .
    .
    .
    You can monitor the SQ shell log file : /home/trafdeveloper/mysource/incubator-trafodion/core/sqf/logs/sqmon.log
    
    Startup time  0 hour(s) 0 minute(s) 59 second(s)
    $
  6. Initialize Trafodion

The final step is to initialize the Trafodion database.

Code Block
languagebash
titleExample: Initialize Trafodion
$ sqlci
Apache Trafodion Conversational Interface 1.3.0
Copyright (c) 2015 Apache Software Foundation
>>initialize trafodion;

--- SQL operation complete.
>> exit;

End of MXCI Session

$

 

Next Steps

 

...