You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 25 Next »

HiveServer2

HiveServer2 (HS2) is a server interface that enables remote clients to execute queries against Hive and retrieve the results. The current implementation, based on Thrift RPC, is an improved version of HiveServer and supports multi-client concurrency and authentication. It is designed to provide better support for open API clients like JDBC and ODBC.

This document describes how to set up the server. How to use a client with this server is described in the HiveServer2 Clients document.

Version

Introduced in Hive version 0.11. See HIVE-2935.

How to Configure

Configuration Properties in the hive-site.xml File

hive.server2.thrift.min.worker.threads – Minimum number of worker threads, default 5.

hive.server2.thrift.max.worker.threads – Maximum number of worker threads, default 500.

hive.server2.thrift.port – TCP port number to listen on, default 10000.

hive.server2.thrift.bind.host – TCP interface to bind to.

Optional Environment Settings

HIVE_SERVER2_THRIFT_BIND_HOST – Optional TCP host interface to bind to. Overrides the configuration file setting.
HIVE_SERVER2_THRIFT_PORT – Optional TCP port number to listen on, default 10000. Overrides the configuration file setting.

Running in HTTP mode

Starting Hive 0.13, HiveServer2 provides support for sending Thrift RPC messages over http transport (HIVE-4752). This is particularly useful to support a proxying intermediary between the client and the server (for example, for load balancing or security reasons). Currently, you can run HiveServer2 in either TCP mode or the HTTP mode, but not in both. For the corresponding JDBC url, check this link: HiveServer2 Clients JDBC url. Use the following settings to enable http mode:

hive.server2.transport.mode – Set this to http.

Optional Environment Settings

hive.server2.thrift.http.port – HTTP port number to listen on; default is 10001.

hive.server2.thrift.http.path – The service endpoint; default is cliservice.

hive.server2.thrift.http.min.worker.threads – Minimum worker threads in the server pool; default is 5.

hive.server2.thrift.http.max.worker.threads – Maximum worker threads in the server pool; default is 500.

How to Start

$HIVE_HOME/bin/hiveserver2

OR

$HIVE_HOME/bin/hive --service hiveserver2

Usage Message

The -H or --help option displays a usage message, for example:

$HIVE_HOME/bin/hive --service hiveserver2 -H
Starting HiveServer2
usage: hiveserver2
 -H,--help                        Print help information
    --hiveconf <property=value>   Use value for given property

Authentication/Security Configuration

HiveServer2 supports Anonymous (no authentication), Kerberos, pass through LDAP, Pluggable Custom Authentication and Pluggable Authentication Modules (supported Hive 0.13 onwards).

Configuration

hive.server2.authentication – Authentication mode, default NONE. Options are NONE, KERBEROS, LDAP, PAM and CUSTOM.

hive.server2.authentication.kerberos.principal – Kerberos principal for server.

hive.server2.authentication.kerberos.keytab – Keytab for server principal.

hive.server2.authentication.ldap.url – LDAP url.

hive.server2.authentication.ldap.baseDN – LDAP base DN.

hive.server2.custom.authentication.class – Custom authentication class that implements org.apache.hive.service.auth.PasswdAuthenticationProvider interface.

Impersonation

By default HiveServer2 performs the query processing as the user who submitted the query. But if the following parameter is set to false, the query will run as the user that the hiveserver2 process runs as.

hive.server2.enable.doAs – Impersonate the connected user, default true.

To prevent memory leaks in unsecure mode, disable file system caches by setting the following parameters to true:

fs.hdfs.impl.disable.cache – Disable HDFS filesystem cache, default false.

fs.file.impl.disable.cache – Disable local filesystem cache, default false.

Integrity/Confidentiality Protection

Changes in HIVE-4911, which is available in Hive 0.12, enable integrity protection and confidentiality protection (beyond just the default of authentication) for communication between the Hive JDBC driver and HiveServer2. You can use the SASL QOP property to configure this.

  • This is only when Kerberos is used for the HS2 client (JDBC/ODBC application) authentication with HiveServer2.
  • hive.server2.thrift.sasl.qop in hive-site.xml has to be set to one of the valid QOP values ('auth', 'auth-int' or 'auth-conf').

SSL Encryption

Changes in HIVE-5351, which will be available in Hive 0.13, provides support for SSL encryption. To enable, set the following configurations in  hive-site.xml:

hive.server2.use.SSL – Set this to true.

hive.server2.keystore.path – Set this to your keystore path.

hive.server2.keystore.password – Set this to your keystore password.

Pluggable Authentication Modules (PAM)

HIVE-6466, which will be available in Hive 0.13, provides support for PAM. To configure PAM:

  • Download the JPAM native library for the relevant architecture.
  • Unzip and copy libjpam.so to a directory (<libjmap-directory>) on the system. 
  • Add the directory to the LD_LIBRARY_PATH environment variable like so: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<libjmap-directory> 

Finally, set the following configurations in  hive-site.xml:

hive.server2.authentication – Set this to PAM.

hive.server2.authentication.pam.services – Set this to a list of comma separated PAM services that will be used. Note that a file with the same name as the PAM service must exist in /etc/pam.d.

Python Client Driver

A Python client driver for HiveServer2 is available at https://github.com/BradRuderman/pyhs2 (thanks, Brad). It includes all the required packages such as SASL and Thrift wrappers.

To use the pyhs2 driver:

pip install pyhs2

and then:

import pyhs2

conn = pyhs2.connect(host='localhost', 
					port=10000,
					authMechanism="PLAIN", 
					user='root', 
					password='test', 
					database='default')
cur = conn.cursor()
cur.execute("show tables")
for i in cur.fetch():
	print i
cur.close()
conn.close()

You can discuss this driver on the user@hive.apache.org mailing list.

  • No labels