...
- As of now, only hive/spark branch works against spark: https://github.com/apache/hive/tree/spark. Build hive assembly from this branch as described in https://cwiki.apache.org/confluence/display/Hive/HiveDeveloperFAQ.
Start hive and add the spark-assembly.jar to the hive auxpath.
Code Block hive --auxpath /location/to/spark-assembly-spark_version-hadoop_version.jar
Configure hive execution engine to run on spark:
Code Block hive> set hive.execution.engine=spark;
Configure required properties for spark properties. Guide is at-conf. See: http://spark.apache.org/docs/latest/configuration.html. This can be done either by adding a file "spark-defaults.conf" to the hive classpath, or configured as regular hive properties:normal properties from hive.
Code Block hive> set spark.master=<spark master URL> hive> set spark.eventLog.enabled=true; hive> set spark.executor.memory=512m; hive> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
...