编译spark:[thriftServer.sh属于测试阶段 hive-0.13.1]

来源:互联网 发布:单片机外围电路经典书 编辑:程序博客网 时间:2024/06/13 12:41

编译spark:[thriftServer.sh属于测试阶段 hive-0.13.1]

说明: 目前已经发布了1.2版本此文至合适安装参考,不用自己编译了


vi  sql/hive/pom.xml 支持读取parquet
<dependency>
           <groupId>com.twitter</groupId>
           <artifactId>parquet-hive-bundle</artifactId>
           <version>1.5.0</version>
        </dependency>


Setting up Maven’s Memory Usage
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"


编译:
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive-0.13.1 -Dhive.version=0.13.1 -Phbase-0.98.7 -Dhbase.version=0.98.7 -DskipTests clean package


生成部署包:


[修改SPARK_HIVE那一段脚本,<id>hive</id>改为自己需要的版本,不然无法生成支持hive的包]
https://github.com/apache/spark/blob/master/make-distribution.sh 


./make-distribution.sh --tgz -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive-0.13.1 -Dhive.version=0.13.1 -Phbase-0.98.7 -Dhbase.version=0.98.7
[等价==> mvn clean package -DskipTests -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive-0.13.1 -Dhive.version=0.13.1 -Phbase-0.98.7 -Dhbase.version=0.98.7]


install:
修改 spark-env.sh 添加如下内容
export JAVA_HOME=/usr/local/jdk1.7.0_45
export SCALA_HOME=/usr/local/scala
export HIVE_CONF_DIR=/usr/local/hive-0.12/conf
export CLASSPATH=$CLASSPATH:/usr/local/hive-0.12/lib
export HADOOP_CONF_DIR=/usr/local/hadoop-2.5.1/etc/hadoop
export SPARK_MASTER_IP=hadoop0
export SPARK_WORKER_MEMORY=2g


start/stop cluster:
start-all.sh
stop-all.sh


JVM:
最大堆 最小堆新生代大小
-Xmx60m -Xms20m -Xmn7m -XX:+PrintGCDetails




spark-submit:
https://spark.apache.org/docs/latest/submitting-applications.html
./bin/spark-submit 
--name SparkAnalysis 
--class com.itweet.spark.Analysis 
--master spark://itr-mastertest01:7077 
--executor-memory 20G 
--total-executor-cores 20 
/program/spark-1.0-project-1.0.jar 
hdfs://itr-mastertest01:9000/labs/docword
hdfs://itr-mastertest01:9000/wc


spark-sql:
/uar/local/spark-1.2/bin/spark-sql --master spark://itr-mastertest01:7077 --executor-memory 512M --total-executor-cores 2
spark-shell:
MASTER=spark://itr-mastertest01:7077 /usr/local/spark-1.2/bin/spark-shell
spark-thriftserver:
/usr/local/spark-1.2/sbin/start-thriftserver.sh --master spark://itr-mastertest01:7077
0 0