window+idea+spark+debug windows下spark开发调试环境搭建

来源:互联网 发布:mac搜狗输入法不显示 编辑:程序博客网 时间:2024/06/05 00:07
NO步骤1搭建环境前写个demo代码2安装配置jdk1.83安装配置scala2.11.184导入pom.xml中依赖jar5下载Hadoop的bin包,设定环境变量HADOOP_HOME,值为解压后的目录6下载winutils.exe将其放到$HADOOP_HOME/bin/目录下7配置程序启动参数local8启动
<properties>       <scala.version>2.11</scala.version>       <spark.version>2.0.1</spark.version>   </properties><dependencies>   <dependency>      <groupId>org.apache.spark</groupId>      <artifactId>spark-core_${scala.version}</artifactId>      <version>${spark.version}</version>   </dependency>   <dependency>      <groupId>org.apache.spark</groupId>      <artifactId>spark-streaming_${scala.version}</artifactId>      <version>${spark.version}</version>   </dependency>   <dependency>      <groupId>org.apache.spark</groupId>      <artifactId>spark-sql_${scala.version}</artifactId>      <version>${spark.version}</version>   </dependency>   <dependency>      <groupId>org.apache.spark</groupId>      <artifactId>spark-mllib_${scala.version}</artifactId>      <version>${spark.version}</version>   </dependency>       <dependency>           <groupId>org.apache.spark</groupId>           <artifactId>spark-hive_${scala.version}</artifactId>           <version>${spark.version}</version>       </dependency>

0 0