Intellij Idea Linux环境下 新建Maven(java)和Scala工程

来源:互联网 发布:js 数组 push pop 编辑:程序博客网 时间:2024/05/16 08:22

Maven工程

1.新建Maven工程,用Java编写Spark的WordCount程序


2.在弹出的界面选择左侧Maven项目,SDK选择自己按安装的jdk,点击next按钮

3.填写 GropId 和ArtifactId,点击next按钮

4.填写工程名称和选择工程存放的位置,点击next继续

5.在新建的Maven项目中,找到Pom.xml文件,添加配置信息(依赖包),其中build选项中是为了eclipse Maven打包需要的配置信息。

  <dependencies>        <dependency>            <groupId>junit</groupId>            <artifactId>junit</artifactId>            <version>3.8.1</version>            <scope>test</scope>        </dependency>        <dependency>            <groupId>org.apache.spark</groupId>            <artifactId>spark-core_2.10</artifactId>            <version>1.6.1</version>        </dependency>        <dependency>            <groupId>org.apache.spark</groupId>            <artifactId>spark-sql_2.10</artifactId>            <version>1.6.1</version>        </dependency>        <dependency>            <groupId>org.apache.spark</groupId>            <artifactId>spark-hive_2.10</artifactId>            <version>1.6.1</version>        </dependency>        <dependency>            <groupId>org.apache.spark</groupId>            <artifactId>spark-streaming_2.10</artifactId>            <version>1.6.1</version>        </dependency>        <dependency>            <groupId>org.apache.hadoop</groupId>            <artifactId>hadoop-client</artifactId>            <version>2.5.0-cdh5.3.6</version>        </dependency>        <dependency>            <groupId>org.apache.spark</groupId>            <artifactId>spark-streaming-kafka_2.10</artifactId>            <version>1.6.1</version>        </dependency>    </dependencies>    <build>        <sourceDirectory>src/main/java</sourceDirectory>        <testSourceDirectory>src/main/test</testSourceDirectory>        <plugins>            <plugin>                <artifactId>maven-assembly-plugin</artifactId>                <configuration>                    <descriptorRefs>                        <descriptorRef>jar-with-dependencies</descriptorRef>                    </descriptorRefs>                    <archive>                        <manifest>                            <mainClass></mainClass>                        </manifest>                    </archive>                </configuration>                <executions>                    <execution>                        <id>make-assembly</id>                        <phase>package</phase>                        <goals>                            <goal>single</goal>                        </goals>                    </execution>                </executions>            </plugin>            <plugin>                <groupId>org.codehaus.mojo</groupId>                <artifactId>exec-maven-plugin</artifactId>                <version>1.2.1</version>                <executions>                    <execution>                        <goals>                            <goal>exec</goal>                        </goals>                    </execution>                </executions>                <configuration>                    <executable>java</executable>                    <includeProjectDependencies>true</includeProjectDependencies>                    <includePluginDependencies>false</includePluginDependencies>                    <classpathScope>compile</classpathScope>                    <mainClass>cn.spark.sparktest.App</mainClass>                </configuration>            </plugin>            <plugin>                <groupId>org.apache.maven.plugins</groupId>                <artifactId>maven-compiler-plugin</artifactId>                <configuration>                    <source>1.6</source>                    <target>1.6</target>                </configuration>            </plugin>        </plugins>    </build>

6.

>>1导入外部Spark的jar包(编写Spark程序需要这些jar包),选择项目右键,点击Open Modules Settings

>>2<< ,按照如下图示添加jarbao


3.找到安装的Spark的lib目录下的spark-assembly-1.6.1-hadoop2.5.0-cdh5.3.6.jar包加入到项目中去,即可



创建Scala工程

1.idea程序,File--》new--》Project,然后按照如下图示创建Scala工程


2.填写新建工程name,选择工程存储位置,jdk版本,Scala版本,(scala选 spark-sdk-2-10,为了和Spark兼容),点击finish即可


阅读全文
0 0
原创粉丝点击