spark2.1源码调试

来源:互联网 发布:知乎封号解封 编辑:程序博客网 时间:2024/06/05 09:50

spark的源码使用了两种构建工具。maven和sbt。

很多人都推荐使用idea+sbt来导入spark的源码。我自己用idea+sbt试了一下。给sbt添加镜像源了,还是没有导入成功。也许是我对sbt不够了解吧。我感觉国内对sbt支持的不好,所以劝大家还是别用sbt了。

然后我又换成EclipseIDE + maven。给maven添加了镜像源,拉jar包确实没问题了。但是eclipse直接卡死了,而且各种报错。可以看出eclipse的插件对scala的确支持的不是很好。尤其是模块化的maven工程。

最后我选择了Idea+maven的方式。不得不说intellij的确做的好,启动速度快,界面还漂亮。一直以为intellij是收费的,所以不关注它。最近才知道竟然有一个社区版的。果断就选它了。再看看github上对Spark的介绍:“Spark is built using Apache Maven.”,咦?sbt哪儿去了?spark in github

重要提示:如果你跟我一样,选择以maven工程的方式导入。先别急着导入,配置好镜像源,然后进入到spark源码顶层目录。命令行下执行 mvn -DskipTests clean package。 这样拉jar包会快些。
注意虽然配置了镜像源,速度也很快。但是拉jar包的时候有的jar包会有问题。这是就要把本地仓库里有问题的jar删掉,再拉取一次。拉完了之后再导入到intellij里。 导入到intellij之后,
View>>Tool Windows>>Maven Projects>>点击小图标Generate Sources and Update Folders For All Projects.
如果你在pom.xml里新增加了依赖,Reimport一下就好了(相当于eclipse里的maven>>update project)。

maven的镜像源配置。不要用oschina了,别问我为什么。用aliyun的。

<mirrors>    <mirror>        <id>alimaven</id>        <name>aliyun maven</name>        <url>http://maven.aliyun.com/nexus/content/groups/public/</url>        <mirrorOf>central</mirrorOf>    </mirror>    <mirror>        <id>repo2</id>        <mirrorOf>central</mirrorOf>        <name>Human Readable Name for this Mirror.</name>        <url>http://repo2.maven.org/maven2/</url>    </mirror>    <mirror>        <id>ibiblio</id>        <mirrorOf>central</mirrorOf>        <name>Human Readable Name for this Mirror.</name>        <url>http://mirrors.ibiblio.org/pub/mirrors/maven2/</url>    </mirror>    <mirror>        <id>jboss-public-repository-group</id>        <mirrorOf>central</mirrorOf>        <name>JBoss Public Repository Group</name>        <url>http://repository.jboss.org/nexus/content/groups/public</url>    </mirror></mirrors>

intellij配置maven:
File>>Other Settings>>Default Settings>>Build, Execution, Deployment>>Maven>>Maven home Directory>>your maven home path

环境:
ubuntu16.04
maven3.3.9
Oracle JDK1.8
IntelliJ IDEA community 2017.1.4

导入完之后我想在intellij里直接运行spark-example里的JavaSparkPi。你会发现很多类找不到。原因是
spark-example这个模块里的很多jar包都是provided的。这个例子本意是你直接在spark的shell里运行的,shell脚步在bin目录下,所有依赖的jar包都在jars文件夹下,所以人家给个provided完全没问题。但是如果你直接在intellij里运行当然不行。 要把pom.xml里所有的provided替换成compile。还要加上其他jar包的依赖,如jetty。

要运行JavaSparkPi这样例子,需要加上这几个jar包:

<dependency>    <groupId>com.google.guava</groupId>    <artifactId>guava</artifactId>    <version>14.0.1</version>    <scope>compile</scope></dependency><dependency>    <groupId>org.eclipse.jetty</groupId>    <artifactId>jetty-server</artifactId>    <version>9.3.11.v20160721</version>    <scope>compile</scope></dependency><dependency>    <groupId>org.eclipse.jetty</groupId>    <artifactId>jetty-util</artifactId>    <version>9.3.11.v20160721</version>    <scope>compile</scope></dependency><dependency>    <groupId>org.eclipse.jetty</groupId>    <artifactId>jetty-servlet</artifactId>    <version>9.3.11.v20160721</version>    <scope>compile</scope></dependency><dependency>    <groupId>org.eclipse.jetty</groupId>    <artifactId>jetty-webapp</artifactId>    <version>9.3.11.v20160721</version>    <scope>compile</scope></dependency><dependency>    <groupId>org.eclipse.jetty</groupId>    <artifactId>jetty-security</artifactId>    <version>9.3.11.v20160721</version>    <scope>compile</scope></dependency><dependency>    <groupId>org.eclipse.jetty</groupId>    <artifactId>jetty-http</artifactId>    <version>9.3.11.v20160721</version>    <scope>compile</scope></dependency><dependency>    <groupId>org.glassfish.jersey.core</groupId>    <artifactId>jersey-client</artifactId>    <scope>compile</scope></dependency><dependency>    <groupId>org.glassfish.jersey.core</groupId>    <artifactId>jersey-common</artifactId>    <scope>compile</scope></dependency><dependency>    <groupId>org.glassfish.jersey.core</groupId>    <artifactId>jersey-server</artifactId>    <scope>compile</scope></dependency>

然后运行的时候会报错
Exception in thread “main” org.apache.spark.SparkException: A master URL must be set in your configuration

你需要指定spark运行在本地模式:Run>>Edit Configurations>>VM Options>>写上-Dspark.master=local
然后再运行就OK了。最后控制台输出如下信息:

17/06/23 15:37:33 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 17/06/23 15:37:33 INFO DAGScheduler: ResultStage 0 (reduce at JavaSparkPi.java:54) finished in 0.430 s17/06/23 15:37:33 INFO DAGScheduler: Job 0 finished: reduce at JavaSparkPi.java:54, took 0.455996 sPi is roughly 3.1416617/06/23 15:37:33 INFO AbstractConnector: Stopped Spark@6c45ee6e{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}17/06/23 15:37:33 INFO SparkUI: Stopped Spark web UI at http://192.168.40.115:404017/06/23 15:37:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!17/06/23 15:37:33 INFO MemoryStore: MemoryStore cleared17/06/23 15:37:33 INFO BlockManager: BlockManager stopped17/06/23 15:37:33 INFO BlockManagerMaster: BlockManagerMaster stopped17/06/23 15:37:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!17/06/23 15:37:33 INFO SparkContext: Successfully stopped SparkContext17/06/23 15:37:33 INFO ShutdownHookManager: Shutdown hook called17/06/23 15:37:33 INFO ShutdownHookManager: Deleting directory /tmp/spark-8fbff2d9-a87a-408e-bd4c-b5a4d4252762Process finished with exit code 0