看example源码学spark系列(4)-DriverSubmissionTest

来源:互联网 发布:淘宝美工工资怎么算 编辑:程序博客网 时间:2024/04/30 04:31

先运行

jpan@jpan-Beijing:~/Software/spark-0.9.1$ ./bin/run-example org.apache.spark.examples.DriverSubmissionTest 3Environment variables containing SPARK_TEST:System properties containing spark.test:Alive for 1 out of 3 secondsAlive for 2 out of 3 seconds

发现打印出来的信息有点问题,SPARK_TEST后面什么都没有,我们分析下源码

/* * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements.  See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License.  You may obtain a copy of the License at * *    http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */package org.apache.spark.examplesimport scala.collection.JavaConversions._/** Prints out environmental information, sleeps, and then exits. Made to  * test driver submission in the standalone scheduler. */object DriverSubmissionTest {  def main(args: Array[String]) {    if (args.size < 1) {      println("Usage: DriverSubmissionTest <seconds-to-sleep>")      System.exit(0)    }    val numSecondsToSleep = args(0).toInt    val env = System.getenv()    val properties = System.getProperties()    println("Environment variables containing SPARK_TEST:")    env.filter{case (k, v) => k.contains("SPARK_TEST")}.foreach(println)    println("System properties containing spark.test:")    properties.filter{case (k, v) => k.toString.contains("spark.test")}.foreach(println)    for (i <- 1 until numSecondsToSleep) {      println(s"Alive for $i out of $numSecondsToSleep seconds")      Thread.sleep(1000)    }  }}

从源码中可以看出,程序是打印系统变量中包含SPARK_TEST属性的值。但是我本机的env 和 properties如下(这是在spark-shell下):

scala> val env = System.getenv()env: java.util.Map[String,String] = {TERM=xterm, XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0, HADOOP_COMMON_LIB_NATIVE_DIR=/home/jpan/Software/hadoop-2.2.0/lib/native, SSH_AGENT_PID=2091, JAVA_HOME=/usr/share/jdk1.7.0_51, SSH_AGENT_LAUNCHER=upstart, LESSCLOSE=/usr/bin/lesspipe %s %s, UPSTART_SESSION=unix:abstract=/com/ubuntu/upstart-session/1000/2035, SESSION_MANAGER=local/jpan-Beijing:@/tmp/.ICE-unix/2199,unix/jpan-Beijing:/tmp/.ICE-unix/2199, LC_NUMERIC=zh_CN.UTF-8, GNOME_DESKTOP_SESSION_ID=this-is-deprecated, COMPIZ_CONFIG_PROFILE=ubuntu, IM_CONFIG_PHASE=1, GDMSESSION=ubuntu, MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path, PWD=/home/jpan/Software/spark-0.9.1, SESSIONTYPE=gnome-session, GTK_IM_MODULE=ibus, MASTER=spark://jpan-Beijing:7077, XDG_GREETER_DATA_DIR=/va...scala> val properties = System.getProperties()properties: java.util.Properties = {java.runtime.name=Java(TM) SE Runtime Environment, =/usr/share/jdk1.7.0_51/jre/lib/amd64, java.vm.version=24.51-b03, java.vm.vendor=Oracle Corporation, java.vendor.url=http://java.oracle.com/, path.separator=:, java.vm.name=Java HotSpot(TM) 64-Bit Server VM, file.encoding.pkg=sun.io, user.country=US, sun.java.launcher=SUN_STANDARD, sun.os.patch.level=unknown, java.vm.specification.name=Java Virtual Machine Specification, user.dir=/home/jpan/Software/spark-0.9.1, java.runtime.version=1.7.0_51-b13, java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment, java.endorsed.dirs=/usr/share/jdk1.7.0_51/jre/lib/endorsed, os.arch=amd64, java.io.tmpdir=/tmp, line.separator=, java.vm.specification.vendor=Oracle Corporation, os.name=Linux, sun.jn...

其中并没有包含spark_test字样,所以没有打印出来。于是我自己见了项目,并把SPARK_TEST改为MASTER,把spark_test改为sun.boot.library.path,运行结果如下:

jpan@jpan-Beijing:~/Mywork/spark_test/DriverSubmissionTest$ sbt "project driversubmissiontest" "run 3"[info] Set current project to DriverSubmissionTest (in build file:/home/jpan/Mywork/spark_test/DriverSubmissionTest/)[info] Set current project to DriverSubmissionTest (in build file:/home/jpan/Mywork/spark_test/DriverSubmissionTest/)[info] Compiling 1 Scala source to /home/jpan/Mywork/spark_test/DriverSubmissionTest/target/scala-2.10/classes...[info] Running main.scala.DriverSubmissionTest 3Environment variables containing SPARK_TEST:(MASTER,spark://jpan-Beijing:7077)System properties containing spark.test:(sun.boot.library.path,/usr/share/jdk1.7.0_51/jre/lib/amd64)Alive for 1 out of 3 secondsAlive for 2 out of 3 seconds[success] Total time: 9 s, completed Jun 4, 2014 3:49:10 PM
即可以看到结果。


源码分析:

这个源码比较简单,就是打印出系统信息,并sleep几秒后退出。程序里面有意思的是filter中的case(k,v),它自动把等好前面的赋值为K,后面赋值为V。

scala语法里有。

0 0