spark 自带的例子在eclipse下运行的方法

来源:互联网 发布:linux 格式化硬盘 编辑:程序博客网 时间:2024/06/05 19:57
**spark中自带了部分java例子,下面介绍JavaWordCount.java在eclipse中的运行方法。例子源码如下:**/* * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements.  See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License.  You may obtain a copy of the License at * *    http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. *///package org.apache.spark.examples;package com.company;import scala.Tuple2;import org.apache.spark.SparkConf;import org.apache.spark.api.java.JavaPairRDD;import org.apache.spark.api.java.JavaRDD;import org.apache.spark.api.java.JavaSparkContext;import org.apache.spark.api.java.function.FlatMapFunction;import org.apache.spark.api.java.function.Function2;import org.apache.spark.api.java.function.PairFunction;import java.util.Arrays;import java.util.List;import java.util.regex.Pattern;public final class JavaWordCount {  private static final Pattern SPACE = Pattern.compile(" ");  public static void main(String[] args) throws Exception {    if (args.length < 1) {      System.err.println("Usage: JavaWordCount <file>");      System.exit(1);    }    SparkConf sparkConf = new SparkConf().setAppName("JavaWordCount");    JavaSparkContext ctx = new JavaSparkContext(sparkConf);    JavaRDD<String> lines = ctx.textFile(args[0], 1);    JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>() {      @Override      public Iterable<String> call(String s) {        return Arrays.asList(SPACE.split(s));      }    });    JavaPairRDD<String, Integer> ones = words.mapToPair(new PairFunction<String, String, Integer>() {      @Override      public Tuple2<String, Integer> call(String s) {        return new Tuple2<String, Integer>(s, 1);      }    });    JavaPairRDD<String, Integer> counts = ones.reduceByKey(new Function2<Integer, Integer, Integer>() {      @Override      public Integer call(Integer i1, Integer i2) {        return i1 + i2;      }    });    List<Tuple2<String, Integer>> output = counts.collect();    for (Tuple2<?,?> tuple : output) {      System.out.println(tuple._1() + ": " + tuple._2());    }    ctx.stop();  }}**使用runAs java application运行,发现如下错误:**Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties16/05/11 21:18:46 INFO SparkContext: Running Spark version 1.6.016/05/11 21:18:50 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable16/05/11 21:18:52 ERROR SparkContext: Error initializing SparkContext.org.apache.spark.SparkException: A master URL must be set in your configurationat org.apache.spark.SparkContext.<init>(SparkContext.scala:401)at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)at com.company.JavaWordCount.main(JavaWordCount.java:44)16/05/11 21:18:52 INFO SparkContext: Successfully stopped SparkContextException in thread "main" org.apache.spark.SparkException: A master URL must be set in your configurationat org.apache.spark.SparkContext.<init>(SparkContext.scala:401)at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)at com.company.JavaWordCount.main(JavaWordCount.java:44)SparkConf sparkConf = new SparkConf().setAppName("JavaWordCount").setMaster("spark://localhost:7077");设定主机,再次运行,发现还是有错误,错误信息如下:Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties16/05/11 19:49:05 INFO SparkContext: Running Spark version 1.6.016/05/11 19:49:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable16/05/11 19:49:08 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.68.137 instead (on interface eth0)16/05/11 19:49:08 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address16/05/11 19:49:09 INFO SecurityManager: Changing view acls to: hadoop16/05/11 19:49:09 INFO SecurityManager: Changing modify acls to: hadoop16/05/11 19:49:09 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)16/05/11 19:49:12 INFO Utils: Successfully started service 'sparkDriver' on port 39253.16/05/11 19:49:14 INFO Slf4jLogger: Slf4jLogger started16/05/11 19:49:14 INFO Remoting: Starting remoting16/05/11 19:49:15 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.68.137:38623]16/05/11 19:49:15 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 38623.16/05/11 19:49:15 INFO SparkEnv: Registering MapOutputTracker16/05/11 19:49:15 INFO SparkEnv: Registering BlockManagerMaster16/05/11 19:49:15 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-af6f0bab-afc3-44c3-a6a3-845dd5a0895f16/05/11 19:49:15 INFO MemoryStore: MemoryStore started with capacity 257.9 MB16/05/11 19:49:16 INFO SparkEnv: Registering OutputCommitCoordinator16/05/11 19:49:17 INFO Utils: Successfully started service 'SparkUI' on port 4040.16/05/11 19:49:17 INFO SparkUI: Started SparkUI at http://192.168.68.137:404016/05/11 19:49:18 INFO AppClient$ClientEndpoint: Connecting to master spark://localhost:7077...16/05/11 19:49:19 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20160511194918-000116/05/11 19:49:19 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39627.16/05/11 19:49:19 INFO NettyBlockTransferService: Server created on 3962716/05/11 19:49:19 INFO BlockManagerMaster: Trying to register BlockManager16/05/11 19:49:19 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.68.137:39627 with 257.9 MB RAM, BlockManagerId(driver, 192.168.68.137, 39627)16/05/11 19:49:19 INFO BlockManagerMaster: Registered BlockManager16/05/11 19:49:19 INFO AppClient$ClientEndpoint: Executor added: app-20160511194918-0001/0 on worker-20160511194527-127.0.0.1-60523 (127.0.0.1:60523) with 2 cores16/05/11 19:49:19 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160511194918-0001/0 on hostPort 127.0.0.1:60523 with 2 cores, 1024.0 MB RAM16/05/11 19:49:19 INFO AppClient$ClientEndpoint: Executor added: app-20160511194918-0001/1 on worker-20160511194527-127.0.0.1-60805 (127.0.0.1:60805) with 2 cores16/05/11 19:49:19 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160511194918-0001/1 on hostPort 127.0.0.1:60805 with 2 cores, 1024.0 MB RAM16/05/11 19:49:19 INFO AppClient$ClientEndpoint: Executor updated: app-20160511194918-0001/1 is now RUNNING16/05/11 19:49:19 INFO AppClient$ClientEndpoint: Executor updated: app-20160511194918-0001/0 is now RUNNING16/05/11 19:49:20 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.016/05/11 19:49:25 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 153.6 KB, free 153.6 KB)16/05/11 19:49:26 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 13.9 KB, free 167.5 KB)16/05/11 19:49:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.68.137:39627 (size: 13.9 KB, free: 257.8 MB)16/05/11 19:49:26 INFO SparkContext: Created broadcast 0 from textFile at JavaWordCount.java:4516/05/11 19:50:06 INFO FileInputFormat: Total input paths to process : 116/05/11 19:50:17 INFO SparkDeploySchedulerBackend: Registered executor NettyRpcEndpointRef(null) (192.168.68.137:60460) with ID 116/05/11 19:50:18 INFO SparkDeploySchedulerBackend: Registered executor NettyRpcEndpointRef(null) (192.168.68.137:60461) with ID 016/05/11 19:50:18 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.68.137:33128 with 517.4 MB RAM, BlockManagerId(0, 192.168.68.137, 33128)16/05/11 19:50:18 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.68.137:34534 with 517.4 MB RAM, BlockManagerId(1, 192.168.68.137, 34534)16/05/11 19:51:42 INFO SparkContext: Starting job: collect at JavaWordCount.java:6816/05/11 19:51:49 INFO DAGScheduler: Registering RDD 3 (mapToPair at JavaWordCount.java:54)16/05/11 19:51:49 INFO DAGScheduler: Got job 0 (collect at JavaWordCount.java:68) with 1 output partitions16/05/11 19:51:49 INFO DAGScheduler: Final stage: ResultStage 1 (collect at JavaWordCount.java:68)16/05/11 19:51:49 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)16/05/11 19:51:49 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)16/05/11 19:51:51 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at JavaWordCount.java:54), which has no missing parents16/05/11 19:51:57 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.7 KB, free 172.2 KB)16/05/11 19:51:57 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.6 KB, free 174.9 KB)16/05/11 19:51:57 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.68.137:39627 (size: 2.6 KB, free: 257.8 MB)16/05/11 19:51:57 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:100616/05/11 19:51:58 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at JavaWordCount.java:54)16/05/11 19:51:58 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks16/05/11 19:52:03 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.68.137, partition 0,ANY, 2128 bytes)16/05/11 19:52:30 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.68.137:34534 (size: 2.6 KB, free: 517.4 MB)16/05/11 19:52:37 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 192.168.68.137): java.lang.ClassNotFoundException: com.company.JavaWordCount$1at java.net.URLClassLoader$1.run(URLClassLoader.java:366)at java.net.URLClassLoader$1.run(URLClassLoader.java:355)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:354)at java.lang.ClassLoader.loadClass(ClassLoader.java:425)at java.lang.ClassLoader.loadClass(ClassLoader.java:358)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:278)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)at scala.collection.immutable.$colon$colon.readObject(List.scala:362)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)at org.apache.spark.scheduler.Task.run(Task.scala:89)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)at java.lang.Thread.run(Thread.java:745)16/05/11 19:52:37 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 1, 192.168.68.137, partition 0,ANY, 2128 bytes)16/05/11 19:52:38 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 1) on executor 192.168.68.137: java.lang.ClassNotFoundException (com.company.JavaWordCount$1) [duplicate 1]16/05/11 19:52:38 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 2, 192.168.68.137, partition 0,ANY, 2128 bytes)16/05/11 19:53:04 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.68.137:33128 (size: 2.6 KB, free: 517.4 MB)16/05/11 19:53:26 WARN TaskSetManager: Lost task 0.2 in stage 0.0 (TID 2, 192.168.68.137): java.lang.ClassNotFoundException: com.company.JavaWordCount$1at java.net.URLClassLoader$1.run(URLClassLoader.java:366)at java.net.URLClassLoader$1.run(URLClassLoader.java:355)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:354)at java.lang.ClassLoader.loadClass(ClassLoader.java:425)at java.lang.ClassLoader.loadClass(ClassLoader.java:358)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:278)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)at scala.collection.immutable.$colon$colon.readObject(List.scala:362)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)at org.apache.spark.scheduler.Task.run(Task.scala:89)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)at java.lang.Thread.run(Thread.java:745)16/05/11 19:53:26 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID 3, 192.168.68.137, partition 0,ANY, 2128 bytes)16/05/11 19:53:27 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID 3) on executor 192.168.68.137: java.lang.ClassNotFoundException (com.company.JavaWordCount$1) [duplicate 1]16/05/11 19:53:28 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; aborting job16/05/11 19:53:28 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 16/05/11 19:53:30 INFO TaskSchedulerImpl: Cancelling stage 016/05/11 19:53:30 INFO DAGScheduler: ShuffleMapStage 0 (mapToPair at JavaWordCount.java:54) failed in 90.346 s16/05/11 19:53:31 INFO DAGScheduler: Job 0 failed: collect at JavaWordCount.java:68, took 108.471838 sException in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 192.168.68.137): java.lang.ClassNotFoundException: com.company.JavaWordCount$1at java.net.URLClassLoader$1.run(URLClassLoader.java:366)at java.net.URLClassLoader$1.run(URLClassLoader.java:355)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:354)at java.lang.ClassLoader.loadClass(ClassLoader.java:425)at java.lang.ClassLoader.loadClass(ClassLoader.java:358)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:278)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)at scala.collection.immutable.$colon$colon.readObject(List.scala:362)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)at org.apache.spark.scheduler.Task.run(Task.scala:89)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)at java.lang.Thread.run(Thread.java:745)Driver stacktrace:at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)at scala.Option.foreach(Option.scala:236)at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)at org.apache.spark.rdd.RDD.collect(RDD.scala:926)at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:339)at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:46)at com.company.JavaWordCount.main(JavaWordCount.java:68)Caused by: java.lang.ClassNotFoundException: com.company.JavaWordCount$1at java.net.URLClassLoader$1.run(URLClassLoader.java:366)at java.net.URLClassLoader$1.run(URLClassLoader.java:355)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:354)at java.lang.ClassLoader.loadClass(ClassLoader.java:425)at java.lang.ClassLoader.loadClass(ClassLoader.java:358)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:278)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)at scala.collection.immutable.$colon$colon.readObject(List.scala:362)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)at org.apache.spark.scheduler.Task.run(Task.scala:89)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)at java.lang.Thread.run(Thread.java:745)16/05/11 19:53:54 INFO SparkContext: Invoking stop() from shutdown hook16/05/11 19:54:11 WARN QueuedThreadPool: 8 threads could not be stopped16/05/11 19:54:11 INFO SparkUI: Stopped Spark web UI at http://192.168.68.137:404016/05/11 19:54:15 INFO SparkDeploySchedulerBackend: Shutting down all executors16/05/11 19:54:15 INFO SparkDeploySchedulerBackend: Asking each executor to shut down16/05/11 19:54:23 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!16/05/11 19:54:39 INFO MemoryStore: MemoryStore cleared16/05/11 19:54:39 INFO BlockManager: BlockManager stopped16/05/11 19:54:39 INFO BlockManagerMaster: BlockManagerMaster stopped16/05/11 19:54:42 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!16/05/11 19:55:19 INFO SparkContext: Successfully stopped SparkContext16/05/11 19:55:19 INFO ShutdownHookManager: Shutdown hook called16/05/11 19:55:19 INFO ShutdownHookManager: Deleting directory /tmp/spark-7588bb52-bb63-439a-b4ee-b3c830d8959c****从**网上查找发现,是没有上传jar包,解决方法如下,导出jar包,本例命名为WCount.jar,然后使用ctx.addJar("/home/hadoop/workspace/sparkwordcount/WCount.jar");语句执行上传操作.**最终运行结果如下:Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties16/05/11 20:21:17 INFO SparkContext: Running Spark version 1.6.016/05/11 20:21:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable16/05/11 20:21:22 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.68.137 instead (on interface eth0)16/05/11 20:21:22 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address16/05/11 20:21:22 INFO SecurityManager: Changing view acls to: hadoop16/05/11 20:21:22 INFO SecurityManager: Changing modify acls to: hadoop16/05/11 20:21:22 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)16/05/11 20:21:25 INFO Utils: Successfully started service 'sparkDriver' on port 37591.16/05/11 20:21:27 INFO Slf4jLogger: Slf4jLogger started16/05/11 20:21:28 INFO Remoting: Starting remoting16/05/11 20:21:28 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.68.137:58065]16/05/11 20:21:28 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 58065.16/05/11 20:21:29 INFO SparkEnv: Registering MapOutputTracker16/05/11 20:21:29 INFO SparkEnv: Registering BlockManagerMaster16/05/11 20:21:29 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-de7b9cf2-425a-4325-a964-d616acb3522d16/05/11 20:21:29 INFO MemoryStore: MemoryStore started with capacity 257.9 MB16/05/11 20:21:29 INFO SparkEnv: Registering OutputCommitCoordinator16/05/11 20:21:30 INFO Utils: Successfully started service 'SparkUI' on port 4040.16/05/11 20:21:30 INFO SparkUI: Started SparkUI at http://192.168.68.137:404016/05/11 20:21:32 INFO AppClient$ClientEndpoint: Connecting to master spark://localhost:7077...16/05/11 20:21:33 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20160511202133-000316/05/11 20:21:33 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51896.16/05/11 20:21:33 INFO AppClient$ClientEndpoint: Executor added: app-20160511202133-0003/0 on worker-20160511194527-127.0.0.1-60805 (127.0.0.1:60805) with 2 cores16/05/11 20:21:33 INFO NettyBlockTransferService: Server created on 5189616/05/11 20:21:33 INFO BlockManagerMaster: Trying to register BlockManager16/05/11 20:21:33 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160511202133-0003/0 on hostPort 127.0.0.1:60805 with 2 cores, 1024.0 MB RAM16/05/11 20:21:33 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.68.137:51896 with 257.9 MB RAM, BlockManagerId(driver, 192.168.68.137, 51896)16/05/11 20:21:33 INFO BlockManagerMaster: Registered BlockManager16/05/11 20:21:34 INFO AppClient$ClientEndpoint: Executor updated: app-20160511202133-0003/0 is now RUNNING16/05/11 20:21:35 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.016/05/11 20:21:35 INFO HttpFileServer: HTTP File server directory is /tmp/spark-d3f799ee-23b3-4ac3-bf98-422947d752ba/httpd-ce6cce7a-ce24-453e-bbb5-6b2ed386631616/05/11 20:21:35 INFO HttpServer: Starting HTTP Server16/05/11 20:21:35 INFO Utils: Successfully started service 'HTTP file server' on port 55960.16/05/11 20:21:49 INFO SparkContext: Added JAR /home/hadoop/workspace/sparkwordcount/WCount.jar at http://192.168.68.137:55960/jars/WCount.jar with timestamp 146296930976616/05/11 20:21:52 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 153.6 KB, free 153.6 KB)16/05/11 20:21:53 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 13.9 KB, free 167.5 KB)16/05/11 20:21:53 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.68.137:51896 (size: 13.9 KB, free: 257.8 MB)16/05/11 20:21:53 INFO SparkContext: Created broadcast 0 from textFile at JavaWordCount.java:4616/05/11 20:22:09 INFO FileInputFormat: Total input paths to process : 116/05/11 20:22:11 INFO SparkContext: Starting job: collect at JavaWordCount.java:6916/05/11 20:22:12 INFO DAGScheduler: Registering RDD 3 (mapToPair at JavaWordCount.java:55)16/05/11 20:22:12 INFO DAGScheduler: Got job 0 (collect at JavaWordCount.java:69) with 1 output partitions16/05/11 20:22:12 INFO DAGScheduler: Final stage: ResultStage 1 (collect at JavaWordCount.java:69)16/05/11 20:22:12 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)16/05/11 20:22:12 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)16/05/11 20:22:12 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at JavaWordCount.java:55), which has no missing parents16/05/11 20:22:13 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.7 KB, free 172.2 KB)16/05/11 20:22:13 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.6 KB, free 174.9 KB)16/05/11 20:22:13 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.68.137:51896 (size: 2.6 KB, free: 257.8 MB)16/05/11 20:22:13 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:100616/05/11 20:22:13 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at JavaWordCount.java:55)16/05/11 20:22:13 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks16/05/11 20:22:14 INFO SparkDeploySchedulerBackend: Registered executor NettyRpcEndpointRef(null) (192.168.68.137:52910) with ID 016/05/11 20:22:14 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.68.137, partition 0,ANY, 2181 bytes)16/05/11 20:22:29 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.68.137:38864 with 517.4 MB RAM, BlockManagerId(0, 192.168.68.137, 38864)16/05/11 20:23:58 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.68.137:38864 (size: 2.6 KB, free: 517.4 MB)16/05/11 20:24:04 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.68.137:38864 (size: 13.9 KB, free: 517.4 MB)16/05/11 20:24:54 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 159811 ms on 192.168.68.137 (1/1)16/05/11 20:24:54 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 16/05/11 20:24:54 INFO DAGScheduler: ShuffleMapStage 0 (mapToPair at JavaWordCount.java:55) finished in 160.761 s16/05/11 20:24:54 INFO DAGScheduler: looking for newly runnable stages16/05/11 20:24:54 INFO DAGScheduler: running: Set()16/05/11 20:24:54 INFO DAGScheduler: waiting: Set(ResultStage 1)16/05/11 20:24:54 INFO DAGScheduler: failed: Set()16/05/11 20:24:54 INFO DAGScheduler: Submitting ResultStage 1 (ShuffledRDD[4] at reduceByKey at JavaWordCount.java:62), which has no missing parents16/05/11 20:24:54 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 2.9 KB, free 177.8 KB)16/05/11 20:24:54 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1750.0 B, free 179.5 KB)16/05/11 20:24:54 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.68.137:51896 (size: 1750.0 B, free: 257.8 MB)16/05/11 20:24:54 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:100616/05/11 20:24:54 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (ShuffledRDD[4] at reduceByKey at JavaWordCount.java:62)16/05/11 20:24:54 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks16/05/11 20:24:54 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, 192.168.68.137, partition 0,NODE_LOCAL, 1947 bytes)16/05/11 20:24:54 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.68.137:38864 (size: 1750.0 B, free: 517.4 MB)16/05/11 20:24:54 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 192.168.68.137:5291016/05/11 20:24:55 INFO MapOutputTrackerMaster: Size of output statuses for shuffle 0 is 141 bytes16/05/11 20:24:55 INFO DAGScheduler: ResultStage 1 (collect at JavaWordCount.java:69) finished in 0.890 s16/05/11 20:24:55 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 891 ms on 192.168.68.137 (1/1)16/05/11 20:24:55 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 16/05/11 20:24:55 INFO DAGScheduler: Job 0 finished: collect at JavaWordCount.java:69, took 163.950721 sis: 2hadoop: 1mapreduce: 1very: 2good.: 216/05/11 20:24:55 INFO SparkUI: Stopped Spark web UI at http://192.168.68.137:404016/05/11 20:24:55 INFO SparkDeploySchedulerBackend: Shutting down all executors16/05/11 20:24:55 INFO SparkDeploySchedulerBackend: Asking each executor to shut down16/05/11 20:24:56 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!16/05/11 20:24:56 INFO MemoryStore: MemoryStore cleared16/05/11 20:24:56 INFO BlockManager: BlockManager stopped16/05/11 20:24:56 INFO BlockManagerMaster: BlockManagerMaster stopped16/05/11 20:24:56 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!16/05/11 20:24:56 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.16/05/11 20:24:57 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.16/05/11 20:25:01 INFO SparkContext: Successfully stopped SparkContext16/05/11 20:25:02 INFO ShutdownHookManager: Shutdown hook called16/05/11 20:25:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-d3f799ee-23b3-4ac3-bf98-422947d752ba/httpd-ce6cce7a-ce24-453e-bbb5-6b2ed386631616/05/11 20:25:03 INFO ShutdownHookManager: Deleting directory /tmp/spark-d3f799ee-23b3-4ac3-bf98-422947d752ba16/05/11 20:25:03 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
1 0
原创粉丝点击