spark 运行log cluster 模式

来源:互联网 发布:晓风 源码 编辑:程序博客网 时间:2024/03/29 17:04
[root@TMAXSPARK bin]# ./spark-submit --class Boot --master spark://TMAXSPARK:7077 --executor-memory 10g --total-executor-cores 10 /home/beijixing/hbf/sparktest_2.10-1.0.jar
2016-06-29 14:38:22,131 INFO SparkContext: Running Spark version 1.4.1
2016-06-29 14:38:22,409 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-06-29 14:38:22,613 INFO SecurityManager: Changing view acls to: root
2016-06-29 14:38:22,614 INFO SecurityManager: Changing modify acls to: root
2016-06-29 14:38:22,614 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
2016-06-29 14:38:24,089 INFO Slf4jLogger: Slf4jLogger started
2016-06-29 14:38:24,144 INFO Remoting: Starting remoting
2016-06-29 14:38:25,183 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@TMAXSPARK:36108]
2016-06-29 14:38:25,189 INFO Utils: Successfully started service 'sparkDriver' on port 36108.
2016-06-29 14:38:25,214 INFO SparkEnv: Registering MapOutputTracker
2016-06-29 14:38:25,232 INFO SparkEnv: Registering BlockManagerMaster
2016-06-29 14:38:25,329 INFO DiskBlockManager: Created local directory at /data2/zdh/spark/tmp/spark-6cfc0cc1-c223-4166-a29a-2f88988181f7/blockmgr-c5ef6fee-4b32-4da5-b182-3da265d91f22
2016-06-29 14:38:25,330 INFO DiskBlockManager: Created local directory at /data3/zdh/spark/tmp/spark-e4ea423d-72f6-46cf-86e3-0cadf01519bf/blockmgr-6c29df30-b3d5-48ef-9683-6cdb44355848
2016-06-29 14:38:25,330 INFO DiskBlockManager: Created local directory at /data4/zdh/spark/tmp/spark-76fb6019-d5a2-4757-a5e3-d3b026374cc0/blockmgr-d167f389-02f4-4c78-bb30-56c010390a6a
2016-06-29 14:38:25,331 INFO DiskBlockManager: Created local directory at /data5/zdh/spark/tmp/spark-c12a88f9-3817-4cb4-a1c6-1f8e459a7b4b/blockmgr-f3abca40-3052-4b94-9761-8329a7ce2600
2016-06-29 14:38:25,331 INFO DiskBlockManager: Created local directory at /data6/zdh/spark/tmp/spark-1f6d75fe-54c1-4745-aad8-e9cf2f48c1a4/blockmgr-e0447fd5-f826-4500-9318-77c23e92d6b9
2016-06-29 14:38:25,331 INFO DiskBlockManager: Created local directory at /data7/zdh/spark/tmp/spark-f8a9b580-5d17-423e-abf0-6741665d853b/blockmgr-4101f569-344e-4195-9d50-c403f0bcd6c4
2016-06-29 14:38:25,331 INFO DiskBlockManager: Created local directory at /data8/zdh/spark/tmp/spark-5371bddd-6c1e-45ea-8f5e-433031298e92/blockmgr-a72ff6ce-ec95-4f5c-a066-ba6c1186aca2
2016-06-29 14:38:25,332 INFO DiskBlockManager: Created local directory at /data9/zdh/spark/tmp/spark-7bea6c54-ebc6-47d8-bf74-e324bd0868b5/blockmgr-12231233-bdd8-449b-826b-5239aa500d7e
2016-06-29 14:38:25,332 INFO DiskBlockManager: Created local directory at /data10/zdh/spark/tmp/spark-9eb65ea0-f62f-4af7-aab4-cff42acc913f/blockmgr-a31454e5-7a29-4ecd-9a6b-84e9b6d2c215
2016-06-29 14:38:25,332 INFO DiskBlockManager: Created local directory at /data11/zdh/spark/tmp/spark-60192fa1-52d3-4ca5-92ed-15149ff27541/blockmgr-2af3c294-adff-4e79-85e8-500b9c7f0c2e
2016-06-29 14:38:25,332 INFO DiskBlockManager: Created local directory at /data12/zdh/spark/tmp/spark-78233916-30b9-4e21-81a7-7ec822372eda/blockmgr-ec94f999-3038-44ee-8f22-01e5cafc7ccc
2016-06-29 14:38:25,339 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
2016-06-29 14:38:25,414 INFO HttpFileServer: HTTP File server directory is /data2/zdh/spark/tmp/spark-6cfc0cc1-c223-4166-a29a-2f88988181f7/httpd-2cc43385-0aa4-4f57-9426-4e73e6d2955b
2016-06-29 14:38:25,418 INFO HttpServer: Starting HTTP Server
2016-06-29 14:38:25,487 INFO Utils: Successfully started service 'HTTP file server' on port 39663.
2016-06-29 14:38:25,501 INFO SparkEnv: Registering OutputCommitCoordinator
2016-06-29 14:38:26,461 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
2016-06-29 14:38:26,483 INFO Utils: Successfully started service 'SparkUI' on port 4041.
2016-06-29 14:38:26,485 INFO SparkUI: Started SparkUI at http://TMAXSPARK:4041
2016-06-29 14:38:26,557 INFO SparkContext: Added JAR file:/home/beijixing/hbf/sparktest_2.10-1.0.jar at http://TMAXSPARK:39663/jars/sparktest_2.10-1.0.jar with timestamp 1467182306556
2016-06-29 14:38:26,634 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@TMAXSPARK:7077/user/Master...
2016-06-29 14:38:26,841 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20160629143826-0517
2016-06-29 14:38:26,842 INFO TaskSchedulerImpl: Starting speculative execution thread
2016-06-29 14:38:26,843 INFO AppClient$ClientActor: Executor added: app-20160629143826-0517/0 on worker-20160628014633-HADOOP-47146 (HADOOP:47146) with 4 cores
2016-06-29 14:38:26,845 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160629143826-0517/0 on hostPort HADOOP:47146 with 4 cores, 10.0 GB RAM
2016-06-29 14:38:26,845 INFO AppClient$ClientActor: Executor added: app-20160629143826-0517/1 on worker-20160627091043-WHOLEDAP-32310 (WHOLEDAP:32310) with 3 cores
2016-06-29 14:38:26,846 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160629143826-0517/1 on hostPort WHOLEDAP:32310 with 3 cores, 10.0 GB RAM
2016-06-29 14:38:26,846 INFO AppClient$ClientActor: Executor added: app-20160629143826-0517/2 on worker-20160627091043-TMAXSPARK-40916 (TMAXSPARK:40916) with 3 cores
2016-06-29 14:38:26,847 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160629143826-0517/2 on hostPort TMAXSPARK:40916 with 3 cores, 10.0 GB RAM
2016-06-29 14:38:26,855 INFO AppClient$ClientActor: Executor updated: app-20160629143826-0517/2 is now LOADING
2016-06-29 14:38:26,855 INFO AppClient$ClientActor: Executor updated: app-20160629143826-0517/0 is now LOADING
2016-06-29 14:38:26,856 INFO AppClient$ClientActor: Executor updated: app-20160629143826-0517/0 is now RUNNING
2016-06-29 14:38:26,858 INFO AppClient$ClientActor: Executor updated: app-20160629143826-0517/1 is now RUNNING
2016-06-29 14:38:26,894 INFO AppClient$ClientActor: Executor updated: app-20160629143826-0517/2 is now RUNNING
2016-06-29 14:38:26,896 INFO AppClient$ClientActor: Executor updated: app-20160629143826-0517/1 is now LOADING
2016-06-29 14:38:26,960 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34142.
2016-06-29 14:38:26,960 INFO NettyBlockTransferService: Server created on 34142
2016-06-29 14:38:26,961 INFO BlockManagerMaster: Trying to register BlockManager
2016-06-29 14:38:26,970 INFO BlockManagerMasterEndpoint: Registering block manager TMAXSPARK:34142 with 265.4 MB RAM, BlockManagerId(driver, TMAXSPARK, 34142)
2016-06-29 14:38:26,973 INFO BlockManagerMaster: Registered BlockManager
2016-06-29 14:38:27,285 INFO EventLoggingListener: Logging events to file:/data1/zdh/spark/logs/eventLog/app-20160629143826-0517
2016-06-29 14:38:27,305 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
2016-06-29 14:38:27,696 INFO MemoryStore: ensureFreeSpace(254064) called with curMem=0, maxMem=278302556
2016-06-29 14:38:27,698 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 248.1 KB, free 265.2 MB)
2016-06-29 14:38:27,916 INFO MemoryStore: ensureFreeSpace(22865) called with curMem=254064, maxMem=278302556
2016-06-29 14:38:27,917 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 22.3 KB, free 265.1 MB)
2016-06-29 14:38:27,920 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on TMAXSPARK:34142 (size: 22.3 KB, free: 265.4 MB)
2016-06-29 14:38:27,926 INFO SparkContext: Created broadcast 0 from textFile at Boot.scala:11
2016-06-29 14:38:30,381 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2016-06-29 14:38:30,399 INFO GPLNativeCodeLoader: Loaded native gpl library from the embedded binaries
2016-06-29 14:38:30,401 INFO LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 015e93e6bae89a7ea2ed67c6e3b8e9e23cee299d]
2016-06-29 14:38:30,409 INFO FileInputFormat: Total input paths to process : 1
2016-06-29 14:38:30,475 INFO SparkContext: Starting job: collect at Boot.scala:13
2016-06-29 14:38:30,500 INFO DAGScheduler: Registering RDD 3 (map at Boot.scala:12)
2016-06-29 14:38:30,502 INFO DAGScheduler: Got job 0 (collect at Boot.scala:13) with 15 output partitions (allowLocal=false)
2016-06-29 14:38:30,503 INFO DAGScheduler: Final stage: ResultStage 1(collect at Boot.scala:13)
2016-06-29 14:38:30,503 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
2016-06-29 14:38:30,509 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
2016-06-29 14:38:30,520 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at map at Boot.scala:12), which has no missing parents
2016-06-29 14:38:30,531 INFO MemoryStore: ensureFreeSpace(3968) called with curMem=276929, maxMem=278302556
2016-06-29 14:38:30,532 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.9 KB, free 265.1 MB)
2016-06-29 14:38:30,540 INFO MemoryStore: ensureFreeSpace(2274) called with curMem=280897, maxMem=278302556
2016-06-29 14:38:30,541 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.2 KB, free 265.1 MB)
2016-06-29 14:38:30,542 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on TMAXSPARK:34142 (size: 2.2 KB, free: 265.4 MB)
2016-06-29 14:38:30,543 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:874
2016-06-29 14:38:30,550 INFO DAGScheduler: Submitting 15 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at map at Boot.scala:12)
2016-06-29 14:38:30,551 INFO TaskSchedulerImpl: Adding task set 0.0 with 15 tasks
2016-06-29 14:38:30,590 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@WHOLEDAP:44789/user/Executor#1665484827]) with ID 1
2016-06-29 14:38:30,603 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, WHOLEDAP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:30,605 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, WHOLEDAP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:30,606 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, WHOLEDAP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:30,739 INFO BlockManagerMasterEndpoint: Registering block manager WHOLEDAP:34877 with 5.2 GB RAM, BlockManagerId(1, WHOLEDAP, 34877)
2016-06-29 14:38:31,168 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on WHOLEDAP:34877 (size: 2.2 KB, free: 5.2 GB)
2016-06-29 14:38:31,260 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@HADOOP:44937/user/Executor#-1572714920]) with ID 0
2016-06-29 14:38:31,262 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, HADOOP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:31,262 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, HADOOP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:31,263 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, HADOOP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:31,264 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, HADOOP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:31,400 INFO BlockManagerMasterEndpoint: Registering block manager HADOOP:36559 with 5.2 GB RAM, BlockManagerId(0, HADOOP, 36559)
2016-06-29 14:38:31,470 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on WHOLEDAP:34877 (size: 22.3 KB, free: 5.2 GB)
2016-06-29 14:38:31,735 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on HADOOP:36559 (size: 2.2 KB, free: 5.2 GB)
2016-06-29 14:38:32,096 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on HADOOP:36559 (size: 22.3 KB, free: 5.2 GB)
2016-06-29 14:38:32,240 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@TMAXSPARK:47783/user/Executor#-1987341806]) with ID 2
2016-06-29 14:38:32,241 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, TMAXSPARK, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:32,242 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, TMAXSPARK, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:32,243 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, TMAXSPARK, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:32,393 INFO BlockManagerMasterEndpoint: Registering block manager TMAXSPARK:36462 with 5.2 GB RAM, BlockManagerId(2, TMAXSPARK, 36462)
2016-06-29 14:38:33,626 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on TMAXSPARK:36462 (size: 2.2 KB, free: 5.2 GB)
2016-06-29 14:38:39,030 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on TMAXSPARK:36462 (size: 22.3 KB, free: 5.2 GB)
2016-06-29 14:38:48,745 INFO TaskSetManager: Starting task 10.0 in stage 0.0 (TID 10, WHOLEDAP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:48,748 INFO TaskSetManager: Starting task 11.0 in stage 0.0 (TID 11, HADOOP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:48,749 INFO TaskSetManager: Starting task 12.0 in stage 0.0 (TID 12, WHOLEDAP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:48,751 INFO TaskSetManager: Starting task 13.0 in stage 0.0 (TID 13, WHOLEDAP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:48,766 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 18160 ms on WHOLEDAP (1/15)
2016-06-29 14:38:48,766 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 17503 ms on HADOOP (2/15)
2016-06-29 14:38:48,766 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 18161 ms on WHOLEDAP (3/15)
2016-06-29 14:38:48,767 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 18172 ms on WHOLEDAP (4/15)
2016-06-29 14:38:50,259 INFO TaskSetManager: Starting task 14.0 in stage 0.0 (TID 14, HADOOP, PROCESS_LOCAL, 1455 bytes)
2016-06-29 14:38:50,267 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 19005 ms on HADOOP (5/15)
2016-06-29 14:38:50,273 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 19011 ms on HADOOP (6/15)
2016-06-29 14:38:50,305 INFO TaskSetManager: Finished task 13.0 in stage 0.0 (TID 13) in 1555 ms on WHOLEDAP (7/15)
2016-06-29 14:38:50,306 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 19043 ms on HADOOP (8/15)
2016-06-29 14:38:50,364 INFO TaskSetManager: Finished task 11.0 in stage 0.0 (TID 11) in 1616 ms on HADOOP (9/15)
2016-06-29 14:38:50,591 INFO TaskSetManager: Finished task 10.0 in stage 0.0 (TID 10) in 1847 ms on WHOLEDAP (10/15)
2016-06-29 14:38:50,648 INFO TaskSetManager: Finished task 12.0 in stage 0.0 (TID 12) in 1900 ms on WHOLEDAP (11/15)
2016-06-29 14:38:51,605 INFO TaskSetManager: Finished task 14.0 in stage 0.0 (TID 14) in 1347 ms on HADOOP (12/15)
2016-06-29 14:38:53,700 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 21457 ms on TMAXSPARK (13/15)
2016-06-29 14:38:54,042 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 21800 ms on TMAXSPARK (14/15)
2016-06-29 14:38:54,358 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 22116 ms on TMAXSPARK (15/15)
2016-06-29 14:38:54,359 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
2016-06-29 14:38:54,359 INFO DAGScheduler: ShuffleMapStage 0 (map at Boot.scala:12) finished in 23.799 s
2016-06-29 14:38:54,360 INFO DAGScheduler: looking for newly runnable stages
2016-06-29 14:38:54,361 INFO DAGScheduler: running: Set()
2016-06-29 14:38:54,361 INFO DAGScheduler: waiting: Set(ResultStage 1)
2016-06-29 14:38:54,361 INFO DAGScheduler: failed: Set()
2016-06-29 14:38:54,366 INFO DAGScheduler: Missing parents for ResultStage 1: List()
2016-06-29 14:38:54,368 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at reduceByKey at Boot.scala:13), which is now runnable
2016-06-29 14:38:54,371 INFO MemoryStore: ensureFreeSpace(2816) called with curMem=283171, maxMem=278302556
2016-06-29 14:38:54,371 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 2.8 KB, free 265.1 MB)
2016-06-29 14:38:54,377 INFO MemoryStore: ensureFreeSpace(1602) called with curMem=285987, maxMem=278302556
2016-06-29 14:38:54,378 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1602.0 B, free 265.1 MB)
2016-06-29 14:38:54,379 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on TMAXSPARK:34142 (size: 1602.0 B, free: 265.4 MB)
2016-06-29 14:38:54,380 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:874
2016-06-29 14:38:54,382 INFO DAGScheduler: Submitting 15 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at reduceByKey at Boot.scala:13)
2016-06-29 14:38:54,382 INFO TaskSchedulerImpl: Adding task set 1.0 with 15 tasks
2016-06-29 14:38:54,387 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 15, HADOOP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,388 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 16, WHOLEDAP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,389 INFO TaskSetManager: Starting task 2.0 in stage 1.0 (TID 17, TMAXSPARK, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,389 INFO TaskSetManager: Starting task 3.0 in stage 1.0 (TID 18, HADOOP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,390 INFO TaskSetManager: Starting task 4.0 in stage 1.0 (TID 19, WHOLEDAP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,391 INFO TaskSetManager: Starting task 5.0 in stage 1.0 (TID 20, TMAXSPARK, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,391 INFO TaskSetManager: Starting task 6.0 in stage 1.0 (TID 21, HADOOP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,392 INFO TaskSetManager: Starting task 7.0 in stage 1.0 (TID 22, WHOLEDAP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,393 INFO TaskSetManager: Starting task 8.0 in stage 1.0 (TID 23, TMAXSPARK, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,393 INFO TaskSetManager: Starting task 9.0 in stage 1.0 (TID 24, HADOOP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:54,418 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on TMAXSPARK:36462 (size: 1602.0 B, free: 5.2 GB)
2016-06-29 14:38:54,420 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on HADOOP:36559 (size: 1602.0 B, free: 5.2 GB)
2016-06-29 14:38:54,425 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on WHOLEDAP:34877 (size: 1602.0 B, free: 5.2 GB)
2016-06-29 14:38:54,453 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to TMAXSPARK:47783
2016-06-29 14:38:54,460 INFO MapOutputTrackerMaster: Size of output statuses for shuffle 0 is 281 bytes
2016-06-29 14:38:54,461 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to HADOOP:44937
2016-06-29 14:38:54,461 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to WHOLEDAP:44789
2016-06-29 14:38:56,483 INFO TaskSetManager: Starting task 10.0 in stage 1.0 (TID 25, WHOLEDAP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:56,502 INFO TaskSetManager: Starting task 11.0 in stage 1.0 (TID 26, TMAXSPARK, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:56,524 INFO TaskSetManager: Starting task 12.0 in stage 1.0 (TID 27, TMAXSPARK, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:38:56,526 INFO TaskSetManager: Starting task 13.0 in stage 1.0 (TID 28, WHOLEDAP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:39:03,607 INFO TaskSetManager: Starting task 14.0 in stage 1.0 (TID 29, HADOOP, PROCESS_LOCAL, 1225 bytes)
2016-06-29 14:39:03,755 INFO TaskSetManager: Finished task 7.0 in stage 1.0 (TID 22) in 9364 ms on WHOLEDAP (1/15)
2016-06-29 14:39:03,756 INFO TaskSetManager: Finished task 5.0 in stage 1.0 (TID 20) in 9366 ms on TMAXSPARK (2/15)
2016-06-29 14:39:03,756 INFO TaskSetManager: Finished task 2.0 in stage 1.0 (TID 17) in 9368 ms on TMAXSPARK (3/15)
2016-06-29 14:39:03,756 INFO TaskSetManager: Finished task 1.0 in stage 1.0 (TID 16) in 9369 ms on WHOLEDAP (4/15)
2016-06-29 14:39:03,776 INFO TaskSetManager: Finished task 4.0 in stage 1.0 (TID 19) in 9387 ms on WHOLEDAP (5/15)
2016-06-29 14:39:03,777 INFO TaskSetManager: Finished task 11.0 in stage 1.0 (TID 26) in 7276 ms on TMAXSPARK (6/15)
2016-06-29 14:39:03,777 INFO TaskSetManager: Finished task 8.0 in stage 1.0 (TID 23) in 9385 ms on TMAXSPARK (7/15)
2016-06-29 14:39:03,777 INFO TaskSetManager: Finished task 3.0 in stage 1.0 (TID 18) in 9388 ms on HADOOP (8/15)
2016-06-29 14:39:07,384 INFO TaskSetManager: Finished task 6.0 in stage 1.0 (TID 21) in 12993 ms on HADOOP (9/15)
2016-06-29 14:39:07,384 INFO TaskSetManager: Finished task 10.0 in stage 1.0 (TID 25) in 10902 ms on WHOLEDAP (10/15)
2016-06-29 14:39:07,385 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 15) in 13002 ms on HADOOP (11/15)
2016-06-29 14:39:07,391 INFO TaskSetManager: Finished task 9.0 in stage 1.0 (TID 24) in 12998 ms on HADOOP (12/15)
2016-06-29 14:39:07,415 INFO TaskSetManager: Finished task 14.0 in stage 1.0 (TID 29) in 3808 ms on HADOOP (13/15)
2016-06-29 14:39:07,419 INFO TaskSetManager: Finished task 13.0 in stage 1.0 (TID 28) in 10893 ms on WHOLEDAP (14/15)
2016-06-29 14:39:07,422 INFO TaskSetManager: Finished task 12.0 in stage 1.0 (TID 27) in 10899 ms on TMAXSPARK (15/15)
2016-06-29 14:39:07,422 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
2016-06-29 14:39:07,422 INFO DAGScheduler: ResultStage 1 (collect at Boot.scala:13) finished in 13.039 s
2016-06-29 14:39:07,430 INFO DAGScheduler: Job 0 finished: collect at Boot.scala:13, took 36.954694 s
The number of Words: (measObjLdn="SBNID=510503,ENODEBID=671878,CellID=51,RelationID=0:460:11:672262:49">,99962223)
2016-06-29 14:39:07,535 INFO SparkContext: Invoking stop() from shutdown hook
2016-06-29 14:39:07,688 INFO SparkUI: Stopped Spark web UI at http://TMAXSPARK:4041
2016-06-29 14:39:07,691 INFO DAGScheduler: Stopping DAGScheduler
2016-06-29 14:39:07,693 INFO SparkDeploySchedulerBackend: Shutting down all executors
2016-06-29 14:39:07,695 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
2016-06-29 14:39:07,841 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
2016-06-29 14:39:09,094 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkExecutor@WHOLEDAP:44789] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
2016-06-29 14:39:09,318 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkExecutor@HADOOP:44937] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
2016-06-29 14:39:10,120 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkExecutor@TMAXSPARK:47783] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
2016-06-29 14:39:10,572 INFO Utils: path = /data2/zdh/spark/tmp/spark-6cfc0cc1-c223-4166-a29a-2f88988181f7/blockmgr-c5ef6fee-4b32-4da5-b182-3da265d91f22, already present as root for deletion.
2016-06-29 14:39:10,572 INFO Utils: path = /data3/zdh/spark/tmp/spark-e4ea423d-72f6-46cf-86e3-0cadf01519bf/blockmgr-6c29df30-b3d5-48ef-9683-6cdb44355848, already present as root for deletion.
2016-06-29 14:39:10,572 INFO Utils: path = /data4/zdh/spark/tmp/spark-76fb6019-d5a2-4757-a5e3-d3b026374cc0/blockmgr-d167f389-02f4-4c78-bb30-56c010390a6a, already present as root for deletion.
2016-06-29 14:39:10,572 INFO Utils: path = /data5/zdh/spark/tmp/spark-c12a88f9-3817-4cb4-a1c6-1f8e459a7b4b/blockmgr-f3abca40-3052-4b94-9761-8329a7ce2600, already present as root for deletion.
2016-06-29 14:39:10,572 INFO Utils: path = /data6/zdh/spark/tmp/spark-1f6d75fe-54c1-4745-aad8-e9cf2f48c1a4/blockmgr-e0447fd5-f826-4500-9318-77c23e92d6b9, already present as root for deletion.
2016-06-29 14:39:10,572 INFO Utils: path = /data7/zdh/spark/tmp/spark-f8a9b580-5d17-423e-abf0-6741665d853b/blockmgr-4101f569-344e-4195-9d50-c403f0bcd6c4, already present as root for deletion.
2016-06-29 14:39:10,572 INFO Utils: path = /data8/zdh/spark/tmp/spark-5371bddd-6c1e-45ea-8f5e-433031298e92/blockmgr-a72ff6ce-ec95-4f5c-a066-ba6c1186aca2, already present as root for deletion.
2016-06-29 14:39:10,572 INFO Utils: path = /data9/zdh/spark/tmp/spark-7bea6c54-ebc6-47d8-bf74-e324bd0868b5/blockmgr-12231233-bdd8-449b-826b-5239aa500d7e, already present as root for deletion.
2016-06-29 14:39:10,572 INFO Utils: path = /data10/zdh/spark/tmp/spark-9eb65ea0-f62f-4af7-aab4-cff42acc913f/blockmgr-a31454e5-7a29-4ecd-9a6b-84e9b6d2c215, already present as root for deletion.
2016-06-29 14:39:10,573 INFO Utils: path = /data11/zdh/spark/tmp/spark-60192fa1-52d3-4ca5-92ed-15149ff27541/blockmgr-2af3c294-adff-4e79-85e8-500b9c7f0c2e, already present as root for deletion.
2016-06-29 14:39:10,573 INFO Utils: path = /data12/zdh/spark/tmp/spark-78233916-30b9-4e21-81a7-7ec822372eda/blockmgr-ec94f999-3038-44ee-8f22-01e5cafc7ccc, already present as root for deletion.
2016-06-29 14:39:10,573 INFO MemoryStore: MemoryStore cleared
2016-06-29 14:39:10,574 INFO BlockManager: BlockManager stopped
2016-06-29 14:39:10,575 INFO BlockManagerMaster: BlockManagerMaster stopped
2016-06-29 14:39:10,578 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
2016-06-29 14:39:10,578 INFO SparkContext: Successfully stopped SparkContext
2016-06-29 14:39:10,578 INFO Utils: Shutdown hook called
2016-06-29 14:39:10,579 INFO Utils: Deleting directory /data2/zdh/spark/tmp/spark-6cfc0cc1-c223-4166-a29a-2f88988181f7
2016-06-29 14:39:10,579 INFO Utils: Deleting directory /data9/zdh/spark/tmp/spark-7bea6c54-ebc6-47d8-bf74-e324bd0868b5
2016-06-29 14:39:10,581 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
2016-06-29 14:39:10,583 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
2016-06-29 14:39:10,602 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
2016-06-29 14:39:10,610 INFO Utils: Deleting directory /data7/zdh/spark/tmp/spark-f8a9b580-5d17-423e-abf0-6741665d853b
2016-06-29 14:39:10,611 INFO Utils: Deleting directory /data5/zdh/spark/tmp/spark-c12a88f9-3817-4cb4-a1c6-1f8e459a7b4b
2016-06-29 14:39:10,612 INFO Utils: Deleting directory /data11/zdh/spark/tmp/spark-60192fa1-52d3-4ca5-92ed-15149ff27541
2016-06-29 14:39:10,692 INFO Utils: Deleting directory /data12/zdh/spark/tmp/spark-78233916-30b9-4e21-81a7-7ec822372eda
2016-06-29 14:39:10,692 INFO Utils: Deleting directory /data6/zdh/spark/tmp/spark-1f6d75fe-54c1-4745-aad8-e9cf2f48c1a4
2016-06-29 14:39:10,693 INFO Utils: Deleting directory /data8/zdh/spark/tmp/spark-5371bddd-6c1e-45ea-8f5e-433031298e92
2016-06-29 14:39:10,706 INFO Utils: Deleting directory /data3/zdh/spark/tmp/spark-e4ea423d-72f6-46cf-86e3-0cadf01519bf
2016-06-29 14:39:10,707 INFO Utils: Deleting directory /data10/zdh/spark/tmp/spark-9eb65ea0-f62f-4af7-aab4-cff42acc913f
1 0
原创粉丝点击