spark 问题
来源:互联网 发布:淘宝卖水果 编辑:程序博客网 时间:2024/06/06 13:01
Q:Application application_1505791560225_0911 failed 2 times due to AM Container for appattempt_1505791560225_0911_000002 exited with exitCode: -1000
For more detailed output, check application tracking page:http://cnsz22VLK2906:8088/cluster/app/application_1505791560225_0911Then, click on links to logs of each attempt.
Diagnostics: File does not exist: hdfs://cluster1/user/sfapp/.sparkStaging/application_1505791560225_0911/__spark_libs__5856340437943961943.zip
java.io.FileNotFoundException: File does not exist: hdfs://cluster1/user/sfapp/.sparkStaging/application_1505791560225_0911/__spark_libs__5856340437943961943.zip
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1309)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:253)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Failing this attempt. Failing the application.
A:spark运行模式用的是yarn-cluster ,代码里却写了sparkConf.setMaster("local[*]")
阅读全文
0 0
- spark问题
- spark问题
- Spark问题
- spark 问题
- spark 问题
- spark外观的问题
- spark 部署问题
- spark hive 问题,纠结
- spark问题记录
- Spark-问题总结
- Spark开发问题集锦
- spark启动时 问题
- spark问题归纳
- Spark问题笔记1
- Spark问题笔记2
- Spark问题笔记3
- Spark问题笔记4
- Spark问题笔记5
- 计算图像均值
- 借用Ajax实现Echarts与MySQL的交互(2)
- Flipper
- FFMPEG 压缩视频文件
- PrivacyPolicy
- spark 问题
- ubuntu package XXX needs to be reinstalled, but I can't find an archive 修复
- SQL使用(一)-----联合查询
- 【LeetCode】C# 70、Climbing Stairs
- MySQL的锁
- paint
- 浮动之摄影社区热门小镇
- Java常见异常(Runtime Exception )小结
- 解决springboot+jpa+hibernate启动时报错:MySQLSyntaxErrorException: Specified key was too long; max key lengt