spark学习-52-Spark的org.apache.spark.SparkException: Task not serializable
来源:互联网 发布:耐玩的手机小游戏知乎 编辑:程序博客网 时间:2024/06/07 10:40
报错这个一般是org.apache.spark.SparkException: Task not serializable
17/12/06 14:20:10 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.4 KB, free 872.6 MB)17/12/06 14:20:10 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.1.161:51006 (size: 28.4 KB, free: 873.0 MB)17/12/06 14:20:10 INFO SparkContext: Created broadcast 0 from newAPIHadoopRDD at SparkOnHbaseSecond.java:92Exception in thread "main" org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108) at org.apache.spark.SparkContext.clean(SparkContext.scala:2101) at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:370)at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:369) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) at org.apache.spark.rdd.RDD.map(RDD.scala:369) at org.apache.spark.api.java.JavaRDDLike$class.map(JavaRDDLike.scala:93) at org.apache.spark.api.java.AbstractJavaRDDLike.map(JavaRDDLike.scala:45) at sparlsql.hbase.www.second.SparkOnHbaseSecond.main(SparkOnHbaseSecond.java:94)Caused by: java.io.NotSerializableException: sparlsql.hbase.www.second.SparkOnHbaseSecondSerialization stack: - object not serializable (class: sparlsql.hbase.www.second.SparkOnHbaseSecond, value: sparlsql.hbase.www.second.SparkOnHbaseSecond@602ae7b6) - field (class: sparlsql.hbase.www.second.SparkOnHbaseSecond$1, name: val$sparkOnHbase, type: class sparlsql.hbase.www.second.SparkOnHbaseSecond) - object (class sparlsql.hbase.www.second.SparkOnHbaseSecond$1, sparlsql.hbase.www.second.SparkOnHbaseSecond$1@37af1f93) - field (class: org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1, name: fun$1, type: interface org.apache.spark.api.java.function.Function)- object (class org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1, <function1>) at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40) at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46) at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100) at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295) ... 12 more17/12/06 14:20:10 INFO SparkContext: Invoking stop() from shutdown hook
第一种是类没序列化
public class SparkOnHbaseBack { private String aa = "1234"; public static void main(String[] args) throws Exception { SparkSession spark=SparkSession.builder() .appName("lcc_java_read_hbase_register_to_table") .master("local[4]") .getOrCreate(); JavaRDD<Row> personsRDD = myRDD.map(new Function<Tuple2<ImmutableBytesWritable,Result>,Row>() { @Override public Row call(Tuple2<ImmutableBytesWritable, Result> tuple) throws Exception { // TODO Auto-generated method stub // System.out.println("====tuple=========="+aa); 这里使用了main外的字段,把字段放到这个map方法里面就好了 或者类继承实现public class SparkOnHbaseSecond implements java.io.Serializable 类似这样 });
阅读全文
0 0
- spark学习-52-Spark的org.apache.spark.SparkException: Task not serializable
- org.apache.spark.SparkException: Task not serializable
- spark出现task org.apache.spark.SparkException: Task not serializable
- spark出现“org.apache.spark.SparkException: Task not serializable"
- spark + quartz : org.apache.spark.SparkException: Task not serializable
- Spark[二]:org.apache.spark.SparkException: Task not serializable
- spark出现task不能序列化错误的解决方法 org.apache.spark.SparkException: Task not serializable
- org.apache.spark.SparkException: Task not serializable问题分析
- Spark运行程序异常信息: org.apache.spark.SparkException: Task not serializable 解决办法
- Exception in thread "main" org.apache.spark.SparkException: Task not serializable异常
- Exception in thread "main" org.apache.spark.SparkException: Task not serializable--two
- Task not serializable exception while running apache spark job
- SparkException: org.apache.spark.streaming.dstream.MappedDStream has not been initialized
- Exception in thread "main" org.apache.spark.SparkException: Application application_1498149692663_01
- 针对Apache Spark logging within scala 出现的 Task not serializable
- spark: Task not serializable (java)
- spark学习- SparkSQL--08-org.apache.spark.SparkException: A master URL must be set in your config
- Spark task not serializable错误的分析和处理
- 第三章:Python基础
- 4.弹性网络( Elastic Net)
- Wireshark安装
- PHP MySQL 简介
- mysql 远程访问不行解决方法 Host is not allowed to connect to this MySQL server
- spark学习-52-Spark的org.apache.spark.SparkException: Task not serializable
- Android Studio3.0中dependencies依赖由compile变为implementation的区别
- 集成学习
- PHP 连接 MySQL
- 第7章 数据规整化:清理、转换、合并、重塑(2)
- 欢迎使用CSDN-markdown编辑器
- Samba案例
- 数据结构实验之图论八:欧拉回路
- Go实现二分法查找