Spark-task相关
来源:互联网 发布:linux分页显示命令 编辑:程序博客网 时间:2024/06/05 07:48
Spark-task相关
@(spark)[Task]
TaskState
private[spark] object TaskState extends Enumeration { val LAUNCHING, RUNNING, FINISHED, FAILED, KILLED, LOST = Value val FINISHED_STATES = Set(FINISHED, FAILED, KILLED, LOST)
Task的状态,本文件还包含了mesos相关状态的转换
TaskEndReason
本质上是个枚举类,标识了所有的task end 的reason
/** * :: DeveloperApi :: * Various possible reasons why a task ended. The low-level TaskScheduler is supposed to retry * tasks several times for "ephemeral" failures, and only report back failures that require some * old stages to be resubmitted, such as shuffle map fetch failures. */ @DeveloperApi sealed trait TaskEndReason
TaskContext
/** * Contextual information about a task which can be read or mutated during * execution. To access the TaskContext for a running task, use: * {{{ * org.apache.spark.TaskContext.get() * }}} */ abstract class TaskContext extends Serializable {
接口定义
TaskContextHelper
/** * This class exists to restrict the visibility of TaskContext setters. */ private [spark] object TaskContextHelper {
TaskContextImpl
实现类,基本上提供了一些metric和listen的钩子
0 0
- Spark-task相关
- Spark job, stage, task, partition相关问题
- spark task启动
- spark中的task 分割
- Spark Task执行原理
- [spark] Task执行流程
- spark相关
- Spark相关
- spark出现task org.apache.spark.SparkException: Task not serializable
- Spark源码阅读笔记:Spark的Task
- Task和Activity相关
- Task和Activity相关
- Task 和Activity相关
- Task和Activity相关
- Task和Activity相关
- Task和Activity相关
- Task和Activity相关
- Task和Activity相关
- mongoose时间区间操作
- Spark-utils 类
- java join sleep wait notify notifyAll
- poj1024
- MYSQL--事务处理
- Spark-task相关
- Spark-rdd
- C++——2831: 字符串处理
- ACM 线段上格点的个数
- 线性回归
- Spark-Dependency/Aggregator
- java中有关链表的用法
- Spark-partitioner
- Python文件编码的声明方法