Spark-task相关

来源:互联网 发布:linux分页显示命令 编辑:程序博客网 时间:2024/06/05 07:48

Spark-task相关

@(spark)[Task]

TaskState

private[spark] object TaskState extends Enumeration {                                                                                                                     val LAUNCHING, RUNNING, FINISHED, FAILED, KILLED, LOST = Value                                                                                                          val FINISHED_STATES = Set(FINISHED, FAILED, KILLED, LOST)      

Task的状态,本文件还包含了mesos相关状态的转换

TaskEndReason

本质上是个枚举类,标识了所有的task end 的reason

/**                                                                                                                                                                      * :: DeveloperApi ::                                                                                                                                                    * Various possible reasons why a task ended. The low-level TaskScheduler is supposed to retry                                                                           * tasks several times for "ephemeral" failures, and only report back failures that require some                                                                         * old stages to be resubmitted, such as shuffle map fetch failures.                                                                                                     */                                                                                                                                                                     @DeveloperApi                                                                                                                                                           sealed trait TaskEndReason   

TaskContext

/**                                                                                                                                                                      * Contextual information about a task which can be read or mutated during                                                                                               * execution. To access the TaskContext for a running task, use:                                                                                                         * {{{                                                                                                                                                                   *   org.apache.spark.TaskContext.get()                                                                                                                                  * }}}                                                                                                                                                                   */                                                                                                                                                                     abstract class TaskContext extends Serializable {   

接口定义

TaskContextHelper

/**                                                                                                                                                                      * This class exists to restrict the visibility of TaskContext setters.                                                                                                  */                                                                                                                                                                     private [spark] object TaskContextHelper {   

TaskContextImpl

实现类,基本上提供了一些metric和listen的钩子

0 0
原创粉丝点击