keras代码阅读-Activition层

来源:互联网 发布:绝食 水 知乎 编辑:程序博客网 时间:2024/04/29 23:56

Activition的功能

对输入应用某个函数 (就是所谓的激活函数)
例子如下:

from keras.layers.core import Activation, Densemodel.add(Dense(64))model.add(Activation('tanh'))

上面的例子就应用了tanh这个函数。

class Activation(Layer):    '''Applies an activation function to an output.    # Arguments        activation: name of activation function to use            (see: [activations](../activations.md)),            or alternatively, a Theano or TensorFlow operation.    # Input shape        Arbitrary. Use the keyword argument `input_shape`        (tuple of integers, does not include the samples axis)        when using this layer as the first layer in a model.    # Output shape        Same shape as input.    '''    def __init__(self, activation, **kwargs):        self.supports_masking = True        self.activation = activations.get(activation)        super(Activation, self).__init__(**kwargs)    def call(self, x, mask=None):        return self.activation(x)    def get_config(self):        config = {'activation': self.activation.__name__}        base_config = super(Activation, self).get_config()        return dict(list(base_config.items()) + list(config.items()))

代码分析

构造函数中的参数activation是一个字符串。通过 activations.get(activation) 将这个字符串转换为了一个函数。

在call中使用这个函数来处理这一层的输入x

那么这个activition在哪里呢?
上面的代码写到

from .. import activations, initializations, regularizers, constraints

这就是说activations是一个模块。在wrappers中找到了这个activitions.py
其中包含了如下的函数

from __future__ import absolute_importfrom . import backend as Kdef softmax(x):    ndim = K.ndim(x)    if ndim == 2:        return K.softmax(x)    elif ndim == 3:        e = K.exp(x - K.max(x, axis=-1, keepdims=True))        s = K.sum(e, axis=-1, keepdims=True)        return e / s    else:        raise Exception('Cannot apply softmax to a tensor that is not 2D or 3D. ' +                        'Here, ndim=' + str(ndim))def softplus(x):    return K.softplus(x)def relu(x, alpha=0., max_value=None):    return K.relu(x, alpha=alpha, max_value=max_value)def tanh(x):    return K.tanh(x)def sigmoid(x):    return K.sigmoid(x)def hard_sigmoid(x):    return K.hard_sigmoid(x)def linear(x):    '''    The function returns the variable that is passed in, so all types work.    '''    return xfrom .utils.generic_utils import get_from_moduledef get(identifier):    return get_from_module(identifier, globals(), 'activation function')

get函数通过utils的generic_utils.py中的get_from_module 获取相应的函数,算是一种反射机制吧。

def get_from_module(identifier, module_params, module_name,                    instantiate=False, kwargs=None):    if isinstance(identifier, six.string_types):        res = module_params.get(identifier)        if not res:            raise Exception('Invalid ' + str(module_name) + ': ' +                            str(identifier))        if instantiate and not kwargs:            return res()        elif instantiate and kwargs:            return res(**kwargs)        else:            return res    elif type(identifier) is dict:        name = identifier.pop('name')        res = module_params.get(name)        if res:            return res(**identifier)        else:            raise Exception('Invalid ' + str(module_name) + ': ' +                            str(identifier))    return identifier
0 0