
This layer creates a convolution kernel that is convolved with the layer input to produce a tensor of outputs.If use_bias is True, a bias vector is created and added to the outputs.Finally, if activation is not None, it is applied to the outputs as well.
#Keras data augmentation multiple inputs series
input_shape=(10, 128) for time series sequences of 10 time steps with 128 features per step in data_format="channels_last", or (None, 128) for variable-length sequences with 128 features per step.

When using this layer as the first layer in a model, provide an input_shape argument (tuple of integers or None, does not include the batch axis), e.g. This layer creates a convolution kernel that is convolved with the layer input over a single spatial(or temporal) dimension to produce a tensor of outputs.If use_bias is True, a bias vector is created and added to the outputs.Finally, if activation is not None, it is applied to the outputs as well. Normalize the activations of the previous layer at each batch, i.e.applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1. BaseLayer BatchNormalizationīatch normalization layer (Ioffe and Szegedy, 2014). AveragePooling3DĪverage pooling operation for 3D data (spatial or spatio-temporal). AveragePooling2DĪverage pooling operation for spatial data. AveragePooling1DĪverage pooling for temporal data.

Add AlphaDropoutĪlpha Dropout is a Dropout that keeps mean and variance of inputs to their original values, in order to ensure the self-normalizing property even after this dropout.Īlpha Dropout fits well to Scaled Exponential Linear Units by randomly setting activations to the negative saturation value.

#Keras data augmentation multiple inputs update
Layer that applies an update to the cost function based input activity. Applies an activation function to an output.
