MATLAB Activation Layers
Layer |
Description of Layer |
---|---|
reluLayer |
ReLU conducts a threshold operation on each element of the input, setting any value that is less zero to zero. |
leakyReluLayer |
Leaky ReLU applies a threshold operation, where any input value that is less than zero is multiplied by a constant scalar. |
clippedReluLayer |
Clipped ReLU layer executes a threshold operation, setting any input value below zero to zero and capping any value surpassing the defined ceiling to that specific ceiling value. |
eluLayer |
Exponential Linear Unit (ELU) activation layer executes the identity operation for positive inputs and applies an exponential nonlinearity for negative inputs. |
geluLayer |
Gaussian Error Linear Unit (GELU) layer adjusts the input by considering its probability within a Gaussian distribution. |
tanhLayer |
Hyperbolic tangent (tanh) activation layer utilizes the tanh function to transform the inputs of the layer. |
swishLayer |
Swish activation layer employs the swish function to process the inputs of the layer. |
List of Deep Learning Layers
Deep learning (DL) is characterized by the use of neural networks with multiple layers to model and solve complex problems. Each layer in the neural network plays a unique role in the process of converting input data into meaningful and insightful outputs. The article explores the layers that are used to construct a neural network.
Table of Content
- Role of Deep Learning Layers
- MATLAB Input Layer
- MATLAB Fully Connected Layers
- MATLAB Convolution Layers
- MATLAB Recurrent Layers
- MATLAB Activation Layers
- MATLAB Pooling and Unpooling Layers
- MATLAB Normalization Layer and Dropout Layer
- MATLAB Output Layers