MATLAB Convolution Layers

Layer

Description of Layer

convolution1dLayer

One-dimensional convolutional layer employs sliding convolutional filters on 1-D input data.

convolution2dLayer

Two-dimensional convolutional layer employs sliding convolutional filters on 2-D input data.

convolution3dLayer

Three-dimensional convolutional layer employs sliding convolutional filters on 3-D input data.

transposedConv2dLayer

Transposed two-dimensional convolutional layer increases the resolution of two-dimensional feature maps through upsampling.

transposedConv3dLayer

Transposed three-dimensional convolutional layer increases the resolution of three-dimensional feature maps through upsampling.

List of Deep Learning Layers

Deep learning (DL) is characterized by the use of neural networks with multiple layers to model and solve complex problems. Each layer in the neural network plays a unique role in the process of converting input data into meaningful and insightful outputs. The article explores the layers that are used to construct a neural network.

Table of Content

  • Role of Deep Learning Layers
  • MATLAB Input Layer
  • MATLAB Fully Connected Layers
  • MATLAB Convolution Layers
  • MATLAB Recurrent Layers
  • MATLAB Activation Layers
  • MATLAB Pooling and Unpooling Layers
  • MATLAB Normalization Layer and Dropout Layer
  • MATLAB Output Layers

Similar Reads

Role of Deep Learning Layers

A layer in a deep learning model serves as a fundamental building block in the model’s architecture. The structure of the network is responsible for processing and transforming input data. The flow of information through these layers is sequential, with each layer taking input from the preceding layers and passing its transformed output to the subsequent layers. This cascading process continues through the network until the final layer produces the model’s ultimate output....

MATLAB Input Layer

...

MATLAB Fully Connected Layers

Layers Description of Layer fullyConnectedLayer Fully connected layer performs matrix multiplication with a weight matrix and subsequently adding a bias vector....

MATLAB Convolution Layers

Layer Description of Layer convolution1dLayer One-dimensional convolutional layer employs sliding convolutional filters on 1-D input data. convolution2dLayer Two-dimensional convolutional layer employs sliding convolutional filters on 2-D input data. convolution3dLayer Three-dimensional convolutional layer employs sliding convolutional filters on 3-D input data. transposedConv2dLayer Transposed two-dimensional convolutional layer increases the resolution of two-dimensional feature maps through upsampling. transposedConv3dLayer Transposed three-dimensional convolutional layer increases the resolution of three-dimensional feature maps through upsampling....

MATLAB Recurrent Layers

Layer Description of Layer lstmLayer LSTM layer represents a type of recurrent neural network (RNN) layer specifically designed to capture and learn long-term dependencies among different time steps in time-series and sequential data. lstmProjectedLayer LSTM projected layer, within the realm of recurrent neural networks (RNNs), is adept at understanding and incorporating long-term dependencies among various time steps within time-series and sequential data. This is achieved through the utilization of learnable weights designed for projection. bilstmLayer Bidirectional LSTM (BiLSTM) layer, belonging to the family of recurrent neural networks (RNNs), is proficient in capturing long-term dependencies in both forward and backward directions among different time steps within time-series or sequential data. This bidirectional learning is valuable when the RNN needs to gather insights from the entire time series at each individual time step. gruLayer Gated Recurrent Unit (GRU) layer serves as a type of recurrent neural network (RNN) layer designed to capture dependencies among different time steps within time-series and sequential data. gruProjectedLayer A GRU projected layer, within the context of recurrent neural networks (RNNs), is specialized in understanding and incorporating dependencies among various time steps within time-series and sequential data. This is accomplished through the utilization of learnable weights designed for projection....

MATLAB Activation Layers

Layer Description of Layer reluLayer ReLU conducts a threshold operation on each element of the input, setting any value that is less zero to zero. leakyReluLayer Leaky ReLU applies a threshold operation, where any input value that is less than zero is multiplied by a constant scalar. clippedReluLayer Clipped ReLU layer executes a threshold operation, setting any input value below zero to zero and capping any value surpassing the defined ceiling to that specific ceiling value. eluLayer Exponential Linear Unit (ELU) activation layer executes the identity operation for positive inputs and applies an exponential nonlinearity for negative inputs. geluLayer Gaussian Error Linear Unit (GELU) layer adjusts the input by considering its probability within a Gaussian distribution. tanhLayer Hyperbolic tangent (tanh) activation layer utilizes the tanh function to transform the inputs of the layer. swishLayer Swish activation layer employs the swish function to process the inputs of the layer....

MATLAB Pooling and Unpooling Layers

Layer Description of Layer averagePooling1dLayer One dimensional average pooling layer accomplishes downsampling by segmenting the input into 1-D pooling regions and subsequently calculating the average within each region. averagePooling2dLayer  Two dimensional average pooling layer conducts downsampling by partitioning the input into rectangular pooling regions and subsequently determining the average value within each region. averagePooling3dLayer Three dimensional average pooling layer achieves downsampling by partitioning the three-dimensional input into cuboidal pooling regions and then calculating the average values within each of these regions. globalAveragePooling1dLayer 1-D global average pooling layer achieves downsampling by generating the average output across the time or spatial dimensions of the input. globalAveragePooling2dLayer  2-D global average pooling layer accomplishes downsampling by determining the mean value across the height and width dimensions of the input. globalAveragePooling3dLayer  3-D global average pooling layer achieves downsampling by calculating the mean across the height, width, and depth dimensions of the input. maxPooling1dLayer 1-D global max pooling layer achieves downsampling by producing the maximum value across the time or spatial dimensions of the input. maxUnpooling2dLayer 2-D max unpooling layer reverses the pooling operation on the output of a 2-D max pooling layer....

MATLAB Normalization Layer and Dropout Layer

Layer Description of Layer  batchNormalizationLayer Batch normalization layer normalizes a mini-batch of data independently across all observations for each channel. To enhance the training speed of a convolutional neural network and mitigate sensitivity to network initialization, incorporate batch normalization layers between convolutional layers and non-linearities, such as ReLU layers. groupNormalizationLayer Group normalization layer normalizes a mini-batch of data independently across distinct subsets of channels for each observation. To expedite the training of a convolutional neural network and minimize sensitivity to network initialization, integrate group normalization layers between convolutional layers and non-linearities, such as ReLU layers. layerNormalizationLayer Layer normalization layer normalizes a mini-batch of data independently across all channels for each observation. To accelerate the training of recurrent and multilayer perceptron neural networks and diminish sensitivity to network initialization, incorporate layer normalization layers after the learnable layers, such as LSTM and fully connected layers. dropoutLayer Dropout layer randomly zeros out input elements based on a specified probability....

MATLAB Output Layers

Layer Description of Layer softmaxLayer Softmax layer employs the softmax function on the input. sigmoidLayer Sigmoid layer utilizes a sigmoid function on the input, ensuring that the output is constrained within the range (0,1). classificationLayer Classification layer calculates the cross-entropy loss for tasks involving classification and weighted classification, specifically for scenarios with mutually exclusive classes. regressionLayer Regression layer calculates the loss using the half-mean-squared-error for tasks related to regression....