Custom layers in TensorFlow

In this tutorial, We will learn what custom layers are and how can we implement custom layers in TensorFlow. When you are writing machine learning code you either want to implement codes with layers that are already implemented or implement customized layers according to your problem. Some of the pre-existing layers are dense layer, CONV2D, LSTM etc. Now we will see how to implement custom layers in Tensorflow.

IMPLEMENTATION

The best way to customize your own layer is to extend tf.keras.layers class and implement _init_, build and call functions. _init_ functions are used to initialize the parameters. build method will be used to get the shape of the tensor and can do more initialization. The call method is used for forwarding propagation.

class MyDenseLayer(tf.keras.layers.Layer):
  def __init__(self, num_outputs):
    super(MyDenseLayer, self).__init__()
    self.num_outputs = num_outputs   # _init_ function

  def build(self, input_shape):
    self.kernel = self.add_weight("kernel",
                                  shape=[int(input_shape[-1]),
                                         self.num_outputs])

  def call(self, input):
    return tf.math.add(input, self.kernel)

layer = MyDenseLayer(10)
_ = layer(tf.zeros([5, 5]))
print([var.name for var in layer.trainable_variables])
['my_dense_layer/kernel:0']

 

This is a custom made layer which inputs a matrix and outputs the addition od that matric. Here we have taken a matrix with all zeros do output is zero. Like this, we can implement big and complex layers according to our needs. We can make a residual network, GAN etc using custom layers in TensorFlow.

 

Leave a Reply

Your email address will not be published. Required fields are marked *