Implementation of Perceptron Algorithm for NOR Logic with 2-bit binary input in Python

NOR Gate
Fig: NOR gate

In this article, you’ll learn how to implement the NOR logic with 2-bit binary input using the perceptron algorithm in Python. The steps that we’ll follow will also enable you to easily implement any other logic function using the perceptron algorithm.

Perceptron algorithm for NOR logic

Perceptron algorithm for NOR logic
Fig: A perceptron with two inputs

A Perceptron can simply be defined as a feed-forward neural network with a single hidden layer. It takes a certain number of inputs (x1 and x2 in this case), processes them using the perceptron algorithm, and then finally produce the output y which can either be 0 or 1. As y can take only two values, a perceptron can also act as a linear classifier. According to the perceptron algorithm,

y = Wx + b, where Wx = w1x1 + w2x2, W = perceptron model weights and b = bias
Also, y = 1 if  Wx + b > 0 and 0 if  Wx + b ≤ 0

The steps that we’ll use to implement the NOR logic using a perceptron is similar to how a neural network is trained.

  • First, we’ll initialize the weights and the bias of the perceptron.
  • Then the input will be forward propagated through the network and output ‘y’ will be produced.
  • This obtained result will then be compared with the actual result and the error obtained will be backpropagated through the network to adjust the weights and the bias of the model. This will minimize the error obtained.
  • We’ll then repeat the above steps for all the inputs present.

NOR Logic

NOR gate produces a high output i.e. 1 only when both of its inputs are low i.e. 0. For all the other possible input combinations, it produces a low output. The truth table for NOR logic is shown below:

+----+----+---+
| x1 | x2 | y |
+----+----+---+
| 0  | 0  | 1 |
+----+----+---+
| 0  | 1  | 0 |
+----+----+---+
| 1  | 0  | 0 |
+----+----+---+
| 1  | 1  | 0 |
+----+----+---+ 

Perceptron Algorithm

As discussed above, according to the perceptron algorithm y = w1x1 + w2x2 + b. Here, to begin with, let us assume  w1 = 1, w2 = 1 and b = 1. Let, the result obtained using the perceptron algorithm be y’ and the actual result be y (given in the truth table). 

  • Now, using the first row of the truth table (x1 = 0 and x2  = 0) as our input, we get y’ = 1.0 + 1.0 + 1 = 1 which is same as y.
  • Using the second row as our input (x1 = 0 and x2  = 1) , we get y’ = 1.0 + 1.1 + 1 = 2 ≠ y. To make y = y’, let w2 = -1. This makes y’ = 1.0 + (-1).1 + 1 = 0 = y.
  • Again, using the third row as our input yields y’ = 1.1 + (-1).0 + 1 = 2 ≠ y. To eliminate this error, let w1 is also -1, this gives y’ = (-1).1 + (-1).0 + 1 = 0 = y. By calculation, you’ll observe that these values of the weights and the bias satisfy the NOR logic for both the above rows.
  • Finally, the last row of the truth table as input produces y’ = (-1).1 + (-1).1 + 1 = -1, and since if  Wx + b ≤ 0 this implies y’ = 0 (according to the perceptron algorithm)we get y’ = y. 

Therefore, the model to implement the NOR logic using the perceptron algorithm will be:

y = (-1).x1 + (-1).x2 + 1

Code

Below is our Python code for implementation of Perceptron Algorithm for NOR Logic with 2-bit binary input:

# Importing the required libraries
import numpy as np

# Defining the activation function
def activation_function(y):
    if y > 0:
        y = 1
    elif y <= 0:
        y = 0
    return y

# W = weights of the perceptron model
W = np.array([-1, -1])
# b = bias of the model
b = 1

# Defining the perceptron algorithm
def perceptron_algorithm(x):
    # y = w1x1 + w2x2 + b
    y = np.dot(W, x) + b
    # y = 1 if Wx+b > 0 else y = 0 
    y = activation_function(y)
    return y

# Input values to verify the NOR logic 
input1 = np.array([0, 0])
input2 = np.array([0, 1])
input3 = np.array([1, 0])
input4 = np.array([1, 1])


# Printing the results

print('NOR Logic: \n')
print(f'x1 = 0 and x2 = 0 => y = {perceptron_algorithm(input1)}')
print(f'x1 = 0 and x2 = 1 => y = {perceptron_algorithm(input2)}')
print(f'x1 = 1 and x2 = 0 => y = {perceptron_algorithm(input3)}')
print(f'x1 = 1 and x2 = 1 => y = {perceptron_algorithm(input4)}')

OUTPUT:

NOR Logic: 

x1 = 0 and x2 = 0 => y = 1
x1 = 0 and x2 = 1 => y = 0
x1 = 1 and x2 = 0 => y = 0
x1 = 1 and x2 = 1 => y = 0

As we can see, the model predictions are the same as the actual results. Hence, we successfully implemented the Perceptron Algorithm for NOR Logic with 2-bit binary input. You can also try implementing other logic functions by following the same steps and obtaining the correct values of the model weights and bias.

Also read:

Leave a Reply