Implementation of Perceptron Algorithm for OR Logic with 2-bit binary input in Python
The Perceptron algorithm is considered to be the simplest type of Artificial Neural Network and one can use it as the starting point in ANN.
Perceptrons can deal with n number of inputs and produces a binary output exclusively.
Let’s see the problem in hand which is a classic example to explain the working of the Perceptron algorithm.
Here, we implement the OR Logic Gate using the Perceptron algorithm which is classifying the 2 binary values into 0 or 1.
The computational graph of our perceptron is:
Start with assigning each input a weight, roughly such that it reflects the amount of influence the input has over the output. Multiply these together then implement summation of those.
Another term in Perceptron is the bias which is just a constant factor added to the equation.
Here, the symbol Σ represents the linear combination of the inputs x by means of the weights w and the bias b.
The Perceptron as a model implements the following function:
We take the weighted sum, then we apply the activation function f(x), also called as a step function.
If the result of the weighted sum is greater than or equal to 0, then the activation function produces an output of 1 otherwise, it produces an output 0.
The Truth Table for OR Logic:
We take the weight parameters as w1 = 1 and w2 = 1, and the bias parameter as b = -0.5.
Putting the above discussion into the function OR_perceptron() below:
import numpy as np weight = np.array([1,1]) bias = -0.5 inputs = np.array([[0, 0], [0, 1], [1, 0], [1, 1]]) def OR_perceptron(x, weight, bias): fx = np.dot(weight, x) + bias if fx >= 0: return i, 1 else: return i, 0 for i in inputs: print(OR_perceptron(i, weight, bias))
(array([0, 0]), 0) (array([0, 1]), 1) (array([1, 0]), 1) (array([1, 1]), 1)