Implementation of Perceptron Algorithm for NAND Logic Gate with 2-bit Binary Input Last Updated : 05 May, 2025 Comments Improve Suggest changes Like Article Like Report A Perceptron is one of the simplest types of deep learning models. It is used to make decisions based on input values and is like a basic brain cell that can "learn" to give output like 0 or 1 from given inputs. In this article, we will show how a perceptron can be used to perform the task of NAND logic gate. A NAND gate gives output of 0 only when both inputs are 1. In all other cases the output is 1. NAND Truth Table is:Input 1Input 2Output001011101110It is the opposite of the AND gate that's why it is called NAND gate (Not of And gate).Perceptron for NAND GateA perceptron takes binary inputs, multiplies them with weights, adds a bias and passes the result through an activation function usually a step function.y = \Theta(w_1 x_1 + w_2 x_2 + b)where:x_1, x_2: Inputsw_1, w_2:Weights for the inputsb: Bias\Theta(z): Step functionStep Function:\Theta(z) = \begin{cases}1 & \text{if } z \geq 0 \\0 & \text{if } z < 0\end{cases}Choose Weights and Bias for NANDWe need to find w_1, w_2,and, b such that the perceptron behaves like a NAND gate. Let’s use this configuration:w_1 = -1w_2 = -1b = 1.5 The inputs x_1 \,and\,x_2go into the perceptron. The weights and bias are used to calculate the weighted sum. and the step function determines the output.Nand gate implementationLet’s implement this into Python.Step 1: Define the Step Activation Function Python import numpy as np def step_function(z): return 1 if z >= 0 else 0 Step 2: Perceptron logic for 2 inputs Python def perceptron(x1, x2, w1, w2, b): z = (x1 * w1) + (x2 * w2) + b return step_function(z) Step 3: Set weights and bias Python w1 = -1 w2 = -1 b = 1.5 Step 4: Test all combinations Python print("NAND Gate Output:") for x1 in [0, 1]: for x2 in [0, 1]: output = perceptron(x1, x2, w1, w2, b) print(f"NAND({x1}, {x2}) = {output}") Output:NAND Gate OutputThe code correctly predicts the NAND gate outputs for all input combinations. The perceptron outputs matches expected truth table. Comment More infoAdvertise with us Next Article Implementation of Perceptron Algorithm for NAND Logic Gate with 2-bit Binary Input goodday451999 Follow Improve Article Tags : Machine Learning Neural Network python Practice Tags : Machine Learningpython Similar Reads Implementation of Perceptron Algorithm for NOR Logic Gate with 2-bit Binary Input A perceptron is a simplest neural network which takes several input value and multiplies them by weights and bias values and then passes the result through a step function. It is used for binary classification. In this article, we'll see how perceptron can be used to simulate a NOR logic gate using 2 min read Implementation of Perceptron Algorithm for OR Logic Gate with 2-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 2 min read Implementation of Perceptron Algorithm for AND Logic Gate with 2-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 2 min read Implementation of Perceptron Algorithm for XNOR Logic Gate with 2-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 3 min read Implementation of Perceptron Algorithm for XOR Logic Gate with 2-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 3 min read Perceptron Algorithm for Logic Gate with 3-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[\begin{array}{c}\hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ \ 4 min read Implementation of Artificial Neural Network for NAND Logic Gate with 2-bit Binary Input Artificial Neural Network (ANN) is a computational model based on the biological neural networks of animal brains. ANN is modeled with three types of layers: an input layer, hidden layers (one or more), and an output layer. Each layer comprises nodes (like biological neurons) are called Artificial N 4 min read Implementation of Perceptron Algorithm for NOT Logic Gate In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 2 min read Implementation of Artificial Neural Network for OR Logic Gate with 2-bit Binary Input Artificial Neural Network (ANN) is a computational model based on the biological neural networks of animal brains. ANN is modeled with three types of layers: an input layer, hidden layers (one or more), and an output layer. Each layer comprises nodes (like biological neurons) are called Artificial N 4 min read Implementation of Artificial Neural Network for NOR Logic Gate with 2-bit Binary Input Artificial Neural Network (ANN) is a computational model based on the biological neural networks of animal brains. ANN is modeled with three types of layers: an input layer, hidden layers (one or more), and an output layer. Each layer comprises nodes (like biological neurons) are called Artificial N 4 min read Like