Implementation of Perceptron Algorithm for NOT Logic Gate Last Updated : 08 Jun, 2020 Comments Improve Suggest changes Like Article Like Report In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ \text { where } \Theta(v)=\left\{\begin{array}{cc} 1 & \text { if } v \geqslant 0 \\ 0 & \text { otherwise } \end{array}\right. \end{array} \] For a particular choice of the weight vector $\boldsymbol{w}$ and bias parameter $\boldsymbol{b}$, the model predicts output $\boldsymbol{\hat{y}}$ for the corresponding input vector $\boldsymbol{x}$. NOT logical function truth table is of only 1-bit binary input (0 or 1), i.e, the input vector $\boldsymbol{x}$ and the corresponding output $\boldsymbol{y}$ - $\boldsymbol{x}$ $\boldsymbol{y}$ 0 1 1 0 Now for the corresponding weight vector $\boldsymbol{w}$ of the input vector $\boldsymbol{x}$, the associated Perceptron Function can be defined as: \[$\boldsymbol{\hat{y}} = \Theta\left(w x+b\right)$\] For the implementation, considered weight parameter is $\boldsymbol{w} = -1$ and the bias parameter is $\boldsymbol{b} = 0.5$. Python Implementation: Python3 1== # importing Python library import numpy as np # define Unit Step Function def unitStep(v): if v >= 0: return 1 else: return 0 # design Perceptron Model def perceptronModel(x, w, b): v = np.dot(w, x) + b y = unitStep(v) return y # NOT Logic Function # w = -1, b = 0.5 def NOT_logicFunction(x): w = -1 b = 0.5 return perceptronModel(x, w, b) # testing the Perceptron Model test1 = np.array(1) test2 = np.array(0) print("NOT({}) = {}".format(1, NOT_logicFunction(test1))) print("NOT({}) = {}".format(0, NOT_logicFunction(test2))) Output: NOT(1) = 0 NOT(0) = 1 Here, the model predicted output ($\boldsymbol{\hat{y}}$) for each of the test inputs are exactly matched with the NOT logic gate conventional output ($\boldsymbol{y}$) according to the truth table. Hence, it is verified that the perceptron algorithm for NOT logic gate is correctly implemented. Comment More infoAdvertise with us Next Article Implementation of Perceptron Algorithm for NOT Logic Gate goodday451999 Follow Improve Article Tags : Machine Learning Neural Network python Practice Tags : Machine Learningpython Similar Reads Implementation of Perceptron Algorithm for NOR Logic Gate with 2-bit Binary Input A perceptron is a simplest neural network which takes several input value and multiplies them by weights and bias values and then passes the result through a step function. It is used for binary classification. In this article, we'll see how perceptron can be used to simulate a NOR logic gate using 2 min read Implementation of Perceptron Algorithm for OR Logic Gate with 2-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 2 min read Implementation of Perceptron Algorithm for NAND Logic Gate with 2-bit Binary Input A Perceptron is one of the simplest types of deep learning models. It is used to make decisions based on input values and is like a basic brain cell that can "learn" to give output like 0 or 1 from given inputs. In this article, we will show how a perceptron can be used to perform the task of NAND l 2 min read Implementation of Perceptron Algorithm for AND Logic Gate with 2-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 2 min read Implementation of Perceptron Algorithm for XOR Logic Gate with 2-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 3 min read Implementation of Perceptron Algorithm for XNOR Logic Gate with 2-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ 3 min read Perceptron Algorithm for Logic Gate with 3-bit Binary Input In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[\begin{array}{c}\hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ \ 4 min read Implementation of Artificial Neural Network for OR Logic Gate with 2-bit Binary Input Artificial Neural Network (ANN) is a computational model based on the biological neural networks of animal brains. ANN is modeled with three types of layers: an input layer, hidden layers (one or more), and an output layer. Each layer comprises nodes (like biological neurons) are called Artificial N 4 min read Implementation of Artificial Neural Network for NOR Logic Gate with 2-bit Binary Input Artificial Neural Network (ANN) is a computational model based on the biological neural networks of animal brains. ANN is modeled with three types of layers: an input layer, hidden layers (one or more), and an output layer. Each layer comprises nodes (like biological neurons) are called Artificial N 4 min read Implementation of Artificial Neural Network for XOR Logic Gate with 2-bit Binary Input Artificial Neural Network (ANN) is a computational model based on the biological neural networks of animal brains. ANN is modeled with three types of layers: an input layer, hidden layers (one or more), and an output layer. Each layer comprises nodes (like biological neurons) are called Artificial N 4 min read Like