Percept Ron
Percept Ron
Percept Ron
PERCEPTRON
Perceptron is a function that maps its input “x,” which is multiplied with
the learned weight coefficient; an output value ”f(x)”is generated.
f(x)=1; if w.x+b>=0
otherwise, f(x)=0
Row 2
• Passing (x1=0 and x2=1), we get: 0+1–1 = 0
• From the Perceptron rule, if Wx+b >=0, then y`=1. This row is incorrect, as the output
is 0 for the AND gate.
• So we want values that will make the combination of x1=0 and x2=1 to give y` a value
of 0. If we change b to -1.5, we have: 0+1–1.5 = -0.5
• From the Perceptron rule, this works (for both row 1, row 2 and 3).
Row 4
• Passing (x1=1 and x2=1), we get: 1+1–1.5 = 0.5
• Again, from the perceptron rule, this is still valid.
• Therefore, we can conclude that the model to achieve an AND gate, using the
Perceptron algorithm is: x1+x2–1.5
OR Gate
From the diagram, the OR gate is 0 only if both inputs are 0.
Row 1
• From w1x1+w2x2+b, initializing w1, w2, as 1 and b as -1, we get: x1(1)+x2(1)-1
• Passing the first row of the OR logic table (x1=0, x2=0), we get: 0+0–1 = -1
• From the Perceptron rule, if Wx+b<0, then y`=0. Therefore, this row is correct.
Row 2
• Passing (x1=0 and x2=1), we get: 0+1–1 = 0
• From the Perceptron rule, if Wx+b >=0, then y`=1. This row is again, correct (for both row 1,
row 2 and 3).
Row 4
• Passing (x1=1 and x2=1), we get: 1+1–1 = 1
• Again, from the perceptron rule, this is still valid. Quite Easy!
• Therefore, we can conclude that the model to achieve an OR gate, using the Perceptron
algorithm is: x1+x2–1
XOR Gate
• The Boolean representation of an XOR gate is: x1x`2 + x`1×2
• We first simplify the Boolean expression
• x`1×2 + x1x`2 + x`1×1 + x`2×2
• x1(x`1 + x`2) + x2(x`1 + x`2)
• (x1 + x2)(x`1 + x`2)
• (x1 + x2)(x1x2)`
• From the simplified expression, we can say that the XOR gate consists of an OR
gate (x1 + x2), a NAND gate (-x1-x2+1) and an AND gate (x1+x2–1.5).
• This means we will have to combine 2 perceptrons:
• OR (x1+x2–1)
• NAND (-x1-x2+1)
• AND (x1+x2–1.5)
Characteristics of Perceptron
The perceptron model has the following characteristics.
1.Perceptron is a machine learning algorithm for supervised learning of binary classifiers.
2.In Perceptron, the weight coefficient is automatically learned.
3.Initially, weights are multiplied with input features, and the decision is made whether the neuron is fired or not.
4.The activation function applies a step rule to check whether the weight function is greater than zero.
5.The linear decision boundary is drawn, enabling the distinction between the two linearly separable classes +1 and -
1.
6.If the added sum of all input values is more than the threshold value, it must have an output signal; otherwise, no
output will be shown.