Unit 9 ANN
Unit 9 ANN
Unit 9 ANN
1
Outline
2
INTRODUCTION TO NEURAL NETWORK
Dendrites are structures used for collecting input for a neuron. It
collects and sums up the inputs and if the result is greater than
its firing threshold, the neuron fires else it inhibits. shows the
structure of a biological neuron.
When a neuron fires, it sends an electrical impulse from its
nucleus to its boutons. The boutons can then network to more
neurons via connections called synapses. Learning takes place by
changing the effectiveness of the synapses so that the infuence of
one neuron on another changes.
The human brain consists of about one hundred billion
(100,000,000,000) neurons, each with about 1000 synaptic
connections.
Our intelligence depends on the effectiveness of these synaptic
connections.
3
INTRODUCTION TO NEURAL NETWORK
4
PERCEPTRON
A Perceptron is an algorithm used for supervised learning
of binary classifiers. Binary classifiers decide whether an
input, usually represented by a series of vectors, belongs
to a specific class.
5
PERCEPTRON
6
PERCEPTRON
The process begins by taking all the input values
and multiplying them by their weights. Then, all of
these multiplied values are added together to create
the weighted sum.
The weighted sum is then applied to the activation
function, producing the perceptron's output. The
activation function plays the integral role of ensuring
the output is mapped between required values such
as (0,1) or (-1,1).
It is important to note that the weight of an input is
indicative of the strength of a node. Similarly, an
input's bias value gives the ability to shift the
activation function curve up or down.
7
PERCEPTRON
The models of ANN are specified by the three basic
entities:
1. Model’s synaptic connections.
2. Training/learning rules adopted for adjusting weights.
3. Activation functions.
8
Network Architecture
Single Layer Feed-forward Network
There can be a type of network in which the input layer is directly
connected to output layer without any intermediate layers. Such a
network is called single-layer feed forward network .
Multilayer Feed-forward Network
A multilayer feed-forward network is formed by the interconnection
of one or more intermediate layers. The
input layer receives the input of the neural network and buffers the
input signal. The output layer generates the output of the network. A
layer that is formed between the input and output layers is called
hidden layer.
The hidden layer does not contact with the external environment
directly. There can be zero to several hidden
layers.
9
Network Architecture
Feedback Network
When outputs are directed back as inputs to same or preceding
layer nodes, it results in the network architecture of feedback
networks. When the feedback of the output layer is connected
back to the same layer, then it is called lateral feedback.
Recurrent Network
Recurrent network is a feedback network with a closed loop.
This type of network can be a single layer network or multilayer
network. In a single layer network with a feedback connection, a
processing element’s output can be directed back to itself or to
another processing element
10
program
import required library train_out = [
import tensorflow as tf [1.],
#input1, input2 and bias [0],
train_in = [ [0],
[1., 1.,1], [0]]
[1., 0,1],
[0, 1.,1],
[0, 0,1]]
#output
11
program
#weight variable initialized with random values using
random_normal()
w = tf.Variable(tf.random_normal([3, 1], seed=12))
#Placeholder for input and output
x = tf.placeholder(tf.float32,[None,3])
y = tf.placeholder(tf.float32,[None,1])
#calculate output
output = tf.nn.relu(tf.matmul(x, w))
12
#Mean Squared Loss or Error
loss = tf.reduce_sum(tf.square(output - y))
#Minimize loss using GradientDescentOptimizer with a
learning rate of 0.01
optimizer = tf.train.GradientDescentOptimizer(0.01)
train = optimizer.minimize(loss)
#Initialize all the global variables
init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)
13
#Compute output and cost w.r.t to input vector
sess.run(train, {x:train_in,y:train_out})
cost = sess.run(loss,feed_dict={x:train_in,y:train_out})
print('Epoch--',i,'--loss--',cost)
14