0% found this document useful (0 votes)
24 views24 pages

Week 10 - Neural Network

The document discusses machine learning and neural networks. It provides an introduction to neural networks, explaining their inspiration from biological nervous systems and how they work in parallel like the brain. It then describes the basic perceptron model of a neuron, how it is trained using an online learning process, and how neural networks learn from examples like people.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views24 pages

Week 10 - Neural Network

The document discusses machine learning and neural networks. It provides an introduction to neural networks, explaining their inspiration from biological nervous systems and how they work in parallel like the brain. It then describes the basic perceptron model of a neuron, how it is trained using an online learning process, and how neural networks learn from examples like people.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 24

MACHINE

LEARNING
DR. SAEED UR REHMAN
Department of Computer Science,
COMSATS University Islamabad
Wah Campus
Introduction to Machine Learning

Lecture Slides for


CHAPTER 19:
Design and
Analysis of ML
Algorithm
Dr Saeed Ur Rehman Courtesy : ETHEM ALPAYDIN © The MIT Press

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Summary Linear Discrimination
Linear Discrimination
Generalizing the Linear Model
Geometry of the Linear Discriminant
 Two Classes
 Multiple Classes
Pairwise Separation
Parametric Discrimination Revisited
Gradient Descent

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Outline
 Introduction
 Understanding the Brain
 Neural Networks as a Paradigm for
Parallel
 Processing
 The Perceptron
 Training a Perceptron
 Learning Boolean Functions

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Outline Contd..
 Multilayer Perceptron
 MLP as a Universal Approximator
 Backpropagation Algorithm
 Nonlinear Regression
 Two-Class Discrimination
 Multiclass Discrimination
 Multiple Hidden Layers

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Neural networks
 Neural network: information processing paradigm
inspired by biological nervous systems, such as our
brain
 Structure: large number of highly interconnected
processing elements (neurons) working together
 Like people, they learn from experience (by example)

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Neural Networks

• Neural networks are configured for a


specific application, such as pattern
recognition or data classification, through a
learning process
• In a biological system, learning involves
adjustments to the synaptic connections
between neurons
DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS
The Parts of a Neuron

03/16/2024 8

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


How it Works
Each neuron has branching from it a number of
small fibers called dendrites and a single long fiber,
the axon.

9US
03/16/2024
DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMP
How it Works
The axon eventually splits and ends in a number
of synapses which connect the axon to the
dendrites of other neurons.

10U S
03/16/2024
DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMP
How it Works
Communication between neurons occurs along
these paths. When the electric potential in a
neuron rises above a threshold, the neuron
activates.

11U S
03/16/2024
DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMP
How it Works
The neuron sends the electrical impulse down the
axon to the synapses.

12U S
03/16/2024
DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMP
How it Works
A synapse can either add to the electrical
potential or subtract from the electrical
potential.

13U S
03/16/2024
DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMP
How it Works
The pulse then enters the connected neuron’s dendrites,
and the process begins again.

14U S
03/16/2024
DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMP
Neural Networks as a Paradigm for Parallel

There are mainly two paradigms for parallel processing


Single Instruction Multiple Data (SIMD) machines
Multiple Instruction Multiple Data (MIMD) machines
Neural Instruction Multiple Data (NIMD) machines[future]
Artificial neural networks are a way to make use of the parallel
hardware we can build with current technology and—thanks to
learning— they need not be programmed

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


The Perceptron

The perceptron is the basic processing element

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Threshold Logic Unit (TLU)
inputs
weights
x1 w1
output
w2 activation
x2  q
y
. a=i=1n wi xi
.
. wn
xn 1 if a  q
y= { 0 if a < q

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Activation Functions
threshold linear
y y

a a
sigmoid
piece-wise linear
y
y

a
a

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Example

DEPARTMENT OF COMPUTER SCIENCE,


https://towardsdatascience.com/introduction-to-neural-networks-advantages-and-applications-96851bd1a207 COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS
https://towardsdatascience.com/introduction-to-neural-networks-advantages-and-applications-96851bd1a207

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Training a Perceptron
The perceptron defines a hyperplane, and the neural network perceptron is just a
way of implementing the hyperplane
We generally use online learning where we are not given the whole sample
Such an approach is interesting for a number of reasons:
It saves us the cost of storing the training sample
The problem may be changing in time
There may be physical changes in the system.

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Online learning
In online learning, we do not write the error function over
the whole sample but on individual instances
Starting from random initial weights, at each iteration we
adjust the parameters a little bit to minimize the error,
without forgetting what we have previously learned

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


Summary
Neural network: information processing paradigm inspired by
biological nervous systems, such as our brain
Working of Brain
NN as Parallel Architecture
Learning a Perceptron

DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS


DEPARTMENT OF COMPUTER SCIENCE, COMSATS UNIVERSITY ISLAMABAD - WAH CAMPUS

You might also like