Pattern Recognition: Statistical and Neural: Lonnie C. Ludeman
Pattern Recognition: Statistical and Neural: Lonnie C. Ludeman
Pattern Recognition: Statistical and Neural: Lonnie C. Ludeman
Pattern Recognition:
Statistical and Neural
Lonnie C. Ludeman
Lecture 24
Nov 2, 2005
1
Lecture 24 Topics
2
Generalized Linear Discriminant Functions
w1
g1(x) x
w2
g2(x) x
… wj
x
gj(x) + g(x)
x
…
wM
gM(x) x
Review 1 3
Patterns are linearly separated in the 3-dim space
Separating plane
4
Review 2
Example: Decision rule using one nonlinear
discriminant function g(x)
R1 decide C1
R2 decide C2
Review 4 6
Find a generalized linear discriminant
function that separates the classes
Solution:
d(x) = w1f1(x)+ w2f2(x)+ w3f3(x)
+ w4f4(x) +w5f5(x) + w6f6(x)
T
= w f (x)
Review 5 7
where
Review 6 8
Decision Boundary in original pattern space
x2
from C1
2
from C2
1 2 3 4 x1
-1
Boundary
-2 d(x) = 0
Review 7 9
Potential Function Approach – Motivated
by electromagnetic theory + from C1
- from C2
Sample space
Review 8 10
Plot of Samples from the two classes
Review 9 11
Given Samples x from two classes C1 and C2
S1 C1
S2 C2
Review 11 13
Functional Link Neural Network
14
Quadratic Functional Link
15
Fourier Series Functional Link
16
Principal Component Functional Link
17
Example: Comparison of Neural Net and
functional link neural net
Given two pattern classes C1 and C2 with the
following four patterns and their desired outputs
18
(a)Design an Artificial Neural Network to
classify the two patterns given
(b) Design a Functional Link Artificial
Neural Network to classify the
patterns given.
(c) Compare the Neural Net and
Functional Link Neural Net designs
19
(a) Solution: Artificial Neural Net Design
20
After training using the training set and the
backpropagation algorithm the design
becomes
Values determined
by neural net
21
(b) Solution: Functional Link
Artificial Neural Net Design
22
A neural net was trained using the functional
link output patterns as new pattern samples
23
(c) Comparison Artificial Neural Net (ANN)
and Functional Link Artificial Neural Net
(FLANN} Designs
on Training Set
25
Determine Performance for
Design using Training Set
Test Design
on Testing Set
26
Could use
(a) Performance Measure ETOT
(b) The Confusion Matrix
(c) Probability of Error
(d) Bayes Risk
27
(a) Local and global errors- Used in
Neural Net Design procedure
Local Measure
Global Measure
28
(b) Confusion Matrix- Example
Correct Incorrect
Classification Classification
29
(c) Probability of Error- Example
Estimates of Probabilities of being Correct
30
(d) Bayes Risk Estimate
31
Radial Basis Function (RBF)
Artificial Neural Network
Functional
Link
32
Functional Form of RBF ANN
where
Examples of Nonlinearities
33
Design Using RBF ANN
Let F(x1, x2, … , xn) represent the function we
wish to approximate.
E
We wish to Minimize E by selecting
M, a ,b1, b2, ... , bM and z1, z2, ... zM
34
Finding the Best Approximation using RBF ANN
Notes:
You can use any minimization procedure you wish.
Training does not use the Backpropagation Algorithm
35
Problems Using Neural Network Designs
Failure to converge
Selection of insufficient structure
Max iterations too small
Lockup occurs
Limit cycles
36
Advantages of Neural Network Designs
37
Other problems that can be solved using
Neural Network Designs
System Identification
Functional Approximation
Control Systems
39
Famous Quotation
“Neural network designs are the second
best way to solve all problems”
??????????
??????????
40
Famous Quotation
“Neural network designs are the second
best way to solve all problems”
??????????
??????????
The promise is that a Neural Network can
be used to solve all problems; however,
with the caveat that there is always a better
way to solve a specific problem.
41
So what is the best way to
solve a given problem ???
42
So what is the best way to
solve a given problem ???
43
So what is the best way to
solve a given problem ???
?
A design that uses and
understands the structure
of the data !!!
44
Summary Lecture 24
45
6. Discussed Problems, Advantages,
Disadvantages, and the Promise of
Artificial Neural Network Design
46
End of Lecture 24
47