Face Detection and Recognition 2019
Face Detection and Recognition 2019
Face Detection and Recognition 2019
An Introduction
Face
Face Recognition
• Face is the most common biometric used by humans
• Applications range from static, mug-shot verification to
a dynamic, uncontrolled face identification in a cluttered
background
• Challenges:
• automatically locate the face
• recognize the face from a general view point under
different illumination conditions, facial expressions,
and aging effects
Authentication vs Identification
• Face Authentication/Verification (1:1 matching)
• Co-variance
…
Eigenfaces: representing faces
• These basis faces can be differently weighted to represent any
face
• So we can use different vectors of weights to represent different
faces
-8029 -1183 2900 -2088 1751 -4336 1445 -669 4238 -4221 6193 10549
Learning Eigenfaces
Q: How do we pick the set of basis faces?
A: We take a set of real training faces
…
❖Then we find (learn) a set of basis faces which
best represent the differences between them
❖We’ll use a statistical criterion for measuring this
notion of “best representation of the differences
between the training faces”
❖We can then store each face as a set of weights
for those basis faces
Using Eigenfaces: recognition & reconstruction
Av = v
Where A
is a matrix, is a scalar (called the eigenvalue)
2 3 3
e.g. A = one eigenvector of A is = since
2 1 2
2 3 3 12 3
2 1 2 = 8 = 4 2
so for this eigenvector of this matrix the eigenvalue is 4
Av
v
Eigenvectors
• We can think of matrices as performing
transformations on vectors (e.g rotations, reflections)
• We can think of the eigenvectors of a matrix as being
special vectors (for that matrix) that are scaled by
that matrix
• Different matrices have different eigenvectors
• Only square matrices have eigenvectors
• Not all square matrices have eigenvectors
• An n by n matrix has at most n distinct eigenvectors
• All the distinct eigenvectors of a matrix are
orthogonal (ie perpendicular)
Covariance
x2
x1
• This vector turns out to be a vector expressing
the direction of the correlation
• Here I have two variables x1 and x2
• They co-vary (y tends to change in roughly the
same direction as x)
Covariance
• The covariances can be expressed as a
matrix x x 1 2
x2
.617 .615 x1
C=
.615 .717 x2
x1
• The diagonal elements are the variances
e.g. Var(x1) n
1 1 2 −x2 )
( x i
− x )( x i
cov( x1 , x2 ) = i =1
n −1
• The covariance of two variables is:
Eigenvectors of the covariance matrix
• The covariance matrix has eigenvectors
eigenvectors x1
−.735 .678
1 = 2 =
.678 .735
eigenvalues = 0.049
1 = 1.284
2
( i i )
n
d ( w1 , w2 ) = w1
− w 2
i =1
82 70 50
30 20 10