Image Compression Fundamentals PDF
Image Compression Fundamentals PDF
Image Compression Fundamentals PDF
Reference:
Digital Image Processing 2nd Edition
Rafael C. Gonzalez
Richard E. Woods
Overview
Introduction
Fundamentals
Coding Redundancy
Interpi xel Redundancy
Psychovisual Redundancy
Fidelity Criteria
Image Compression Models
Source Encoder and Decoder
Channel Encoder and Decoder
Elements of Information Theory
Measuring Information
The Information Channel
Fundamental Coding Theorems
Noiseless Coding Theorem
Noisy Coding Theorem
Source Coding Theorem
Bit Allocation
Wavelet Coding
Wavelet Selection
Quantizer Design
JPEG 2000
x 0 y 0
f ( x, y ) 2
SNRms M 1 N 1 2
x 0 y 0
f ( x , y ) f ( x, y )
Mapper
Transforms input data into a format designed to reduce
interpixel redundancies in input image.
Reversible process generally
May or may not reduce directly the amount of data required
to represent the image.
Examples
Run-length coding(directly results in data compression)
Transform coding
Symbol decoder
Inverse Mapper
Example
Hamming Code – appends enough bits to the data being
encoded to ensure that two valid codewords differ by a
minimum number of bits.
Ref: http://en.wikipedia.org/wiki/Probability
3/24/2012 CS 04 804B Image Processing Module 3 39
Ref: http://en.wikipedia.org/wiki/Probability
3/24/2012 CS 04 804B Image Processing Module 3 40
Ref: http://en.wikipedia.org/wiki/Probability
P(a ) 1
j 1
j
The finite ensemble (B,v), where v P(b1 ), P(b2 ),..., P(bJ )T
describes the channel output completely and thus the
information received by the user.
P b1 | a1 P b1 | a2 ... P b1 | aJ
P b2 | a1 P b2 | a2 ... P b2 | aJ
Q
: : ... :
P bK | a1 P bK | a2 ... P bK | aJ
P(a j , bk )
Conditional Probability, P (a j | bk )
P(bk )
K J
H (z | v ) P(a j , bk ) log P(a j | bk )
k 1 j 1
q11
q11.P(a1 ) log 2
q11 P(a1 ) q12 P(a2 )
q21
q21.P(a1 ) log 2
q21 P(a1 ) q22 P(a2 )
q12
q12 .P(a2 ) log 2
q11 P(a1 ) q12 P(a2 )
q22
q22 .P(a2 ) log 2
q21 P(a1 ) q22 P(a2 )
3/24/2012 CS 04 804B Image Processing Module 3 66
pe pe
p e . pbs log 2 pe . pbs log 2
p e pbs pe p bs pe pbs p e p bs
pe pe
pe . p bs log 2 p e . p bs log 2
p e pbs pe p bs pe pbs p e p bs
p e . pbs log 2 p e p e . pbs log 2 p e pbs pe p bs
pe . p bs log 2 p p . p log p p p p
e e bs 2 e bs e bs
p e . p bs log 2 p p . p log p p p p
e e bs 2 e bs e bs
H bs ( pe pbs p e p bs ) H bs ( pe )
where H bs (.) pbs log 2 pbs p bs log 2 p bs
3/24/2012 CS 04 804B Image Processing Module 3 67
Capacity of BSC
Maximum of mutual information over all source distributions.
T
1 1 1
I (z | v ) is max imum when pbs is .This corresponds to z , .
2 2 2
1 1
I (z | v ) H bs ( pe p e ) H bs ( pe )
2 2
1 1
H bs ( pe (1 pe ) ) H bs ( pe )
2 2
1
H bs H bs ( pe )
2
1 1 1 1
log 2 log 2 H bs ( pe )
2 2 2 2
1 H bs ( pe )
3/24/2012 CS 04 804B Image Processing Module 3 68
3/24/2012 CS 04 804B Image Processing Module 3 69
Overview
Introduction
Fundamentals
Coding Redundancy
Interpi xel Redundancy
Psychovisual Redundancy
Fidelity Criteria
Image Compression Models
Source Encoder and Decoder
Channel Encoder and Decoder
Elements of Information Theory
Measuring Information
The Information Channel
Fundamental Coding Theorems
Noiseless Coding Theorem
Noisy Coding Theorem
Source Coding Theorem
H (z ') nH (z )
R( D) min I (z, v)defines the minimum rate at
QQD
which information can be conveyed to user subject to the
constraint that the average distortion be less than or equal
to D. K