Huffman Coding 1
Huffman Coding 1
Huffman Coding 1
2
Huffman Coding
Example 1
3
Huffman Coding
Example 1
4
Huffman Coding
Example 1
5
Huffman Coding
Example 1
6
Huffman Coding
Example 1
7
Huffman Coding
Example 1 (Alternative method)
8
Huffman Coding
Example 1 (Alternative method)
9
Huffman Coding
10
Huffman Coding
11
Huffman Coding
12
Huffman Coding
13
Huffman Coding
Some Comments on Huffman Codes
14
Huffman Coding
Some Comments on Huffman Codes
15
Huffman Coding
Some Comments on Huffman Codes
16
Huffman Coding
Some Comments on Huffman Codes
18
Huffman Coding
Some Comments on Huffman Codes
19
Huffman Coding
Some Comments on Huffman Codes
20
Huffman Coding
Some Comments on Huffman Codes
Huffman coding for weighted codewords:
21
Huffman Coding
Some Comments on Huffman Codes
23
Huffman Coding
Some Comments on Huffman Codes
Huffman codes and Shannon codes:
24
Huffman Coding
Some Comments on Huffman Codes
Huffman codes and Shannon codes:
26
Huffman Coding
Optimality of Huffman Codes
27
Huffman Coding
Optimality of Huffman Codes
By trimming branches
without siblings, we
A possible improve the code.
instantaneous code
28
Huffman Coding
Optimality of Huffman Codes
We now rearrange the tree as shown in (c), so Finally, we swap probability assignments
that the word lengths are ordered by increasing to improve the expected depth of the tree,
length from top to bottom as shown in (d). 29
Huffman Coding
Optimality of Huffman Codes
31
Huffman Coding
Optimality of Huffman Codes
3. The two longest codewords differ only in the last bit and
correspond to the two least likely symbols.
• If there is a maximal-length codeword without a sibling, we can
delete the last bit of the codeword and still satisfy the prefix
property. This reduces the average codeword length and
contradicts the optimality
• Hence, every maximal-length codeword in any optimal code
has a sibling.
• Now we can exchange the longest codewords so that the two
lowest-probability source symbols are associated with two
siblings on the tree. This does not change the expected length.
Thus, the codewords for the two lowest-probability source
symbols have maximal length and agree in all but the last bit.32
Huffman Coding
Optimality of Huffman Codes
33
Huffman Coding
Optimality of Huffman Codes
34
Huffman Coding
Optimality of Huffman Codes
Let C∗m−1(p) be an optimal code for p, and let C∗m(p) be the
canonical optimal code for p.
35
Huffman Coding
Optimality of Huffman Codes
37
Huffman Coding
Optimality of Huffman Codes
38
Huffman Coding
Optimality of Huffman Codes
or
39
Huffman Coding
Optimality of Huffman Codes
41
Huffman Coding
Shannon-Fano-Elias Coding
42
Huffman Coding
Shannon-Fano-Elias Coding
43
Huffman Coding
Shannon-Fano-Elias Coding
44
Huffman Coding
Shannon-Fano-Elias Coding
46
Huffman Coding
Shannon-Fano-Elias Coding
47
Huffman Coding
Shannon-Fano-Elias Coding
Since we use
48
Huffman Coding
Shannon-Fano-Elias Coding
49
Huffman Coding
Shannon-Fano-Elias Coding
In this case, the average codeword length is 2.75 bits and the
entropy is 1.75 bits. The Huffman code for this case achieves the
entropy bound. Looking at the codewords, it is obvious that there
is some inefficiency—for example, the last bit of the last two
codewords can be omitted. But if we remove the last bit from all
the codewords, the code is no longer prefix-free.
50
Huffman Coding
Shannon-Fano-Elias Coding
51
Huffman Coding
Shannon-Fano-Elias Coding
52
53
54