SWE 423: Multimedia Systems - PowerPoint PPT Presentation

About This Presentation
Title:

SWE 423: Multimedia Systems

Description:

Modern arithmetic coding can be attributed to Pasco (1976) and Rissanen and Langdon (1979) Arithmetic coding treats the whole message as one unit ... – PowerPoint PPT presentation

Number of Views:21
Avg rating:3.0/5.0
Slides: 11
Provided by: wasfiga
Category:

less

Transcript and Presenter's Notes

Title: SWE 423: Multimedia Systems


1
SWE 423 Multimedia Systems
  • Chapter 7 Data Compression (3)

2
Outline
  • Entropy Encoding
  • Arithmetic Coding

3
Entropy Encoding Arithmetic Coding
  • Initial idea introduced in 1948 by Shannon
  • Many researchers worked on this idea
  • Modern arithmetic coding can be attributed to
    Pasco (1976) and Rissanen and Langdon (1979)
  • Arithmetic coding treats the whole message as one
    unit
  • In practice, the input data is usually broken up
    into chunks to avoid error propagation

4
Entropy Encoding Arithmetic Coding
  • A message is represented by a half-open interval
    a,b), where a,b??.
  • General idea of encoding
  • Map the message into an open interval a,b)
  • Find a binary fractional number with minimum
    length that belongs to the above interval. This
    will be the encoded message
  • Initially, a,b) 0,1)
  • When the message becomes longer, the length of
    the interval shortens and the of bits needed to
    represent the interval increases

5
Entropy Encoding Arithmetic Coding
  • Coding Algorithm
  • Algorithm ArithmeticCoding
  • // Input symbol Input stream of the message
  • terminator terminator symbol
  • // Low and High all symbols ranges
  • // Output binary fractional code of the message
  • low 0 high 1 range 1
  • while (symbol ! terminator)
  • get (symbol)
  • high low range High(symbol)
  • low low range Low(symbol)
  • range high low
  • return CodeWord(low,high)

6
Entropy Encoding Arithmetic Coding
  • Binary code generation
  • Algorithm CodeWord
  • // Input low and high
  • // Output binary fractional code
  • code 0
  • k 1
  • while (value(code) lt low)
  • assign 1 to the kth binary fraction bit
  • if (value(code) gt high)
  • replace the kth bit by 0
  • k

7
Entropy Encoding Arithmetic Coding
  • Example Assume S A,B,C,D,E,F,, where is
    the terminator symbol. In addition, assume the
    following probabilities for each character
  • Pr (A) 0.2
  • Pr(B) 0.1
  • Pr(C) 0.2
  • Pr(D) 0.05
  • Pr(E) 0.3
  • Pr(F) 0.05
  • Pr() 0.1
  • Generate the fractional binary code of the
    message CAEE

8
Entropy Encoding Arithmetic Coding
  • It can be proven that ??log2 (1/? Pi)? is the
    upper bound on the number of bits needed to
    encode a message
  • In our case, the maximum is equal to 12.
  • When the length of the message increases, the
    range decreases and the upper bound value ......
  • Generally, arithmetic coding outperforms Huffman
    coding
  • Treats the whole message as one unit vs. an
    integral number of bits to code each character in
    Huffman coding
  • Redo the previous example CAEE using Huffman
    coding and notice how many bits are required to
    code this message.

9
Entropy Encoding Arithmetic Coding
  • Decoding Algorithm
  • Algorithm ArithmeticDecoding
  • // Input code binary code
  • // Low and High all symbols ranges
  • // Output The decoded message
  • value convert2decimal(code)
  • Do
  • find a symbol s so that
  • Low(s) lt value lt High(s)
  • output s
  • low Low(s) high High(s) range high
    low
  • value (value low) / range
  • while s is not the terminator symbol

10
Entropy Encoding Arithmetic Coding
  • Example
Write a Comment
User Comments (0)
About PowerShow.com