Suppose an information source X represents a weighted coin toss with probability of heads equal to 0.9, and probability of tails equal to 0.1. (a) Construct a Huffman code for this source, encoding one message at a time. (This is a very simple code.) How does the average code word length compare to the entropy of the source? 1 (b) Construct a Huffman code for this source, encoding two messages at a time. (Hint: you need four codewords.) How does the average code word length (per message X) compare to the entropy of the source X?

icon
Related questions
Question
100%

Please help me out with this practice question. Please show your work and explain it the best you can, will thumbs up. Thank you in advance. 

Suppose an information source X represents a weighted coin toss with probability
of heads equal to 0.9, and probability of tails equal to 0.1.
(a) Construct a Huffman code for this source, encoding one message at a time. (This is a
very simple code.) How does the average code word length compare to the entropy of
the source?
1
(b) Construct a Huffman code for this source, encoding two messages at a time. (Hint: you
need four codewords.) How does the average code word length (per message X) compare
to the entropy of the source X?
(c) Construct a Huffman code for this source, encoding three messages at a time. (Hint:
you need eight codewords.) How does the average code word length (per message X)
compare to the entropy of the source X?
Transcribed Image Text:Suppose an information source X represents a weighted coin toss with probability of heads equal to 0.9, and probability of tails equal to 0.1. (a) Construct a Huffman code for this source, encoding one message at a time. (This is a very simple code.) How does the average code word length compare to the entropy of the source? 1 (b) Construct a Huffman code for this source, encoding two messages at a time. (Hint: you need four codewords.) How does the average code word length (per message X) compare to the entropy of the source X? (c) Construct a Huffman code for this source, encoding three messages at a time. (Hint: you need eight codewords.) How does the average code word length (per message X) compare to the entropy of the source X?
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 3 steps with 3 images

Blurred answer
Knowledge Booster
Fiber optics
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, electrical-engineering and related others by exploring similar questions and additional content below.