Suppose an information source X represents a weighted coin toss with probability of heads equal to 0.9, and probability of tails equal to 0.1. (a) Construct a Huffman code for this source, encoding one message at a time. (This is a very simple code.) How does the average code word length compare to the entropy of the source? 1 (b) Construct a Huffman code for this source, encoding two messages at a time. (Hint: you need four codewords.) How does the average code word length (per message X) compare to the entropy of the source X?

Introductory Circuit Analysis (13th Edition)
13th Edition
ISBN:9780133923605
Author:Robert L. Boylestad
Publisher:Robert L. Boylestad
Chapter1: Introduction
Section: Chapter Questions
Problem 1P: Visit your local library (at school or home) and describe the extent to which it provides literature...
icon
Related questions
Question
100%

Please help me out with this practice question. Please show your work and explain it the best you can, will thumbs up. Thank you in advance. 

Suppose an information source X represents a weighted coin toss with probability
of heads equal to 0.9, and probability of tails equal to 0.1.
(a) Construct a Huffman code for this source, encoding one message at a time. (This is a
very simple code.) How does the average code word length compare to the entropy of
the source?
1
(b) Construct a Huffman code for this source, encoding two messages at a time. (Hint: you
need four codewords.) How does the average code word length (per message X) compare
to the entropy of the source X?
(c) Construct a Huffman code for this source, encoding three messages at a time. (Hint:
you need eight codewords.) How does the average code word length (per message X)
compare to the entropy of the source X?
Transcribed Image Text:Suppose an information source X represents a weighted coin toss with probability of heads equal to 0.9, and probability of tails equal to 0.1. (a) Construct a Huffman code for this source, encoding one message at a time. (This is a very simple code.) How does the average code word length compare to the entropy of the source? 1 (b) Construct a Huffman code for this source, encoding two messages at a time. (Hint: you need four codewords.) How does the average code word length (per message X) compare to the entropy of the source X? (c) Construct a Huffman code for this source, encoding three messages at a time. (Hint: you need eight codewords.) How does the average code word length (per message X) compare to the entropy of the source X?
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 3 steps with 3 images

Blurred answer
Knowledge Booster
Fiber optics
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, electrical-engineering and related others by exploring similar questions and additional content below.
Similar questions
Recommended textbooks for you
Introductory Circuit Analysis (13th Edition)
Introductory Circuit Analysis (13th Edition)
Electrical Engineering
ISBN:
9780133923605
Author:
Robert L. Boylestad
Publisher:
PEARSON
Delmar's Standard Textbook Of Electricity
Delmar's Standard Textbook Of Electricity
Electrical Engineering
ISBN:
9781337900348
Author:
Stephen L. Herman
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Electrical Engineering
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education
Fundamentals of Electric Circuits
Fundamentals of Electric Circuits
Electrical Engineering
ISBN:
9780078028229
Author:
Charles K Alexander, Matthew Sadiku
Publisher:
McGraw-Hill Education
Electric Circuits. (11th Edition)
Electric Circuits. (11th Edition)
Electrical Engineering
ISBN:
9780134746968
Author:
James W. Nilsson, Susan Riedel
Publisher:
PEARSON
Engineering Electromagnetics
Engineering Electromagnetics
Electrical Engineering
ISBN:
9780078028151
Author:
Hayt, William H. (william Hart), Jr, BUCK, John A.
Publisher:
Mcgraw-hill Education,