tanszek:oktatas:techcomm:conditional_probability_and_information_theory
Conditional Probability and Information Theory Exercises
- What is the probability of rolling an odd number with a fair dice? How many bits of information does the statement $\text{"we roll an odd number with a fair dice"}$ contain? (p=0.5; 1 bit)
- We roll two physically identical dice. Let’s call event $A$ when we roll a 2 on either or both dice, and event $B$ when we roll a 3 on either or both dice. What is the probability of the event, when after rolling a 3, we immediately roll both a 2 and a 3 at the same time? (Rolling a 3 means that at least one die shows a 3.) (p=0.18)
- An urn contains 2 white and 2 black balls.
- What is the probability of drawing two white balls consecutively without replacement? (p=0.1666)
- How many bits of information does the statement $\text{"we draw 2 white balls consecutively without replacement from an urn containing 2 white and 2 black balls"}$ contain? (2.58 bits)
- What is the entropy of this set of information (i.e., the average amount of information per news item)? (1.25 bits)
- What is the entropy of the set of information for a single draw? (1 bit)
- What would be the entropy of the set of information for two draws if the four balls were all different colors (e.g., one white, one black, one red, and one green)? (2.58 bits)
- An automated machine has a defect rate of 10% (it produces 10 defective units out of 100 units on average).
- What is the information content of the news that, out of 10 items, 2 are selected and they are defect-free? (0.32 bit)
- What is the information content if we select 5 out of 10 units and all are defect-free? (1 bit)
- In a manufacturing process, the expected defect rate is 0.25. After producing 20 units, we randomly and simultaneously select 2 of them and inspect them:
- How many bits of information has the news that $\text{"we selected 2 defective parts"}$ contain? (4.24 bits)
- How many elements does the news set have (what are the number of possible outcomes)? What is the entropy of the set? (n=3; H=1.22 bits)
- In a communication system, we want to transmit 2-character symbolic words using the characters $A$, $B$, and $C$, with each word having equal probability, using a binary fixed-length code.
- How many such symbolic code words can be formed? (9)
- What is the minimum number of bits needed for coding? (4 bits)
- What is the average amount of information content for the words transmitted? (3.17 bits)
- What is the redundancy of the code? (R=20.75%)
tanszek/oktatas/techcomm/conditional_probability_and_information_theory.txt · Last modified: 2024/10/15 18:33 by kissa