tanszek:oktatas:techcomm:conditional_probability_and_information_theory
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2025/11/05 12:06] – kissa | tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2025/11/10 18:56] (current) – kissa | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ===== Conditional Probability and Information Theory Exercises ===== | ===== Conditional Probability and Information Theory Exercises ===== | ||
| - | | + | **Related theory: [[tanszek: |
| - | - We roll two physically | + | |
| + | | ||
| + | - We roll two identical dice twice. What is the probability | ||
| - An urn contains 2 white and 2 black balls. | - An urn contains 2 white and 2 black balls. | ||
| - | - What is the probability of drawing two white balls consecutively without replacement? | + | - What is the probability of drawing two white balls consecutively without replacement? |
| - | - How many bits of information does the statement $\text{" | + | - How many bits of information does the statement $\text{" |
| - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)// | - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)// | ||
| - What is the entropy of the set of information for a single draw? //(1 bit)// | - What is the entropy of the set of information for a single draw? //(1 bit)// | ||
| Line 13: | Line 15: | ||
| - What is the information content if we select 5 out of 10 units and all are defect-free? | - What is the information content if we select 5 out of 10 units and all are defect-free? | ||
| - In a manufacturing process, the expected defect rate is 0.25. After producing 20 units, we randomly and simultaneously select 2 of them and inspect them: | - In a manufacturing process, the expected defect rate is 0.25. After producing 20 units, we randomly and simultaneously select 2 of them and inspect them: | ||
| - | - How many bits of information has the news that $\text{" | + | - How many bits of information has the news that $\text{" |
| - | - How many elements does the news set have (what are the number of possible outcomes)? What is the entropy of the set? //(n=3; H=1.22 bits)// \\ \\ | + | - How many elements does the news set have (i.e., what are the number of possible outcomes)? What is the entropy of the set? //(n=3; H=1.23 bits)// \\ \\ |
| - In a communication system, we want to transmit 2-character symbolic words using the characters $A$, $B$, and $C$, with each word having equal probability, | - In a communication system, we want to transmit 2-character symbolic words using the characters $A$, $B$, and $C$, with each word having equal probability, | ||
| - How many such symbolic code words can be formed? //(9)// | - How many such symbolic code words can be formed? //(9)// | ||
| - What is the minimum number of bits needed for coding? //(4 bits)// | - What is the minimum number of bits needed for coding? //(4 bits)// | ||
| - What is the average amount of information content for the words transmitted? | - What is the average amount of information content for the words transmitted? | ||
| - | - What is the redundancy of the code? //(0.2075)// \\ \\ | + | - What is the redundancy of the code? //(0.21)// \\ \\ |
tanszek/oktatas/techcomm/conditional_probability_and_information_theory.1762344392.txt.gz · Last modified: 2025/11/05 12:06 by kissa
