tanszek:oktatas:techcomm:conditional_probability_and_information_theory
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2025/11/05 12:17] – kissa | tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2025/11/05 20:00] (current) – kissa | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ===== Conditional Probability and Information Theory Exercises ===== | ===== Conditional Probability and Information Theory Exercises ===== | ||
| - | - What is the probability of rolling an odd number with a fair dice? How many bits of information does the statement $\text{" | + | - What is the probability of rolling an odd number with a fair dice? How many bits of information does the statement $\text{" |
| - We roll two physically identical dice. Let’s call event $A$ when we roll a 2 on either or both dice, and event $B$ when we roll a 3 on either or both dice. What is the probability of the event, when after rolling a 3, we immediately roll both a 2 and a 3 at the same time? (Rolling a 3 means that at least one die shows a 3.) // | - We roll two physically identical dice. Let’s call event $A$ when we roll a 2 on either or both dice, and event $B$ when we roll a 3 on either or both dice. What is the probability of the event, when after rolling a 3, we immediately roll both a 2 and a 3 at the same time? (Rolling a 3 means that at least one die shows a 3.) // | ||
| - An urn contains 2 white and 2 black balls. | - An urn contains 2 white and 2 black balls. | ||
| - | - What is the probability of drawing two white balls consecutively without replacement? | + | - What is the probability of drawing two white balls consecutively without replacement? |
| - How many bits of information does the statement $\text{" | - How many bits of information does the statement $\text{" | ||
| - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)// | - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)// | ||
| Line 19: | Line 19: | ||
| - What is the minimum number of bits needed for coding? //(4 bits)// | - What is the minimum number of bits needed for coding? //(4 bits)// | ||
| - What is the average amount of information content for the words transmitted? | - What is the average amount of information content for the words transmitted? | ||
| - | - What is the redundancy of the code? //(0.2075)// \\ \\ | + | - What is the redundancy of the code? //(0.21)// \\ \\ |
tanszek/oktatas/techcomm/conditional_probability_and_information_theory.1762345072.txt.gz · Last modified: 2025/11/05 12:17 by kissa
