tanszek:oktatas:techcomm:conditional_probability_and_information_theory
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2024/09/26 08:22] – kissa | tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2024/10/15 18:33] (current) – kissa | ||
---|---|---|---|
Line 1: | Line 1: | ||
===== Conditional Probability and Information Theory Exercises ===== | ===== Conditional Probability and Information Theory Exercises ===== | ||
- | - What is the probability of rolling an odd number with a fair die? How many bits of information does the statement "we roll an odd number with a fair die" contain? //(p=0.5; 1 bit)// \\ \\ | + | - What is the probability of rolling an odd number with a fair dice? How many bits of information does the statement |
- | - We roll two physically identical dice. Let’s call event "A" | + | - We roll two physically identical dice. Let’s call event $A$ when we roll a 2 on either or both dice, and event $B$ when we roll a 3 on either or both dice. What is the probability of the event, when after rolling a 3, we immediately roll both a 2 and a 3 at the same time? (Rolling a 3 means that at least one die shows a 3.) // |
- An urn contains 2 white and 2 black balls. | - An urn contains 2 white and 2 black balls. | ||
- What is the probability of drawing two white balls consecutively without replacement? | - What is the probability of drawing two white balls consecutively without replacement? | ||
- | - How many bits of information does the statement "we draw two white balls consecutively without replacement from an urn containing 2 white and 2 black balls" contain? //(p=2.58 bits)// | + | - How many bits of information does the statement |
- | - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.24 bits)// | + | - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)// |
- | - What is the entropy of the set of information for each individual | + | - What is the entropy of the set of information for a single |
- | - What would be the entropy of the set of information for two draws if the four balls were all different colors, i.e., one white, one black, one red, and one green? //(2.58 bits)// | + | - What would be the entropy of the set of information for two draws if the four balls were all different colors |
- | - Determine the unique information content of the complex event where, drawing three balls consecutively without replacement, | + | - An automated machine has a defect rate of 10% (it produces |
- | - An automated machine has a defect rate of 10% (10 defective out of 100 units on average). | + | - What is the information content of the news that, out of 10 items, 2 are selected and they are defect-free? |
- | - What is the information content of the news that, out of 10 units selected, 2 are defect-free? | + | |
- What is the information content if we select 5 out of 10 units and all are defect-free? | - What is the information content if we select 5 out of 10 units and all are defect-free? | ||
- In a manufacturing process, the expected defect rate is 0.25. After producing 20 units, we randomly and simultaneously select 2 of them and inspect them: | - In a manufacturing process, the expected defect rate is 0.25. After producing 20 units, we randomly and simultaneously select 2 of them and inspect them: | ||
- | - How many bits of information has the news that "we selected 2 defective parts" contain? //(4.24 bits)// | + | - How many bits of information has the news that $\text{"we selected 2 defective parts"}$ contain? //(4.24 bits)// |
- | - How many elements | + | - How many elements |
- | - In a communication system, we want to transmit 2-character symbolic words using the characters A, B, and C, with each word having equal probability, | + | - In a communication system, we want to transmit 2-character symbolic words using the characters |
- How many such symbolic code words can be formed? //(9)// | - How many such symbolic code words can be formed? //(9)// | ||
- What is the minimum number of bits needed for coding? //(4 bits)// | - What is the minimum number of bits needed for coding? //(4 bits)// | ||
- What is the average amount of information content for the words transmitted? | - What is the average amount of information content for the words transmitted? | ||
- | - What is the redundancy of the code? //(R=20%)// \\ \\ | + | - What is the redundancy of the code? //(R=20.75%)// \\ \\ |
- | - Numbers are encoded in byte-length (8 bit) words. | + | |
- | - What is the largest integer that can be represented? | + | |
- | - What is the largest BCD (Binary-Coded Decimal) number that can be represented? | + |
tanszek/oktatas/techcomm/conditional_probability_and_information_theory.1727338928.txt.gz · Last modified: 2024/09/26 08:22 by kissa