User Tools

Site Tools


tanszek:oktatas:techcomm:conditional_probability_and_information_theory

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2025/11/10 18:12] kissatanszek:oktatas:techcomm:conditional_probability_and_information_theory [2025/11/10 18:56] (current) kissa
Line 4: Line 4:
  
   - What is the probability of rolling an odd number with a fair dice? How many bits of information does the statement $\text{"we roll an odd number with a fair dice"}$ contain? //(p=0.5; I=1 bit)// \\ \\    - What is the probability of rolling an odd number with a fair dice? How many bits of information does the statement $\text{"we roll an odd number with a fair dice"}$ contain? //(p=0.5; I=1 bit)// \\ \\ 
-  - We roll two identical dice twice. What is the probability that we first roll at least one 3, and then immediately roll a 2 and a 3 at the same time (one die shows 2 and the other 3)?  //(p=0.18)// \\ \\+  - We roll two identical dice twice. What is the probability that we first roll at least one 3, and then immediately roll a 2 and a 3 at the same time (one die shows 2 and the other 3)?  //(p=0.02)// \\ \\
   - An urn contains 2 white and 2 black balls.   - An urn contains 2 white and 2 black balls.
     - What is the probability of drawing two white balls consecutively without replacement? //(0.17)//     - What is the probability of drawing two white balls consecutively without replacement? //(0.17)//
-    - How many bits of information does the statement $\text{"we draw 2 white balls consecutively without replacement from an urn containing 2 white and 2 black balls"}$ contain? //(2.58 bits)//+    - How many bits of information does the statement $\text{"we draw 2 white balls consecutively without replacement from an urn containing 2 white and 2 black balls"}$ contain? //(2.59 bits)//
     - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)//     - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)//
     - What is the entropy of the set of information for a single draw? //(1 bit)//     - What is the entropy of the set of information for a single draw? //(1 bit)//
Line 15: Line 15:
     - What is the information content if we select 5 out of 10 units and all are defect-free? //(1 bit)// \\ \\     - What is the information content if we select 5 out of 10 units and all are defect-free? //(1 bit)// \\ \\
   - In a manufacturing process, the expected defect rate is 0.25. After producing 20 units, we randomly and simultaneously select 2 of them and inspect them:   - In a manufacturing process, the expected defect rate is 0.25. After producing 20 units, we randomly and simultaneously select 2 of them and inspect them:
-    - How many bits of information has the news that $\text{"we selected 2 defective parts"}$ contain? //(4.24 bits)// +    - How many bits of information has the news that $\text{"we selected 2 defective parts"}$ contain? //(4.25 bits)// 
-    - How many elements does the news set have (i.e., what are the number of possible outcomes)? What is the entropy of the set? //(n=3; H=1.22 bits)// \\ \\+    - How many elements does the news set have (i.e., what are the number of possible outcomes)? What is the entropy of the set? //(n=3; H=1.23 bits)// \\ \\
   - In a communication system, we want to transmit 2-character symbolic words using the characters $A$, $B$, and $C$, with each word having equal probability, using a binary fixed-length code.   - In a communication system, we want to transmit 2-character symbolic words using the characters $A$, $B$, and $C$, with each word having equal probability, using a binary fixed-length code.
     - How many such symbolic code words can be formed? //(9)//     - How many such symbolic code words can be formed? //(9)//
tanszek/oktatas/techcomm/conditional_probability_and_information_theory.1762798327.txt.gz · Last modified: 2025/11/10 18:12 by kissa