tanszek:oktatas:techcomm:conditional_probability_and_information_theory
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2024/10/15 18:32] – kissa | tanszek:oktatas:techcomm:conditional_probability_and_information_theory [2024/10/15 18:33] (current) – kissa | ||
---|---|---|---|
Line 7: | Line 7: | ||
- How many bits of information does the statement $\text{" | - How many bits of information does the statement $\text{" | ||
- What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)// | - What is the entropy of this set of information (i.e., the average amount of information per news item)? //(1.25 bits)// | ||
- | - What is the entropy of the news set for a single draw? //(1 bit)// | + | - What is the entropy of the set of information |
- | - What would be the entropy of the set of information for two draws if the four balls were all different colors, i.e., one white, one black, one red, and one green? //(2.58 bits)// \\ \\ | + | - What would be the entropy of the set of information for two draws if the four balls were all different colors |
- An automated machine has a defect rate of 10% (it produces 10 defective units out of 100 units on average). | - An automated machine has a defect rate of 10% (it produces 10 defective units out of 100 units on average). | ||
- What is the information content of the news that, out of 10 items, 2 are selected and they are defect-free? | - What is the information content of the news that, out of 10 items, 2 are selected and they are defect-free? |
tanszek/oktatas/techcomm/conditional_probability_and_information_theory.1729017172.txt.gz · Last modified: 2024/10/15 18:32 by kissa