tanszek:oktatas:techcomm:information
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revision | |||
tanszek:oktatas:techcomm:information [2025/10/06 20:47] – [Example of Entropy calculation] knehez | tanszek:oktatas:techcomm:information [2025/10/14 06:25] (current) – [Entropy] knehez | ||
---|---|---|---|
Line 45: | Line 45: | ||
The average information content of the set of messages is called the //entropy// of the message set. | The average information content of the set of messages is called the //entropy// of the message set. | ||
- | $$ H_E = \sum_{i=1}^n p_i \cdot I_{E_i} = \sum_{i=1}^n p_i \cdot \log_2 \frac{1}{p_i} = - \sum_{i=1}^n p_i \cdot \log_2 p_i$$ | + | $$ H_E = \sum_{i=1}^n p_i \cdot I_{E_i} = \sum_{i=1}^n p_i \cdot \log_2 \frac{1}{p_i} = - \sum_{i=1}^n p_i \cdot \log_2 p_i [bit]$$ |
**Example**: | **Example**: |
tanszek/oktatas/techcomm/information.txt · Last modified: 2025/10/14 06:25 by knehez