User Tools

Site Tools


tanszek:oktatas:techcomm:information

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
tanszek:oktatas:techcomm:information [2025/10/06 20:47] – [Example of Entropy calculation] kneheztanszek:oktatas:techcomm:information [2025/10/14 06:25] (current) – [Entropy] knehez
Line 45: Line 45:
 The average information content of the set of messages is called the //entropy// of the message set. The average information content of the set of messages is called the //entropy// of the message set.
  
-$$ H_E = \sum_{i=1}^n p_i \cdot I_{E_i} = \sum_{i=1}^n p_i \cdot \log_2 \frac{1}{p_i} = - \sum_{i=1}^n p_i \cdot \log_2 p_i$$+$$ H_E = \sum_{i=1}^n p_i \cdot I_{E_i} = \sum_{i=1}^n p_i \cdot \log_2 \frac{1}{p_i} = - \sum_{i=1}^n p_i \cdot \log_2 p_i  [bit]$$
  
 **Example**: Given an event space consisting of two events: \( E = \{E_1, E_2\} \), and further \( p = \{p_1, p_2\} \) with \( p_2 = 1 - p_1 \), then the average information content is: **Example**: Given an event space consisting of two events: \( E = \{E_1, E_2\} \), and further \( p = \{p_1, p_2\} \) with \( p_2 = 1 - p_1 \), then the average information content is:
tanszek/oktatas/techcomm/information.txt · Last modified: 2025/10/14 06:25 by knehez