tanszek:oktatas:techcomm:formulas_for_mathematical_exercises
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
tanszek:oktatas:techcomm:formulas_for_mathematical_exercises [2024/09/06 14:19] – [Combinatorics] kissa | tanszek:oktatas:techcomm:formulas_for_mathematical_exercises [2024/10/15 18:47] (current) – [Information Theory] kissa | ||
---|---|---|---|
Line 13: | Line 13: | ||
==== Information Theory ==== | ==== Information Theory ==== | ||
- | ^ Notation | + | ^ Notation |
- | | $$I(A)$$ | Information content or self-information of an event A. | $$I(A) = -\log_2 P(A) \text{ [bits]}$$ | | + | | $$I(A)$$ |
- | | $$H(X)$$ | Entropy, which measures the average amount of information (or uncertainty) in a random variable X. | $$H(X) = -\sum_{x \in X} P(x) \log_2 P(x) \text{ [bits]}$$ | | + | | $$H(X)$$ |
- | | $$H_{max}$$ | Maximum possible entropy (when all outcomes are equally likely). | $$H_{\text{max}} = \log_2 |\mathcal{X}|$$ $$|\mathcal{X}| \text{ is the number of possible outcomes in the set } \mathcal{X}$$ | | + | | $$H_{max}$$ |
- | | $$R(X)$$ | Redundancy, which measures the portion of duplicative information within a message. | $$R(X) = 1 - \frac{H(X)}{\log_2 |X|}$$ | + | | $$R(X)$$ |
==== Combinatorics ==== | ==== Combinatorics ==== | ||
tanszek/oktatas/techcomm/formulas_for_mathematical_exercises.1725632340.txt.gz · Last modified: 2024/09/06 14:19 by kissa