User Tools

Site Tools


tanszek:oktatas:techcomm:formulas_for_mathematical_exercises

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
tanszek:oktatas:techcomm:formulas_for_mathematical_exercises [2024/09/06 14:15] – [Combinatorics] kissatanszek:oktatas:techcomm:formulas_for_mathematical_exercises [2024/10/15 18:47] (current) – [Information Theory] kissa
Line 13: Line 13:
  
 ==== Information Theory ==== ==== Information Theory ====
-^ Notation  ^ Value  ^ Formula ^ +^ Notation     ^ Value                                                                                               ^ Formula                                                                                                                         
-| $$I(A)$$ | Information content or self-information of an event A. | $$I(A) = -\log_2 P(A)  \text{ [bits]}$$ | +| $$I(A)$$     | Information content or self-information of an event A.                                              | $$I(A) = -\log_2 P(A) = \log_2 \frac{1}{P(A)}  \text{ [bits]}$$                                                                 
-| $$H(X)$$ | Entropy, which measures the average amount of information (or uncertainty) in a random variable X. | $$H(X) = -\sum_{x \in X} P(x) \log_2 P(x)  \text{ [bits]}$$ | +| $$H(X)$$     | Entropy, which measures the average amount of information (or uncertainty) in a random variable X.  | $$H(X) = -\sum_{x \in X} P(x) \log_2 P(x) = \sum_{x \in X} P(x) \log_2 \frac{1}{P(x)}  \text{ [bits]}$$                         
-| $$H_{max}$$ | Maximum possible entropy (when all outcomes are equally likely). | $$H_{\text{max}} = \log_2 |\mathcal{X}|$$ $$|\mathcal{X}| \text{ is the number of possible outcomes in the set } \mathcal{X}$$ | +| $$H_{max}$$  | Maximum possible entropy (when all outcomes are equally likely).                                    | $$H_{\text{max}} = \log_2 |\mathcal{X}|$$ $$|\mathcal{X}| \text{ is the number of possible outcomes in the set } \mathcal{X}$$  
-| $$R(X)$$ | Redundancy, which measures the portion of duplicative information within a message. | $$R(X) = 1 - \frac{H(X)}{\log_2 |X|}$$  In terms of maximum entropy: $$R = \frac{H_{\text{max}} - H}{H_{\text{max}}}$$ |+| $$R(X)$$     | Redundancy, which measures the portion of duplicative information within a message.                 | $$R = \frac{H_{\text{max}} - H}{H_{\text{max}}}$$          |
 ==== Combinatorics ==== ==== Combinatorics ====
  
Line 27: Line 27:
 === What formula to use? === === What formula to use? ===
 | | |  **Repetition**   || | | |  **Repetition**   ||
-| | | //Not possible// | //Possible// |+| | |  //Not possible//   //Possible//  |
 | **Order** | //Matters// |  $$V_n^k$$ (variation without repetition)  |  $$\overline{V}_n^k$$ (variation with repetition)  | **Order** | //Matters// |  $$V_n^k$$ (variation without repetition)  |  $$\overline{V}_n^k$$ (variation with repetition) 
 | ::: | //Doesn't matter// |  $$C_n^k$$ (combination without repetition)  |  $$\overline{C}_n^k$$ (combination with repetition)  | | ::: | //Doesn't matter// |  $$C_n^k$$ (combination without repetition)  |  $$\overline{C}_n^k$$ (combination with repetition)  |
tanszek/oktatas/techcomm/formulas_for_mathematical_exercises.1725632137.txt.gz · Last modified: 2024/09/06 14:15 by kissa