User Tools

Site Tools


tanszek:oktatas:techcomm:formulas_for_mathematical_exercises

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
tanszek:oktatas:techcomm:formulas_for_mathematical_exercises [2024/09/06 12:28] – [Combinatorics] kissatanszek:oktatas:techcomm:formulas_for_mathematical_exercises [2024/10/15 18:47] (current) – [Information Theory] kissa
Line 13: Line 13:
  
 ==== Information Theory ==== ==== Information Theory ====
-^ Notation  ^ Value  ^ Formula ^ +^ Notation     ^ Value                                                                                               ^ Formula                                                                                                                         
-| $$I(A)$$ | Information content or self-information of an event A. | $$I(A) = -\log_2 P(A)  \text{ [bits]}$$ | +| $$I(A)$$     | Information content or self-information of an event A.                                              | $$I(A) = -\log_2 P(A) = \log_2 \frac{1}{P(A)}  \text{ [bits]}$$                                                                 
-| $$H(X)$$ | Entropy, which measures the average amount of information (or uncertainty) in a random variable X. | $$H(X) = -\sum_{x \in X} P(x) \log_2 P(x)  \text{ [bits]}$$ | +| $$H(X)$$     | Entropy, which measures the average amount of information (or uncertainty) in a random variable X.  | $$H(X) = -\sum_{x \in X} P(x) \log_2 P(x) = \sum_{x \in X} P(x) \log_2 \frac{1}{P(x)}  \text{ [bits]}$$                         
-| $$H_{max}$$ | Maximum possible entropy (when all outcomes are equally likely). | $$H_{\text{max}} = \log_2 |\mathcal{X}|$$ $$|\mathcal{X}| \text{ is the number of possible outcomes in the set } \mathcal{X}$$ | +| $$H_{max}$$  | Maximum possible entropy (when all outcomes are equally likely).                                    | $$H_{\text{max}} = \log_2 |\mathcal{X}|$$ $$|\mathcal{X}| \text{ is the number of possible outcomes in the set } \mathcal{X}$$  
-| $$R(X)$$ | Redundancy, which measures the portion of duplicative information within a message. | $$R(X) = 1 - \frac{H(X)}{\log_2 |X|}$$  In terms of maximum entropy: $$R = \frac{H_{\text{max}} - H}{H_{\text{max}}}$$ |+| $$R(X)$$     | Redundancy, which measures the portion of duplicative information within a message.                 | $$R = \frac{H_{\text{max}} - H}{H_{\text{max}}}$$          |
 ==== Combinatorics ==== ==== Combinatorics ====
  
-Notation  Value  Formula +^ ^ without repetition  with repetition 
-| $$$$ | | $$$$ |+**Permutations** \\ number of all possible arrangements of $nelements | $$P_n = n!$$ | $$P_n^{k_1, k_2,...k_r} = \frac{n!}{k_1! \cdot k_2! \cdot ... \cdot k_r!}$$ | 
 +| **Variations** \\ the number of all possible arrangements of any $kelements from $nelements | $$V_n^k=\frac{n!}{(n-k)!}$$ | $$\overline{V}_n^k=n^k$$ | 
 +| **Combinations** \\ number of ways to choose $k$ items from $n$ items, regardless of order | $$C_n^k=\binom{n}{k}=\frac{n!}{k! \cdot (n - k)!}$$ | $$\overline{C}_n^k=\binom{n+k-1}{k}$$ | 
 + 
 +=== What formula to use? === 
 +| | |  **Repetition**   || 
 +| | |  //Not possible//  |  //Possible// 
 +| **Order** | //Matters// |  $$V_n^k$$ (variation without repetition)  |  $$\overline{V}_n^k$$ (variation with repetition)  |  
 +| ::: | //Doesn't matter// |  $$C_n^k$$ (combination without repetition)  |  $$\overline{C}_n^k$$ (combination with repetition)  |
tanszek/oktatas/techcomm/formulas_for_mathematical_exercises.1725625686.txt.gz · Last modified: 2024/09/06 12:28 by kissa