User Tools

Site Tools


tanszek:oktatas:techcomm:formulas_for_mathematical_exercises

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
tanszek:oktatas:techcomm:formulas_for_mathematical_exercises [2024/09/06 12:02] – [Probability and Conditional Probability] kissatanszek:oktatas:techcomm:formulas_for_mathematical_exercises [2024/10/15 18:47] (current) – [Information Theory] kissa
Line 3: Line 3:
 ==== Probability and Conditional Probability ==== ==== Probability and Conditional Probability ====
  
-Value  Description  ^ Formula ^+Notation  Value  ^ Formula ^
 | $$P(A)$$ | Probability of event A occuring. | $$P(A) = \frac{\text{Number of favorable outcomes for } A}{\text{Total number of possible outcomes}}$$ | | $$P(A)$$ | Probability of event A occuring. | $$P(A) = \frac{\text{Number of favorable outcomes for } A}{\text{Total number of possible outcomes}}$$ |
 | $$P(A \mid B)$$ | Conditional probability of event A occurring, given that event B has occurred. | $$P(A \mid B) = \frac{P(A \cap B)}{P(B)}$$ | | $$P(A \mid B)$$ | Conditional probability of event A occurring, given that event B has occurred. | $$P(A \mid B) = \frac{P(A \cap B)}{P(B)}$$ |
Line 13: Line 13:
  
 ==== Information Theory ==== ==== Information Theory ====
 +^ Notation     ^ Value                                                                                               ^ Formula                                                                                                                         ^ 
 +| $$I(A)$$     | Information content or self-information of an event A.                                              | $$I(A) = -\log_2 P(A) = \log_2 \frac{1}{P(A)}  \text{ [bits]}$$                                                                 | 
 +| $$H(X)$$     | Entropy, which measures the average amount of information (or uncertainty) in a random variable X.  | $$H(X) = -\sum_{x \in X} P(x) \log_2 P(x) = \sum_{x \in X} P(x) \log_2 \frac{1}{P(x)}  \text{ [bits]}$$                         | 
 +| $$H_{max}$$  | Maximum possible entropy (when all outcomes are equally likely).                                    | $$H_{\text{max}} = \log_2 |\mathcal{X}|$$ $$|\mathcal{X}| \text{ is the number of possible outcomes in the set } \mathcal{X}$$ 
 +| $$R(X)$$     | Redundancy, which measures the portion of duplicative information within a message.                 | $$R = \frac{H_{\text{max}} - H}{H_{\text{max}}}$$          |
 ==== Combinatorics ==== ==== Combinatorics ====
 +
 +^ ^ without repetition  ^ with repetition ^
 +| **Permutations** \\ number of all possible arrangements of $n$ elements | $$P_n = n!$$ | $$P_n^{k_1, k_2,...k_r} = \frac{n!}{k_1! \cdot k_2! \cdot ... \cdot k_r!}$$ |
 +| **Variations** \\ the number of all possible arrangements of any $k$ elements from $n$ elements | $$V_n^k=\frac{n!}{(n-k)!}$$ | $$\overline{V}_n^k=n^k$$ |
 +| **Combinations** \\ number of ways to choose $k$ items from $n$ items, regardless of order | $$C_n^k=\binom{n}{k}=\frac{n!}{k! \cdot (n - k)!}$$ | $$\overline{C}_n^k=\binom{n+k-1}{k}$$ |
 +
 +=== What formula to use? ===
 +| | |  **Repetition**   ||
 +| | |  //Not possible//  |  //Possible//  |
 +| **Order** | //Matters// |  $$V_n^k$$ (variation without repetition)  |  $$\overline{V}_n^k$$ (variation with repetition) 
 +| ::: | //Doesn't matter// |  $$C_n^k$$ (combination without repetition)  |  $$\overline{C}_n^k$$ (combination with repetition)  |
tanszek/oktatas/techcomm/formulas_for_mathematical_exercises.1725624169.txt.gz · Last modified: 2024/09/06 12:02 by kissa