kb:estimation_methods

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
kb:estimation_methods [2022-02-12 20:08] jaeyoungkb:estimation_methods [2024-04-30 04:03] (current) – external edit 127.0.0.1
Line 36: Line 36:
 For a given probability distribution $\mathbb{P}$ with parameter $\theta$, we can extract feature(s) $h^\theta = g(\mathbb{P}^\theta)$. We can also calculate the features for the empirical distribution $\hat{h} = g(\hat{\mathbb{P}})$. Then solve for $\theta$ by setting $h^\theta = \hat{h}$. For a given probability distribution $\mathbb{P}$ with parameter $\theta$, we can extract feature(s) $h^\theta = g(\mathbb{P}^\theta)$. We can also calculate the features for the empirical distribution $\hat{h} = g(\hat{\mathbb{P}})$. Then solve for $\theta$ by setting $h^\theta = \hat{h}$.
  
 +==== Method of moments ====
 +
 +Moments of distributions are commonly used as features for feature matching. The $k$-th moment of a random variable $X$ is $\mathbb{E}[X^k]$.
 +
 +To estimate the moment from empirical data $X_1, ... X_n$, replace expectation with the average:
 +
 +$$ \hat{\mathbb{E}}[X^k] = \frac{1}{n} \sum_{i=1}^n X_i^k $$
 ===== Maximum likelihood estimator ===== ===== Maximum likelihood estimator =====
  
  • kb/estimation_methods.1644696516.txt.gz
  • Last modified: 2024-04-30 04:03
  • (external edit)