Exercise 3.5 Solution Example - Hoff, A First Course in Bayesian Statistical Methods
標準ベイズ統計学 演習問題 3.5 解答例

Table of Contents

Answer

a)

The posterior distribution is proportional to the product of the prior mixture and the likelihood:

\begin{align*} p(\theta \mid y_1, \dots, y_n) &\propto \tilde{p}(\theta) p(y_1, \dots, y_n \mid \theta) \\ &= \sum_{k=1}^K w_k p_k(\theta) p(y_1, \dots, y_n \mid \theta) \end{align*}

Note that for each component:

\begin{align*} w_k p_k(\theta) p(y_1, \dots, y_n \mid \theta) &= w_k \kappa_k(n_k, t_k) c(\theta)^{n_k} e^{n_k t_k \phi(\theta)} \times \prod_{i=1}^n c(\theta) h(y_i) \exp \{\phi(\theta) t(y_i)\} \\ &= w_k \kappa_k(n_k, t_k) \prod_{i=1}^n h(y_i) \times c(\theta)^{n_k+n} e^{ \phi(\theta) (n_k t_k + \sum t(y_i))} \\ &= w_k \underbrace{ \left(\prod_{i=1}^n h(y_i)\right)\frac{\kappa_k(n_k,t_k)}{\kappa_k(n_k+n,t_k^\ast)}}_{ m_k(\bm y)} \;\times\; \underbrace{\kappa_k(n_k+n,t_k^\ast)\,c(\theta)^{n_k+n}e^{ \phi(\theta) (n_k t_k+\sum t(y_i))}}_{p_k(\theta\mid \bm y)} \\ &= w_k m_k(\bm y) p_k(\theta\mid \bm y) \end{align*}

where \[ t_k^\ast = \frac{n_k t_k + n \bar{t}(\bm{y})}{n_k + n}, \quad \bar{t}(\bm{y}) = \frac{1}{n} \sum_{i=1}^n t(y_i) \]

Thus, the posterior becomes:

\begin{align*} p(\theta \mid y_1, \dots, y_n) &\propto \sum_{k=1}^K w_k m_k(\bm y) p_k(\theta\mid \bm y) \end{align*}

Normalizing the weights so that they sum to unity, we obtain: \[ p(\theta \mid y_1, \dots, y_n) = \sum_{k=1}^K w_k' p_k(\theta\mid \bm y) \quad \text{where} \quad w_k' = \frac{w_k m_k(\bm y)}{\sum_{j=1}^K w_j m_j(\bm y)} \]

b)

This is a special case of part (a). The Poisson probability mass function is: \[p(y \mid \theta) = \frac{\theta^y e^{-\theta}}{y!}\] To express this in the general form of an exponential family, we rewrite \(\theta^y\) as \(\exp(y \log \theta)\): \[ p(y \mid \theta) = \underbrace{ \frac{1}{y!} }_{h(y)} \times \underbrace{e^{-\theta}}_{c(\theta)} \times \exp\{ \underbrace{\log \theta}_{\phi(\theta)} \times \underbrace{y}_{t(y)} \} \]

Furthermore, the Gamma prior is given by \(p_k(\theta \mid a_k, b_k) = \frac{b_k^{a_k}}{\Gamma(a_k)} \theta^{a_k-1} \exp \left( - b_k \theta \right)\). Comparing this to the conjugate prior form in (a):

\begin{align*} t(y_i) &= y_i \\ \phi(\theta) &= \log \theta\\ c(\theta) &= e^{-\theta} \\ h(y_i) &= \frac{1}{y_i!} \\ n_k &= b_k, \quad n_k t_k = a_k - 1 \end{align*}

Substituting these into the expression for \(m_k(\bm{y})\): \[ m_k(\bm{y}) = \left(\prod_{i=1}^n h(y_i)\right)\frac{\kappa_k(n_k,t_k)}{\kappa_k(n_k+n,t_k^\ast)} = \left(\prod_{i=1}^n \frac{1}{y_i!}\right) \frac{b_k^{a_k}}{\Gamma(a_k)} \frac{\Gamma(a_k + \sum y_i)}{(b_k + n)^{a_k + \sum y_i}} \]

Then the posterior distribution is: \[ p(\theta \mid y_1, \dots, y_n) = \sum_{k=1}^K w'_k p_k(\theta\mid \bm y), \] where

\begin{align*} p_k(\theta \mid \bm y) &\propto \left( e^{-\theta} \right)^{b_k+n} \times \exp \left\{ \log \theta \left(a_k - 1 + \sum y_i \right) \right\} \\ &= \theta^{a_k + \sum y_i - 1} \exp \left\{ - (b_k + n) \theta \right\} \\ &\propto \text{dgamma}(\theta \mid a_k + \sum y_i, b_k + n), \\ \\ w'_k &= \frac{w_k m^{\ast}_k(\bm y)}{\sum_{j=1}^K w_j m^{\ast}_j(\bm y)}, \quad m^{\ast}_k(\bm y) = \frac{b_k^{a_k}}{\Gamma(a_k)} \frac{\Gamma(a_k + \sum y_i)}{(b_k + n)^{a_k + \sum y_i}}. \end{align*}

Author: Kaoru Babasaki

Email: bbkaoru1007@keio.jp

Last Updated: 2026-03-21 土 19:43

home Home | ホーム | GitHub