Click on the arrow to see a solution.
Exercise 4.3
Let \(X_1,\ldots,X_n \vert \theta \overset{\mathrm{iid}}{\sim} \mathrm{Expon}(\theta)\).
a) Show that Jeffreys’ prior is \(p(\theta)\propto1/\theta\). Is it proper?
The likelihood is \[
p(x_1,\ldots,x_n\mid\theta) = \prod_{i=1}^n \theta e^{-\theta x_i} = \theta^{n}e^{-\theta\sum_{i=1}^{n}x_{i}}
\] with log likelihood \[
\log p(x_1,\ldots,x_n\mid\theta)=n\log\theta-\theta\sum_{i=1}^{n}x_{i}
\] and first derivative \[
\frac{d\log p(x_1,\ldots,x_n\mid\theta)}{d\theta}=\frac{n}{\theta}-\sum_{i=1}^{n}x_{i}
\] and second derivative \[
\frac{d^{2}\log p(x_1,\ldots,x_n\mid\theta)}{d\theta^{2}}=-\frac{n}{\theta^{2}}
\] so the Fisher information is \[
I(\theta)=E\left(-\frac{d^{2}\log p(x_1,\ldots,x_n\mid\theta)}{d\theta^{2}}\right)=\frac{n}{\theta^{2}}
\] and Jeffreys’ prior is therefore \[
p(\theta) = I(\theta)^{1/2} = \frac{\sqrt{n}}{\theta} \propto \frac{1}{\theta}.
\] Jeffreys’ prior is not proper for the iid exponential model since the integral of the prior diverges. Since \(1/\theta > 0\) for all \(\theta\), we can see that the integral diverges as follows: \[
\int_{0}^{\infty}\frac{1}{\theta}d\theta \, \geq
\int_{1}^{\infty}\frac{1}{\theta}d\theta :=
\lim_{a\rightarrow \infty} \int_1^a \frac{1}{\theta}d\theta
= \lim_{a\rightarrow \infty} [\log(\theta)]_1^a = \lim_{a \rightarrow \infty}\big(\log(a) - 0 \big) = \infty.
\]
b) Derive the posterior of \(\theta\) for Jeffreys’ prior. Is it proper?
The posterior distribution is obtained in the usual way with Bayes’ theorem \[
p(\theta \vert x_1,\ldots,x_n) = \theta^{n}e^{-\theta\sum_{i=1}^{n}x_{i}} \cdot \frac{1}{\theta} \propto \theta^{n-1}e^{-\theta\sum_{i=1}^{n}x_{i}}
\] which is proportional to a \(\mathrm{Gamma}(n,\sum_{i=1}^n x_i)\) density. A Gamma density is proper when both its hyperparameters are strictly positive, i.e. for any \(n \geq 1\) since all \(x_i>0\) with probability one in a gamma distribution and any realized data will therefore have \(\sum_{i=1}^n x_i > 0\).
c) Find a way to motivate the particular form of the Jeffreys prior as non-informative prior, in addition to its inherent invariance property.
One way to view the Jeffreys’ prior as non-informative is to consider the posterior from the conjugate prior \(\theta \sim \mathrm{Gamma}(\alpha,\beta)\), which from Exercise @q:iid_exp_conj is known to be \[
\theta \vert x_1,\ldots,x_n \sim \mathrm{Gamma}(\alpha + n, \beta +\sum_{i=1}^n x_i).
\] The conjugate prior can therefore be seen to contain the information equivalent to a prior sample of \(\alpha\) (imaginary) observations with a data sum equal to \(\beta\). Given the form of the Jeffreys’ posterior \(\mathrm{Gamma}(n,\sum_{i=1}^n x_i)\), we see that Jeffreys’ prior is equivalent to a prior sample of \(\alpha = 0\) observations.