Math  /  Data & Statistics

QuestionQuestion: Poisson Distribution
Let X1,X2,,XnX_{1}, X_{2}, \ldots, X_{n} be independent and identically distributed (i.i.d.) random variables, where each XiX_{i} follows a Poisson distribution with parameter λ>0\lambda>0. The probability mass function (PMF) for a Poisson random variable is given by:
Likelihood Estimation fo fX(x;λ)=λxeλx!,x=0,1,2,f_{X}(x ; \lambda)=\frac{\lambda^{x} e^{-\lambda}}{x!}, \quad x=0,1,2, \ldots where λ\lambda is the rate parameter of the Poisson distribution. (a) Write the likelihood function L(λ)L(\lambda) for the sample X1,X2,,XnX_{1}, X_{2}, \ldots, X_{n}. (b) Derive the log-likelihood function (λ)=lnL(λ)\ell(\lambda)=\ln L(\lambda). (c) Find the Maximum Likelihood Estimator (MLE) for λ\lambda by solving e(λ)λ=\frac{\partial e(\lambda)}{\partial \lambda}= 0 . (d) Verify that the second derivative of the log-likelihood function at the MLE is negative, confirming that the MLE is indeed a maximum. (e) Find the Fisher information for λ,I(λ)=E[2(λ)λ2]\lambda, I(\lambda)=-E\left[\frac{\partial^{2} \ell(\lambda)}{\partial \lambda^{2}}\right]. (f) Using the MLE and Fisher information, calculate the Cramer-Rao lower bound for the variance of the MLE.

Studdy Solution
The Cramer-Rao lower bound for the variance of the MLE is given by the inverse of the Fisher information:
Var(λ^)1I(λ)=λn\text{Var}(\hat{\lambda}) \geq \frac{1}{I(\lambda)} = \frac{\lambda}{n}
Using λ^=Xˉ\hat{\lambda} = \bar{X}, the bound becomes:
Var(λ^)Xˉn\text{Var}(\hat{\lambda}) \geq \frac{\bar{X}}{n}

View Full Solution - Free
Was this helpful?

Studdy solves anything!

banner

Start learning now

Download Studdy AI Tutor now. Learn with ease and get all help you need to be successful at school.

ParentsInfluencer programContactPolicyTerms
TwitterInstagramFacebookTikTokDiscord