Math  /  Data & Statistics

Question1. (5.2 MLE) A random sample of size n=9n=9 is taken from the probability function: pY(k;λ)=λk(1λ)1k,k=0,1;0<λ<1p_{Y}(k ; \lambda)=\lambda^{k}(1-\lambda)^{1-k}, k=0,1 ; \quad 0<\lambda<1
The samples are Y1=1,Y2=1,Y3=1,Y4=0,Y5=0,Y6=1,Y7=1,Y8=0,Y9=1Y_{1}=1, Y_{2}=1, Y_{3}=1, Y_{4}=0, Y_{5}=0, Y_{6}=1, Y_{7}=1, Y_{8}=0, Y_{9}=1. Find the maximum likelihood estimate λ^\hat{\lambda} for λ\lambda.

Studdy Solution

STEP 1

What is this asking? We're trying to find the best guess for a hidden probability (λ\lambda) based on some observations of ones and zeros. Watch out! Don't mix up the number of ones and zeros with the *probability* of getting a one or a zero.

STEP 2

1. Define the likelihood function
2. Take the logarithm of the likelihood function
3. Optimize the log-likelihood

STEP 3

Alright, let's **define the likelihood function**, which tells us how likely our observations are given a specific value of λ\lambda.
Think of it like this: we've flipped a weird coin 9 times, and we got some ones (heads) and some zeros (tails).
We want to figure out the probability (λ\lambda) of getting a one (heads) that best explains our results!

STEP 4

The probability function for a single observation YiY_i is given by pY(k;λ)=λk(1λ)1kp_{Y}(k ; \lambda)=\lambda^{k}(1-\lambda)^{1-k}.
Since our samples are independent, the likelihood function is the product of the individual probabilities for each observation:
L(λ)=i=19λYi(1λ)1Yi L(\lambda) = \prod_{i=1}^{9} \lambda^{Y_i}(1-\lambda)^{1-Y_i}

STEP 5

Now, products can be tricky to work with, so let's take the **natural logarithm** of the likelihood function.
This turns the product into a sum, which is *way* easier to handle, and it doesn't change where the maximum occurs!
This gives us the **log-likelihood**:
lnL(λ)=i=19ln(λYi(1λ)1Yi)=i=19[Yiln(λ)+(1Yi)ln(1λ)] \ln L(\lambda) = \sum_{i=1}^{9} \ln(\lambda^{Y_i}(1-\lambda)^{1-Y_i}) = \sum_{i=1}^{9} [Y_i \ln(\lambda) + (1-Y_i) \ln(1-\lambda)]

STEP 6

Let's simplify this further.
We know the values of our YiY_i’s.
We have six ones and three zeros.
So, the sum becomes:
lnL(λ)=6ln(λ)+3ln(1λ) \ln L(\lambda) = 6 \ln(\lambda) + 3 \ln(1-\lambda)

STEP 7

To find the **maximum likelihood estimate**, we need to find the value of λ\lambda that maximizes our log-likelihood.
We do this by taking the derivative with respect to λ\lambda and setting it equal to zero.
ddλlnL(λ)=6λ31λ \frac{d}{d\lambda} \ln L(\lambda) = \frac{6}{\lambda} - \frac{3}{1-\lambda}

STEP 8

Setting the derivative equal to zero and solving for λ\lambda:
6λ31λ=0 \frac{6}{\lambda} - \frac{3}{1-\lambda} = 0 6λ=31λ \frac{6}{\lambda} = \frac{3}{1-\lambda} 6(1λ)=3λ 6(1-\lambda) = 3\lambda 66λ=3λ 6 - 6\lambda = 3\lambda 6=9λ 6 = 9\lambda λ^=69=23 \hat{\lambda} = \frac{6}{9} = \frac{2}{3}

STEP 9

Our **maximum likelihood estimate** for λ\lambda is λ^=23\hat{\lambda} = \frac{2}{3}.
So, based on our observations, the best guess for the probability of getting a one is 23\frac{2}{3}!

Was this helpful?

Studdy solves anything!

banner

Start learning now

Download Studdy AI Tutor now. Learn with ease and get all help you need to be successful at school.

ParentsInfluencer programContactPolicyTerms
TwitterInstagramFacebookTikTokDiscord