้ข็ฎ
BU.232.630.W4.SP25 sample_quiz_3
ๅ้กน้ๆฉ้ข
Consider the likelihood of an i.i.d. sample from a Bernoulli population with parameter ๐ ๐ฟ ( ๐ฅ 1 , . . . , ๐ฅ ๐ ) = โ ๐ก = 1 ๐ ๐ ๐ฅ ๐ก ( 1 โ ๐ ) 1 โ ๐ฅ ๐ก . If you estimate the parameter ๐ using a Maximum Likelihood estimator, you obtain the point estimate ๐ ฬ = 1 ๐ โ ๐ก = 1 ๐ ๐ฅ ๐ก , which corresponds to the sample mean. We know that for a Bernoulli random variable the expected value and the variance are ๐ผ ( ๐ฅ ๐ก ) = ๐ , ๐ ( ๐ฅ ๐ก ) = ๐ ( 1 โ ๐ ) . Using this information, what is the variance of the estimator ๐ ( ๐ ฬ ) ?
้้กน
A.The variance of
๐
ฬ
is
๐
(
๐
ฬ
)
=
๐
2
B.The variance of
๐
ฬ
is
๐
(
๐
ฬ
)
=
๐
(
1
โ
๐
)
๐
C.The variance of
๐
ฬ
is
๐
(
๐
ฬ
)
=
๐
(
1
โ
๐
)
D.The variance of
๐
ฬ
is
๐
(
๐
ฬ
)
=
๐ผ
(
๐
2
)
E.All the answers are incorrect.
ๆฅ็่งฃๆ
ๆ ๅ็ญๆก
Please login to view
ๆ่ทฏๅๆ
We start by considering what we know about the Bernoulli(p) population and the estimator p-hat. The data x1, ..., xT are i.i.d. Bernoulli(p), and p-hat = (1/T) โ xt is the sample mean. For a Bernoulli random variable, E[xt] = p and Var(xt) = p(1โp). Since p-hat is the average of T independent copies, its variance is Var(p-hat) = Var((1/T) โ xt) = (1/T^2) โ Var(xt) by independence, which simplifies to (1/T^2) ยท T ยท p(1โp) = p(1โp)/T.
Now let's ......Login to view full explanation็ปๅฝๅณๅฏๆฅ็ๅฎๆด็ญๆก
ๆไปฌๆถๅฝไบๅ จ็่ถ 50000้่่ฏๅ้ขไธ่ฏฆ็ป่งฃๆ,็ฐๅจ็ปๅฝ,็ซๅณ่ทๅพ็ญๆกใ
็ฑปไผผ้ฎ้ข
Consider a population with mean ฮผ and variance ฯ2<โ. Assume the following two estimators ห ฮผ 1 and ห ฮผ 2 for the mean of the population ฮผ, with the following expected values and variances E( ห ฮผ 1)=ฮผ;V( ห ฮผ 1)=5; E( ห ฮผ 1)=ฮผ+1;V( ห ฮผ 2)=2. We also know that the covariance between the two estimators is COV( ห ฮผ 1, ห ฮผ 2)=โ1. Now consider a new estimator that combines the two previous ones ห ฮผ 3= 1 3 ห ฮผ 1+ 2 3 ห ฮผ 2. Then the variance V( ห ฮผ 3) of ห ฮผ 3 is
Consider a population with mean ๐ and variance ๐ 2 < โ . You are comparing two estimators ๐ ฬ 1 and ๐ ฬ 2 for the mean of the population ๐ , with the following expected values and variances ๐ธ ( ๐ ฬ 1 ) = ๐ ; ๐ ( ๐ ฬ 1 ) = 9 ; ๐ธ ( ๐ ฬ 1 ) = ๐ + 1 ; ๐ ( ๐ ฬ 2 ) = 1 . We also know that the covariance between the two estimators is ๐ถ ๐ ๐ ( ๐ ฬ 1 , ๐ ฬ 2 ) = โ 2 . Now consider a new estimator that combines the two previous ones ๐ ฬ 3 = 1 4 ๐ ฬ 1 + 3 4 ๐ ฬ 2 . Then the variance ๐ ( ๐ ฬ 3 ) of ๐ ฬ 3 is
ไฝ็ฝฎ2็้ฎ้ข The variance of the estimator for E[Y]E\left\lbrack Y\right\rbrack at a given point x0x_0 decreases as the sample size increases.The variance of the estimator for E[Y]E\left\lbrack Y\right\rbrack at a given point x0x_0 decreases as the sample size increases.TrueFalse้ข็ฎ่งฃๆ
Consider a population with mean ฮผ and variance ฯ2<โ. Assume the following two estimators ห ฮผ 1 and ห ฮผ 2 for the mean of the population ฮผ, with the following expected values and variances E( ห ฮผ 1)=ฮผ;V( ห ฮผ 1)=5; E( ห ฮผ 1)=ฮผ+1;V( ห ฮผ 2)=2. We also know that the covariance between the two estimators is COV( ห ฮผ 1, ห ฮผ 2)=โ1. Now consider a new estimator that combines the two previous ones ห ฮผ 3= 1 3 ห ฮผ 1+ 2 3 ห ฮผ 2. Then the variance V( ห ฮผ 3) of ห ฮผ 3 is
ๆดๅค็ๅญฆ็ๅฎ็จๅทฅๅ ท
ๅธๆไฝ ็ๅญฆไน ๅๅพๆด็ฎๅ
ๅ ๅ ฅๆไปฌ๏ผ็ซๅณ่งฃ้ ๆตท้็้ข ไธ ็ฌๅฎถ่งฃๆ๏ผ่ฎฉๅคไน ๅฟซไบบไธๆญฅ๏ผ