้ข˜็›ฎ
้ข˜็›ฎ

BU.232.630.W6.SP25 sample_quiz_3

ๅ•้กน้€‰ๆ‹ฉ้ข˜

Consider the likelihood of an i.i.d. sample from a Bernoulli population with parameter ๐‘ ๐ฟ ( ๐‘ฅ 1 , . . . , ๐‘ฅ ๐‘‡ ) = โˆ ๐‘ก = 1 ๐‘‡ ๐‘ ๐‘ฅ ๐‘ก ( 1 โˆ’ ๐‘ ) 1 โˆ’ ๐‘ฅ ๐‘ก . If you estimate the parameter ๐‘ using a Maximum Likelihood estimator, you obtain the point estimate ๐‘ ฬ‚ = 1 ๐‘‡ โˆ‘ ๐‘ก = 1 ๐‘‡ ๐‘ฅ ๐‘ก , which corresponds to the sample mean. We know that for a Bernoulli random variable the expected value and the variance are ๐”ผ ( ๐‘ฅ ๐‘ก ) = ๐‘ , ๐• ( ๐‘ฅ ๐‘ก ) = ๐‘ ( 1 โˆ’ ๐‘ ) . Using this information, what is the variance of the estimator ๐• ( ๐‘ ฬ‚ ) ?

ๆŸฅ็œ‹่งฃๆž

ๆŸฅ็œ‹่งฃๆž

ๆ ‡ๅ‡†็ญ”ๆกˆ
Please login to view
ๆ€่ทฏๅˆ†ๆž
The question presents an i.i.d. Bernoulli(p) sample X1, X2, ..., XT and defines the MLE for p as the sample mean p_hat = (1/T) sum_t X_t. To find Var(p_hat), we can use the properties of independent Bernoulli trials. Each X_t has Var(X_t) = p(1 โˆ’ p). Since the X_t are independent, the variance of the sum is the sum......Login to view full explanation

็™ปๅฝ•ๅณๅฏๆŸฅ็œ‹ๅฎŒๆ•ด็ญ”ๆกˆ

ๆˆ‘ไปฌๆ”ถๅฝ•ไบ†ๅ…จ็ƒ่ถ…50000้“่€ƒ่ฏ•ๅŽŸ้ข˜ไธŽ่ฏฆ็ป†่งฃๆž,็Žฐๅœจ็™ปๅฝ•,็ซ‹ๅณ่Žทๅพ—็ญ”ๆกˆใ€‚

็ฑปไผผ้—ฎ้ข˜

ๆ›ดๅคš็•™ๅญฆ็”Ÿๅฎž็”จๅทฅๅ…ท

ๅŠ ๅ…ฅๆˆ‘ไปฌ๏ผŒ็ซ‹ๅณ่งฃ้” ๆตท้‡็œŸ้ข˜ ไธŽ ็‹ฌๅฎถ่งฃๆž๏ผŒ่ฎฉๅคไน ๅฟซไบบไธ€ๆญฅ๏ผ