English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories
3

Nielson Media Research wants to estimate the mean amount of time (in minutes) that full time college students spend watching television each weekday. Find the sample size necessary to estimate that mean with a 15-minute margin of error. Assume that a 95% confidence level is desired. Also assume that a pilot study showed that the standard deviation is estimated to be 112.2 minutes.

2007-06-30 16:49:23 · 1 answers · asked by heffie130 1 in Science & Mathematics Mathematics

1 answers

The formula that is used for finding the sample size necessary is

n = (z*sigma/E)^2, where

z is the multiplier determined by the standard normal for a particular level of significance. Since we want a 95% confidence interval, z = 1.96.

sigma is the population standard deviation, which is 112.2.

E is the desired margin of error, which is 15.

So

n = (1.96*112.2/15)^2
= 214.93905664

which should always be rounded up. You need a sample of size 215 or more.

2007-06-30 17:01:11 · answer #1 · answered by blahb31 6 · 3 0

fedest.com, questions and answers