English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Can anyone out there do this, because I'm really stuck?

The new cable and TV channel Channel#6 is keen to establish what opinion the viewing public have of its sunday night films. It has undertaken a survey of 100 subscribers and found that 70 like it and 30 don't.

Assuming a constant proportion (in the recent survey) of 70% liking the sunday films, how large a sample is required to detect at 99.5% level that the proportion of subscribers liking the sunday night films has risen from an earlier proportion of 60%?

2006-11-22 04:06:48 · 1 answers · asked by joemoran7 2 in Science & Mathematics Mathematics

1 answers

The test statistic for testing one sample proportion is

z = (phat-p)/sqrt(p(1-p)/n)

Now phat = 0.7 since that value came from your sample. We'll assume that proportion carries through regardless of the sample size.
p = 0.6 since that what you would assume the value is in the null hypothesis.
z = 2.576. Now you said 99.5% level. I'm assuming you mean a 0.5% significance level from that. On the standard normal, 0.5% is found above 2.576

So plug all of those values in and solve for n

2.576 = (0.7-0.6)/sqrt(0.6*0.4/n)
2.576sqrt(0.24/n) = 0.1
sqrt(0.24/n) = 0.1/2.576 = 0.0388
0.24/n = 0.0388^2 = 0.00151
n/0.24 = 1/0.00151 = 663.5776
n = 663.5776*0.24 = 159.258624

Which you should round up every time. You would need a sample size of 160 to make that determination.

2006-11-22 04:28:21 · answer #1 · answered by blahb31 6 · 1 0

fedest.com, questions and answers