English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Lets pretend the mean weight of a cat is 10 stone (I know, its a pretty heavy cat haha) with a standard deviation of 1.5 stone. The mean weight of a dog is 8 stone with a standard deviation of 1.2 stone.

If I took one cat and one dog and put them in a box, how can I find the standard deviation of the difference in weight? How can I find what percent of dogs weigh more than cats?

2007-12-12 18:16:08 · 3 answers · asked by PikaPika 3 in Science & Mathematics Mathematics

3 answers

As long as there is no "bias" between the weight of dogs and the weight of cats distributions, the variances of the two distributions are additive
(1.5^2+1.2^2)=3.69, and the standard deviation of the new distribution (wt cats-wt dogs) is sqrt (3.69).

For part 2, you can turn your dog distriubtion. into a normalized distribution , and use your z parameter to find the percent of dogs that weight more than 10 stone.

Where did you get all those big cats in the UK?

2007-12-12 18:45:10 · answer #1 · answered by cattbarf 7 · 0 0

Cat;
mean = 10
standard deviation = 1.5

Dog;
mean = 8
standard deviation = 1.2

Well, I guess that you just have to subtract both the weigh and then multiply it with 100 and then divide it with the weight of the cat.

2007-12-13 03:07:42 · answer #2 · answered by Rai 4 · 0 0

You need to standardize each weight (z=x-mu/sigma) and then you can compare them.

2007-12-13 02:37:18 · answer #3 · answered by autumndaesy 2 · 1 0

fedest.com, questions and answers