I want to do an experiment that clearly proves that when you have a limited amount of a heat absorbing materiel (like water), that for the maximum cooling efficiency, it's better to wait before using it to cool the hot object or whatever it is. I did this little one yesterday:
I took two glass beakers with 200mL of 99C water in each. I added 40mL of 25C water to each at separate times. Here are the results:
A (Room temp water added at 1:00 min)
B (Room temp water added at 9:00 min)
Minute [ Temp A(C) / Temp B(C) ]
0 [ 99 / 99 ]
1 [ 87** / 95 ]
2 [ 85 / 91 ]
3 [ 83 / 88 ]
4 [ 81 / 86 ]
5 [ 79 / 84 ]
6 [ 78 / 82 ]
7 [ 77 / 80 ]
8 [ 76 / 79 ]
9 [ 75 / 70**]
10 [ 74 / 69 ]
**-room temp water added
I need help brainstorming ideas for a better test.
2007-08-14
08:04:44
·
3 answers
·
asked by
Anonymous
in
Science & Mathematics
➔ Physics
Gnik: Losses to ambient are the whole point of the test. The room was 25C.
2007-08-14
09:15:46 ·
update #1
The two 40mL beakers were in equilibrium with their surroundings. I filled them from an open container of water that had been sitting there for over 24 hours.
2007-08-14
09:18:04 ·
update #2