English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-10-16 13:57:27 · 3 answers · asked by R dog 2 in Science & Mathematics Weather

3 answers

Relative humidity is the amount of moisture in the air that's relative to the air temperature. It's measured in percent. To determine the percentage you need a dry bulb temperature and a wet bulb temperature reading. When the temperature of the dry bulb and the wet bulb is the same the humidity is 100%. The further apart the dry bulb temp and the wet bulb temp the lower humidity.

2007-10-16 14:30:43 · answer #1 · answered by Anonymous · 0 0

First, you have to understand that at different air temperatures, the air can "saturate" to 100% and cause precipitation (rain, snow, sleet). It is the water vapor content of the air RELATIVE to the air temperature and the amount of water vapor needed to saturate and cause precipitation.

Relative humidity refers to the current percent of humidity in the air, at the current temperature of the air, that would cause precipitation at the current temperature.

This is a pretty simplified explanation but should give you a handle the terminology.

So if the weather man says its 80 degrees outside and the relative humidity is 50%, then the air is halfway to the saturation point for rain at that temperature.

For a deeper understanding, research "Dew Point".

2007-10-16 14:34:03 · answer #2 · answered by andyfnp 1 · 0 0

Relative Humidity is the degree of saturation of air.In other words, it can be taken as he percentage of moisture in the air.The more correct definition follows.
Relative humidity is defined as the ratio of the amount of water vapour present in the air to the amount of water vapour required to saturate it at a particular temperature.
It is calculated using dry bulb temperatre and wet bulb temperature.

2007-10-17 22:56:51 · answer #3 · answered by Arasan 7 · 0 0

fedest.com, questions and answers