As I write this post in my godforsaken part of North Carolina, it's a comfortable 61°F outside. The weather report also tells me that the relative humidity is 55% . For all intents and purposes, that means nothing. It's crap. Let me tell you why.

The relative humidity essentially tells you how close the air is to saturation. Saturation refers to the point at which the air is holding as much water vapor as it can; the air is said to reach 100% relative humidity when it is fully saturated with water vapor.

Everyone just calls it "humidity," but its real name is "relative humidity." We call it relative humidity because it's relative to the air temperature. As the air temperature goes up, relative humidity goes down. As the air temperature goes down, relative humidity goes up.

Take a look at this forecast from Weather Underground for my location this week.

The red line shows the temperature over the next 9 days, while the green line shows the humidity over the same period. As the temperature rises during the day, the humidity falls; the humidity rises as temperatures fall overnight. Barring any cold/warm fronts, rain, or thick cloud cover, the rise and fall of humidity through the day is predictable.

The relative humidity tricks you if you're not familiar with how it works. Take this example, for instance. You pick two days at random and they have the following weather conditions:

  1. 45°F with 100% humidity
  2. 90°F with 55% humidity

Which day has more moisture in the air?

Think of a balloon with a golf ball inside of it — the balloon represents the air, while the golf ball represents the amount of water vapor in the air. Air contracts when it's cold and expands when it's warm, so the water vapor takes up more of the air's total volume when it's cold and contracted, thus causing the percentage of the space it consumes to go up. As the air warms and expands, the water vapor takes up less space relative to the volume of the air, so the relative humidity is lower.

Since relative humidity measures the amount of water vapor in the air relative to its capacity, and warm air has a higher capacity to hold water vapor than cool air, relative humidity will drop like a rock when temperatures are hot.

Looking at the relative humidity alone to get an idea of what it feels like outside is like opening up a novel to the middle and only reading the last half of the book — you won't get the whole story. (Cheesy, I know. Work with me) The relative humidity combined with the temperature can give you a vague idea of how gross or comfortable the weather is, but there is another measure that gives you a much better idea of what it feels like.

If you're getting a respectable weather report, they'll also include something called the "dew point" immediately after the temperature. The dew point tells you to what temperature the air would have to cool to reach full saturation, or reach 100% relative humidity. Looking at the dew point is the best way to determine how much moisture is present.

If the dew point is 61°F, it means that the air temperature has to cool to 61°F to become fully saturated and reach a relative humidity of 100%. The warmer the air temperature gets — or the farther away from 61°F it climbs — the relative humidity will start to drop. As I noted above, this is why relative humidity often reaches the 90-100% range at night and quickly drops to 50% or below during the heat of the day.

Glancing at the dew point can give you a good idea of how comfortable or gross it will feel once you go outside. Here's a general scale to help:

  • < 50°F = comfortable
  • 50-60°F = noticeable moisture but still not bad
  • 60-65°F = what most people would call "humid"
  • 65-70°F = muggy, approaching uncomfortable
  • > 70°F = oppressive; can be dangerous above 75°F

The dew point is such a good measure of moisture in the air (and its effect on humans) that meteorologists use it to come up with the heat index, which, sorry Tom, is not a load of crap.

If the air temperature is 95°F but the dew point is 15°F, it's incredibly dry out and you're probably in Phoenix. The sweat will evaporate off of your skin almost instantly and you'll be able to cool off — so much so that you'll quickly become dehydrated if you're not careful.

If you're in New York City in July, however, and it's 100°F with a dew point of 68°F, the sweat has a very hard time evaporating off of your skin. If you're not careful, this could cause you to overheat, making you susceptible to heat illnesses that could seriously injure or kill you. This temp of 100°F with a dew point of 68°F leads to a heat index of 107°F, which is what it feels like to humans because of the moisture in the air.

Using relative humidity is a quick and snappy way to give the general public a vague idea of whether or not they'll have a bad hair day or if the pages in that freshly printed term paper will start to curl, but it's low information and just a bad way to get a true feel for how comfortable the weather will be on any particular day.

Let's go back to the example I used earlier. A 45°F day with 100% humidity has much less water vapor in the air than a 90°F day with 55% humidity, but most people would say the former sounds worse because 100 is bigger than 55, even though the latter equates to an oppressive dew point of 71°F.

It comes down to getting tricked by big numbers. If you use relative humidity instead of the dew point, you're getting duped. Don't let that happen.

[Images via AP / Weather Underground / / University at Albany]