In HVAC control conversations, one of the most common questions customers face is deceptively simple:

“Do you need Enthalpy or Dry Bulb sensors?”

Before we compare the two, it’s important to understand a few fundamentals about temperature, humidity, and how the human body perceives comfort. This foundation will make it much clearer why these two sensor types behave differently—and why it matters for rooftop units and economizer systems.

Relative Humidity vs. Dew Point — What’s Really Going On?

Most people are familiar with relative humidity, but it’s not actually the best indicator of how humid air feels. Dew point is the better measurement.

Why Relative Humidity Can Be Misleading

Relative humidity (RH) represents how much water vapor the air can hold at a given temperature. Warmer air can store a much higher volume of moisture than cooler air.

Example:

  • 50% RH at 55°F contains far less moisture than
  • 50% RH at 95°F

So identical RH values can represent dramatically different moisture levels.

Dew Point — The Temperature of Saturation

Dew point is the temperature at which air becomes fully saturated (100% RH) and water begins to condense.

  • At 55°F and 50% RH Dew point is 37°F
  • At 95°F and 50% RH Dew point is 74°F

When the dew point equals the actual air temperature, the air is fully saturated. Fog, dew, or cloud formation begins at this point.

How Dew Point Relates to Human Comfort

Dew point is the best comfort indicator.

Comfort Scale

Dew Point

Comfort Perception

55°F or less

Dry

55–60°F

Comfortable

60–64°F

Rather Humid

65–69°F

Humid

70–75°F

Very Humid

75°F+

Miserable

This explains why a humid 95°F summer day feels oppressive even at “only” 50% RH—because the dew point is extremely high.

Dry Bulb Temperature — The Standard Measurement

A normal outdoor thermometer measures Dry Bulb temperature. It’s what most people think of as the “outside temperature.”

  • It does not account for humidity
  • It is measured using a thermometer exposed to air but not influenced by moisture

Dry Bulb temperature is the reference point used by basic economizers and many HVAC controls.

Wet Bulb Temperature — The Foundation of Enthalpy

To understand Enthalpy sensors, you must understand Wet Bulb temperature.

Wet Bulb Explained

A wet bulb reading simulates saturated air by placing wet material over the temperature probe. As air passes over it:

  • Water evaporates from the material
  • The rate of evaporation depends on humidity
  • The probe cools to a lower temperature if humidity is low
  • The probe cools less if humidity is high

Thus:

Wet Bulb temperature is always equal to or lower than Dry Bulb temperature.

This is the exact principle behind perspiration cooling on human skin.

Enthalpy — Heat + Moisture = Total Heat Energy

Enthalpy considers:

  • Dry Bulb temperature

plus

  • Moisture content (humidity)

This creates a far more accurate picture of an air mass’s total heat energy. For economizers, this matters because air with high moisture content can seem “cool enough” by dry bulb alone but may still be unsuitable for “free cooling.”

Quick Definitions Recap

  • Dry Bulb — Standard air temperature, unaffected by moisture.
  • Wet Bulb — The lowest temperature achievable through evaporation; a measurement of humidity.
  • Dew Point — The temperature where air becomes fully saturated (100% RH).
  • Enthalpy — Total heat content of the air, measuring both temperature and humidity.

So—Enthalpy or Dry Bulb?

This first article provides the foundational science. In Part II, we’ll dive into real-world HVAC applications:

  • When Enthalpy sensors provide accurate free cooling
  • When Dry Bulb is sufficient
  • How economizer control packages use these values
  • How humidity impacts comfort, energy savings, and mixed-air control