The first modern air conditioning system was developed in 1902 by a young electrical engineer named Willis Haviland Carrier. It was designed to solve a humidity problem at the Sackett-Wilhelms Lithographing and Publishing Company in Brooklyn, N.Y. Paper stock at the plant would sometimes absorb moisture from the warm summer air, making it difficult to apply the layered inking techniques of the time. Carrier treated the air inside the building by blowing it across chilled pipes. The air cooled as it passed across the cold pipes, and since cool air can’t carry as much moisture as warm air, the process reduced the humidity in the plant and stabilized the moisture content of the paper. Reducing the humidity also had the side benefit of lowering the air temperature — and a new technology was born.
Carrier realized he’d developed something with far-reaching potential, and it wasn’t long before air-conditioning systems started popping up in theaters and stores, making the long, hot summer months much more comfortable
The actual process air conditioners use to reduce the ambient air temperature in a room is based on a very simple scientific principle. The rest is achieved with the application of a few clever mechanical techniques. Actually, an air conditioner is very similar to another appliance in your home — the refrigerator. Air conditioners don’t have the exterior housing a refrigerator relies on to insulate its cold box. Instead, the walls in your home keep cold air in and hot air out.
Air conditioners use refrigeration to chill indoor air, taking advantage of a remarkable physical law: When a liquid converts to a gas (in a process called phase conversion), it absorbs heat. Air conditioners exploit this feature of phase conversion by forcing special chemical compounds to evaporate and condense over and over again in a closed system of coils.