A History of Drinking Water Treatment
In today’s modern society, it seems to be an assumption that city-supplied tap water, if not always the healthiest specimen of drinking water, has at least gone through some disinfection and purification processes prior to distribution. Most residents of the United States turn on their faucets feeling confident they can drink the water without contracting a waterborne disease or dying immediately. However, it is only a relatively new phenomenon for water consumers to receive treated water as an inherent right of municipal residence. For hundreds of years, as water treatment methods have evolved, the quality of municipal drinking water has developed from a relatively sketchy product to a strictly regulated commodity.
Ancient Water Treatment
The first documented attempts to treat drinking water are recorded in ancient Greek and Sanskrit writings that date back to 2000 B.C. At this time, people were aware that boiling water helped to purify it and that filtration and straining methods helped to reduce visible particles and turbidity in water. Because nothing was known about microorganisms or chemical contaminants (which would remain unseen in water until the seventeenth century), the motive for treating water was to make it smell and taste better. The Greek scientist Hippocrates, who invented the first cloth bag filter around 500 B.C, also believed that if water tasted and smelled clean, it must be healthful for the body. His invention, called the “Hippocratic sleeve,” was one of the first domestic water filters (Baker & Taras 1981).
Discovery of Microorganisms
Throughout the 1700s, as people began to understand more about the dangers of drinking water contaminants, domestic water filter units made from wool, sponge, and charcoal began to be used in individual homes. In the year 1804, the first large municipal water treatment plant was installed in Scotland in order to provide treated water to every resident (Baker & Taras 1981). This revolutionary installation prompted the idea that all people should have access to clean drinking water. However, it would be some time before this ambitious idea would be implemented widely throughout the world.
Cholera and Chlorination
In 1854, the British scientist John Snow found that the cholera disease was spread through contaminated water, a discovery that would greatly impact the future of water treatment and disinfection. While studying cholera epidemics in municipal areas of England, Snow noticed that regions that used slow sand filtration before distributing water tended toward fewer cholera cases. Eventually, he was able to trace the outbreaks of cholera to a particular water pump that had been contaminated by raw sewage. Snow used chlorine to kill the cholera bacteria in the water, leading to the rise of water chlorination as an effective disinfection process. His work also revolutionized the prevalent theory that good-tasting and odorless water naturally meant it was healthful and safe. Because the contaminated water had contained no detectable taste or odor, Snow surmised that water quality could not be established by that criteria alone. After his findings were published, several cities began to treat all water with sand filters and chlorine before distributing it to the public.
In the late nineteenth century, municipal water treatment began to take hold in the United States. Technicians started experimenting with rapid, as opposed to slow, sand filtration and found the process to be much more efficient and effective. Also, the overall capacity and lifetime of the filter could be improved by cleaning it with a powerful steam jet, thus increasing the number of residents who could be served by one treatment plant. As a result of increased water treatment and chlorination within several U.S. cities and around the world, the outbreak of such waterborne diseases as cholera and typhoid rapidly decreased in the early twentieth century.
Softening and Ion Exchange
First Government Regulations
As municipal water treatment eventually became a common practice in most U.S. cities, federal and state governments began to recognize the importance of drinking water standards for municipalities. While some limited drinking water standards would be implemented as early as 1914 (EPA 2000), it would not be until the 1940s that federal drinking water standards were widely applied. But the most comprehensive federal regulations and standards for the water treatment industry were implemented in the 1970s, in reaction to a huge increase in environmental concerns in the country. In 1972, the Clean Water Act passed through Congress and became law , requiring industrial plants to proactively improve their waste procedures in order to limit the effect of contaminants on freshwater sources. In 1974, the Safe Drinking Water Act was adopted by all 50 U.S. states for the regulation of public water systems within their jurisdictions. This law specified a number of contaminants that must be closely monitored in water and reported to residents should they exceed the maximum contaminant levels allowed by the federal government. Drinking water systems are now closely monitored by federal, state, and municipal governments for safety and compliance with existing regulations.
Water Treatment Today
-- Posted April 30, 2007
Baker, M.N. and Taras, Michael J. 1981. The Quest for Pure Water: A History of the Twentieth Century, Volume 1 and 2. Denver: AWWA.
Christman, Keith. 1998. The History of Chlorine. Waterworld, 14 (8), 66-67.
EPA. 2000. The History of Drinking Water Treatment. Environmental Protection Agesncy, Office of Water (4606), Fact Sheet EPA-816-F-00-006, United States.Outwater, Alice. 1996. Water: A Natural History. New York: Basic Books.