Efficient Plumbing Fixtures, and a Short History of Sewage

© Tom Otterness/tomotterness.net • Photo by James Dee

Cholera is one of the world’s more deadly diseases, causing acute dehydration.  It strikes so quickly that, without medical intervention, a person can die within 12 hours of showing its first symptoms.  In past centuries, its miseries were visited on tens of millions of people unaware that it was transmitted through contaminated water.  Endemic to Asia, it is estimated that 38 million Indians died from it in the century between 1817 and 1917.  As rail and sea travel became more prevalent in the world, the disease made the jump to the West.  It is the most formidable water-borne disease, with a 50% mortality rate if left untreated; but it is hardly the only disease spread by bad sanitation.

In these relatively enlightened times, it is sobering to realize that the scientific principle of germs and infections (and conditions that promote them) were not discovered until the mid-19th century, and not widely acknowledged until the latter part of the same century.  Some professionals in the fields of health and sanitation believe that the toilet, that most mundane and derided of appliances, has saved more lives than disease immunization or even antibiotics.  However, the disease-preventing capabilities of modern sewage systems are fairly confusing without a historical perspective.

Despite its smell, there is nothing outrageously harmful in human waste.  However, since the human digestive tract is an environment without light or air, it can often harbor deadly diseases and parasites that can spread onsite or downstream.

Throughout history, there have been 4 ways people have typically disposed of human waste.

1. The most basic was open defecation.  About 40% of the world’s population still practice this habit, usually out of poverty or lack of knowledge.

2. Onsite latrines, both with and without water, where manure is sometimes used for agriculture.

3. Water-borne sewage, which may or may not be purified to remove disease-carrying organisms and nutrients before being released back into water bodies.

4. Various combinations of these practices.

Human and food waste was not particularly hard to manage in early societies based on hunter-and-gatherer and agricultural relationships.  In general, the land and crops could easily absorb the waste from small populations.  It was often used as fertilizer.  But villages and cities were another matter.  Many of them did not develop effective ways of disposing of waste, creating conditions rife with disease. Onsite waste could contaminate groundwater, and could come into direct contact with people tracking it into their environment or exposed to it through insects and rodents.

Indeed, rural migration to some cities became a grim form of population control, as these urban settlements became de facto population sinks.10  These cities constantly needed to replenish their numbers because of disease brought on by (preventable) unsanitary conditions.  In many cities around the world, it was common to simply throw waste out the window, to the detriment of common health and unwitting passersby.

However, some early cultures developed effective forms of water-borne sewage disposal.  The oldest recorded example of this was about 2,500 and 3,500 B.C. in the Indus Valley (Northwest India).  Whole cities were built on gradients that allowed gravity to carry waste inside clay sewers from inside houses to the edge of the settlement.  Latrines were flushed with water poured from pottery.  The sewers had depressions resembling primitive septic tanks that allowed the solids to settle, and this was eventually cleaned by hand.  All houses were served with water wells near or inside the dwelling.

Excavated sewer system from Indus Valley • harrappa.com

While this system did an amazing job of dealing with onsite sanitation for large populations, the effluent ran into the Indus River, polluting the environment and probably causing distress to downstream neighbors.  Other water-borne systems have been excavated in ancient cities of Babylonia, Assyria, Sumeria, and Crete.

Probably the more common alternative throughout the ages has been onsite dry privies and cesspools.  Waste was carted off by hand (and horse), and often sold to farmers as fertilizer.  Often the collection was done at night, which gave rise to the 19th century American euphemism of “nightsoil.”  Many cities in America had such a system through the 19th century.  Japan was heavily dependent on this method until the late 20th century.

The effectiveness of these systems to mitigate disease was largely dependent on how they were built and maintained.  A cesspool emptied regularly had less potential of leaching into nearby water supplies than one that was routinely ignored.  A cesspool under a house had less potential to attract vermin than an uncovered one.  In the case of dry systems, an interesting study compared the customs in 19th and 20th century in Singapore and Tokyo.  It noted that Singapore might have been more successful in preventing disease because clean buckets were provided with nightly removal service.11

Some cities practiced 3 disposal methods at once.  Ancient Rome was a prime example.  Its aqueducts and sewers are considered engineering marvels.  Its sewer system, first built for drainage in about 800 B.C., is still in use.  It connected directly to public  bathhouses and toilets, using low-quality water from aqueducts as a continuous flushing mechanism.  By some accounts, the sewers were also directly connected to homes of its wealthier citizens.  But onsite disposal, including the streets, was also a common occurrence that was dealt with by regularly using lower quality water to flush street waste into the storm sewer.  Similar to other water-borne systems, the unfiltered contents flowed downstream, into the Tiber, then out to sea.

An interesting twist on the Roman experience is that, under the Emperor Vespasian, the contents of the city’s urinals was sold to laundries because the ammonia in it produced excellent cleaning results.  When his son, Titus, expressed revulsion at the practice, the Emperor, in one of history’s classic one-liners, held a gold coin under his son’s nose and proclaimed “Gold has no odor.”

An American Tale: Do Unto Others What You Can Get Away With

Sewage treatment in the U.S. has taken a circuitous and often delayed route in protecting public health.

Until the late 19th century, almost all U.S. cities were using nightsoil removal or cesspool cleaning, or both, as their primary methods of waste removal.  In 1880, 103 of 222 major cities in the U.S. used their waste as agricultural fertilizer, including 74 out of 94 cities in the New England and Mid-Atlantic states.12  Baltimore continued this practice into the 20th century, when it finally built its sewer system.

There were several reasons for the switch to centralized systems.13  A main one was that cities began supplying piped water to meet demand for increased volume and purity, which in turn created more wastewater from toilets, baths, and sinks.  This increased volume could drastically increase the cost of cesspool cleaning.  Units that were formerly cleaned once a year might need to be cleaned once a month.  Central sewers offered a cost-effective alternative.

There were also health considerations, particularly as cities densified.  Many cesspools were unlined, and some of their water would soak into the earth, contaminating wells, groundwater, and surface water near their location.  Tenements and slums with large numbers of people would drive demand for waste removal in areas most likely to be neglected.  And onsite maintenance habits varied.  While some people might be responsible and have enough money for frequent maintenance, others were careless or poor and created a public and health nuisance.  In this regard, central sewer systems were considered more “idiot proof.”

However, rendering waste out of sight and mind did not make it vanish.  It moved it downstream, and usually, untreated.  Statistical analysis has shown that there was a precipitous drop in deadly water-borne disease such as cholera and typhoid when U.S. cities began to purify their water supplies with sand filters and chlorination.  This situation created a chasm between the health community, which pressed for sewage treatment plants, and city engineers and many elected officials who did not see the need.  The latter’s motivation was that sewage treatment would primarily benefit people in other cities downstream…or fish, but not their constituents.

There was even a pseudo-scientific rationale for this.  “Dilution is the solution to pollution.”  Since limited amounts of pollution can be cleansed from rivers through natural oxidation, many cities used this as justification.  However, their immense volumes of waste overwhelmed what nature could do in this regard.

Of the thousands of cities that dumped their waste problems downstream, perhaps the most colorful example was Chicago.14 For many years it was sending its untreated sewage into Lake Michigan, the same source where it was haplessly drawing its water.  In response to disease outbreaks, the City had the inspiration to route the sewage away from the drinking water in 1866.  This reversed the flow of the Chicago River, so that it carried waste away from Lake Michigan instead of into it.   A major expansion of this strategy occurred in 1900.

This enraged neighbors both upstream and downstream.  Cities that drew their own water supply from the polluted Illinois and Mississippi Rivers had to deal with the exported pollution and disease.  Cities upstream worried that further withdrawals would harm the navigability of the Lake.  Before its first sewage treatment plant was built in 1922, the Chicago River carried untreated waste from 2.7 million people and its slaughterhouse industry into the rivers.  After 3 U.S. Supreme Court cases and numerous political maneuvers over the course of 30 years, Chicago was ordered by the Supreme Court in 1930 to curtail (not eliminate) its diversions from the Lake, which effectively mandated more treatment capacity.

The Austin System

Austin’s own history also serves as example.15  The first sewer service in Austin began in 1882 as a small line to serve businesses and wealthy residences in downtown Austin.  Since the city was poor, the system was built by a private company.  Its expansion was guided more by who could afford to pay than the need for health.  Large areas of the city were left unserved, either because line extensions were not made, or because people living next to them could not afford to connect.

The City of Austin bought the private company in 1912 and began a slow process of increased connections.  Upgrading the system, and by extension the city’s health, had immense challenges.  In 1913, an Austin health inspector reported that the City’s sanitary regulations were in the “Dark Ages.”  Sewers served less than 50% of the city’s population and 25% of the city’s populated area.  Waller, Shoal, and Little Shoal Creek were running with untreated waste coming from both the sewer system and cesspools, and this would eventually flow into the Colorado.  Many residents were still without water mains, and as many as 500 water wells were subject to groundwater contamination from onsite waste systems.

In 1916, fecal bacteria contamination to Austin’s raw water supply was detected, and the utility subsequently purchased chlorination equipment to deal with the threat.16

Policies to expand the system included mandatory hook-ups for residents living near the lines, and at times, free hook-ups and service offered as an incentive.  However, it was not until 1926 that Austin adopted a policy of citywide service.  At that time, 40% of the city was without sewers, and citywide service was not attained until federal funding was received in the 1930s and 40s.

It was not until 1919 that Austin began to operate its first sewage treatment plant at the site of the now retired Holly Street Power Plant, mandated by a 1915 state law.  This was very likely an unsophisticated “primary” solids settling plant.  It was probably not until 1938 that Austin began to operate its first modern “secondary” treatment plant, which was built by LBJ patron George Brown (of Brown & Root) with New Deal money.17

The Slow Path to Clean Rivers

The lack of concern for who lived downstream was prevalent in many parts of the country into the late 20th century.  Around 1900, few Americans were served by sewage conveyance or treatment of any kind.  This had changed markedly when a census of wastewater treatment was made in 1939.  By then, some 76 million people, 57% of the 1940 U.S. population, was served by sewer pipes, and 95% of these people lived in dwellings connected to them.18  However, only about 41 million people, 58% of this sewer-connected population, were served by treatment plants. Only 29% of the population served by sewer pipes were served by advanced (for their day), secondary treatment plants.  Many in the rest of the country used outhouses, cesspools, septic fields, or bushes.

By 1968, there were about 140 million served by central sewers, 70% of the U.S. population.19  But still, 39% of them were only served by primary treatment plants, or were dumping raw sewage into water bodies.

Stricter standards from the Clean Water Act brought more improvements.  By 1996, 190 million people (72% of the increased U.S. population) were served with sewage treatment, with 87% of this being secondary or above in quality.20  No raw sewage was being expelled on a routine basis.  Most of the balance of the country was served by septic systems.  Key to these improvements was massive federal funding that began with the New Deal and continued through the 1990s.

Despite the expensive and expansive wastewater treatment infrastructure that the U.S. has built, and its relatively advanced public health system, it still has considerable progress to make.  About 850 billion gallons of untreated sewage flow into water bodies each year from cities with older piping systems that collect both sewage and storm water for processing.21  When rain surges occur, the treatment plants cannot handle the excess, which is released into the environment.  These overflows do not include leaks and equipment failures that occur around the country.

Despite plummeting levels of severe water-borne diseases such as cholera and typhoid in the U.S. compared to a century ago, there were about 29,000 cases of cryptosporidium and giardia in 2010.22  One study estimated 99 million cases of gastrointestinal illness a year, with at least 6 million of these coming from sewer overflows and breakdowns, and urban run-off from impervious cover.23  Our country has made great progress in dealing with sewage pollution, but it is hardly perfect.

The conclusion of this short history is that conventional toilets and the infrastructure that serves them can remove potentially hazardous waste away from local areas.  But toilets are only appliances attached to a complex water and wastewater system.  Their effectiveness in disease prevention and environmental protection is dependent on the infrastructure upstream and downstream, and how well it is managed.

Toilets do, however, use an awful lot of water.

Water Use in Today’s Plumbing Fixtures

Depending on the age and tank size, toilets can use anywhere from 10 to 40% of the indoor water use in an average Austin home.

Older toilets used considerable amounts of water compared to today’s standards.  The typical toilet of the 1960s used 5 to 7 gallons per flush.  In about 1985, as a response to water shortages, the state of California enacted a standard of 3.5 gallons per flush.  National requirement reduced this to 1.6 gallons in 1994.  In 2014, Texas and California required a standard of 1.28 gallons for all new single-flush toilets sold.  (Austin adopted this for its building code in 2010.)  The best conventional models today use 0.60 to 0.80 gallons.  The advent of dual flush toilets, that allow less water to be used to remove liquid waste, has shown a reduction to as low as 0.9 to 0.96 gallons per flush when averaged over time.


The impetus to enact these water-saving standards was partially inspired by a similar water-efficiency standard in Europe.  However, after the U.S. enacted its standard, it was painfully observed that the two continents had different technologies for the appliance.  American units usually siphoned or “pulled” water, while European units generally “pushed” water.  The European efficiency standard was not always directly transferable, and some American units were not engineered to account for this.

The consequence was that many of the first American units operated poorly.  Some units used more water than the efficiency standard allowed.  In others, double flushes were required for heavy waste, so a lot of the potential for water conservation was lost.  The bad performance created a backlash against water conservation equipment by some people.  A kind of black market even developed for old, water-wasting units, or new water-wasting units from Canada, where it was still legal to sell them.

The problems were largely corrected with the advent of the Maximum Performance (MaP) Testing program in 2003.  With original funding provided by governments and water utilities in the U.S. and Canada, a private testing company began to scientifically rate toilets on the amount of solid waste that could be removed per flush (as weighed in grams).  Originally, only 39% of the toilets tested could handle 350 grams per flush, the minimum performance level to gain a MaP rating in 2012.  In 2012, only 9% of units failed to reach the same level, and some units can reach over 1,000 grams per flush.

MaP testing is not required, but most plumbing manufacturers voluntarily participate.  There were 3,734 models rated by March of 2019.  A Web link to this list can be found at the end of this story.

Some rated units can remove as much as 1,000 grams (2.2 pounds) per flush, but it should be noted that units rated at 350 grams per flush will handle the vast majority of situations.  Paying more for units above this base performance level is generally not necessary.

In November 2012, MaP released ratings for a new PREMIUM standard for residential toilets that used no more than 1.06 gallons per flush and had a solids removal rating of 600 grams per flush.  As of March 2019, over 229 models had already complied with the new requirements.

The U.S. Environmental Protection Agency has a similar rating program for plumbing fixtures called WaterSense.  It uses third-party testing to rate High Efficiency Toilets (1.28 gallons) for minimum solids removal of 350 grams per flush, though it does not detail the rating if a unit exceeds this threshold as MaP tests do.

There are other qualities to look for in a new unit.  One of them is sound.  Consumers are advised to be cautious of pressure-assisted models that use a combination of water and compressed air to propel water.  These are extremely loud and disturbing.  Another consideration is how clean the bowl stays after washdown.  Consumer Reports magazine is a good source to compare quality of competing models.

Showerheads and Faucets

Showers consume about 17% of domestic indoor water use.  In the early 1990s, showerheads could use as much as 5.5 gallons per minute.  National standards in 2012 mandate units that consume no more than 2.5 gallons per minute.  The WaterSense rating program rates and recommends water-conserving fixtures using no more than 2 gallons per minute.

Indoor faucets consume about 15% of domestic indoor water use.  National standards for bathroom sink faucets in 2012 limit consumption to no more than 2.2 gallons per minute.  WaterSense rated fixtures reduce use to 1.0-1.5 gallons per minute, a savings of 32 to 55%.  Inexpensive aerators can reduce flow to as little as 0.5 gallons per minute.

For More Information:

MaP-rated toilets, showerheads, and faucets:


WaterSense rated toilets, showerheads, and faucets:


Consumer Reports magazine:


This article has primarily focused on water-borne centralized wastewater utilities.  Other solutions, such as composting toilets and decentralized treatment, may be subjects of future articles.


(Note: This story previously appeared in a series.  Because of this, the documentation does not begin at #1.)

10 Ponting, Clive, A Green History of the World, New York, NY: Penguin Books, 1881, p. 226.

11 Otaki, Yurina, et. al., “Water systems and urban sanitation: A historical comparison of Tokyo and Singapore,” Journal of Water and Health, 2007, p. 259.

12 Goldstein, Jerome, Sensible Sludge, Emmaus, PA: Rodale Press, 1977, p. 17

13 Much of this discussion came from Benidickson, Jamie, The Culture of Flushing, BUC Press, 2007.

14 Ibid., pp. 183-212.

15 Much of this discussion came from Kraus, Steven Joseph, Water, Sewers, and Streets, 1875-1930, MA Thesis, UT-Austin, May 1973.

16 Discussion of water contamination noted in minutes of Sept. 20, 1916 City Council meeting. Purchase of chlorine first noted in City budget Feb. 21, 1918.

17 Contractor from minutes of the Austin City Council, Nov. 21, 1935.

Year commissioned from City of Austin, “Annual report of the sewage treatment plant,” 1938.

18 “A Summary of Census Data on Sewerage Systems in the United States,” Public Health Reports, Mar. 20, 1942, pp. 409-421.

19 US EPA, Progress in Water Quality: An Evaluation of the National Investment in Municipal Wastewater Treatment, June 2000, Executive Summary, p. 3.

20 Ibid.

21 US EPA, Report to Congress on the Impacts and Control of CSOs and SSOs, Office of Water, August 2004, p. ES-4.

22 Center for Disease Control and Prevention, MMWR Morbidity and Mortality Weekly, Vol. 59, No. 53, June 1, 2012, p. 24.

23 Gaffield, Stephen, et. al., “Public Health Effects of Inadequately Managed Stormwater Runoff,” American Journal of Public Health, September 2003, pp. 1527–1533.

Share via