The Basics of Cooling Water Management
This article, which explores the reasons why you should manage your industrial cooling water, is one in an occasional series on water management basics and technologies.
Drought conditions and increasing water usage have combined to decrease the availability and increase the cost of the good quality, low-hardness water preferred for cooling tower makeup use. At the same time, stricter environmental restrictions on effluent discharge have resulted in increased fees for disposal of cooling tower blowdown to the sewers. The addition of these concerns to the existing requirements for control of scale, corrosion and biological fouling has increased the difficulty and costs associated with operating a cooling water management program.
In spite of all these concerns, cooling water is a commonly neglected area often responsible for substantial problems due to downtime, equipment damage, loss of process control, high water use, environmental violations, safety hazards and increased energy use. Neglect of cooling water generally is because of two major reasons. First, the user often does not appreciate that cooling water is a vital part of the facility operation or production process; and second, misinformation, fraudulent products and marketing hype are common when cooling water treatment is the issue.
Cooling water users, armed with a basic knowledge of proper cooling water management, can avoid many of the problems resulting from corrosion, scale, deposition and biological fouling.
Many industrial processes require that heat be moved from one place to another. A cooling tower is simply a device for rejection of unwanted heat to the atmosphere. The fact that water is a low cost, convenient and highly effective heat transfer medium, and that evaporation of a pound of water requires about 1,000 BTU, makes the evaporative cooling tower the most effective means for heat transfer. Water evaporation within the cooling tower accounts for the majority of the heat rejected; typically, 75 percent to 80 percent is removed by evaporation. The rest of the heat is removed by transfer to the substantial airflow passing through the cooling tower.
For example, 1,000-ton-rated cooling tower is designed to have a heat rejection of 12 million BTU/hr. At 80 percent heat rejection by evaporation, this unit will evaporate about 1,100 gal/hr, or 26,400 gal/day. Water evaporation in the cooling tower concentrates the dissolved salts found in almost all water sources, which increases the potential for scale, corrosion and biological fouling.
In addition to the concentration of salts, a 1,000-ton unit operates at a design airflow rate of 271,000 cfm. Because a cooling tower is also an effective air scrubber, passage of large amounts of air through the device results in the addition of significant amounts of airborne dust and debris to the cooling water.
Cooling Tower Blowdown
Blowdown, or intentional removal of water from the cooling tower, is required to prevent over-concentration of salts and insoluble airborne debris. However, blowdown results in potential environmental problems, increased water use and wastewater disposal costs. The number of times that the replacement water, or makeup, is increased in concentration commonly is referred to as cycles, which is calculated by dividing the dissolved solids level in the cooling water by that of the makeup water.
In many areas of the country, scale formation due to poor makeup-water quality prevents any cycling. Other areas are severely limited as to the maximum obtainable. Chemical treatment of the cooling water thus is required in many areas to permit operation of an evaporative cooling tower without scale formation. Scale causes physical blockage of piping, equipment and cooling.
Described as the universal solvent, water corrodes all materials of construction but at different rates. Steel, being the lowest-cost construction material for cooling systems, is common and is readily corroded by most cooling waters. Other materials such as copper, brass and galvanized steel also corrode, though generally at lower rates. To obtain a useful life from a cooling system, corrosion inhibitors usually are used to control the corrosion rates to an acceptable level.
The cooling water environment -- warm temperature water with high dissolved solids and debris loading -- is an excellent medium for growth of microorganisms that may cause many severe problems. Not only is there an increased risk of Legionnaires' disease, but biofilms can plug water passages and piping, help accelerate corrosion, and reduce heat exchanger efficiency.
The effect of biofilms on the power cost of chiller operation often is not appreciated. Looking at the thermal conductivity of biofilm, typically 0.2, it is substantially less conductive than common calcium carbonate scale. Therefore, while a system may be scale-free, any biofilm present may cause excessive energy use. To prevent these problems, chemicals referred to as biocides are added to cooling towers to control the growth of unwanted microorganisms in the cooling water.
Any discussion of cooling water management must begin with an objective statement of what is expected from the cooling water system. In most facilities, the cooling water system must provide reliable equipment cooling with maximum heat-transfer efficiency. The four basic requirements are:
- To obtain maximum efficiency by minimizing problems from corrosion, scale, deposition and biological growth.
- To have feasible implementation and control with a minimum input of labor and money.
- To be cost effective, considering the total water system capital and operating costs.
- To be healthy, safe and environmentally acceptable.
These four requirements form the basis for any serious discussion of cooling water management. More on this in part two. PCE
Related Articles in this Series