The earliest phases of design are crucial to the temperature control of a small manufacturing process. Many problems, including unstable control, excessive cost, inaccurate temperatures and user complexity, are either created or avoided based on the initial specifications provided by the manufacturing plant and the design principles followed by the equipment vendor.
Basic requirements such as process volume, ambient and process temperature range, control size, and cosmetics are usually straightforward and, although they should be checked, do not often require close scrutiny. Conversely, temperature accuracy (see sidebar) is a parameter that always should be examined closely using specific engineering guidelines. An unrealistic temperature accuracy specification can have a substantial impact on cost, delivery schedule and performance. Likewise, parameters such as cooling capacity and thermal lag can affect the overall system efficiency.
You know your process better than anyone. But when it comes to controlling the temperature, it can be easy to over-specify a controller based on what you think you need to ensure accuracy, or to under-specify by focusing exclusively on price. Understanding the variables that affect temperature control and working closely with your equipment vendor can help you optimize the performance of your cooling system.
Defining Your Process ApplicationTemperature accuracy is important. But how much accuracy do you really need? Be realistic about the accuracy and stability that your process must hold at setpoint. Specifying temperature accuracy better than 0.001 of a degree could drive the design cost and schedule way beyond acceptable limits. Unless you are dealing with a phase change, distillation threshold or similar effect that has an inherently precise temperature, it is unlikely that variations of one or even several degrees will affect the process outcome.
Your equipment vendor can help you perform experiments and calculations that will allow you to determine the stability and accuracy required of your cooling system and process controller. The looser these two specifications, the easier and less costly the equipment will be to implement.
You also should consider carefully how the process temperature and setpoint will be displayed to the user. Even a high-quality controller and carefully designed system can appear to be unstable if the temperature readout is presented in 0.001 or even 0.01 of a degree. Always specify readout precision in the largest unit that is meaningful to the process. One degree is usually adequate in nonscientific applications.
The type of sensor used in the controller will have a direct bearing on cost and performance. Select a sensor that fits your expectations for performance rather than sticking to a particular type (i.e., thermistor, resistance temperature detector [RTD], thermocouple, etc.) simply because it has always been used in your process. Generally, it is best to consider the highest-energy-efficiency thermistors and semiconductor types first because they offer the best chance for system stability at the lowest cost. If these sensors do not offer sufficient accuracy or range, then lower-sensitivity but high-accuracy RTDs might be appropriate. Thermocouples might be required for extended temperature ranges or a fast response time, but they offer the lowest stability and accuracy.
The cooling capacity must be large enough to accommodate the load while allowing for variations in ambient temperature, process changes and other parameters, but it must not be an order of magnitude beyond this requirement. A capacity that substantially exceeds the load requirement will severely reduce the likelihood of achieving smooth, stable temperature control and might even make control impossible.
An experienced engineer can calculate cooling capacity requirements from detailed system plans. However, it is often faster and more accurate for the vendor to assemble a rough prototype system without any controls. The system then should be tested under the most severe ambient and load conditions while the temperature is monitored at the load. Capacity is adequate if the process temperature holds steady or, even better, decreases up to 25 percent over a time scale relevant to the process.
Yet another important characteristic in temperature control is the thermal lag, or delay between the cooling mechanism and the controlled region of the process (the sensor location). This lag is simply the time required for a given amount of heat to be transported out of the process and carried to the heat exchanger. The lag time must be reduced as much as possible to achieve stable temperature control.
Thermal transport lag can be partially offset by an elaborate proportional control with integral and derivative (PID control), but this approach will result in slower overall system response time as well as a higher controller cost. Addressing thermal lag in the system design is a more effective way to achieve process temperature stability.
Design strategies that can reduce lag time include close-coupling between the process media and the cooling device(s), using high-thermal-conductivity materials and minimizing the physical size of the system. When a system prototype is available, the temperature should be monitored at the most important location in the process media with the cooling turned off to establish a baseline. Cooling then should be turned on at full capacity. The subsequent graph of process temperature vs. time will characterize the thermal process delay (figure 1).
A related parameter is the overall temperature gradient in the cooling system. A temperature controller alone, no matter how sophisticated, cannot ensure an even, gradient-free temperature throughout a process. As with thermal lag, the temperature gradient must be addressed through the appropriate system design.
Through close collaboration with a knowledgeable, experienced equipment vendor, you can establish practical parameters for a cooling system that meets your process requirements and your budget.
Sidebar:Accuracy and precision have different meanings, but these terms often are confused when specifying process control or operation. Understanding the difference is important from both cost and performance perspectives.
Accuracy and Precision
The precision specification for a temperature control is how closely temperature can be read on a display or dialed in for a setpoint. For example, a display of 102.3°F is precise to 0.1 degree, while a display of 123°F is only precise to 1 degree.
The accuracy of a temperature control reading is how closely it would conform to the same reading made by a thermometer, calibrated and traceable to international standards, measured at exactly the same point. For example, a temperature displayed as 102.3°F with an accuracy of 0.5°F indicates an actual measured temperature that is anywhere from 101.8 to 102.8°F. Thus, a more precise temperature reading is not necessarily more accurate.
In general, a high-precision reading can be obtained far more easily and, therefore, at a lower cost than high accuracy. However, neither precision nor accuracy should be specified beyond what is necessary for the process.