Electronics Protection - January/February 2013 - (Page 6)

Feature How Do You Choose Between Hot and Cold Aisle Containment? Ian Seaton, Global Technology Manager Chatsworth Products From the very beginning of data center’s using airflow containment strategies, there has been an ongoing debate about which form is the most efficient: hot aisle containment or cold aisle containment. The debate, based on equal parts vested vendor interest, intuition and anecdotal evidence, was addressed by the scientific study DataCenter 2020. Their conclusion is best summarized by the white paper’s title: “DataCenter 2020: Hot Aisle and Cold Aisle Containment Efficiencies Reveal No Significant Differences.” This study shows that data centers can deploy containment strategies based on specific architectural and business variables, rather than preference or even guesswork, and essentially achieve the same results. The most critical steps then become understanding how each strategy fits into those variables and how to optimize performance once installed. In fact, that basic principle has been around since the inception of data center airflow containment. Widely credited with introducing the hot aisle/cold aisle concept, the Uptime Institute’s Dr. Robert Sullivan, working for IBM at the time, decided that it didn’t make much sense to dump hot air from one rack into the cold air intake of the next one. However, today’s equipment produces considerably higher heat densities, and the traditional hot aisle/cold aisle approach of delivering increasing amounts of very low temperature air often doesn’t work. Typically, even in data centers where the volume of chilled air supplied far exceeds the actual demand of IT equipment and that air is supplied 20°F to 25°F cooler than actually required by that equipment, there will still be hot spots in some areas of the data center, some servers will be ingesting air above the maximum threshold temperature set for the space. For example, several years ago the Uptime Institute conducted data center audits of their US membership, which operate some of the most efficient data centers in the country, and they found the average data center was still producing 2.6 times the amount of airflow demanded by the servers. Best practices such as sealing floor tile cut-outs, deploying blanking panels in all unused rack mount spaces and properly locating cooling units help improve the physical separation achieved by hot aisle/cold aisle, thereby reducing the need for a high degree of over-production of overly cool supply air. That path of optimized airflow management culminates with a containment system that reduces the airflow surplus requirement to 5 percent or less of the IT equipment actual demand. In addition to over-producing the volume of chilled air, most data centers still find they need to overcool that air. With a typical set point around 72°F (22°C) producing a supply temperature in the mid-50s to meet a recommended maximum server inlet temperature of 80°F (27˚C), according to the latest ASHRAE Environmental Guidelines for Data Processing Equipment, many data center operators still find they need to drop that set point another degree or two to meet their server inlet temperature specifications. One of the startling results of this practice is that you are spending one third of your entire data center operating budget to cool air that’s already cold enough to use in a data center with good airflow management. For most people the single largest chunk of their operating budget is consumed with cooling costs. We need that number to be as low as possible. But as it stands, most traditional hot aisle/cold aisle data centers are overproducing so much cool air that the guy next door could cool his data center for free with your waste air. It’s just spent needlessly. No matter where your pain point exists, whether it’s data 6 center expansion, equipment upgrades or a search for efficiency, airflow containment can probably solve it. The Data Center 2020 study helps clarify that hot aisle and cold aisle containment can both be highly efficient, but strict separation of cold and hot air is fundamental. So, where do you start? In the cabinet. Containment is often associated with enclosing a hot or cold aisle. The cabinets are easy to overlook but the cabinets have to provide the first barrier to the mixing of hot air and cold air. If there are empty rackmount spaces (U), you need a blanking panel. If we have cable pass-throughs in the front and Custom designed cabling grommets that allow rear, we have to put cable pass-throughs and contain airflow grommets there to make sure we’re not violating the integrity of that isolation. If we’re doing setbacks (25 mm being the most common) on equipment mounting rails from the cabinet frame, we have to make sure that setback is sealed all the way around the perimeter of the cabinet with air dams. We don’t often think of that space around the perimeter equating to that big of an opening but leaving out that one detail equates to an opening of about 5U without filler panels. Once you’ve accounted for cabinet level containment you can take a step back and start weighing out the variables of which containment strategy fits your specific needs. Cold Aisle Containment (CAC) is pretty simple compared to some Hot Aisle Containment (HAC) solutions. You stick a roof over the cold aisles and doors on the end of the rows and you’ve pretty much got it done. Generally deployed in a cabinet-supported system, most of the CAC facilities I’ve seen are pumping cold air into the contained environment through a raised floor structure. It’s also compatible with row-based cooling and an easy retrofit for existing data centers, especially with overhead power distribution and basket tray. It prevents all the structures from being an obstacle to containment. Fire suppression does require a little more work but it’s not complicated: you just put a hole in the containment ceiling and bring the sprinkler in. Since the CAC architecture results in two (or more) separate air volumes which must be addressed by fire suppression, there is a cost adder for that are essentially two parallel systems, including the piping, sprinkler heads and perhaps a higher capacity delivery system. And, when you raise set points, the room is going to be considerably warmer. For example, if the supply air is set at 75°F (24°C) and we are running primarily blade servers with a 32°F (0˚C) ΔT, the whole room basically becomes a hot aisle at around 107°F (42°C). HAC gets a little more interesting. You need to have a path for that hot air to get back to your return air. Typically you would have the hot aisle go up to the drop ceiling and use that as the return path or the whole structure can go back to the return. But it has to get out of the room without mixing with the cooled supply air. Eliminate that hot air and you have a path to higher densities. HAC also has the flexibility of being deployed in a frame sup- January/February 2013 www.ElectronicsProtectionMagazine.com http://www.ElectronicsProtectionMagazine.com

Table of Contents for the Digital Edition of Electronics Protection - January/February 2013

Electronics Protection - January/February 2013
CPI’s eConnect PDUs Integrate Thermal Management with Intelligent Power Solutions
Five Powerful Virtualization Challenges
How Do You Choose Between Hot and Cold Aisle Containment?
Choosing the Right UPS Deployment Architecture for your Data Center
Adalet Offers Explosion Proof Enclosures Constructed from Stainless Steel
AVX’S MLC Capacitor Series Provides Protection Against ESD Strikes
Electrorack Launches Contain-IT Aisle Containment Solution
CoolitDC v.6.00 Boosts Modeling Accuracy and Ease-of-Use
Tru-Block Failure-Free Surge Protection Products Introduced
TE Connectivity’s 2Pro AC Devices Deliver Enhanced Protection in A Single Component
Gore Protective Vent Improves Reliability of Electronic Displays
USB Panel-Mount PCB Connectors Added to L-com Lineup
Industry News
Calendar of Events
Research & Development

Electronics Protection - January/February 2013

https://www.nxtbook.com/nxtbooks/webcom/ep_2017summer
https://www.nxtbook.com/nxtbooks/webcom/ep_2017spring
https://www.nxtbook.com/nxtbooks/webcom/ep_2017winter
https://www.nxtbook.com/nxtbooks/webcom/ep_2016fall
https://www.nxtbook.com/nxtbooks/webcom/ep_2016summer
https://www.nxtbook.com/nxtbooks/webcom/ep_2016spring
https://www.nxtbook.com/nxtbooks/webcom/ep_2015winter
https://www.nxtbook.com/nxtbooks/webcom/ep_2015fall
https://www.nxtbook.com/nxtbooks/webcom/ep_2015summer
https://www.nxtbook.com/nxtbooks/webcom/ep_2015spring
https://www.nxtbook.com/nxtbooks/webcom/ep_2014winter
https://www.nxtbook.com/nxtbooks/webcom/ep_2014fall
https://www.nxtbook.com/nxtbooks/webcom/ep_2014summer
https://www.nxtbook.com/nxtbooks/webcom/ep_2014spring
https://www.nxtbook.com/nxtbooks/webcom/ep_20140102
https://www.nxtbook.com/nxtbooks/webcom/ep_20131112
https://www.nxtbook.com/nxtbooks/webcom/ep_20130910
https://www.nxtbook.com/nxtbooks/webcom/ep_20130708
https://www.nxtbook.com/nxtbooks/webcom/ep_20130506
https://www.nxtbook.com/nxtbooks/webcom/ep_20130304
https://www.nxtbook.com/nxtbooks/webcom/ep_20130102
https://www.nxtbook.com/nxtbooks/webcom/ep_20121112
https://www.nxtbook.com/nxtbooks/webcom/ep_20120910
https://www.nxtbook.com/nxtbooks/webcom/ep_20120607
https://www.nxtbook.com/nxtbooks/webcom/ep_20120304
https://www.nxtbook.com/nxtbooks/webcom/ep_20120102
https://www.nxtbook.com/nxtbooks/webcom/ep_20111112
https://www.nxtbook.com/nxtbooks/webcom/ep_20110910
https://www.nxtbook.com/nxtbooks/webcom/ep_20110607
https://www.nxtbookmedia.com