Cooling a Server Room. How to Spend Money to a Good Use

  • Published In: Tutorial
  • Created Date: 2014-09-21
  • Hits: 1510

The following 4 things prompted me to write this article:

  1. The article «Why It Is Important to Maintain Temperature Conditions in a Server Room. How a Server Room Cooling Usually Works», where the author tried to fulfill the honorable and very difficult mission of explaining why it is necessary to cool servers.
  2. Mistakes found in this article.
  3. My own experience.
  4. A very small number of rich in content articles on the topic of data center infrastructure (I mean that a server room = a little data center) at Habrahabr. However «Beeline» and guys from "Data Centres Russia" are great in this.

So, what this article is about.

Firstly, this article will do a small excursion into a theory of cooling server rooms. Secondly, I will try to discuss main errors when planning a cooling system. Thirdly, I will examine what stuff is really worth spending some money and what things can be easily refused.

Today there are two global strategies in data centers cooling:

  1. Free cooling. When servers are cooled directly by outer air which passed a minimal preparation (usually it is only a basic filtration and some heating during a winter season).
  2. So-called controlled cooling. In this case you prepare air according to pollution, humidity and temperature characteristics and then supply it to the servers. Here belong different methods of indirect free cooling (outer air is used to cool a heat exchanger where air from a data center is).

Advantages of the first strategy are obvious. They are: low cost of implementing, low cost of technical support, ridiculous bills for electricity. Its disadvantages are also clear. Uncontrolled air humidity and dust level will inevitably bring to failure of servers components. Such approach has its own followers. Usually it is big technological companies. Why is it good for them and bad for the others? There are 3 reasons for this:

  1. They have a network of completely backed up grounds. If one fails, another one will carry on.
  2. A necessity to be at the peak of technology. A server that works with a bad air will fail nearly in a year. During this year these big companies will change a one third of their servers. Therefore they do not need to take care about firmware, which will be thrown out in garbage in a year.
  3. Dimensions and bills for the electricity. Cooling is the most expensive part in electricity bills. 1% reduction of expenses for cooling will save them several millions of dollars. Just imagine a 30-50% reduction! No doubt, these companies are ready to tolerate certain discomfort.

The second strategy implies a higher reliability and a longer life time of cooled equipment. The most traditional example is banking. As well as all other companies that do not change their servers like socks. Disadvantages of this strategy are all about its price (building, maintenance, electricity bills).

It is clear that the most companies consider a method “the most efficiently and unpretentiously”. However, the simplest way is not always the right one. Sometimes being too simple is an opposite to being right.

Let us switch over to more practical stuff. When we discuss cooling servers, controlling a temperature is considered the first thing to do. It is right but not sufficient. There are three basics of right cooling. They are: a temperature, an air volume and its humidity. The second important element is a management of air flow, i.e. how to direct a cold air to the place where a server will take it in and how to take out a hot air from a server release and direct it to an air conditioner as well as how to keep these airflows, cold and hot, separate one from another to prevent their mixing up.

Everything is simple with a temperature. There are certain recommendations from a server’s manufacturer as well as ASHRAE recommendations. I think that a normal temperature for the majority of server rooms is 22-24 degrees of Celsius (72-75 degrees of Fahrenheit).

While everyone remembers about a temperature, almost no one thinks about air volumes when building a server room. Let us consider a server’s technical specifications. Besides consuming and dimensions, there is a parameter, which is usually measured in CFM (cubic feet per minute) – it is usually a volume of pumped air. I.e. your server needs an air of a certain temperature and in certain volume. A “certain volume” is written here in bold and caps. And thus we come to a possible using of home split-systems in a server room. The thing is they will not cope with a necessary volume. In fact, specific heat of human is incomparably small compared to a server, at the same time home air conditioners are designed to maintain a comfort climate for a human. Their little ventilators (like Tyrannosaurus forelimbs) are not able to run through themselves an air volume, which is necessary to cool a server. As a result we get the situation when a server runs some air through itself, an air conditioner cannot remove it and a hot air is mixed with a cold one. I’m sure you have been to a server room where a conditioner is set to +16 degrees of Celsius (61 degrees of Fahrenheit), but a room temperature is +28 degrees of Celsius (82 degrees of Fahrenheit). I have been to. Perhaps your server room is just the same?

Some additional information:

What about controlling a humidity level, the article I mentioned above contains a wrong approach. No doubt, a humidity level must be controlled. However, an air must not be dried, but moistened. The fact is a server room has a closed air circulation (at least it should have). The amount of moisture in the air at the stage 0 (when starting a server room) is in certain range. When cooling, the most moisture is condensed on a heat exchanger of an air conditioner (temperatures are too different) and it is discharged into a drain. An air becomes too dry, causing appearance of statics on boards and reducing an air heat capacity. Therefore, you should invest in purchasing a powerful humidifier and a water treatment system.

There is one more moment connected to management of air flow. In the most cases ventilator blocks are absolutely useless. They pull air upwards while the servers pull it from front to back. What you need to do is to throw ventilator blocks out of a budget and lay plugs on empty units in a closet. You are free to do whatever you can to close all holes, through which an air from a rear part of a closet can get to a front one. Passive ways of an air monitoring usually work better than active ones. They are cheaper as well.

A micro climate monitoring is a very important moment. Without monitoring you will never know what works improperly, i.e. not the way you meant it to. You must monitor both temperature and humidity levels. Humidity can be monitored in the farthest spot from a humidifier because this characteristic is the same for any spot in the room. Oppositely, a temperature must be monitored on a front door of a closet. If you do not apply the distribution of a cold air from a raised floor, then one sensor for one closet will be enough. If you distribute an air from a raised floor (it goes without saying you use proper air conditioners), then a good strategy will be to monitor the air on different levels from the floor (for example, 0,5 m and 1,5 m). Also, it is worth mentioning that you must never install closets with glass/blind doors. The air must pass through a closet and get inside a server freely. If you happened to have such closets somehow, remove their doors.

Several conclusions to sum up the article:

  1. Do not use home splits because they are not useful at all.
  2. Manage humidity.
  3. Manage air flow.
  4. Install plugs on unused units of a closet.
  5. Use closets with perforated front and rear doors. If you do not possess these, get rid of doors at all. Or use creative thinking and a drill.
  6. Place system monitoring sensors properly. A temperature should be taken at a front part of a closet, while humidity is measured anywhere in the room.
  7. Get rid of radiators in a server room. They do not only warm up, they can also wet a room sometimes.
  8. Get rid of windows. Windows are heat leakages and the easiest way to get inside bypassing an armored door to the server room as well as all five security posts.
  9. Implement an appropriate waterproofing, vapor barrier and thermal insulation in a server room.
  10. Tools are secondary. There are a lot of solutions on cooling and monitoring out there. The main thing is to understand what is primary for you today. This way you will find the tool easily.
  11. Please, take into account the fact that IT today is not limited to «patching KDE2 under FreeBSD», VM and DB, and there are such things like energetics, cooling, physical security and architecture.

Good luck to you in your urge to build a proper infrastructure.

Author: ksopt

The text of the article is available on the link: http://habrahabr.ru/post/235593/


comments powered by Disqus