If the white space of a datacenter is the place where the actual machines are hosted (we talked about it here), the gray space is the area set up to accommodate the back-end infrastructure that allows the machines to function. The larger a white space is, the more gray space is likely to be needed to support it.
Sustainability is one of the pillars that guided the entire project, from the construction of the datacenter at the Bologna Technopole, to the design of the supercomputer itself.
All the technological solutions used were evaluated also from the point of view of their environmental impact: great attention was therefore dedicated to the design of the electrical and mechanical infrastructure, to the lighting, and, in particular, to the technologies used for cooling Leonardo and the datacenter rooms.
COOLING THE MACHINE
One of the fundamental elements is the cooling system of the machines: without it, Leonardo would literally burn in a very short time, rapidly losing efficiency and, in fact, stop working.
Precisely to keep the temperature at bay, the racks are cooled with warm water: the water enters the supercomputer cooling circuits at 36 ° C and exits at 42° C, heated by the CPUs engaged at the maximum possible power in the calculations and processing, to be sent out to the adiabatic disposers, called drycoolers, which bring it back to a temperature of 36 ° C.
The classic refrigeration cycle is therefore not used, as in many other datacenters, but only heat exchangers, composed of huge “fans“, to cool the water, and which exploit a natural adiabatic process.
This process is based on water evaporation to bring down the excess temperature. The difference compared to classic cooling systems is that drycoolers are much more efficient, thus reducing energy consumption.
AIR CONDITIONING SYSTEM
In addition to the internal cooling of the machine, there is also an air conditioning system, to keep the white space at an acceptable temperature for the operation of the machine itself.
The air conditioning of the room, and of all the technical spaces in general, is carried out with chilled water: the water, brought to 19 ° C by the refrigeration units, is sent to the various rooms to cool the air, and returns to the refrigeration units with a temperature of 26 ° C, and then be cooled again.
Overall, the air conditioning system consists of about 10 km of closed-circuit piping, about 7 km for the main circuit, and 3 km for the secondary one, which from the plants bring chilled and tempered water to the rooms and to the supercomputer.
The pipes contain a total, between warm and chilled water, of approximately 500 m3.
The cooling capacity required for air conditioning has been divided into four units that feed two separate circuits and with the relative systems, operating at two different temperatures.
ELECTRICITY NETWORK
Machines like Leonardo are naturally energy-consuming, so even the electrical system has been carefully designed to ensure a PUE, or an efficiency in terms of electricity consumption, estimated at around 1.18: this means that for each watt used directly from Leonardo for the calculation operations, only a fraction more will be needed to ensure full operation of the whole system, including cooling.
The electrical power is transported in medium voltage and is transformed into low voltage by the four independent power stations, and then distributed in the room by means of the aluminum busbars positioned above the racks.
The electrical distribution inside the room is guaranteed by two different types of busbars, primary of 3200 ampere and secondary of 1000 ampere, which are fixed to the “copponi”, positioned immediately below the power plants in order to also reduce network losses.
Leonardo’s electrical network manages therfore a total of 4200 ampere, enough to power 9333 LED bulbs at the same time.
The first floor of the data center is also intended for all the equipments that guarantees the quality and continuity of Leonardo’s power line through UPS, powered by lithium batteries, and which manage the switching of the electricity grid with the generating sets.
Given the LEED certification of the datacenter, which will be of level Gold, the hosting entity is obliged to use at least 12% of renewable energy: Cineca aims to use at least 50%.
In the gallery below you can see some images of the gray space.
More information on Leonardo’s gray space is available at this link.