Throughout the Energy Systems Integration Facility on the campus of the National Renewable Energy Laboratory in Golden, Colo., you will find some of the country’s best and brightest diligently solving the questions to the world’s pressing energy questions.

But, to hear Chris Gaul, P.E., put it, the inner-workings of the ESIF itself are not a reinvention of the hydronic heating and cooling wheel.

“There is no secret chamber where magical things happen,” says Gaul, NREL’s energy manager. “It is just standard commercial items put together in a very thoughtful way.”

The ESIF is a 182,500-sq.-ft. complex featuring 15 laboratories where researchers are working on the development of the smart grid of the future. The information gleaned from the constant, rigorous testing is stored in the supercomputers hosted in the building’s data center.

During the rigorous design-build process leading to the completion of the building in 2012, the NREL set aggressive performance goals for the ESIF, including hitting LEED Platinum certification (which is 50% higher than ASHRAE 90.1 standards) and a power-use effectiveness (PUE) of 1.06, significantly lower than the industry average of 1.7 to 1.8.

Then in August 2014, it went back and made some modifications to the hydronic system. The changes made brought that already impressive PUE to less than 1.06 — including 10 of 17 months where the PUE was below 1.06, according to the NREL.

 

Starting at the source

Using the data center and the heat and energy the computers make was crucial in reaching that minuscule PUE. The NREL partnered with Hewlett-Packard to develop a liquid-cool platform that takes the warm water to dissipate heat generated by the computers. The design uses water-side economization so no refrigerant-based chillers or a boiler are needed.

To get the water flowing throughout the building, the design — spearheaded by Robert Thompson, chief mechanical engineer with nationwide firm SmithGroupJJR — called for more than 55 pumps to be installed, all from Bell & Gossett, including the e-1510 centrifugal, Series 90, Series 80 and Series 60 units. As the primary way to move the energy throughout the system, Bell & Gossett GPX plate and frame heat exchangers were selected and installed.

The water that came out the other side of the servers is then repurposed when needed to heat parts of the building.

Tyler Lobb, P.E., preconstruction manager with Westminster, Colo.-based MTech Mechanical, notes the system uses a multitude of water temperatures to cool different spaces of the ESIF building such as the data center, the laboratories and the general office space.

“They had the higher-range cooling water going through the actual data center server ranks,” Lobb says. “It is a pretty atypical data center on the HVAC side. It is not something — at that size — that is typically built.”
 

Rebooted piping

The goal of the pipe reworking in August 2014 was to increase energy efficiency by allowing air-cooling coils to work in concert with the cooling-tower system for the supercomputers. Michael Sheppy, P.E., energy engineer with NREL, says the water is not sent parallel to the water-cooled or air-cooling equipment, rather it is sent in a series. The first stop along the trip is at the air-cooled equipment, then to the water-cooled equipment and back into the cooling towers.

Sheppy says the change has shown a greater heat-loss rejection percentage – to the tune of 22% to 62%. The temperature of the water coming from the data center increased from 80° F to 95° and the temperature differential from the energy recovery heat exchangers increased from 0.5° to 10°.

“The two exciting things we have seen from the repiping is we are able to heat the building during certain times of the year just by what heat is created in the data center,” Sheppy states. “Also, in the summer we are going to dump our waste heat into the campus hot-water loop and use that as a thermal battery.

“That will help us save energy and reduce the load on the campus cooling towers.”

Sheppy with a current average PUE below 1.06 the ESIF is the most energy-efficient data center in the world. He says they are making use of absolutely every spec of waste heat and energy that comes off the supercomputers.

“Cooling with water is 1,000 times more effective than cooling with air,” he says.

 

New to the block

Employing a hydronic design for a major data center is the way to go, according to Lobb, Gaul and Sheppy. But each believes that in due time engineers will be more receptive to going forward with a design similar to the ESIF.

For Lobb, he believes engineers will be more receptive once sticker shock is no longer a concern.

“I think the biggest issue there is the price point,” Lobb states. “It is new on the datacenter side, but I can see it getting to the point in a couple years where it is more regular once they learn how to design it more efficiently. Once that price point comes down, engineers will be looking at the paybacks more as well. I’m not sure how long it will take to get there where it is the norm.”

Gaul concurs that water-based cooling applications will be a major player in the datacenter market in the coming years.

“That seems to be a trend,” Gaul states, “because that is a big cost factor you can cut out of data centers (budget). The upshot is in order to reduce a data center’s energy output you have to cool it without using chillers and a lot of secondary power. You want all the watts going into it sorting the ‘1s’ and ‘0s’.

“When you do it this way I do not believe there is much a cost premium, but there is quite a performance gain.”

One space where Lobb has plenty of room to operate within is the mechanical room and if a visitor to the ESIF wants to confirm that they are more than welcome.

“One of the luxuries that we did have was the size of our mechanical spaces,” he notes. “They give tours at the building. They have large glass walls that showcase the area. They highlight the mechanical rooms during tours to show the stuff they have. That is really different than a typical building.”

 

Staying on their toes

Lobb registers no complaints about the time he worked with the NREL, but the process did provide its share of challenges to overcome. Since the NREL is on the cutting-edge of testing the latest technologies for renewable energy, the organization’s staff has to be — as does the physical building — ready for anything.

“The biggest challenge was the changes that design went through,” he recalls. “We had a client that was trying to construct a building that could facilitate anything and everything they could imagine as far as what they were going to use their lab spaces for. Every week they would think of something else and would say ‘We might, five years down the road, have to do this experiment. We need to accommodate for that.’

“It was a constantly changing design that we had to keep up. The whole team knew what we were going through. The designers, contractors and vendors, we all knew the drill and how we had to work collaboratively to stay on schedule.”

Lobb — who started on the project as a 27-year-old — is grateful to have played his role in the design and construction of the ESIF and he hopes to work on more projects with such forward-thinking design and scale.

“I always joke that I peaked early in my career,” he says. “This was a one-of-a-kind project. The number and the types of systems that were in this one project, all under one roof, is something I most likely will never see again. It was a great experience. It was riddled with challenges, but at the end of the day we got done on time and had a happy client. Overall, it was a great job to be a part of and I am always thankful I got that opportunity.”

Gaul hopes that fellow engineers and designers make a trip to the NREL campus and get inspired.

“These are things that anyone can do,” he states. “We can show a lot of science and we can show a lot of amazing images. We also can tell builders and engineers to walk through the facility and then they see how things are done.  They say, ‘You know, I can do that. That is not so tricky.’”


This article was originally titled “Doubling down” in the March 2016 print edition of PM Engineer (pme).