Some CIOs and data center managers have found themselves having to wring performance out of monstrously deficient facilities. Ultimately, these three wrestled with their infrastructures, made major changes and won the day.
Pomona Valley Medical Center CIO Kent Hoyos says he became embroiled in a too-literal version of "Dante's Inferno" when he tried to consolidate data center operations for the 436-bed acute-care hospital, located in Pomona, Calif., near Los Angeles. Escalating heat problems threatened ongoing operations and future technology deployments as temperatures inside the data center began to spike to as much as 102 degrees.
"It was horrific," Hoyos says. "It was about as dire a situation as I think anyone would ever want to have to deal with. We were trying to get as much out of our data center as possible, but what we wound up with was just an awful environment."
The firm that had originally been hired to design the consolidated data center had promised that there would be adequate cooling, but by the time temperatures began to increase, the firm had gone out of business and left Pomona with an untenable working environment.
The hospital was using two 5-ton air conditioners to cool the data center, but the increasing heat generated by the tightly spaced server installation began to push typical temperatures inside the data center past 90 degrees. Two portable air-conditioning units were placed on the floor, and a hole was cut into the facility's Plexiglas window to allow for the installation of a third air conditioner. But even then the heat persisted, so multiple box fans were hung at various points along the ceilings in an attempt to reduce problems at specific hot spots.
"It was one of the most ridiculous things you've ever seen," Hoyos says. "We simply did not have the infrastructure necessary to move forward."
Alas, it was not a divine comedy for Pomona. The hospital attempted to add a Picture Archiving and Communications (PAC) digital radiology system that required a large SAN and archiving platform, which "were just like putting a furnace in the room."
When one of the air conditioning units failed, the data center saw temperatures exceed 100 degrees, leading to the loss of several hard drives and a lab system. In all, more than US$40,000 worth of equipment was damaged, and Hoyos' IT staff was besieged by help desk calls.
Space limitations made it impossible to add more large air conditioners, and a raised floor only 6 inches high also limited options. Hoyos investigated using chilled-water solutions but was reluctant to introduce water in his data center, particularly since the small floor space would mean the cooling units would have to be mounted above the servers. He instead began working with Emerson Network Power's Liebert subsidiary and decided to install its XD high-density cooling systems.
Twenty Liebert XDV cooling units and two XDP pumping units were installed inside the facility to supplement the existing 5-ton units. The XD systems were mounted directly on top of the hot server racks, providing cooling of up to 500 watts per square feet. The units use a coolant that is pumped as a refrigerated liquid to the modules, where the coolant is then vaporized as a gas that absorbs heat.
In total, the new cooling units bought the facility the equivalent of 44 tons of air conditioning, allowing Hoyos to transform his previously sizzling environment to a more stable and comfortable 66 degrees.
"All of a sudden, the help desk got quiet," he says. "The problems that we were experiencing because of the un-optimized operating condition just ceased."