News Release
Thinking Inside the Box: UC San Diego Opts for Eco-Friendly Datacenters
Sun Modular Datacenter Can Cost Up to 40% Less to Cool
San Diego, CA, May 16, 2008 -- It's not often that researchers at the University of California, San Diego are asked to think inside the box, but two large shipping containers installed on the campus yesterday hold great promise for scientists in the form of advanced data storage and energy savings.
|
Each high-density, 200 kilowatt-capacity datacenter-in-a-box will not only meet the need for robust data storage, but its eco-friendly design is expected to reduce cooling costs by up to 40 percent when compared to a traditional brick-and-mortar server room.
"The ultra high bandwidth network that we have at UCSD makes it possible to move vast amounts of data to and from these modular data centers, and the fact that the units are green was a key factor in this procurement decision," said Elazar Harel, Assistant Vice Chancellor for ACT. "The flexibility and agility of the units will allow us to expand and grow in the future, and we're proud of the fact that this was a joint partnership between Calit2, the School of Medicine and ACT. If these datacenters prove to be effective, we will consider deploying them elsewhere on campus in place of traditional server rooms."
|
Sustainability in a box
To eliminate the need for air conditioning, each Sun Modular Datacenter's closed-loop water-cooling system uses built-in heat exchanges between equipment racks to channel air flow. This allows the unit to cool 25 kilowatts per rack, compared to the 4- to 6-kilowatt cooling capacity of typical datacenters. The industry-standard racks can also be placed close together, further reducing the structure's overall eco-footprint and increasing energy efficiency by eliminating dead space. In addition, with more than 34 billion possible server layout combinations, the datacenters are re-configurable and scalable. Such flexibility could eventually allow the units to serve as a test environment for a variety of environmental variables, such as temperature and humidity.
Tad Reynales, technology infrastructure manager for Calit2, compared the containers to the controlled environment of a high-tech wine cellar, where equipment can be stored under conditions that maximize performance.
"There's an opportunity here to have your 'red wines' at one temperature, and your 'white wines' at another," said Reynales. "And the units are primarily designed to be accessed remotely in what we call 'lights out management.' Since you don't have to maintain a noise level or a comfort level, it allows for some experimentation with environmental parameters."
|
"Space is a huge issue at this university, and I think this is one solution for part of the problem," said Charlotte Klock, executive director for IT infrastructure at ACT. "Other UC campuses are looking at us now, waiting to see if this project really pays off."
Only two other universities in the world currently use Sun Modular Datacenters to store data: Stanford University and the Netherlands' Radboud University. Stanford's Linear Accelerator Center had its first container delivered fully loaded with server equipment; it weighed in at 23,400 pounds. UCSD's units, which weren't yet equipped with servers, weighed in at 16,500 pounds each. ACT expects the boxes to be equipped and ready to stream data live within a few weeks.
Lots of storage in a little space
Each of UCSD's two Sun Modular Datacenters can hold as many as 280 servers, providing up to 18 terabytes of active memory and more than three petabytes of disk storage. To put that in perspective, the entire Library of Congress – some 40 million books and 130 million documents on 535 miles of shelf space – represents only a tiny fraction (.02) of one petabyte.
Gill says that the School of Medicine and the Skaggs School of Pharmacy and Pharmaceutical Sciences will use their unit to store microscopy, genetics and genomics data – especially genetic and epigenetic data obtained by researchers based at UCSD, UCSD's branch of the Ludwig Institute for Cancer Research, the Moores Cancer Center and the La Jolla Institute for Allergy and Immunology which is located in UCSD's Science Research Park.
|
Researchers from Calit2's Collaborative Cyberinfrastructure for Advanced Microbial Ecology Research and Analysis (CAMERA) plan to store marine metagenomics data – including DNA sequences of ocean microbes and associated environmental metadata – in the Calit2 datacenter. The institute's CineGrid project is expected to archive 4K digital video data in the boxes. (4K is a digital cinema format offering four times the resolution of today's highest-resolution HD television standard.) The Calit2-based OptIPuter project is also eying the new datacenter as a location to store netflow data, or to store extra data as the OptIPuter network goes from today's 10-Gigabit-per-second (Gbps) to a potentially 100-Gbps network. For now though, OptIPuter researchers say their primary focus will be to analyze the data that others are storing for environmental variables.
The Jacobs School of Engineering and Scripps Institution of Oceanography are reportedly also considering hosting their data in one of the units, and the San Diego Supercomputer Center's Philip Papadopoulos – a co-principal investigator on the OptIPuter project – will play a critical role in getting the Sun Modular Datacenters to fit into the wider UCSD network.
|
To gauge how the datacenter would perform in an earthquake or while on a cargo ship on the high seas, researchers from the Jacobs School's seismic test facility conducted rigorous testing of the datacenter's anti-vibration system last year. Even with several computer applications running, the unit withstood a magnitude 6.7 earthquake (equivalent to the devastating quake that rocked Kobe, Japan, in 1995) and sustained only a few loose screws. The unit's performance in the earthquake simulation bodes well for the university's plans to use the datacenters as back-up storage to help mitigate data loss in the event of a disaster.
The specific location of the datacenters was partially determined by the proximity of chilled water and electrical sources. Project manager Buck Wilmerding said it took about three months to prepare the site, which included running pipes and electrical wiring, building the foundation, and cutting down a single diseased tree to make way for a healthy one – in keeping with the project's eco-friendly roots.