Society’s increasing dependence on and consumption of data makes it necessary to figure out innovative ways to store it. As part of an initiative called Project Natick, Microsoft deployed a shipping-container-size data center prototype in Scotland that’s processing workloads and could eventually provide ultrafast cloud-based services to people in coastal cities.
Overall, Project Natick aims to explore methods of manufacturing and operating environmentally sustainable, prepackaged data centers that clients can order in desired sizes, send to the proper locations quickly and run for years underwater without human intervention.
This Technology Could Serve Most of the World
More than half the world’s population lives within 120 miles of the coast, which is one reason why Microsoft is putting so much energy into Project Natick. That proximity means that data stored beneath the surface has a short distance to travel.
So, people who rely on an underwater data center could benefit from quick Internet speeds that let them browse the web, stream content, play games and interact with artificial-intelligence (AI) applications.
A Use for Renewable Energy
Microsoft chose the Orkney archipelago in Scotland, which contains about 70 islands, for its underwater data center trial. The company was strategic in choosing that particular location. Orkney has the highest concentration of small and micro wind turbines in the U.K. Also, even though it’s a British county with a small population—approximately 10,000 residents—it has the most domestic wind turbines of any U.K. county: over 500. These turbines supply more than enough electricity for the people who live there.
Orkney’s early and abundant adoption of wind energy is a factor that likely compelled Microsoft to select that location. After all, it’s wind power, combined with solar, wave and tidal energy, that generates the electricity for the data center, thereby meeting the renewable-energy goal for this project.
No Maintenance Needed
One question on many people’s mind is how Microsoft will maintain an underwater data center. The equipment it used for Project Natick is self-sustaining and designed to work for five years without maintenance. But AI technology will monitor the equipment for signs of failure and draw conclusions about environmental conditions and longevity of the servers. The 12 racks in this data center contain 864 standard servers, collectively providing enough space for about five million movies as well as processing power equal to that of several thousand high-end consumer PCs.
The researchers on the Microsoft team think eliminating maintenance is not a far-fetched hope. Because the company deployed the underwater data center in an unpopulated area, it can remove all of the oxygen and the majority of water vapor from the atmosphere. Doing so would reduce corrosion, a major problem that limits the lifespan of traditional data center equipment on land. Repairing the data center’s computers if they break down isn’t an option, however.
Spending Fewer Resources on Cooling and Other Energy Needs
Scotland has an emerging aquaculture industry, with Atlantic salmon accounting for 90 percent of the country’s economic impact. Analysis indicates that Scottish fish farms are important to the world’s food supply. The fact that all fish live in the water was probably another aspect on the minds of Microsoft engineers as they planned their unconventional data center. Research indicates that the energy consumed by cooling equipment can account for more that 40 percent of a data center’s total energy.
People working on Project Natick can only theorize about how the cooling costs for their data center may change, but the theory is that the underwater location will reduce the energy that cooling equipment requires. Also, the Orkney data center is what's called a lights-out location. Its isolated location should limit environmental fluctuations that would otherwise result in more energy usage.
Microsoft representatives say the cold-aisle temperature in the data center is around 53°F, and they don’t anticipate significant deviations. If that’s the case, the environment could stay consistently cool enough for optimal functioning of the machinery while removing the potential for temperature-related stresses. Because this data center has no human workers, it eliminates expenses incurred by making rooms sufficiently well lit for people to work safely and by controlling the temperature around doorways.
A Scalable Solution
When most customers interact with data center representatives and consider doing business with them, scalability is an important topic. The Orkney data center is inside a single container, but engineers envision connecting several modules to scale as needed. They might have a central node for power and connectivity, similar to rows of traditional racks in a data center.
Fast Deployment Through a Multinational Effort
Project Natick is in phase two following an initial effort in 2016 that involved a 105-day trial of a data center submerged in 30 feet of water off the California coast. That system had 100 sensors monitoring environmental conditions such as humidity and motion. This latest attempt started when a French company assembled the data center and shipped it to Scotland on a flatbed truck. On arrival, it was attached to a base filled with ballasts. Operators used 10 winches and a crane to move the data center to the desired spot in the ocean and lower it almost 120 feet to the seafloor. Also, a remote-controlled piece of equipment brought the data center’s fiber-optic and power cables to the surface for testing before attaching them in the proper place.
Even though the process sounds complex, the deployment time for this kind of data center is less than 90 days from the time it ships from a factory to when it becomes functional. That short time could make these underwater data centers feasible for clients who want to speedily meet demand and show the kind of flexibility that tends to coincide with market competitiveness.
The underwater data center’s vessel was deployed on June 1, 2018, at the European Marine Energy Centre. The researchers involved intend to provide periodic updates. This experiment is remarkable because if it succeeds, people could forever think differently about the makeup and location of data centers. And the use of water, as well as land, suits the world's growing data consumption and the need to store and process it.
About the Author
Kayla Matthews is a technology writer and reporter, contributing to websites like VentureBeat, VICE, MakeUseOf and TechnoBuffalo. Visit ProductivityBytes.com to read more recent posts by Kayla.