The NSA Would Like Your Sewage

The NSA Would Like Your Sewage

When residents of Howard County, Maryland, flush their toilets, their sewage will soon end up at the NSA’s new computer center several miles away. Collecting and storing so much data has been generating a whole lotta heat for the NSA—we mean this quite literally—and the agency’s now buying treated wastewater to cool their equipment.

Read more…


    



Microsoft Wants to Power Its Data Centers With Fuel Cells

Microsoft Wants to Power Its Data Centers With Fuel Cells

Data centers are some of the most power-hungry pieces of infrastructure that exist today, but Microsoft has plans to make them a little greener—by powering its racks with built-in fuel cells.

Read more…


    



Facebook building $1.5 billion data center in Altoona, Iowa

DNP  Facebook building $15 billion data center in Altoona, Iowa

Facebook has already set up shop in North Carolina and Oregon, but it’s heading to Iowa for its next — and biggest — data center. According to the Des Moines Register, the town of Altoona will be home to a 1.4-million-square-foot facility (code-named Catapult), and it will reportedly be the “most technologically advanced center in the world.” Why Altoona, you ask? The city is already home to several data hubs, as its fiber-optic cable system, access to power and water utilities and affordable land are big draws for companies. Facebook will complete project Catapult in two $500 million phases, though the entire cost will reportedly ring in at $1.5 billion. The social network is also seeking wind energy production tax credits, which is no doubt connected to its Open Compute Project for promoting energy efficiency. That’s all we know so far; suffice to say a center this big won’t be built overnight.

Filed under: ,

Comments

Via: TechCrunch

Source: Des Moines Register

Facebook launches real-time graphs to highlight its data center efficiency

Image

Curious as to the effect that your poking wars are having on the planet? Facebook is outing power and water usage data for its Oregon and North Carolina data centers to show off its sustainability chops. The information is updated in near-real time, and the company will add its Swedish facility to the charts as soon as it’s built. The stats for the Forest City, NC plant show a very efficient power usage effectiveness ratio of 1.09 — thanks, in part, to that balmy (North) Carolina air.

Filed under: ,

Comments

Via: GigaOm

Source: Facebook, Open Compute Project

Apple says it now gets 75 percent of its total energy from renewable sources

Apple says it now gets 75 percent of its total energy from renewable sources

Based on the latest reports, the company once chided for making too large an impact on Mother Earth is now claiming that a full 75 percent of its energy is being sourced from renewables. Apple’s chief financial officer, Peter Oppenheimer, informed Reuters this week that all of its data centers — including the gargantuan facility in Maiden, North Carolina — are now fully powered by renewable energy from onsite and local sources, while three-fourths of the energy used by the whole company is pulled from green sources. For those wondering, that includes solar, wind, hydro and geothermal, and the 75 percent mark is a stark 40 percent uptick from just two years ago. As for what the future holds? According to Apple: “We won’t stop working until we achieve 100 percent throughout Apple.” Alrighty then.

Filed under: ,

Comments

Via: Reuters, Fortune

Source: Apple

HGST’s Nanotechnology Printing Breakthrough Is Great News For Data Center Storage And HDD Capacity

HGST-1.2Tbit-cropped-2013SPIE[4]

If you’re at all familiar with mobile processors, you’ve likely heard a lot about 32nm vs. 28nm construction when comparing the current generation of chips from companies like Qualcomm and others. That refers to the size of the processor, where a smaller number is better in terms of power consumption, fitting more transistors in less space for more efficient processing.

Currently, it’s hard to get past around the 20nm when creating individual patterns for data storage on today’s disk drives, which is another area in addition to processors where Moore’s Law applies. Today though, HGST, a Western Digital Company, announced a breakthrough that allows it to produce patterns as small as 10nm, via a process called “nanolithography,” meaning that it can essentially double the current maximum storage capacity possible in hard disk drives, given the same-sized final product.

HGST’s process, which was developed in tandem with Austin, Texas-based silicon startup Molecular Imprints, Inc. doesn’t use the current prevailing photolithography tech, which is limited in how small it can go by the size of light wavelengths, which is what allows it to get to the 10nm threshold, and hopefully beyond even that in time, HGST VP of Research Currie Munce told me in an interview.

The upshot of all this is that HGST hopes to have the process ready for wide-scale commercial production by the end of the current decade, with a process that makes the resulting storage both affordable and dependable enough to be used widely by customers who need ever-increasing amounts of storage. The number of customers who fit that description is increasing rapidly, too: the advent and growth in popularity of cloud services means that big companies like Facebook, Apple and Amazon are continually building and expanding new data centers in search of greater storage capacity. HGST’s nanolithography process could double the storage capacity per square foot at any of those facilities, without having the same effect on power requirements, which is clearly an attractive proposition.

While the process looks well-suited to disk-based storage, where redundancies and workaround can account for minor imperfections at the microscopic level, Munce says that HGST nanolithography is less well-suited to the task of creating mobile processors for smartphone like those mentioned above.

“If you don’t connect the circuits properly on a processor it doesn’t work at all,” he explained. “On a hard disk drive, we can always have error connecting codes, we can always use additional signal processing to cover up a few defects in the pattern that’s created.”

Still, for HDDs and computer memory (RAM), HGST’s breakthrough could have a massive impact on cloud computing, mobile devices and the tech industry as a whole, and all within the next five to six years.

16 crazy things we learned about Google’s data centers

Google’s data centers are secretive, massive locations filled with cutting-edge proprietary technology. So of course most Google fans will never get a chance to step inside one. But Google knows how amazing their data centers are, and so they recently put up a website with a complete Street View of its North Carolina data center and also spilled a whole lot of details to Wired reporter Steven Levy. Of course, no valuable trade secrets were disclosed. So what did we actually learn about Google’s data centers? (more…)

By Ubergizmo. Related articles: Seawater cooled Google data center in Finland is up and running, Google profits leaked, miss lofty expectations, stock tanks 10%,

Google takes us inside their data centers, shows you where the internet lives (video)

Google takes us inside their data centers, shows you where the internet lives

Ever fancied a look inside one of Google’s cavernous server farms? Given the security issues, the company isn’t likely to just let anyone mooch around — but understands if you’re curious. That’s why it’s adding a special collection to its Street View data that lets you wander inside without a big trek to Iowa, Belgium or Finland. If you’d like to sample some of the delights, you can check out our gallery or head down past the break to get a video tour of the facility in Lenoir, NC.

[Image Credit: Connie Zhou / Google]

Continue reading Google takes us inside their data centers, shows you where the internet lives (video)

Filed under: , ,

Google takes us inside their data centers, shows you where the internet lives (video) originally appeared on Engadget on Wed, 17 Oct 2012 09:38:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceGoogle, Where The Internet Lives  | Email this | Comments

Researchers turn to 19th century math for wireless data center breakthrough

Researchers turn to 19th century math for wireless data center breakthrough

Researchers from Microsoft and Cornell University want to remove the tangles of cables from data centers. It’s no small feat. With thousands of machines that need every bit of bandwidth available WiFi certainly isn’t an option. To solve the issue, scientists are turning to two sources: the cutting edge of 60GHz networking and the 19th century mathematical theories of Arthur Cayley. Cayley’s 1889 paper, On the Theory of Groups, was used to guide their method for connecting servers in the most efficient and fault tolerant way possible. The findings will be presented in a paper later this month, but it won’t be clear how effectively this research can be applied to an actual data center until someone funds a prototype. The proposed Cayley data centers would rely on cylindrical server racks that have transceivers both inside and outside the tubes of machines, allowing them to pass data both among and between racks with (hopefully) minimal interference. Since the new design would do away with traditional network switches and cables, researchers believe they may eventually cost less than current designs and will draw less power. And will do so while still streaming data at 10 gigabits per second — far faster than WiGig, which also makes use of 60GHz spectrum. To read the paper in its entirety check out the source.

Filed under: , , , ,

Researchers turn to 19th century math for wireless data center breakthrough originally appeared on Engadget on Fri, 12 Oct 2012 11:39:00 EDT. Please see our terms for use of feeds.

Permalink Wired  |  sourceOn the Feasibility of Completely Wireless Datacenters (PDF)  | Email this | Comments

Google data center in Oklahoma to get 48MW of wind power, boost renewable energy in the Sooner state

Google data center in Oklahoma to get 48MW of wind power, boost renewable energy in the Sooner stateGoogle has made a point of relying on renewable resources for its data centers whenever possible, even down to the cooling. It hasn’t had quite as unique an arrangement as what it’s planning for its data center in Oklahoma, though. The search firm wants to supply its Mayes County location with 48MW of wind energy from Apex’s Canadian Hills Wind Project, but it isn’t buying power directly from the source. Instead, it’s making a deal with the Grand River Dam Authority, a utility, to purchase the clean power on top of what’s already supplied from the GRDA at present. The deal should keep the data center on the environmentally friendly side while giving it room to grow. Wind power will come online at Google’s facility once the Canadian Hills effort is up and running later in 2012; hopefully, that gives us enough time to better understand why there’s a Canadian River and Canadian Hills to be found in the southern United States.

Filed under:

Google data center in Oklahoma to get 48MW of wind power, boost renewable energy in the Sooner state originally appeared on Engadget on Thu, 27 Sep 2012 01:33:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceGoogle Official Blog  | Email this | Comments