Google Buzz’s last gasp: Drive swallows final bits of lost social network

Google Buzz was shut down back in 2011, but Google seems to still be dealing with the service like an annoying dog that won’t go away. Google has notified its users via email that all your Buzz data will be moved to a folder in your Google Drive account. The migration will begin starting July 17, where users will begin seeing their Buzz data appear in Google Drive.

buzz

The transfer will result in two sets of files showing up in Google Drive, the first of which will contain a snapshot of the public and private Buzz posts that you have made. This will be a private Drive folder, while the second set of files will be public by default, consisting of all your public Buzz posts that can be viewable by anyone with the files’ links.

As for comments, those will also be migrated to Google Drive, but they will be saved to the user’s Drive whose post the comments appeared on, not the commenter. This could create a little friction, as commenters won’t have any control over who sees the migrated data of a post that has a comment of theirs in it, so Google recommends that you delete your Buzz content now before the migration happens.

Google also says that all Buzz files will be treated “the same as any other Drive file,” so you’ll be able to do whatever you want with them just like you could with regular Drive files. These added Buzz files won’t count against your storage limit, but you should be able to easily delete them if you don’t have a need for them.

Google Buzz originally launched back in February 2010, and while it was given the benefit of the doubt by early adopters, it eventually didn’t stand the test of time. No one used it, and it didn’t offer anything that Facebook and Twitter didn’t already offer. Google shut down Buzz in 2011 to no one’s chagrin, but eventually launched another social network called Google+, which is the company’s current social offering that seems to be doing fairly well amongst a niche audience.

VIA: The Next Web


Google Buzz’s last gasp: Drive swallows final bits of lost social network is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Researchers achieve world record in wireless data transmission, seek to provide rural broadband

Researchers achieve world record in wireless data transmission, seek to provide rural broadband

Speed. It’s a movie. It’s a drug. And it’s also something that throngs of internet users the world over cannot get enough of. Thankfully, the wizards at the Fraunhofer Institute for Applied Solid State Physics and the Karlsruhe Institute for Technology have figured out a way to satisfy the unsatisfiable, announcing this week a world record in the area of wireless data transmission. Researchers were able to achieve 40Gbit/sec at 240GHz over a distance of one kilometer, essentially matching the capacity of optical fiber… but, you know, without the actual tether.

The goal here, of course, isn’t to lower your ping times beyond where they are already; it’s to give rural communities across the globe a decent shot at enjoying broadband. Distances of over one kilometer have already been covered by using a long range demonstrator, which the Karlsruhe Institute of Technology set up between two skyscrapers as part of the project “Millilink”. There’s no clear word on when the findings will be ported over to the commercial realm, but given the traction we’re seeing in the white spaces arena, we doubt you’ll have to wait long.

Filed under: ,

Comments

Via: Physorg

Source: Fraunhofer Institute for Applied Solid State Physics

Nissan plans to make Leaf data available to app developers

Nissan plans to make Leaf data available to app developers

Those of you familiar with the Nissan Leaf will know about its Carwings system, which lets you check the vehicle’s charge, turn on the AC, rate your driving efficiency against others and even read RSS feeds out loud — all over an always-on cellular data connection. In fact, the RSS functionality raised some privacy concerns when it was discovered that Carwings embeds location and other data in the URL it sends to public servers (something that can thankfully be disabled by the owner). Nissan announced today that it plans to make telemetry data from the Leaf available to third-party developers for a fee — with the owner’s consent, of course. The company already uses telemetry data for vehicle maintenance and products like Carwings, but it hopes to broaden the ecosystem with apps. Examples include smart-grid integration (supplying power to a building for a reduced parking fee) and location-based services (real-time coupons as you drive by restaurants). It’ll be interesting to see if there’s enough interest from both developers and Leaf owners for Nissan to successfully monetize this idea.

Filed under:

Comments

Source: Nikkei (subscription required)

At I/O, Google Will Be Tracking Things Like Noise Level And Air Quality With Hundreds Of Arduino-Based Sensors

motes

If you’re attending Google I/O this week, you will be a part of an experiment from the Google Cloud Platform Developer Relations team. On its blog today, the team outlined its plan to gather a bunch of environmental information happening around you as you meander around the Moscone Center.

In the blog post, Michael Manoochehri, Developer Programs Engineer, outlines his team’s plan to place hundreds of Arduino-based environmental sensors around the conference space to track things like temperature, noise levels, humidity and air quality in real-time. This was spawned due to a fascination with wanting to know which areas of the conference were the most popular, so it will be interesting to see what the information the team gathers actually tells us.

At first glance, this seems a little bit creepy, but it’s no different than a venue adjusting the cooling system based on the temperature inside at any given moment. As with anything that Google does, this could have implications for tracking indoor events or businesses in the future, as Manoochehri shared:

Networked sensor technology is in the early stages of revolutionizing business logistics, city planning, and consumer products. We are looking forward to sharing the Data Sensing Lab with Google I/O attendees, because we want to show how using open hardware together with the Google Cloud Platform can make this technology accessible to anyone.

Notice the wrap-up of wanting to show people how open hardware combined with Google’s Cloud Platform benefits everyone. Ok, sure. What could data like this mean for businesses, though? Well, a clothing store would be able to track how many people came in and browsed, which areas of the store were hot-spots for interest and then figure out how their displays converted. It’s like real-world ad-tracking. It makes sense, but still seems a long way off.

What will be interesting is not each dataset that is collected, but what all of them tied together tell us about our surroundings:

Our motes will be able to detect fluctuations in noise level, and some will be attached to footstep counters, to understand collective movement around the conference floor.

Of course, none of this information is personally identifiable, but the thought of our collective steps, movements and other ambient output being turned into something usable by Google is intriguing to say the least…and yes, kind of creepy.

If this particular team can share all of the data it collects in an easy to digest way, then businesses will be clamoring to toss sensors all over their stores and drop the data on whatever cloud platform that will host it the cheapest. Google would like to be that platform.

During the event, the team will hold a workshop on what it calls the “Data Sensing Lab,” so if you’re interested on learning more about what the team is gathering as you walk around, this would be the place to go. You’ll also be able to see some of the real-time visualizations on screens set up throughout the conference floor.

We’ll be covering all of the action as we’re being covered by Google.

ESPN streaming content subsidization: mobile carriers mull partial payment

As most carriers have now moved toward using data caps and effectively got rid of unlimited data plans, it seems there are still some big companies out there that feel bad for the users, ESPN being one of them. The sports media network has reportedly been in talks with at least one major carrier about paying them as a subsidy so that streamed ESPN video wouldn’t count against users’ data caps.

espn

According to the Wall Street Journal, ESPN has been discussing possible solutions for the data cap issue, but the specific carriers that are said to be in talks with ESPN haven’t been disclosed. Essentially, ESPN wants to pay carriers to not have their mobile streaming content count against users monthly data caps, since many data plans only allow for a couple hundred megabytes per month.

If these discussions are true, the possibility of this kind of deal happening would still be up in the air, and they know that. The company says that they’re not sure if the deal would work financially, but at least they’re giving it a try, and it goes to show that not all company’s are evil — some actually want to make life easier, even if that also helps out the company.

However, data caps can also have a direct negative effect on companies like ESPN, where they partially rely on mobile ad revenue. With data caps in place, smartphone users are less likely to stream mobile content knowing that it will obliterate their monthly usage in no time. So, by going to carriers to make a deal like this would not only benefit ESPN, but also ESPN customers.

This kind of deal could also bring in more money for carriers. While they would have to give up a portion of their bandwidth, they would be getting paid by content providers on either a monthly or yearly basis (or possibly just a multi-year contract of sorts). The WSJ reports that both Verizon and AT&T are at least interested in this kind of concept, and it would introduce new sources of revenue without having hike up fees for customers.

SOURCE: Wall Street Journal


ESPN streaming content subsidization: mobile carriers mull partial payment is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Who Actually Uses Math at Work?

Let’s admit it together. We all kind of suck at math. It’s okay! Numbers are evil. And back in high school when you were forced to struggle through Algebra and Geometry and Algebra again and if you were especially unlucky, Calculus, you probably thought to yourself when in the hell would you ever use all those stupid theories, equations and computational silliness in real life. And the truth is you won’t use them! Who needs math! More »

Facebook unwraps plans for new data center in Iowa

As rumored earlier, Facebook has taken off the wraps of a new data center that will begin construction this summer in Altoona, Iowa. This will be the social network’s fourth self-owned and operated data center. The company already has data centers in Prineville, Oregon; Forest City, North Carolina; and Luleå, Sweden.

facebook-data-center

The new data center will feature the same Open Compute Project server designs and outdoor-air cooling system that the other Facebook data centers boast, but the new center will also include overall improvements to the building’s design and networking architecture than the other data facilities. Facebook claims that the new Iowa facility “will be among the most advanced and energy efficient facilities of its kind.”

As far as why Facebook chose Iowa to plop down a new data center in, the social networking giant said that the state is full of wind-generated power, and has “a great talent pool that will help build and operate the facility.” Of course, there’s also plenty of flat and open land there, making it ideal for a large facility that relies on outdoor air for cooling

The company didn’t reveal any statistics on the new facility, but the Des Moines Register reports that the new data center will cost $1.5 billion to construct. There’s no official say as to how large this new data center will be, but the Des Moines Register says that the facility will be 1.4 million square feet. Facebook’s one billion users are uploading a lot of content to the social network, and it all has to go somewhere.


Facebook unwraps plans for new data center in Iowa is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Facebook is reportedly behind “Project Catapult” data center

Facebook is reportedly the company that’s planning on building a $1.5 billion data center in Altoona, Iowa. Before, everything was kept hush-hush, and the only thing we knew about the data center was that it was referred to by officials as the cryptic “Project Catapult”. Des Moines Register stated that it spoke with lawmakers about Catapult and discovered that it is Facebook who is behind the project.

Facebook to reportedly build 1.5 billion dollar data center in Altoona, Iowa

Facebook was said to have been scouting sites to launch its next data center. There was a location in Nebraska that Facebook was reportedly looking at, but apparently Facebook has decided to go in another direction. According to Data Center Knowledge, official’s approved the site plan for Facebook’s data center in Iowa back in June, and by November Facebook and state/local officials had a meeting to discuss the “fine details” of the data center in order to finalize the deal.

Many people had already speculated that Facebook was behind the data center in Altoona, mostly because the site plans looked very similar to Facebook’s site plans for its data centers in both Oregon and North Carolina. The entire building will be about 1.4 million square feet, with 3 separate data centers measuring 466,000 square-feet. It is said to be the “most technologically advanced data center in the world.”

Facebook is also said to be in talks with officials about tax credits for wind energy production, as well as a new payment rate on water. While there are various factors that point to Facebook as the company behind the data center, Facebook has yet to officially confirm it. But if it is, it looks like both Google and Microsoft’s data centers in Iowa will be getting a familiar neighbor pretty soon.

[via Des Moines Register]


Facebook is reportedly behind “Project Catapult” data center is written by Brian Sin & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Apple confirms it keeps Siri data for up to two years (update: Google too)

Apple confirms it keeps anonymized Siri data for up to two years

It’s no secret that Apple hangs onto your Siri data for some length of time (as other companies so with search data and the like), but it hasn’t been clear exactly how long it keeps that data sitting on its servers. Wired has now cleared that up somewhat, though, hearing from Apple spokesperson Trudy Muller that the company “may keep anonymized Siri data for up to two years.” That word follows another report from Wired yesterday that raised concerns about the issue. As Muller notes, the data is immediately deleted if a user turns Siri off at any time, and it’s anonymized from the start; neither your Apple ID or email address are stored with a data, but rather a randomly generated number that represents the user and becomes associated with the voice files. That number then gets disassociated from voice clips after six months, but Apple still hangs onto the files for another 18 months for what’s described as testing and product improvement purposes.

Update: The Financial Times has confirmed with Google that it, too, keeps your voice search data for up to two years. Google itself has previously detailed how it handles that data, including the added measures put in place when a user opts-in for personalized voice recognition; in that case, electronic keys linked to your account are generated that Google says are “designed to be accessed by machines, not people.”

Filed under: , ,

Comments

Source: Wired

Facebook rolls out near-real-time public PUE and WUE dashboards for data centers

Facebook has launched public dashboards for its Forest City, North Carolina and Prineville, Oregon data centers, providing easily accessible access to near-real-time Power Usage Effectiveness (PUE) and Water Usage Effectiveness (WUE) information. The dashboards provide continues information that is updated to the minute, also providing a 24-hour set of data and historial view containing the year’s numbers.

Data Center

The reason for the change is a matter of pride, according to the announcement, with a desire to show the efficiency of the data centers. In addition, by providing the information, Facebook says that the centers are demystified, and that such a move follows naturally in line with its sharing of the hardware and building designs. The public now gets a peek at what the data centers’ technicians see daily.

The public dashboards won’t change the annualized averages, which are accessible via the dashboards beneath the near-real-time information. Facebook first made PUE information for Prineville available the second quarter of 2011, eventually rolling out WUE at the same center in mid-2012. Those who follow the dashboards are advised that some “weird” numbers might pop up occasionally because the data centers are still under construction and the variables frequently shifting.

In addition, the dashboards are available for others to use, with Facebook saying that the front-end code will be made open source and available “in the coming weeks”. Those who take advantage of the code once it is made available are encouraged to make changes to it, improving what is already there and providing an overall better experience for everyone.

[via Open Compute]


Facebook rolls out near-real-time public PUE and WUE dashboards for data centers is written by Brittany Hillen & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.