What we're buying: An official third-party PS4 controller for Xbox converts

This week’s hardware IRL keeps us indoors and out of the sun. UK Bureau Chief Mat Smith replaced his elderly in-box PS4 DualShock with Hori’s Onyx, the first licensed third-party wireless controller to land, though it’s no longer the only one. After…

Add CarPlay and Android Auto To Your Older Vehicle For $348

Want to try out Apple CarPlay or Android Auto without…buying a new car? This aftermarket Sony head unit is marked down to $348 today, and can be installed in any double-DIN dash opening on your existing vehicle.

Read more…

Senator suggests ways to combat misinformation and boost data privacy

Senator Mark Warner (D-VA) has put together a policy paper that both highlights some of the bigger problems facing online platforms today and includes potential ways in which to address them. Axios got ahold of the 23-page paper and it focuses on thr…

Gee, I Wonder What People Will Look at in Google's New VR Web Browser

First it was the Kindle that allowed the world to enjoy its romance novels without fear of public embarrassment. Next, Google will let everyone browse the web in the privacy of virtual reality with its Chrome browser and Daydream VR headset.

Read more…

NASA’s TESS spacecraft begins its search for exoplanets

NASA’s TESS spacecraft is officially up and running. The agency recently announced that TESS, which stands for Transiting Exoplanet Survey Satellite, started its search for exoplanets — planets outside of our solar system — on July 25th. Its first…

OpenAI’s robotic hand doesn’t need humans to teach it human behaviors

Gripping something with your hand is one of the first things you learn to do as an infant, but it’s far from a simple task, and only gets more complex and variable as you grow up. This complexity makes it difficult for machines to teach themselves to do, but researchers at Elon Musk and Sam Altman-backed OpenAI have created a system that not only holds and manipulates objects much like a human does, but developed these behaviors all on its own.

Many robots and robotic hands are already proficient at certain grips or movements — a robot in a factory can wield a bolt gun even more dexterously than a person. But the software that lets that robot do that task so well is likely to be hand-written and extremely specific to the application. You couldn’t for example, give it a pencil and ask it to write. Even something on the same production line, like welding, would require a whole new system.

Yet for a human, picking up an apple isn’t so different from pickup up a cup. There are differences, but our brains automatically fill in the gaps and we can improvise a new grip, hold an unfamiliar object securely and so on. This is one area where robots lag severely behind their human models. And furthermore, you can’t just train a bot to do what a human does — you’d have to provide millions of examples to adequately show what a human would do with thousands of given objects.

The solution, OpenAI’s researchers felt, was not to use human data at all. Instead, they let the computer try and fail over and over in a simulation, slowly learning how to move its fingers so that the object in its grasp moves as desired.

The system, which they call Dactyl, was provided only with the positions of its fingers and three camera views of the object in-hand — but remember, when it was being trained, all this data is simulated, taking place in a virtual environment. There, the computer doesn’t have to work in real time — it can try a thousand different ways of gripping an object in a few seconds, analyzing the results and feeding that data forward into the next try. (The hand itself is a Shadow Dexterous Hand, which is also more complex than most robotic hands.)

In addition to different objects and poses the system needed to learn, there were other randomized parameters, like the amount of friction the fingertips had, the colors and lighting of the scene and more. You can’t simulate every aspect of reality (yet), but you can make sure that your system doesn’t only work in a blue room, on cubes with special markings on them.

They threw a lot of power at the problem: 6144 CPUs and 8 GPUs, “collecting about one hundred years of experience in 50 hours.” And then they put the system to work in the real world for the first time — and it demonstrated some surprisingly human-like behaviors.

The things we do with our hands without even noticing, like turning an apple around to check for bruises or passing a mug of coffee to a friend, use lots of tiny tricks to stabilize or move the object. Dactyl recreated several of them, for example holding the object with a thumb and single finger while using the rest to spin to the desired orientation.

What’s great about this system is not just the naturalness of its movements and that they were arrived at independently by trial and error, but that it isn’t tied to any particular shape or type of object. Just like a human, Dactyl can grip and manipulate just about anything you put in its hand, within reason of course.

This flexibility is called generalization, and it’s important for robots that must interact with the real world. It’s impossible to hand-code separate behaviors for every object and situation in the world, but a robot that can adapt and fill in the gaps while relying on a set of core understandings can get by.

As with OpenAI’s other work, the paper describing the results is freely available, as are some of the tools they used to create and test Dactyl.

Microsoft OneDrive on Android adds fingerprint locking support

Microsoft OneDrive, the company’s cloud storage alternative to Dropbox and Google Drive, just got a big security update for Android users. After updating the Android mobile app, OneDrive allows the user to secure their content behind a fingerprint lock. The extra layer of security ensures only someone with the right fingerprint can unlock the account. Unlike now, OneDrive users on … Continue reading

EA Origin Access Premier subscriptions are now available

EA’s Origin service has competed with Steam for years at this point, but now we’re seeing it take a page out of Netflix’s book as well. Today, EA launched a new service called Origin Access Premier, which allows subscribers to play the company’s newest titles without a purchase in return for a monthly subscription fee. Other gaming companies have tried … Continue reading

Google Chrome For Daydream VR Available Now

Owners of a Daydream VR headset will now be able to browse the web in virtual reality. Google has today announced that Chrome has been launched on Google’s Daydream View headset as well as the Lenovo Mirage Solo with Daydream headset. Those who have either headset will now be able to launch Chrome directly from the homepage to browse and interact with any webpage in virtual reality.

It has long been possible to view web VR content in Chrome but the browser itself couldn’t be launched from the Daydream launcher. Users had to first navigate to the page they wanted to interact with in VR and then put their device inside a virtual reality headset.

This was obviously not the best user experience and it also prevented standalone headsets like the Lenovo Mirage Solo from allowing users to surf the web in virtual reality. That pain points ends today as Chrome now has full support for the Daydream platform.

All of the core Chrome features such as voice search, saved bookmarks, and incognito mode are available on the Daydream headset. Several features specific to the virtual reality platform have also been added. They include a cinema mode which optimizes web video for the best viewing experience in VR.

Users just have to update to the latest version of Chrome on Android from the Google Play Store so that they can launch Chrome from the home screen of their Daydream headset.

Google Chrome For Daydream VR Available Now , original content from Ubergizmo. Read our Copyrights and terms of use.

OnePlus Confirms Android P Release For OnePlus 3 And 3T


Google and its OEM partners promise two years of major Android platform upgrades for smartphones. The handsets also get an additional year of security updates but it’s very uncommon for a device to get more than two major updates. OnePlus wants its devices to be the exception here which is why the company has now confirmed that the OnePlus 3 and OnePlus 3T will both receive the Android P update.

Going by the rule, many had obviously assumed that the Android 8.0 or Android 8.1 Oreo update would be the last major release for the OnePlus 3 and 3T. That will no longer be the case as these devices will get Android P, but at a cost.

OnePlus says that the two handsets will receive Android P but they won’t get Android 8.1. They will jump from Android 8.0 to Android P so that the development resources are better utilized. These devices will also be at the end of the numerical priority list, the OnePlus 6 will get Android P first, then the OnePlus 5/5T, and so on and so forth.

OnePlus adds that the decision to bypass the Android 8.1 update has been made so that the OnePlus 3/3T owners can get the new features and improvements that Android P brings to the table. I’m sure many users won’t mind as they’re getting an unprecedented third major Android release for their handsets anyway.

OnePlus Confirms Android P Release For OnePlus 3 And 3T , original content from Ubergizmo. Read our Copyrights and terms of use.