Netflix came out against GA’s abortion law, but the CEO contributed to lawmakers who voted for a similar Missouri law.
GE’s lineup of connected microwaves can now be controlled by the Google Assistant. Owners can use voice commands to set their microwave and perform basic functions like starting, pausing, stopping or adding time. The microwaves also have a “scan-to-c…
If robots are really to help us out around the house or care for our injured and elderly, they’re going to want two hands… at least. But using two hands is harder than we make it look — so this robotic control system learns from humans before attempting to do the same.
The idea behind the research, from the University of Wisconsin-Madison, isn’t to build a two-handed robot from scratch, but simply to create a system that understands and executes the same type of manipulations that we humans do without thinking about them.
For instance, when you need to open a jar, you grip it with one hand and move it into position, then tighten that grip as the other hand takes hold of the lid and twists or pops it off. There’s so much going on in this elementary two-handed action that it would be hopeless to ask a robot to do it autonomously right now. But that robot could still have a general idea of why this type of manipulation is done on this occasion, and do what it can to pursue it.
The researchers first had humans wearing a motion capture equipment perform a variety of simulated everyday tasks, like stacking cups, opening containers and pouring out the contents, and picking up items with other things balanced on top. All this data — where the hands go, how they interact, and so on — was chewed up and ruminated on by a machine learning system, which found that people tended to do one of four things with their hands:
- Self-handover: This is where you pick up an object and put it in the other hand so it’s easier to put it where it’s going, or to free up the first hand to do something else.
- One hand fixed: An object is held steady by one hand providing a strong, rigid grip, while the other performs an operation on it like removing a lid or stirring the contents.
- Fixed offset: Both hands work together to pick something up and rotate or move it.
- One hand seeking: Not actually a two-handed action, but the principle of deliberately keeping one hand out of action while the other finds the object required or performs its own task.
The robot put this knowledge to work not in doing the actions itself — again, these are extremely complex motions that current AIs are incapable of executing — but in its interpretations of movements made by a human controller.
You would think that when a person is remotely controlling a robot, it would just mirror the person’s movements exactly. And in the tests, the robot does so to provide a baseline of how without knowledge about these “bimanual actions,” many of them are simply impossible.
Think of the jar-opening example. We know that when we’re opening the jar, we have to hold one side steady with a stronger grip and may even have to push back with the jar hand against the movement of the opening hand. If you tried to do this remotely with robotic arms, that information is not present any more, and the one hand will likely knock the jar out of the grip of the other, or fail to grip it properly because the other isn’t helping out.
The system created by the researchers recognizes when one of the four actions above is happening, and takes measures to make sure that they’re a success. That means, for instance, being aware of the pressures exerted on each arm by the other when they pick up a bucket together. Or providing extra rigidity to the arm holding an object while the other interacts with the lid. Even when only one hand is being used (“seeking”), the system knows that it can deprioritize the movements of the unused hand and dedicate more resources (be it body movements or computational power) to the working hand.
In videos of demonstrations, it seems clear that this knowledge greatly improves the success rate of the attempts by remote operators to perform a set of tasks meant to simulate preparing a breakfast: cracking (fake) eggs, stirring and shifting things, picking up a tray with glasses on it and keeping it level.
Of course this is all still being done by a human, more or less — but the human’s actions are being augmented and re-interpreted into something more than simple mechanical reproduction.
Doing these tasks autonomously is a long ways off, but research like this forms the foundation for that work. Before a robot can attempt to move like a human, it has to understand not just how humans move, but why they do certain things in certain circumstances, and furthermore what important processes may be hidden from obvious observation — things like planning the hand’s route, choosing a grip location, and so on.
The Madison team was led by Daniel Rakita; their paper describing the system is published in the journal Science Robotics.
There’s always something about devices that are always listening that doesn’t quite sit right with people who worry a lot about their data privacy. However, that has not prevented smart assistant-enabled devices from becoming very popular over the past couple of years. In a step that will be appreciated by many, Amazon is now allowing users to command Alexa to delete all of their voice recordings from that particular day.
Amazon today announced that it’s introducing a new and easier way to delete your voice recordings on all Alexa-enabled devices. You just have to say “Alexa, delete everything I said today” and all of the recordings from that day will be deleted.
This can prove to be very helpful when you want to delete things that Alexa was asked but you’d not rather have them be a part of your Alexa search history. The feature will let you make sure that all of those recordings from throughout the day are deleted in response to a single voice command.
Amazon has also said that it will soon allow users to selectively delete their queries as well by saying “Alexa, delete what I just said.” This will prompt the digital assistant to automatically delete the last request that it received from the user.
You Can Make Alexa Delete Your Voice Recordings Now , original content from Ubergizmo. Read our Copyrights and terms of use.
If you search for noise canceling headphones on Amazon, chances are that one of the first results will be for a Bose product. The company has made some very popular noise canceling headphones that have done very well in the market. Bose is out with a new flagship model today called the Noise Cancelling Headphones 700. They look nothing like the company’s existing QC35 models and cost $399.
The Noise Cancelling Headphones 700 aren’t actually meant to replace Bose’s more affordable QC35s. The company may feel that there’s demand in the market for more expensive cans that offer additional features and possibly better quality as well.
The Noise Cancelling Headphones 700 look far more modern compared to the QC35s and offer 20 hours of “full featured” playback, according to the company. They also have a USB Type-C port for charging. Voice assistant support is present so you will be able to summon Google Assistant, Siri or Alexa whenever you desire.
Bose also talks a big game about the microphone system in these new over-ear headphones. The system has four mics and superior voice isolation coupled with the headphones’ marquee noise cancelation feature. The touch controls have been placed in the front part of the right earcup. Those who are interested in picking one up can pre-order them from Bose’s website. The company will start shipping the Noise Cancelling Headphones 700 on June 20th.
Bose Unveils New $399 Noise Cancelling Headphones 700 , original content from Ubergizmo. Read our Copyrights and terms of use.
It was reported that the Apple Pay Express Transit feature will soon be supported by the MTA in New York City. It has been officially confirmed today that commuters will be able to pay for their subway and bus trips with Apple Pay starting May 31st. This means that they will no longer have to purchase physical MetroCards for rides.
Apple CEO Tim Cook had previously confirmed that Apple Pay support would come to MTA’s subway and buses in “early summer.” The company has followed through on that timeline and confirmed today that commuters will be able to use the service to pay for their trips.
Before they can do that, they will need to ensure that their iPhone is running the latest software which is iOS 12.3 or version 5.2.1 for watchOS. A credit or debit card will also be required to be authenticated with the Express Transit feature. It’s that feature which will process payments for the subway and buses without requiring the user to open an app or even unlock their phone in order to pay at the turnstile.
Support for Apple Pay is part of MTA’s new fare payment system called OMNY. It will also support other mobile payment systems. Google has already confirmed OMNY support for Google Pay starting May 31st as well.
Apple Pay Can Be Used On NYC Public Transport Starting May 31st , original content from Ubergizmo. Read our Copyrights and terms of use.
The driver and ratings system has long been a part of Uber’s service. However, the only real implication was felt by drivers who could see their bonuses curtailed or even access to the service limited if they were consistently rated poorly. Drivers could also rate passengers but it didn’t seem to change much. That’s no longer going to be the case. Uber today confirmed that it will start deactivating riders with “below average” ratings from the app.
The company says that riders could lose access to the Uber app if they rack up a significantly below average rating.
They will be shown tips on how to improve their ratings which may include polite behavior, avoiding leaving trash in the vehicle, or avoiding request to the driver to go over the speed limit. Riders will be given multiple opportunities to improve their rating before they are deactivated from the app.
The company acknowledges that accountability is a two-way street and that while drivers have long been expected to meet a minium rating threshold which varies from city to city, a similar approach will now be taken for riders. Uber does say that it only expects a small number of riders to ultimately be impacted by ratings-based deactivations.
Riders With Below Average Ratings Will Be Deactivated From Uber , original content from Ubergizmo. Read our Copyrights and terms of use.
There has been a lot of anticipation for Hideo Kojima’s next game. It has been teased frequently but those who have been looking forward to it can now breathe a sigh of relief. It has finally been confirmed that the Death Stranding PS4 release will take place in November this year.
Kojima hasn’t revealed much about the game so far. A cryptic Twitch stream has been teasing the game with what felt like gameplay footage but it was difficult to be sure of it. As the stream racked up more viewers, it revealed more footage, ending with a new trailer which introduced us to the main cast of the game. The trailer also confirmed the release date for this title.
The new trailer provides a good glimpse of Death Stranding’s gameplay, the best we have seen so far, and it seems that players will primarily be involved in rebuilding a society that has gone to the ruins. There will be plenty of action in the game as well.
“Death Stranding is a completely new type of action game, where the goal of the player is to reconnect isolated cities and a fragmented society. It is created so that all elements, including the story and gameplay, are bound together by the theme of the “Strand” or connection,” Kojima explained in a post on the official PlayStation blog.
Death Stranding will be released for the PlayStation 4 on November 8th this year.
Death Stranding PS4 Release Confirmed For November 2019 , original content from Ubergizmo. Read our Copyrights and terms of use.
If you felt that the Amazon Echo Show was too large for your liking, Amazon is out with a new Echo device today which offers a smaller display and is much more compact. Not only does the new Echo Show 5 have a 5.5 inch display, it’s also quite a bit cheaper than most of the smart speakers that also feature a display.
For $89.99, the Echo Show 5 provides a rectangular 5.5 inch display, a speaker and a camera in addition to all Alexa features that many are now accustomed to. This new smart speaker is going to be available in sandstone and charcoal colors. If you want to tilt the device, you will need to purchase the $19.99 magnetic stand from Amazon. This stand is what will enable the device to be used as a desk accessory or an alarm clock, for example.
Since many of you may not want something in your bedroom that has a camera, Amazon has included a physical shutter for the camera on the Echo Show 5. It’s the visible attribute that one has to really appreciate as you can know without having to look at the screen that the camera’s view really has been obstructed.
Other devices do have a physical switch to disable the camera but one has to look at the screen just to be sure. Interested customers can pre-order the Echo Show 5 starting today. It will ship next month.
Amazon’s New Echo Show 5 Features A Smaller Display , original content from Ubergizmo. Read our Copyrights and terms of use.
Google Says Disabling Digital Wellbeing Doesn’t Improve Pixel’s Performance
Posted in: Today's Chili
There have been reports from a few Pixel smartphone users lately who claim that disabling the Digital Wellbeing features on the device has a noticeable improvement in the performance. Google has now commented on the matter, saying that this isn’t the case. It maintains that there are no performance issues being caused by Digital Wellbeing on Pixel smartphones.
Google confirmed this through its official account on Reddit. The company says that it has conducted a “thorough analysis” of the software after users had started complaining about this and found that there are “no performance issues associated with the Digital Wellbeing app on Pixel.”
The Digital Wellbeing app is present on all Google Pixel and Android One smartphones in addition to a few devices from other manufacturers. It runs in the background and provides data to users such as the number of times they unlock the device and the time they spend on each individual app.
The company did acknowledge, though, that it did find problems unrelated to the bug reports about Digital Wellbeing that were impacting the performance. Google says that it’s in the process of rolling out the fixes to ensure that Pixel smartphones offer optimum performance.
Users were recommending others to disable Digital Wellbeing in order to improve the performance but Google has effectively said that it was nothing more than just a placebo effect.
Google Says Disabling Digital Wellbeing Doesn’t Improve Pixel’s Performance , original content from Ubergizmo. Read our Copyrights and terms of use.