Kim Kardashian Has A Simple Explanation For Her ‘Photoshop Fail’

“This is actually really funny to me!”

What Millennials Want Most In Love, According To Therapists

“They’re not looking for their mother’s marriage.”

Aira’s new smart glasses give blind users a guide through the visual world

When it comes to augmented reality technologies, visuals always seems to be a pretty essential part of most people’s definitions, but one startup is offering an interesting take on audio-based AR that also calls on computer vision. Even without integrated displays, glasses are still an important part of the company’s products, which are designed with vision-impaired users in mind.

Aira has built a service that basically puts a human assistant into a blind user’s ear by beaming live-streaming footage from the glasses camera to the company’s agents who can then give audio instructions to the end users. The guides can present them with directions or describe scenes for them. It’s really the combination of the high-tech hardware and highly attentive assistants.

The hardware the company has run this service on in the past has been a bit of a hodgepodge of third-party solutions. This month, the company began testing its own smart glasses solution called the Horizon Smart Glasses, which are designed from the ground-up to be the ideal solution for vision-impaired users.

The company charges based on usage; $89 per month will get users the device and up to 100 minutes of usage. There are various pricing tiers for power users who need a bit more time.

The glasses integrate a 120-degree wide-angle camera so guides can gain a fuller picture of a user’s surroundings and won’t have to instruct them to point their head in a different direction quite as much. It’s powered by what the startup calls the Aira Horizon Controller, which is actually just a repurposed Samsung smartphone that powers the device in terms of compute, battery and network connection. The controller is appropriately controlled entirely through the physical buttons and also can connect to a user’s smartphone if they want to route controls through the Aira mobile app.

Though the startup isn’t planning to part ways with their human assistants anytime soon, the company is predictably aiming to venture deeper into the capabilities offered by computer vision tech. The company announced earlier this month that it would be rolling out its own digital assistant called Chloe that will eventually be able to do a whole lot, but is launching with the ability to read so users can point their glasses at some text and they should be able to hear what’s written. The startup recently showed off a partnership with AT&T that enables the glasses to identify prescription pill bottles and read the labels and dosage instructions to users.

The company is currently in the testing phase of the new headset, but hopes to begin swapping out old units with the Horizon by June.

Arm chips with Nvidia AI could change the Internet of Things

Nvidia and Arm today announced a partnership that’s aimed at making it easier for chip makers to incorporate deep learning capabilities into next-generation consumer gadgets, mobile devices and Internet of Things objects. Mostly, thanks to this partnership, artificial intelligence could be coming to doorbell cams or smart speakers soon.

Arm intends to integrate Nvidia’s open-source Deep Learning Accelerator (NVDLA) architecture into its just-announced Project Trillium platform. Nvidia says this should help IoT chip makers incorporate AI into their products.

“Accelerating AI at the edge is critical in enabling Arm’s vision of connecting a trillion IoT devices,” said Rene Haas, EVP, and president of the IP Group, at Arm. “Today we are one step closer to that vision by incorporating NVDLA into the Arm Project Trillium platform, as our entire ecosystem will immediately benefit from the expertise and capabilities our two companies bring in AI and IoT.”

Announced last month, Arm’s Project Trillium is a series of scalable processors designed for machine learning and neural networks. NVDLA open-source nature allows Arm to offer a suite of developers tools on its new platform. Together, with Arm’s scalable chip platforms and Nvidia’s developer’s tools, the two companies feel they’re offering a solution that could result in billions of IoT, mobile and consumers electronic devices gaining access to deep learning.

Deepu Tallam, VP and GM of Autonomous Machines at Nvidia, explained it best with this analogy: “NVDLA is like providing all the ingredients for somebody to make it a dish including the instructions. With Arm [this partnership] is basically like a microwave dish.”

Sea of Thieves drops plans for “death tax” after community pushback

Now that Sea of Thieves has launched, Rare is looking to improve the game with new features and updates. Right now, Sea of Thieves can feel a little feature-light, with PvP dominating the game because there aren’t really any penalties in place for failing. Recently, Rare announced to plans to change that by deducting gold from players who die on … Continue reading

Facebook faces lawsuit over discriminatory housing ads

In 2016m the Congressional Black Caucus said that Facebook had violated the Fair Housing Act (FHA) by allowing advertisers to exclude racial and ethnic groups when buying ads for housing, employment or even credit. Last year, ProPublica said that the…

Apple's 'Field Trip' education iPad event by the numbers

Well, that was underwhelming. At its “Let’s Take a Field Trip” education event in Chicago on Tuesday, Apple only had a nominal upgrade to its 9.7-inch iPad and some minor software updates to announce. But hey, at least Crayola’s new digital crayon lo…

Lytro is shutting down, but some employees may head to Google

Lytro made a name for itself by allowing you to take a photo and then change the focus point after the fact. Its “Light Field” cameras never really took off, though, and neither did its pivot to pro-styled cameras and virtual reality. Now the company…

NVIDIA aims to make self-driving safer with virtual simulations

Amid the torrent of news at CES in January, it was easy to miss the unveiling of NVIDIA’s Drive platform — a way for the company to test out its self-driving algorithms through repeated simulations. At that point, it was more of a concept than an ac…

NVIDIA's next AI steps: An ARM deal and a new 'personal supercomputer'

Soon you won’t need one of NVIDIA’s tiny Jetson systems if you want to tap into its AI smarts for smaller devices. At its GPU Technology Conference (GTC) today, the company announced it’ll be bringing its open source Deep Learning Architecture (NVDLA…