At the GDC 2018 Indie Megabooth on Monday, Silver Dollar Games showed off One Finger Death Punch 2, the sequel to its popular 2013 brawler. Fans of the original — and fans of smash-em-ups in general — are not going to be disappointed.
If you watch a lot of content on YouTube then you must have surely heard about Patreon. It’s a service that lets content creators accept donations from their fans so that they can keep creating more content for them. A lot of people content creators on YouTube are now using Patreon for precisely this purpose and Facebook wants its creator community to have a similar option. Among the new creator features that it has announced today is the ability for fans to support their favorite creators by making monthly payments.
Fans who decide to make monthly payments in order to support their favorite creators will receive exclusive content as an incentive. They will also get a new badge which shows their support for a creator. Top fans will be able to display that badge based on how frequently they comment, share, watch or react to their preferred creator’s content. They will have to opt-in for this, though.
The world’s largest social network also wants to make it easier for brands to work with its creator community. It will now allow creators to establish a portfolio that highlights their body of work so that advertisers can then search and find the creators that they want to work on branded campaigns with.
Facebook is only testing these new creator features currently so they aren’t available widely as yet. That will depend on the results that Facebook gets.
Facebook Now Allows Fans To Support Creators With Monthly Payments , original content from Ubergizmo. Read our Copyrights and terms of use.
Uber has been testing its self-driving cars on public roads in Arizona for quite some time now and while there have been reports of its cars getting into minor accidents, they had never been involved in a fatal accident until today. A female pedestrian was killed earlier today after being struck by an autonomous Uber car in Tempe, Arizona. Tempe Police have confirmed that the car was in autonomous mode when the accident happened.
The accident took place near Mill Avenue and Curry Road earlier this morning in Tempe, Arizona. The car was northbound when it struck a woman walking outside of the crosswalk. The woman was rushed to the hospital where she succumbed to her injuries. Police have identified the victim as 49 year old Elaine Herzberg.
Uber has also confirmed that the car was in autonomous mode at the time of the crash but there was a human safety driver behind the wheel. That’s a legal requirement for self-driving cars that are putting their tech to the test on public roads.
A spokesperson for the company has added that there was no one else in the vehicle except the female safety driver and she was not hurt in this accident. No further details have been revealed about the driver and police have now taken possession of the vehicle.
The U.S. National Transportation Safety Board has launched an investigation into this crash and dispatched a team to Tempe. Uber CEO Dara Khosrowshahi has tweeted out condolences to the affected family and said that Uber is working with local law enforcement to figure out what happened.
Uber Self-Driving Car Tests Halted After Pedestrian Killed In Accident , original content from Ubergizmo. Read our Copyrights and terms of use.
GOP Rep. Once Said LGBTQ People Would Wed ‘Everyone In California With AIDS’
Posted in: Today's ChiliRep. Steve Pearce had a theory about marriage equality leading to health care, which he implied was bad.
It’s the first known fatality involving a pedestrian and an autonomous vehicle.
Parkland Survivors Call Out Media For Ignoring Gun Violence In Black Communities
Posted in: Today's Chili“We have to use our white privilege” to help elevate those most affected by gun violence, one student survivor said.
Teammates who reportedly saw the scale confirmed the story.
A self-driving vehicle made by Uber has struck and killed a pedestrian. It’s the first such incident and will certainly be scrutinized like no other autonomous vehicle interaction in the past. But on the face of it it’s hard to understand how, short of a total system failure, this could happen when the entire car has essentially been designed around preventing exactly this situation from occurring.
Something unexpectedly entering the vehicle’s path is pretty much the first emergency event that autonomous car engineers look at. The situation could be many things — a stopped car, a deer, a pedestrian — and the systems are one and all designed to detect them as early as possible, identify them, and take appropriate action. That could be slowing, stopping, swerving, anything.
Uber’s vehicles are equipped with several different imaging systems which work both ordinary duty (monitoring nearby cars, signs, and lane markings) and extraordinary duty like that just described. No less than four different ones should have picked up the victim in this case.
Top-mounted lidar. The bucket-shaped item on top of these cars is a lidar, or light detection and ranging, system that produces a 3D image of the car’s surroundings multiple times per second. Using infrared laser pulses that bounce off objects and return to the sensor, lidar can detect static and moving objects in considerable detail, day or night.

This is an example of a lidar-created imagery, though not specifically what the Uber vehicle would have seen.
Heavy snow and fog can obscure a lidar’s lasers, and its accuracy decreases with range, but for anything from a few feet to a few hundred feet, it’s an invaluable imaging tool and one that is found on practically every self-driving car.
The lidar unit, if operating correctly, should have been able to make out the person in question, if they were not totally obscured, while they were still more than a hundred feet away, and passed on their presence to the “brain” that collates the imagery.
Front-mounted radar. Radar, like lidar, sends out a signal and waits for it to bounce back, but it uses radio waves instead of light. This makes it more resistant to interference, since radio can pass through snow and fog, but also lowers its resolution and changes its range profile.

Tesla’s Autopilot relies mostly on radar.
Depending on the radar unit Uber employed — likely multiple in both front and back to provide 360 degrees of coverage — the range could differ considerably. If it’s meant to complement the lidar, chances are it overlaps considerably, but is built more to identify other cars and larger obstacles.
The radar signature of a person is not nearly so recognizable, but it’s very likely they would have at least shown up, confirming what the lidar detected.
Short and long-range optical cameras. Lidar and radar are great for locating shapes, but they’re no good for reading signs, figuring out what color something is, and so on. That’s a job for visible-light cameras with sophisticated computer vision algorithms running in real time on their imagery.
The cameras on the Uber vehicle watch for telltale patterns that indicate braking vehicles (sudden red lights), traffic lights, crossing pedestrians, and so on. Especially on the front end of the car, multiple angles and types of camera would be used, so as to get a complete picture of the scene into which the car is driving.
Detecting people is one of the most commonly attempted computer vision problems, and the algorithms that do it have gotten quite good. “Segmenting” an image, as it’s often called, generally also involves identifying things like signs, trees, sidewalks and more.
That said, it can be hard at night. But that’s an obvious problem, the answer to which is the previous two systems, which work night and day. Even in pitch darkness, a person wearing all black would show up on lidar and radar, warning the car that it should perhaps slow and be ready to see that person in the headlights. That’s probably why a night-vision system isn’t commonly found in self-driving vehicles (I can’t be sure there isn’t one on the Uber car, but it seems unlikely).
Safety driver. It may sound cynical to refer to a person as a system, but the safety drivers in these cars are very much acting in the capacity of an all-purpose failsafe. People are very good at detecting things, even though we don’t have lasers coming out of our eyes. And our reaction times aren’t the best, but if it’s clear that the car isn’t going to respond, or has responded wrongly, a trained safety driver will react correctly.
Worth mentioning is that there is also a central computing unit that takes the input from these sources and creates its own more complete representation of the world around the car. A person may disappear behind a car in front of the system’s sensors, for instance, and no longer be visible for a second or two, but that doesn’t mean they ceased existing. This goes beyond simple object recognition and begins to bring in broader concepts of intelligence such as object permanence, predicting actions, and the like.
It’s also arguably the most advance and closely guarded part of any self-driving car system and so is kept well under wraps.
It isn’t clear what the circumstances were under which this tragedy played out, but the car was certainly equipped with technology that was intended to, and should have, detected the person and caused the car to react appropriately. Furthermore, if one system didn’t work, another should have sufficed — multiple failbacks are only practical in high stakes matters like driving on public roads.
We’ll know more as Uber, local law enforcement, federal authorities, and others investigate the accident.
Uber has grounded its entire self-driving car fleet in the US and Canada, after one of its autonomous vehicles reportedly killed a pedestrian in Arizona overnight. The vehicle is said to have been driving itself when the incident took place, though with a safety operator at the wheel. The crash took place in Tempe, a city in south central Arizona, … Continue reading
Facebook is under fire for how carefully it treats your personal data, but you don’t need to wait for regulators to wade in before doing an audit on how socially exposed you are. The social network doesn’t make it too obvious how you can limit the visibility on things like third-party apps and what Facebook games can see about you, … Continue reading