The late-night host hilariously suggested the president will require more than “attitude” in his meeting.
Improving battery life and energy capacity is a big deal for automakers around the world. More storage capacity for energy inside the battery and lighter batteries will greatly impact how far drivers can go with each charge of their electric vehicle. GM and Honda have announced that they have teamed up to accelerate battery development for both companies. The two … Continue reading
You might remember him as Billy Ray Valentine, Sherman Klump, or Donkey — but after a brief hiatus from the spotlight, Eddie Murphy is poised to return in Netflix’s Dolemite Is My Name. The film, which Murphy will produce and star in, is a biopic a…
The development of AI is a fascinating one, where if we could develop computer systems that can think for itself and for us, it would save us a lot of time. However there is the ethical question behind the development of such sentience, and as if to prove the naysayers right, researchers at MIT have actually created an ‘psychopath’ AI.
This AI is dubbed Norman after the antagonist in the Alfred Hitchcock classic “Psycho”. How they created this AI was by feeding it using disturbing image captions from Reddit from a subreddit “dedicated to document and observe the disturbing reality of death”. After feeding the AI various captions taken from the subreddit, they then gave it the Rorschach inkblot test and also gave the same test to a standard AI to compare the results.
Based on the results of the test, Norman’s response was more disturbing compared to the standard AI. For example one inkblot test saw the standard AI give a response which read, “a group of birds sitting on top of a tree branch,” whereas Norman responded by saying, “a man is electrocuted and catches to death.” Another test saw the standard AI answer with, “a black and white photo of a small bird,” while Norma responded with, “man gets pulled into dough machine.”
However according to the researchers, the goal of creating Norman wasn’t so much about setting out to create psychopath AI, but to prove a point. “Norman is born from the fact that the data that is used to teach a machine learning algorithm can significantly influence its behavior. So when people talk about AI algorithms being biased and unfair, the culprit is often not the algorithm itself, but the biased data that was fed to it.”
Researchers Create A ‘Psychopath’ AI By Feeding It Reddit Captions , original content from Ubergizmo. Read our Copyrights and terms of use.
One of fun aspects of building your own PC is being able to pick and choose the various components, such as the motherboard, the RAM, the GPU, and also the case. Some builders might not be fussed about cases, while others want a case that can show off the innards of their builds.
If you’re in the latter camp and are in the market for a new PC case, Corsair might have something for you. At Computex 2018, Corsair unveiled their new Crystal Series 280X RGB MATX cases. It should be noted that these are for smaller builds which means that if you’re after a case that will support full-sized motherboards and GPUs, you’ll probably have to look elsewhere.
However if you want a more compact build, then this could be for you. The case will feature the use of tempered glass on the top, front, and sides of the case and will come preinstalled with fans. There will be a non-RGB which will come with regular fans and not the lit-up ones, but other than that they’re similar in specs, which includes support for GPUs of up to 300mm in length.
As for pricing, the non-RGB version will be priced at $110 while the RGB model will go for $160 and should already be available for purchase via authorized retailers and distributors.
Corsair Unveils New Crystal Series 280X RGB Computer Case , original content from Ubergizmo. Read our Copyrights and terms of use.
It is easy to see why drones are ideal for surveillance, especially in crowded areas where the drones can provide an aerial overview of what’s going on. However the fact that it is so far removed from the crowd means that certain things might be missed by the drone pilot, which is why AI comes into play.
In a recently published research paper titled “Eye in the Sky” (via The Verge), it seems that researchers are using AI inside of drones to help spot violent behavior in crowds. This could be in handy during huge festivals where sometimes certain types of behavior might be overlooked due to the sheer number of people in attendance. It can also be used during protests to weed out people who are misbehaving, and so on.
While it sounds good on paper, in practice it doesn’t seem to be performing so well. According to lead researcher Amarjot Singh of the University of Cambridge, the AI has a 94% accuracy rating when it comes to identifying violent poses, but it seems that the more people introduced into the frame, its accuracy rating fell.
Also during their tests, these violent poses were simply that, poses, in which the researchers asked volunteers to act out different types of behavior. Also as you can see in the video above, the volunteers are spaced out from each other which means that it’s easier to look for certain types of poses. This means that in a real-world situation, it might not necessarily be as useful.
However like we said, on paper it sounds like it could be useful in situations with large crowds, so perhaps with more improvements made to it, we might see practical use for it some day.
AI Used In Drones Can Spot Violent Behavior In Crowds , original content from Ubergizmo. Read our Copyrights and terms of use.
The comedian shows why he thinks Trump’s border policy is dumb and dumber.
Hey, good morning! You look fabulous.
Welcome to Friday! Amazon’s fusing together its Echo smartspeaker and Fire TV box — and the result is a cube. We also report on a new BlackBerry phone and a spinning cat-litter tray. Unrelated.
It is easy to see why drones are ideal for surveillance, especially in crowded areas where the drones can provide an aerial overview of what’s going on. However the fact that it is so far removed from the crowd means that certain things might be missed by the drone pilot, which is why AI comes into play.
In a recently published research paper titled “Eye in the Sky” (via The Verge), it seems that researchers are using AI inside of drones to help spot violent behavior in crowds. This could be in handy during huge festivals where sometimes certain types of behavior might be overlooked due to the sheer number of people in attendance. It can also be used during protests to weed out people who are misbehaving, and so on.
While it sounds good on paper, in practice it doesn’t seem to be performing so well. According to lead researcher Amarjot Singh of the University of Cambridge, the AI has a 94% accuracy rating when it comes to identifying violent poses, but it seems that the more people introduced into the frame, its accuracy rating fell.
Also during their tests, these violent poses were simply that, poses, in which the researchers asked volunteers to act out different types of behavior. Also as you can see in the video above, the volunteers are spaced out from each other which means that it’s easier to look for certain types of poses. This means that in a real-world situation, it might not necessarily be as useful.
However like we said, on paper it sounds like it could be useful in situations with large crowds, so perhaps with more improvements made to it, we might see practical use for it some day.
AI Used In Drones Can Spot Violent Behavior In Crowds , original content from Ubergizmo. Read our Copyrights and terms of use.