University of Glasgow makes 3D models with single-pixel sensors, skips the cameras (video)

University of Glasgow creates 3D with singlepixel sensors, skips the cameras video

Most approaches to capturing 3D models of real-world objects involve multiple cameras that are rarely cheap, and are sometimes tricky to calibrate. The University of Glasgow has developed a method that ditches those cameras altogether. Its system has four single-pixel sensors stitching together a 3D image based on the reflected intensity of light patterns cast by a projector. Reducing the pixel count lowers the cost per sensor to just a few dollars, and extends the sensitivity as far as terahertz wavelengths. Real-world products are still a long way off, but the university sees its invention as useful for cancer detection and other noble pursuits. Us? We’d probably just waste it on creating uncanny facsimiles of ourselves.

Filed under: ,

Comments

Via: New Scientist

Source: University of Glasgow

Robots Tell You If You Have Bad Breath or Smelly Feet

If you have a body odor problem, you’ll want stay away from this disembodied robotic head and dog. The one shaped like a girls head smells your breath and the robot canine smells your feet.

smelling robots
They were made by Kitakyushu National College of Technology and  ”Crazy Lab”. These robots were designed to make people more aware of their cleaning habits by making strong remarks about just that. Depending on how bad your breath is, Kaori-chan can say things like “Yuck! You have bad breath!”, “No way! I can’t stand it!” or even the dire “Emergency! There’s an emergency taking place!”

Shuntaro-kun, the feet smelling robot dog evaluates the intensity of odor from your feet, on a scale of one to four. Depending on how strong the stench is, the dog will cuddle up to you, bark, growl or pass out.

[via Asahi via Damn Geeky]

Sensoria Socks technology aims to prevent injury before it happens

As wearable computing technology continues to improve, companies are looking for more and more ways we can use the data received and technology at hand to better products, and ourselves. With Sensoria Socks from Heapsylon, they are using new technology to not only track fitness like the Nike FuelBand and others, but also prevent injury before they happen.

Screen Shot 2013-05-09 at 11.56.16 AM

Sensoria Fitness and their new Sensoria Socks is a patent-pending wearing technology that aims to do exactly that. Bring an entire new level to our fitness and daily lives, as well as help with sports athletes and injuries. Products on the market like the Nike FuelBand, FitBit, Jawbone UP and more all track steps, speed, calories, and more, but imagine a product that can track weight distribution on the foot as you stand, walk, and run. Sensoria Socks rely on sensor-equipped textile materials, as well as the accompanying band pictured below.

With more than 25 million runners in the US alone, more than half are prone to some sort of running related injury or pain, and this isn’t even counting other athletes. Instead of dealing with injury we should be looking at ways to prevent it before it happens. This is where Heapsylon come into play. Sensoria Socks can identify poor running types, then using a custom designed app to coach the runner to reduce those tendencies, thus reducing the risk of injury. Then like any other fitness apps runners can benchmark and analyze performance, limits, distance and more.

According to Heapsylon and their demo when an injury or issue does happen, Sensoria can also track patient adherence, progress and much more. The accompanying application will sync the data over Bluetooth to your smartphone, letting users track anything and everything with this new technology. The app as mentioned above will show poor running techniques, but everything else will be available too.

Their anklet tracks activity type and level, heart rate, blood pressure, breathing rate, then relays this to the app dashboard to show how far, how fast you run, calories burnt and more. Even those with good technique can study and learn better habits, reach higher goals, and train harder without strain.

Screen Shot 2013-05-09 at 11.56.16 AM
8718876730_04d97718d0_z
sensor-sock

The idea behind wearable computing for more than just fun (read: Google Glass) and really opens the door for many different things such as Sensoria Socks. We’re hearing they’ll be available later this year and will help runners and athletes dodge and prevent injuries, and up their game at the same time.


Sensoria Socks technology aims to prevent injury before it happens is written by Cory Gunther & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

OmniVision OV2724 should lead to super-small, 1080p60 front phone cameras

OmniVision OV2724 may lead to supersmall, 1080p60 front phone cameras

When most front-facing mobile cameras are shoehorned in between a myriad of sensors, they seldom have the breathing room they’d need for truly noteworthy performance. OmniVision can’t quite defy physics, but its new OV2724 sensor could challenge at least a few of our common assumptions. The OV2722 successor stuffs 1080p imaging into the company’s smallest chip of the kind, at 5mm by 5mm by 3.5mm — ideally, leading to full HD front cameras in tinier devices. Full-size devices still stand to benefit, though. The OV2724 has the headroom to record at an extra-smooth 60 frames per second, and individual frames should be more eye-catching between the higher dynamic range and better low light shooting. The only frustration left is having to wait for mass production of the new sensor in the summer quarter — we won’t see any phones or tablets reaping the rewards for at least a few months.

Filed under: , ,

Comments

Source: OmniVision

Smart Skin Could Give Robots a Sense of Touch

Like most of you, I’m patiently waiting for the day when I can buy a robot that will take care of all the irritating things around house I want to deal with. I want a robot to wash the dishes, bathe the dog, mow the lawn, and take out the trash. While this breakthrough might not solve that problem for me yet, a group of scientists from Georgia Institute of Technology have invented something dubbed “smart skin” that could give robots a sense of touch.

smart skin1

The researchers working on the smart skin used a bundle of vertical zinc oxide nano wires along with an array of about 8000 transistors. Each of those individual transistors is able to independently produce electronic signals when subjected to mechanical strain. The researchers say that these touch sensitive transistors, which they call taxels, have sensitivity comparable to that of the human finger. They say that the artificial skin can feel activity on its surface and the sensation could be translated into control signals for robot in the future.

Having a sense of touch is important because it will allow a robot to know when something is in its hand and how hard can grip that object. It wouldn’t do to have a robot breaking all your dishes after all.

[via BBC News]

After hours at NAB: A closer look at Red’s Dragon upgrade operation (video)

After hours at NAB A closer look at Red's Dragon upgrade operation video

Red’s clean room on the NAB show floor is typically no place for camera crews, but after adding a bit of protection, Red President Jarred Land gave us the green light to step inside the company’s sacred space for a closer look at operation Dragon upgrade. (The $8,500+ sensor swap gives Epic cams the gift of 6K shooting.) The view from behind the glass wall separating spectators from technicians isn’t significantly different, but we were able to get quite a bit more insight into how the process goes down, including stops at each of the workstations.

The temporary assembly center that Red built at the Las Vegas Convention Center is a miniature version of the company’s primary facility in Irvine, California — while Dragon upgrades are underway in Las Vegas, a structure that’s estimated to be 20 times the size of the one here in Nevada is processing the updates remotely, though admittedly with far less fanfare. Join us past the break for an exclusive look at the process, live from Red’s booth at NAB.

Filed under:

Comments

Red Motion mount eliminates shutter judder, we go eyes-on (video)

DNP  Red Motion Mount handson

Red announced its new Motion lens mount prior to opening its booth at NAB, and now we’ve had a chance to see this guy in action. The mount, which is compatible with the company’s Epic and Scarlet cameras, is meant to fix the CMOS rolling shutter problem. The Motion includes a liquid crystal shutter that’s placed in front of the main sensor and is timed to engage when the sensor is fully open. It also adds up to 8x neutral density, which can be enabled electronically through the camera UI with 1/100-stop precision. In a demo at Red’s NAB booth, the camera captured every flash of a strobe — without the new mount, some flashes would likely slip through the cracks. It’s set to ship for $4,500 in the fall (or possibly this summer), and will be available with Canon EF or PL mounts. You can see it in action today in the hands-on demo after the break.

Filed under:

Comments

Red performs Dragon sensor upgrades right on the NAB show floor (video)

Image

Well, this is a trade-show first. Red Digital Cinema has made a name for itself by pushing the limits when it comes to motion picture camera technology, but the company’s “get it done” approach is even evident in the layout of its trade show booth this year. The team has constructed a full sterile lab here at NAB, where attendees can watch technicians upgrade Epic cameras with the new Dragon 6K sensor, which enables 6K shooting at 6144 x 3160 pixels and up to 100 frames-per-second, offering three additional stops over the Epic M-X. A wall of glass separates fans from the engineers in the clearly visible clean room, who are diligently going about their duties despite the constant gaggle of excited customers just a few feet away. Company spokesman Ted Schilowitz gave us a quick tour of the facility, where the $8,500+ sensor upgrades are now underway. Geek out with us in the video just past the break.

Filed under:

Comments

Red Epic Dragon sensor updates start tomorrow for $8,500

Red to start performing Epic Dragon sensor updates tomorrow at its NAB booth

Red has announced that Dragon sensor updates will start tomorrow for Epic-M and Epic-X owners and, interestingly, is letting owners (and the public) see the operation for themselves at its NAB booth. The new sensor will bring 6K resolution, 120 fps at 5K and 15+ stops of dynamic range in a slightly larger format, according to Red. Early adopters will be able to pre-order now for $8,500, while Epic owners who wait until Thursday or later will be able to grab the update for $9,500. Filmmakers hoping for a new Epic-M with the Dragon instead of the Mysterium-X sensor will be able to pre-order tomorrow for $29,000 or so. Meanwhile, there’s good news for those with the more budget-minded Scarlet — they’ll be able to upgrade to the Epic directly or get a 6k Dragon sensor and ASICs, with pricing details coming tomorrow and pre-orders launching on Thursday. Red may have a tough row to hoe with recent NAB news from the likes of BlackMagic Design and Vision Systems, but how many companies will actually let you watch your camera get operated on? Check the source for more.

Filed under:

Comments

Source: Red

Panasonic explains how its color splitter sensor works in a vividly detailed video

Video explains how Panasonic's color splitter sensor works in microscopic detail

You’d be forgiven if you weren’t entirely on the same page with Panasonic regarding its micro color splitter sensor: it’s a big break from the traditional Bayer filter approach on digital cameras, and the deluge of text doesn’t do much to simplify the concept. Much to our relief, DigInfo TV has grilled Panasonic in a video that provides a more easily digestible (if still deep) interpretation. As the technology’s creator says, it’s all about the math. To let in so much light through the splitters requires processing the light in four mixed colors, and that processing requires studying the light’s behavior in 3D. Panasonic’s new method (Babinet-BPM) makes that feasible by finishing tasks 325 times faster than usual, all while chewing up just a 16th of the memory. The company isn’t much closer to having production examples, but it’s clarifying that future development will be specialized — it wants to fine-tune the splitter behavior for everything from smartphone cameras through to security systems. Catch the full outline after the break.

Filed under:

Comments

Via: GSM Arena

Source: DigInfo TV