Graphene camera sensor 1,000 times more sensitive to light

It seems we can never be content with how sensitive our camera sensors are to light. Scientists in Singapore are working on a new camera sensor technology made from graphene that will supposedly make future cameras 1,000 times more sensitive to light and uses 10 times less energy than current camera sensors.

ntu_1

In turn, this will produce way better low-light photos, to the point where we hopefully don’t even have to bother with ISO. Plus, the scientists working on the new technology say that these new sensors will be the fifth of the cost of current camera sensors, meaning that we could see camera prices drop significantly in the future.

The sensor works by trapping light-generated electron particles far longer than current sensors can, while only being made from a single sheet of graphene. The sensor will be able to be used in a number of different cameras, including infrared cameras, traffic cameras, as well as satellite imaging cameras, thanks to the wide spectrum of light that the new sensor can capture.

canon_7d_slashgear_slashgear-540x360

Plus, Assitant Professor Wang Qijie from Nanyang Technological University says that the research team is keeping “current manufacturing practices in mind,” which means the camera industry “can easily replace the current base material of photo sensors with the new nano-structured graphene material.”

Obviously, it’s too early to tell when we’ll be seeing these new sensors in consumer cameras, but they’ll ultimately hit the enterprise and government first, being used in security cameras, traffic cameras, etc. Of course, graphene is already set to be used in new flexible OLED screens, so the technology will definitely be on its way to the mainstream soon.

VIA: CNET

SOURCE: Science Daily


Graphene camera sensor 1,000 times more sensitive to light is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

SmartThings opens up its home automation platform to developers

SmartThings opens up home automation platform to developers

To say SmartThings‘ Kickstarter campaign was a success is an understatement: the Internet of Things outfit, which offers a clever array of home automation sensors, routers and smartphone apps, raised more than $1.2 million (over four times the company’s original goal), nabbed over 6,000 backers and quickly sold out of its first batch of kits. Naturally, the company isn’t stopping there — it’s making good on its goal of providing an open-source platform for developers, as it announced the availability of its Developer and Inventor Toolkit. Now, interested parties can create and develop their own SmartThings, and can collaborate with like-minded folks to come up with even more ways to take advantage of the platform. Additionally, it supports several types of wireless standards, such as WiFi, Bluetooth, ZigBee and Z-Wave, giving it interoperability with various home automation systems.

As a refresher, SmartThings connects a large number of household items — appliances, automatic door locks, thermostats, humidity sensors, presence sensors, power outlet switches, IR remotes, secret bookcase doors and plenty more — to a central router which then can be controlled through a smartphone app. Thanks to the openness of the platform, the number of use-case scenarios is rather significant, which certainly makes it more appealing to users. If you’re interested in learning how to get started, head below to the press release and go here to get the whole enchilada of information.

Filed under: , ,

Comments

Google Maps-driven Map Dive 3D-tracking hands-on

This week the folks at the development studio known as Instrument have brought a virtual reality demonstration to Google I/O 2013, complete with a multi-display drop from the upper atmosphere down toward the earth in freefall. What this demonstration consisted of was seven 1080p displays, each of them run by their own Ubuntu PC working with a full-screen version of Chrome version 25. A motion tracker works to track the user, their arms, and the angle at which they’re standing – or leaning and falling, as it were.

mapdrop

This system was developed by Instrument to track user input and motion tracking with a custom C++ app built with openNI as well as an ASUS Xtion Pro 3D motion tracking sensor camera. As the motion tracker sees and understands the angle of the human playing the game’s torso and location of each arm, so too will their avatar on the display array move as they fall.

The 3D game content is rendered with WebGL using THREE.js, the WebGL layer being rendered with a totally transparent background. This setup allows the map layer underneath to show through, this map layer being generated by Google Maps.

mapside

What the user sees below – the earth they’re plummeting toward – is a completely live HTML Google Map instance. It’s accurate – meaning you could potentially be diving toward your house, a national landmark, or perhaps somewhere that’d be useful for real-world training.

showing

In addition to this setup being live and ready to roll here at Google I/O 2013 as a playable demo, Instrument has created a Dive editor. With this Dive editor, an editor is able to build directly into the control node administrative console, each of these changes reflected instantly – live in the scene.

The editor user interface exists as a Google Map, the person editing it able to use draggable markers that act as game objects. With this interface, developers and savvy users will be able to utilize geocoding to center the map view on locations of their choice – anywhere Google Maps can see. Think of the possibilities!


Google Maps-driven Map Dive 3D-tracking hands-on is written by Chris Burns & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Hidden sensors in Google Glass could enable AR apps

Hidden sensors in Google Glass could enable AR apps

One of the bigger digs against Glass so far has been its rather limited feature set. In particular, there seemed to be no way to build full augmented reality applications for the wearable. (And it’s not like Google has exactly been forthcoming about many of its specs.) But all hope is not lost. Programmer Lance Nanek was digging around in debug mode and managed to push an Android app to the head-mounted display that spit out a list of available sensors. Looks like Glass does in fact have all of the necessary components for full-fledged AR — the official API just hasn’t exposed those capabilities yet. Currently, third-party Glass apps are limited to updating your location once every 10 minutes, but with a little bit of hacking, we’re sure that limitation could be overcome and the full suite of orientation sensors exposed to developers. Perhaps it won’t be long before someone ports Yelp Monocle to Glass. Of course, it’s probably only a matter of time before Google opens those features up to devs. For the full list of sensors and location providers head on after the break.

Filed under: , ,

Comments

Via: Karthik’s Geek Center

Source: NeatoCode Techniques

GoalControl to provide goal-line tech during 2014 World Cup

The 2014 World Cup will take place in Brazil, and folks are already beginning to prepare for the tournament. As qualifying goes on as we speak, organizers are working on a new system that will electronically detect when a goal has been scored, and it was just announced today that GoalControl will be providing the technology necessary.

3747916256_b8ba6353d2_z-580x386

This isn’t the first time that we’ve discussed goal-line technology for the 2014 World Cup. FIFA initially announced it back in February. However, more details were released today on how exactly it all will work. Overall, the stadium has 14 cameras spread out amongst it, with 7 cameras focusing on each of the two goals in order to detect when a goal is scored.

All objects that are within the cameras’ field of view are tracked, but the players and referees are cleverly filtered out, leaving just the ball being tracked. The ball’s position is continuously and automatically captured in three dimensions (X-, Y- and Z-coordinates) whenever it gets close to the goal, in order to accurately judge where the ball is.

If the ball crosses the goal line, the system sends an encrypted signal to a watch that referees will be wearing. The signal is sent in less than a second after the ball has passed the goal line. Plus, a virtual 3D image of any portions of the playing field can be shown on the big screen from any camera angle, thanks to those 14 strategically-placed cameras. It also doesn’t hurt to know that the 2014 World Cup will be broadcasted in 4K.

[via BBC News]

Image via Flickr


GoalControl to provide goal-line tech during 2014 World Cup is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

DUO 3D sensor shows up on Kickstarter, claims that “anyone can build” it

We’ve been hearing a lot about motion tracking as of late, the Leap Motion being the most popular device that is making its way to the public. However, a new mechanism is looking to gain some ground and has popped up on Kickstarter. The DUO 3D sensor claims to be the “world’s first and only DIY 3D sensing solution.”

9c77b653d97118a9c6d467fc02158088_large

The DUO 3D sensor is open source, meaning that you can do anything with it that you please. It comes with open hardware plans, and you can get it in kit form where you assemble it or you can get fully assembled devices. The drivers and SDK are also open source, so there’s quite a bit that you can do with it right off the bat.

The company even claims that the sensor is practically plug-in-play, where you just plug it in, download the necessary software, and start playing around with it “within minutes.” From the video itself, the DUO looks to be extremely accurate, tracking fingers with every slight move. From the looks of it so far, it’ll definitely give the Leap Motion a run for its money.

The “DIY” portion comes into play with the open source hardware blueprints that you can purchase (or “back” in this case). The hardware plans will provide you with everything you need, but it’ll be up to you to get the parts and assemble it. However, you can modify the plans however you wish and truly make it your own.

Pledging $10 gets you the SDK, while $20 will get you the hardware plans, as well as the SDK. $40 will get you everything previously, as well as a custom-molded case for your 3D sensor. $140 will get you a fully-assembled kit, while $110 will score you all the parts you need to assemble it yourself.


DUO 3D sensor shows up on Kickstarter, claims that “anyone can build” it is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Insert Coin semifinalist: Observos environmental monitoring sensors hands-on (video)

Insert Coin finalist Observos remote monitoring sensors handson video

We first heard about Observos a month ago when it became a participant in our Insert Coin semifinals, but it wasn’t until the Hexagonal Research product showed up at Engadget Expand that we were able to see working models of its environmentally aware sensors. Each sensor, which is shaped like a hexagon and is about twice as thick as a hockey puck, is capable of monitoring the temperature, humidity and barometric pressure of virtually any object you can think of. For indoor sensors, a small screen on top displays the desired information of the item you’re monitoring, but there’s no need to keep a close eye on it — the information can be relayed to a web interface by communicating wirelessly with a base station hooked into your router. (Outdoor sensors are more rugged to handle external weather conditions and don’t have a display screen.

You can program the setup to alert you via email or text if something is awry, regardless of where you are, and you’ll be able to monitor everything directly from your smartphone; in the future, Observos hopes to expand into a control network that would give you the ability to make changes to environmental conditions remotely. In other words, if your plants get low on moisture, you’d be able to program a flow valve to open automatically.

While the company’s Expand booth featured only six sensors, up to 40 could be used simultaneously. The Observos team plans to launch its Kickstarter campaign this coming Monday, and backers can grab one indoor sensor and base station together for $175, with the price going up as more sensors are added; outdoor sensors will be a bit more spendy as well. A hacker’s board will also be available at $75 for anyone who just wants to tinker around with the goods. Check out our video and full image gallery below for another look.

Filed under: , ,

Comments

Electronic Sensor Tattoos Can Now Be Printed Directly Onto Human Skin

Thanks to the same people that brought us the stick-on electric tattoo and stretchable battery, we’re now looking at a future of electronic sensors that can be printed directly onto human skin. More »

Apple Patent Shows Squeezable iDevices and Vanishing Keyboards

Apple Patent Shows Squeezable iDevices and Vanishing Keyboards

We may soon see a day when your iPhone and iPad physically deform at your touch and your MacBook has a disappearing keyboard, according to a patent Apple has been granted.

How Your Smartphone Will Get Lytro-Like Superpowers

As neat as they are, the Lytro camera’s re-focusing tricks aren’t going to convince most of us to replace our highly pocketable cameraphones. So a California company called DigitalOptics has found a way to give us the best of both worlds with a new ultra-thin sensor that promises Lytro-like tricks. More »