Canopy Lets You Gaze at the Bright, Morning Sky – While You’re on the Subway

Whenever I want to spend some time alone to mull things over, I trek off to this park I like, find a bench, and literally just stare off into space. There’s just something relaxing and comforting about an overlooking mountain view that puts my mind (and heart) at ease.

Unfortunately, not all of us have the time to take regular trips out of the city (or even to a city park) just to calm our minds. But thanks to design students Matt Batchelor, Amitra Kulkami, and Emma Laurin, your commutes are about to become more relaxing.

Canopy Concept

They came up with the Canopy concept, which uses a curved display to show an animation of the sky or surface landmarks as the train or subway you’re riding in passes them. Think of it like a sun roof of sorts, only the digital kind. Though the prototype appears to use some sort of projection system, the concept calls for large, flexible electrophoretic e-paper displays.

Canopy Concept1

Mass transit operators can also choose to monetize on the added comfort by adding unobtrusive digital ads that interested commuters can check out by scanning the ad’s QR code.

It’s definitely a bright idea that will give people more things to look forward to for their long, dark underground commutes.

[via PSFK via Dvice]


eSSage Gives You a Massage, Without the Masseuse

Nothing beats getting a massage after a long, hard day at work. But if you’re too beat to even drive to your favorite spa or call for home service, then there’s another alternative (provided it gets manufactured, that is): the eSSage.

eSSageIt might not be the first time someone thought about making a self-massaging suit so that you can, well, give yourself a massage, but it’s one of the concepts that pretty well thought-out. It’s controlled remotely via an Android or iOS app, where you can point your stylus (or finger) on the nodes that you want to work on.

You can also have someone else “do” the massage for you by passing on the app and stylus.

eSSage1

The eSSage is a concept design by André Cofield. What do you think?

[via Yanko Design]


MIRAGE Substitutional Reality System: One Step Closer to Total Recall

This “substitutional reality system” was developed by the Japanese Laboratory for Adaptive Intelligence at the RIKEN Brain Science Institute. It was created to fuse performance art with perceived reality for its wearer. While it’s doesn’t produce the sort of directly implanted memories seen in Total Recall, the visual and audio portions are immersive enough to trick your mind anyhow.

mirage substitional augmented reality japan

The headgear is supposed to seamlessly meld the live video with recordings from the past as well as performances from dancers. Sounds groovy. It combines fiction and reality, making them indecipherable from each other. It’s definitely an interesting conundrum, not being able to tell if what you’re seeing is in fact real, just a recording or a hybrid of both.

The MIRAGE is a unique take on augmented reality, though I’m sure under the right circumstances, it could be used to brainwash people. You still want to try it out?

mirage substitional augmented reality japan real

[via designboom]


Kuratas Mech: Real or Fake, It’s Still Awesome

I’m going to start saying that I’m taking this with a huge grain of salt. This robot looks very realistic, yet has a somewhat cheesy and viral attempt feel to it at the same time. Watch the video for yourself and see what you think. The video is supposed to be a how-to video from a company called Suidobashi Heavy Industry.

kuratas robot

The video goes over how to ride a robot called the Kuratas. The 13-foot-tall wheeled robot is clearly a nod at the Mechwarrior series of video games and any number of Japanese animated shows. In the video, you can see petite Japanese woman climbing into the chest cockpit cavity of the robot and going over the controls, including a remote smartphone-operated mode.

The controls look easy-to-use and seem realistic. You’ll note on the left arm there are twin multi-barreled cannons. How scary would it be as a soldier on about film to have one of these massive robots come walking up? The video claims the top speed of the bot, which has a torso and two arms but rolls on wheels, is 10 km/h. The bot uses a diesel engine and can be driven in high or low modes. The missile launcher appears to be packed with water bottles and will “from time to time” hit its target.

If this is just a fake viral video, it’s extremely well done. If it’s real, we could soon have mechs walking the streets of Tokyo.

[via Daily Mail]


NASA Designing a New Spacesuit, Astronauts to Look Like Buzz Lightyear?

NASA is trying to bring it’s equipment into the 21st century and that includes updating its spacesuits. Scientists and engineers at NASA have been working to develop the new prototype called the Z-1. This is the new spacesuit that is being developed to replace the twenty-year old model that was first put into service in 1992. Is it just me or does this look like Buzz Lightyear’s suit?

new nasa spacesuit
Right now it is undergoing heavy testing. The Z-1 prototype spacesuit and portable life support system has its own airlock. With this new design, an astronaut crawls into the suit from the back, near the top. This is done through an airtight hatch that can latch on to a docking terminal or other vehicle such as a smaller spacecraft or rover unit. This design of course has many possibilities that the previous suits didn’t have. It is also more flexible and cuts down the amount of oxygen that an astronaut uses while in the suit.

new nasa spacesuit z 1

I’m not sure why they are bothering since we don’t seem to want to send humans anywhere in space other than space stations, but hey, at least we have new suits if we change our minds. You can find a more detailed image of the Z-1 spacesuitover at Popular Mechanics.

[via Gizmag via Geek]


Google Glass tech’s dark side blasted in science fiction short

Google Glass is definitely something to get excited about, but do we want a world where everything is constantly connected through devices like it? A few years ago the notion would have seemed absurd, but now a future where everyone uses devices like Google Glass is certainly a possibility. A new short film named Sight by Daniel Lazo and Eran May-raz examines such a future, and even though there are some exciting possibilities, there are also some pretty terrifying ones.


The film focuses on special contact lenses called “Sight” which allow users to see the world around them through the constantly-connected lenses. Things start out innocently enough, with our main character using his special contacts to turn his lunch preparation into a game and pick out his outfit for his upcoming date without ever leaving the couch. He then meets up with a girl named Daphne for their date, and secretly uses a dating app to help him win her affection. We’ll leave the rest to you to find out, because hey, we wouldn’t want to spoil the fantastic ending.

Being constantly-connected can have it’s benefits, but as the film shows, things don’t always turn out for the best when we end up relying too much on technology. Sight is equal parts beautiful and creepy, but the film’s creators tell VentureBeat that they didn’t intend for there to be a connection between Sight and Google Glass. In a strange series of coincidences, they say that they came up with the idea for a film centered around augmented reality, and few days after production began on Sight, Google released the first video for Google Glass. Still, even though there isn’t any intended connection between Sight and the current Google Glass craze, the film does serve as a possible snapshot of a future which may not be that far off. Enjoy!


Google Glass tech’s dark side blasted in science fiction short is written by Eric Abent & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


System Automatically Recognizes Baked Goods Without Labels or RFID

In the not-too-distant future, technology might let you check out for your purchases without any need to scan tags, enter prices, or even read RFID tags. Thanks to visual recognition technology, items being purchased could be automatically identified just by the way they look.

bakery scanner

A trial is underway at a bakery in Tokyo using Brain Corporation’s object recognition technology to automatically ring up items for purchase just by setting them onto a tray. A camera grabs an image of the items, and checks a database to match up the baked goods with their pricing. It works surprisingly well handling subtle variants of the same item – like 2 different loaves of bread. It’s a cool idea, and seems to work quite well in this particular application.

While I like the general concept, I could see problems with the system if you start dealing with multiple items that look the same on the outside, but have different insides (i.e. different memory configurations on an iPhone, or in this case a cherry croissant vs. a chocolate one.) Still, for items which can be identified by color, size and shape, it’s definitely got potential.

[via DigInfo TV]


DIY Google Glasses Provide Translation via Subtitles

The first time we featured programmer Will Powell, we learned how he was able to make a crude version of Google’s Project Glass augmented reality glasses. It turns out that Powell has made another version of his hack that is capable of translating spoken language and displaying the translation in subtitles.

project glass translator will powell

Like with his earlier project, Powell used a pair of Vuzix STAR 1200 glasses as the base of the hack. If I understood what Powell said on his blog, a Jawbone Bluetooth microphone picks up the audio and sends it to a mobile device, which then processes the words using translation API made by Microsoft. The translation is then passed on to a Raspberry Pi, which sends a text of the translation to the Vuzix display and a transcript of the conversation taking place to a TV. Below is a shot of the subtitle being displayed on the glasses’ monitor:

project glass translator will powell 2

And here’s a shot of the transcript on the TV:

project glass translator will powell 3

Finally here’s a demo of the hack in action. Note that there is a significant delay in the translation, which according to Powell occurs mainly when the audio goes through the translation API.

The sheer number of gadgets needed plus the fact that the Raspberry Pi is physically connected to the glasses via an S-video connector means that this is not a portable system, but I am still amazed at what one man armed with off the shelf parts can do. Besides, all devices – including the ones Powell needs – get more powerful and smaller in time. The time when we’ll be able to reenact Casa de mi Padre is closer than we think.

[Will Powell via Ubergizmo]


Aerographite is the World’s Lightest Material

Scientists from Europe have created what they claim to be the world’s lightest material. The material is called Aerographite and is said to be 75 times lighter than styrofoam. The new material is also electrically conductive, highly compressible, and seriously black in color. The material resembles a cobweb and consists of porous carbon tubes that almost appear smoke-like in the image.

aereographite

The image above was taken with a scanning electron microscope. The material weighs 0.2 mg per cubic centimeter. That makes the new material four times lighter than the previous record holder called Microlattice, made from nickel. Aerographite can be compressed up to 95% and still spring back to its original form with no damage. Compression up to a certain point actually makes the material more solid and stronger than before.

The material nearly completely absorbs light rays, creating what the scientists say could be called the blackest black. The material was created by starting with a zinc oxide powder heated to 900°C, creating a crystalline structure. Hydrogen is then introduced to react with the oxygen inside the zinc oxide resulting in omission of steam in seeing gas leaning porous carbon tubes behind. The scientists believe that material could be used in electronics for aviation or satellites and possibly for water purification among other uses.

[via MSNBC]


Vigilus Weapons System Concept Uses Airships to Deploy UAVs: The Carrier Will Arrive

Yesterday we looked at a drone that’s meant to be used for peaceful purposes. We are now back to our scheduled programming. MBDA Missile Systems recently unveiled the CVS301 Vigilus system, a “suite of future strike weapons” composed of small UAVs that are deployed via a launch aircraft. Oblivion descends.

vigilus system mbda

As shown in a demo video, the Vigilus system will be capable of deploying (at least) two types of UAVs from Armatus, a mothership that looks like two blimps fused together. The first UAV is Caelus, a “scout missile” meant to be used for recon and to paint a target for its big brother, the other UAV, which is called Gladius. Both have 1kg warheads, but the Gladius has enough fuel to fly for 2 hours, while the Caelus has a smaller range. The Vigilus system can be controlled either by soldiers in the battlefield or by an operator in a remote station. We’re screwedius.

vigilus system mbda 2 150x150
vigilus system mbda 3 150x150
vigilus system mbda 4 150x150
vigilus system mbda 5 150x150
vigilus system mbda 6 150x150
vigilus system mbda 7 150x150
vigilus system mbda 150x150

What’s next, a ship with a nuclear-powered beam cannon? Zerglings? Check out the source links below for more terrifying information.

[MBDA via New Scientist via Emergent Futures]