Next3D’s plan to bring recorded video to the Oculus Rift

Next3D's plan to bring recorded video to the Oculus Rift

The dream of wearing a lightweight headset, like the Oculus Rift, in order to simulate physical presence isn’t limited to the imaginary worlds of video games. One man’s vision is that of immersive TV shows, movies and live sports. In fact, David Cole, co-founder of Next3D and an industry veteran who helps content creators and providers produce and deliver 3D, has been using his Rift dev kit to bring TV and film to life since the kits started shipping in March. The company is combining its video processing and compression technology with its experience in content production and stereoscopic delivery to offer what it’s called Full-Court.

Next3D hopes to leverage its existing relationships with creators and providers to assist them in jumping into the world of live-action VR content. This includes both pre-recorded and live broadcasts. We wanted to see this firsthand, so we jumped at the opportunity to witness the creation of content and experience the results. This trial run of Next3D’s stereoscopic, 180-degree field-of-view camera rig, and the post-processing to adapt it to VR, was part of the production of the paranormal investigation show, Anomaly, at Castle Warden in St. Augustine, Fla. Being nearby, we braved the perils of the haunted surroundings to tell you about what we hope is only the beginning of virtual reality content.

Filed under: ,

Comments

Oculus snags $16 million from investors to bring virtual reality to the masses

Oculus announced its first round of funding today, wherein the company secured $16 million from investors specifically aimed at putting the Oculus Rift in consumer hands. The nascent virtual reality hardware company has repeatedly said its end goal with the Rift is to make it a consumer product; currently, only folks who backed the Rift on Kickstarter and those willing to spend $300 on a developer kit have access. A handful of games support the Rift, though more and more developers are promising not just support in their games, but entire games built from the ground up with VR in mind. An HD version of the headset was also introduced at last week’s E3 gaming show.

Oculus’ new business partners apparently see enough financial potential in the Rift to not only invest heavily, but to also take on board positions — both Santo Politi of Spark Capital and Antonio Rodriguez of Matrix Partners are now on the Oculus board of directors. “What Palmer, Brendan and the team are building at Oculus so closely matches the Metaverse that we had to be part of it. Working with them to get this platform to market at scale will be enormously exciting,” Rodriguez said of today’s news.

The company launched last year with a Kickstarter campaign targeting $250,000 — the project eventually raised just shy of $2.5 million, and now sells its Rift dev kit outside of the Kickstarter campaign.

Comments

Hands-on with EVR, a spaceship dogfighting game demo built for Oculus Rift

Handson with EVR, a spaceship dogfighting game demo built for Oculus Rift

We’ve seen plenty of demos showcasing the Oculus Rift, but actual gameplay experience with the VR headset has been tough to come by. We first heard about a spaceship dogfighting game called EVR being built for the Oculus Rift by game studio CCP a couple months ago. And, today at E3 we finally got to put a dev unit to its intended use playing the game.

As we noted before, it’s a Wing Commander-style game featuring 3v3 gameplay in open space and amongst asteroid field. Upon donning the Oculus Rift and a pair of Razer Kraken headphones, we found ourselves sitting in the cockpit of our very own starfighter. Looking around, we could see the sides of the launch tube, our digital hands manning the flight controls, and looking down revealed our legs and even the popped collar of our flight jacket. In previous Rift demos, we couldn’t see our digital avatar, but being able to do so in EVR really added to the immersiveness of the experience.

Filed under: ,

Comments

Oculus Rift HD prototype VR headset appears at E3, we go hands (and eyes) on

Oculus Rift HD prototype VR headset appears at E3, we go hands and eyes on

We’ve been impressed with Oculus Rift from the start, and have been following the VR headset closely ever since. The developer edition has been in the hands of devs for a couple months now, and while Palmer Luckey and Nate Mitchell have certainly received rave reviews of the headset from many, they’ve also heard lots of feedback about ways to improve it. The number one request from users and devs? A higher-resolution screen than the 1,280 x 800 panel in the dev device. Well, after months of research and tinkering to find the right hardware combination, team Oculus is finally ready to show off a Rift with a 1,080 x 1920 display, and we got to demo the thing.

Before heading into the land of 1080p, we got to explore a demo built with Unreal Engine 4 in the existing dev headset. After looking around a snowy mountain stronghold inhabited by a fire lord in low res, we switched to the exact same demo running at 60 fps on the HD prototype device — and the difference was immediately apparent. Surface textures could be seen in much higher fidelity, colors were brighter and less muddied and the general detail of the entire environment was greatly improved.

Filed under: ,

Comments

Oculus Rift HD prototype VR headset appears at E3, we go hands (and eyes) on (update: video)

Oculus Rift HD prototype VR headset appears at E3, we go hands and eyes on

We’ve been impressed with Oculus Rift from the start, and have been following the VR headset closely ever since. The developer edition has been in the hands of devs for a couple months now, and while Palmer Luckey and Nate Mitchell have certainly received rave reviews of the headset from many, they’ve also heard lots of feedback about ways to improve it. The number one request from users and devs? A higher-resolution screen than the 1,280 x 800 panel in the dev device. Well, after months of research and tinkering to find the right hardware combination, team Oculus is finally ready to show off a Rift with a 1,080 x 1920 display, and we got to demo the thing.

Before heading into the land of 1080p, we got to explore a demo built with Unreal Engine 4 in the existing dev headset. After looking around a snowy mountain stronghold inhabited by a fire lord in low res, we switched to the exact same demo running at 60 fps on the HD prototype device — and the difference was immediately apparent. Surface textures could be seen in much higher fidelity, colors were brighter and less muddied and the general detail of the entire environment was greatly improved.

Filed under: ,

Comments

Visualized: a history of augmented and virtual reality eyewear

Visualized a history of augmented and virtual reality eyewear

We’ve seen the prototypes that led Google to Glass, but there are many devices that predate Mountain View’s smart specs, and Augmented World Expo in Santa Clara, California was able to gather and display a historic number of such headsets this week. From Steve Mann’s handmade WearComp 1 and EyeTap prototypes to Glass-like precursors from Optinvent and Vuzix, it’s quite the comprehensive collection — over thirty devices in all. While they may make their way into a museum some day, we’re bringing pictures of them all to your screen right now. Enjoy.

Filed under: ,

Comments

Putting Your Finger in this Japanese Robot is a Step Toward Actual Virtual Reality

Haptic system from NHK

Welcome to Touchable TV!
In addition to showcasing their 8K, 7680×4320, Ultra-High-Def (Ridiculous-Def?) TV broadcasting kit last weekend, Japan’s NHK also demoed a haptic feedback device that simulates virtual 3D objects in real time. And the thing is, it’s really just a robot that, when you touch it, kinda touches you back.

NHK (Nippon Hōsō Kyōkai/Japan Broadcasting Corporation) is a public media organization somewhat analogous to the American PBS. However, entirely not at all like its American counterpart, the J-broadcaster’s got this: NHK Science & Technology Research Laboratories. Which is nice, because in cooperation with various corporate partners, NHK seriously delivers the tech.

Okay fine… so where’s the robot?

Haptic Virtual Reality that’s Actually Virtual – Just Put Your Finger in This Robotic Thingy!
In the image above, a brave test pilot is placing his index finger into the locus of a five-point artificial haptic feedback environment. Based on the analysis & modeling of a virtual 3D object that in turn informs the movements and relative resistances among five robotic arms controlling the five feedback points, a focused area of stimuli/response is generated. Sounds complicated to explain “robotic, artificial sense of touch” that way, but conceptually the idea is quite simple:

#1. Put your finger in here and strap on the velcro:

#2. It’ll feel like you’re touching something that doesn’t physically exist, like Domo-kun (Dōmo-koon) here:

Each of those shiny round points is the terminus of a robotic arm that either gives way or holds steady based on the relative position of the finger to the contours of the object being simulated. Each point’s position-resistance refreshes every 1/1000th of a second. Not bad.

For practical, full-immersion VR to exist (in a physical sense; that is, before VR becomes a direct neural interface a la The Matrix), for now and for a while our low-to-medium-resolution interactive haptic feedback interfaces will be intrinsically robotic. And for virtualizing entirely digital, non-real artifacts, NHK’s device is a step in that direction.

Of course five points of interactivity might not sound like much, but mindful of the generally leapfroggy nature of technological advancement, effectively replicating and surpassing the haptic resolution we now experience via the estimated 2,500 nerve receptors/cm² in the human hand doesn’t seem too tall an order.

If that does seem too tall, if that does sound too far out and overly optimistic, if it seems impossible that we’d ever be able to cram 2,500 sensory & feedback robots into a square centimeter – well, then your robo-dorkery score is low and you need to pay more attention. Because dude, we’re already building nanorobots atom-by-atom. Not an “if” question, this one.

Neat… But Anything Really New Here?
Of course, a wide variety of teleoperated force-feedback systems are either already in use or in-development (the da Vinci Surgical System; NASA’s Robonaut 2; etc.), so it’s important to emphasize here that NHK’s device is novel for a very particular reason: Maybe all, or nearly all, of the force-feedback haptic systems currently in use or development are based on an ultimately analog physicality. That is to say, whether it’s repairing a heart valve from another room, or, from a NASA building in Texas, tele-pushing a big shiny button on the International Space Station – what’s being touched from afar ultimately is a physical object.

So, what we might consider contemporary practical VR is more accurately a kind of partial VR. As the sense of touch is essential to our experience as human beings, incorporating that sense is a step toward interactive, actual factual, truly virtual virtual reality. Modeling and providing haptic feedback for non-physical objects, i.e., things that don’t really exist, in concert with other virtualization technologies – that’s a big step.

So What Can/Does/Will it Do?
NHK is kind of talking up the benefits for the visually impaired – which is good and noble and whatnot – but perhaps focusing on that is a bit of a PR move, because at least in theory this technology could go way, way beyond simple sensory replacement/enhancement.

An advanced version, incorporating the virtual touching of both simulated and/or real objects, could add layers of utility and interactivity to almost any form of work, entertainment, shopping… from afar we might discern how hard it is to turn a valve in an accident zone (partial VR), how bed sheets of various thread count feel against the skin (partial or full VR), the rough surface of the wall one hides behind in a videogame (proper VR), or even pettting the dog, or petting… ummm, a friend (partial and/or proper VR – chose your own adventure)!

That’s a ways off, but in the short-to-near-term, here’s how NHK envisions functionality for their touchable TV tech:

Matchmaker, Matchmaker, Make Me a Full-Immersion Omni-Sensory VR System!
Okay, so to get this ball rolling: NHK, meet VR upstart Oculus Rift. NHK & Oculus Rift, meet VR/AR mashup Eidos. NHK, Oculus Rift, and Eidos, meet UC Berkely’s laser-activated pseudo-robotic hydrogels.

We’re all waiting for your pre-holodeck lovechild.

• • •

Reno J. Tibke is the founder and operator of Anthrobotic.com and a contributor at the non-profit Robohub.org.

Via: MyNavi (Japanese/日本語); DigInfo

Images: DigInfo; NHK

Google Glass, Meta Wants Your Milkshake! …Do Consumers Want Either of Them?

Meta-glass

Google Glass fever and upstart Meta’s rapidly financed US $100,000 Kickstarter campaign indicate #1. impending altered reality market maturity, or #2. everything new remixes the old, but still the geeks sing “Ohhhhh look, shiny!

Google Glass: Loudest Voice in the Room
In development for several years and announced way back when, Glass finally got to developers and the geek elite about two months ago (for US $1500, plus getting oneself to a mandatory orientation meeting thingy). Glass is a kind of hybrid between a head-mounted display and augmented reality (AR) prosthetic outfitted with the internets. Really, if you’re reading Akihabara News you’re probably already hip, but if not there’s a search engine very ready to help you. Big G overlord Eric Schmidt indicated last month that a consumer-ready Glass product is about a year away. Realistically, at this point it’s unclear whether Glass is expected to be a viable consumer product or more of a proof-of-concept development platform.

Meta: Quickly Kickstarted, High-Profile Team Assembled – Working Man’s AR?
If you saw last year’s sci-fi short film “Sight” or the YouTube sci-fi series “H+,” you’re already hip to what Columbia University’s Meron Gribitz & pals are aiming for with Meta. While Glass is more of a HUD with some AR, Meta is less with the acronyms and more what the name suggests: information about information, i.e., Meta hopes to overlay manipulatable imagery/data on the physical world, augmenting real reality and projecting virtual reality (VR) artifacts that you can fiddle with in real time.

For now, Meta has a slick video, a prototype, a crack team of engineers and advisors including professor Steven Feiner and wearable computing advocate guy, Steve Mann, and financing to get their dev kit into dev’s hands. To its credit, Meta does seem to aim less at generalized gee-whiz gimmickry and heads-up automated narcissism, and more toward the getting actual work done.


Asian Alternatives:
First: POPSCI, very well done. The image on the above left melts one’s technosnarky heart.

In typical form, China has assimilated and excreted: the Baidu Eye is their Glass clone. There’s no indication of plans to bring it to market, so maybe they just wanted to say “Ha, ha, we can, too!” Or maybe they just wanted to do research and ride the Glass hype, which is understandable. But China, dude – might wanna think about doing some original stuff someday soon. That lack of intellectual capital is going to sting when “Designed in California” meets “Made in the U.S.A. With My 3D Printer.

Over here in Japan we’ve got startup Telepathy One pushing a Glass-looking, but as they openly declare, not Glass-like AR headset (above-right). While technology writers rhetorically speculating as much in a headline makes for good Search Engine Optimization (other adjectives include: disingenuous, blithe, lame), rather than compete with Glass, Telepathy One is focusing on social networking & multimedia – but they too are clearly attempting to catch the contemporary current of AR hype – which is understandable. And hey, even if Telepathy One flashes and disappears, that fact that the phrase “Japanese Startup” can be used without the usual preface of “Why Aren’t There Any…” is a positive thing.

Okay Then, It’s Almost Doable – But Still…
Indeed, the apps, core software, computational capability, and the ubiquitous-enough network connectivity essential for decent AR are quickly ramping up. Along with innovative concepts like the AR/VR mashup Eidos Masks, alternatives to and more advanced versions of the above devices will likely continue to crop up. In fact, the never-even-close-to-being-vaguely-realized promises of VR are also showing signs of decreased morbidity. So…

We Actually Want It vs. They Want Us to Want It
Glass, the engine of the current VR hype machine, is of course conceptually nothing new, but it has the word “Google” in the name, so people are paying attention. Of course even Google gets ahead of itself from time to time (Buzz? Wave?), but lucky for them selling ads pays well, and they’ve got a boatload of cash to pour into whatever sounds cool. Millions have benefited from Google’s side projects and non-traditional ventures (Gmail much?), but the expectations leveled on Glass are… perhaps a bit much. Suffice it to say, Google absolutely nails search and software and web apps, but thus far big-G’s hardware projects have but limped.

But if we’ve got the cash, that probably won’t stop us! The soft tyranny of the tech elite is the ability to ring a shiny bell and then watch the doggies line up to pay. Luckily, actually useless products, products produced with too much hype, products produced with too much variety, products out of touch with the people who ultimately finance their creation – no matter how awesome they seem at first blush – they will fail. Hard. (Note: Sony, if you’re here, please reread the last sentence!).

Until AR & VR technologies can out-convenience a smartphone, shrink into a contact lens, dispense with voice controls and the confusing non-verbal communication of fiddling with a touchscreen on your temple, i.e., until such devices can move beyond relatively impractical novelty, it’s unlikely they’ll amount to much more than narrowly focused research and demonstration platforms.

This is to say, along with inventing Google Glass, the search giant might also want to invent something for us to like, you know, do with it. Or maybe that’s not fair – so to be fair, one can concede that no new technology is perfect at 1.0, and any awesome innovation has to start somewhere…

Maybe it could start in 1995. Ask Nintendo about that.

• • •

Reno J. Tibke is the founder and operator of Anthrobotic.com and a contributor at the non-profit Robohub.org.

Props to io9 and Meta’s Kickstarter and Meta (but come on guys, tame that website – autoplay is really annoying). PopSci article/image; Watch the augmented reality-themed “Sight” and “H+” by clicking on those words.

Meta, The World’s First Entry-Level AR Glasses, Hires The Father Of Wearable Computing As Chief Scientist

e0a9a95c6682ab3898a0013b9caee78a_large

The Meta 1 is a pair of augmented reality goggles that performs some very unique and useful tricks. While they are still in beta stage, the glasses are coupled with a Kinect-like camera to sense objects in real space and allow users to interact with virtual worlds with the swipe of their hand.

The company founder, Meron Gribetz, says that the company is on track to create a mass produced solution shortly, but until then they have brought on Steve Mann, a real cyborg and wearable computing researcher, to act as a chief scientist. You’ll recall that Mann was assaulted in a Parisian McDonald’s for wearing a Google-Glass-like headset.

“We brought Mann on board because of his expertise in two key areas: miniaturization and mediated reality. Mann has been developing a Google Glass-like device for years but recognized now was not the right time for something of that scale, because of the limitations of such a device. Rather than a phone accessory, Mann is keen to work with us to develop a fully fledged new interface for computers,” said Gribetz.

“His scientific leadership in mediated reality will be a huge advantage for us when delivering an immersive augmented experience. Occlusion (hiding or modifying real world objects) is a key part of full augmented reality and Mann’s experience in mediated reality will allow us to bring the best solution to market in this area.”

Gribetz is a Y Combinator alum and the project, which is still on Kickstarter, is nearly funded with 26 days to go. Users can receive a Dev Kit for $550. Epson will help build Meta’s next-generation VR glasses which will look considerably less DIY than the beta developer version.

“The entrance into consumer wearables needs to be a high powered immersive device capable of fully replacing the computer and more. Heads up notification systems have their use cases, but they won’t be game changers. Mann’s commitment to a fully wearable future is why he chose to join us,” said Gribetz. Considering Mann has been wearing his computing power for most of this decade, it seems like a good fit.

Insert Coin: Meta 1 marries 3D glasses and motion sensor for gesture-controlled AR

In Insert Coin, we look at an exciting new tech project that requires funding before it can hit production. If you’d like to pitch a project, please send us a tip with “Insert Coin” as the subject line.

Insert Coin Meta 1 marries 3D glasses and motion sensor for gesturecontrolled AR

Now that Google Glass and Oculus Rift have entered the zeitgeist, might we start to see VR and AR products popping up on every street corner? Perhaps, but Meta has just launched an interesting take on the concept by marrying see-through, stereoscopic, display glasses with a Kinect-style depth sensor. That opens up the possibility of putting virtual objects into the real world, letting you “pick up” a computer-generated 3D architectural model and spin it around in your hand, for instance, or gesture to control a virtual display appearing on an actual wall. To make it work, you connect a Windows PC to the device, which consists of a pair of 960 x 540 Epson displays embedded in the transparent glasses (with a detachable shade, as shown in the prototype above), and a depth sensor attached to the top. That lets the Meta 1 track your gestures, individual fingers and walls or other physical surfaces, all of which are processed in the PC with motion tracking tech to give the illusion of virtual objects anchored to the real world.

Apps can be created via Unity3D and an included SDK on Windows computers (other platforms will arrive later, according to the team), with developers able to publish their apps on the upcoming Meta Store. The group has launched the project on Kickstarter with the goal of raising $100,000 to get developer kits into the hands of app coders, and though it’s no Google, Meta is a Y Combinator startup and has several high-profile researchers on the team. As such, it’s asking for exactly half of Glass’ Explorer Edition price as a minimum pledge to get in on the ground floor: $750. Once developers have had their turn, the company will turn its attention toward consumers and more sophisticated designs — so if you like the ideas peddled in the video, hit the source to give them your money.

Filed under:

Comments

Source: Kickstarter