3D Conferencing System Allows for Virtual Light Saber Duels
Posted in: intel, R&D and Inventions, research, Today's ChiliIf your Wii boxing buddy or Star Wars light saber duel partner moved to a different town, technology can help bring you together for just one more game. Researchers at the University of Illinois at Urbana-Champaign and Intel have created a system that can support collaborative physical activities from different geographical locations.
“We can capture motions of the human body in real time and bring them together on a big screen,” says Ahsan Arefin, a doctoral student currently involved with the project.
The project called ‘Tele-immersive Environment for Everybody’ or TEEVE hooks up two off-the-shelf 3D cameras to a PC with a Firewire port. A gateway server at each site sends and receives the different video streams using standard compression techniques. A renderer is used to project the virtual interactions on a big screen monitor, creating a real-time virtual 3D effect. It’s like web conferencing, but with a virtual reality twist.
The system was on display Thursday at an Intel Labs “research day” in Mountain View, California. At the event Intel showcased technologies the company is working on.
In their demonstration of the TEEVE idea, Arefin and his colleague stood in two opposite corners of a room with light sabers in hand. They had 3D stereo vision cameras called BumbleBee 2 pointed at them. As the duo dueled, they could see their 3D images captured and reflected on screen.
The idea has applications beyond gaming. It can be used in business, sports and medicine, says Arefin. An experiment by the University had two dancers from different locations dancing together on a large screen.
The system is part of the quest towards more visual computing, says Jack Gold, principal analyst with consulting firm J. Gold Associates.
“Moving to a visual environment, from the text heavy one we are in right now, is one of the most important issues that we have to deal with in computing,” he says. “As they say, sometimes a picture is worth a thousand words.”
The biggest challenge in the application for the researchers comes from the demand on computational and network resources that the system imposes. TEEVE uses real-time 3D reconstruction algorithms that are necessary to convert 2D frame images to 3D frame that also includes the depth information. To optimize it, researchers have used multi-threaded computation and Arefin says TEEVE can work on PCs with high-end Intel processors.
“Our goal is to make the system portable and easily deployable because of its use of off-the-shelf components,” he says.
Post a Comment