Could a Smartphone Cholesterol Sensor Convince You Not to Supersize?

Could a Smartphone Cholesterol Sensor Convince You Not to Supersize?

One day you might be able to make a more informed decision when the cashier asks if you’d like to supersize your fries. Researchers at Cornell University have developed an accessory for your smartphone called the smartCARD that actually lets you measure your cholesterol in mere minutes.

Read more…


    



Researchers Are Finally Teaching Robots To Be Less Stabby

Most of the news coming out of robotics research has us really worried about mankind’s future, but Cornell University finally brings us a glimmer of hope. Researchers there are working on developing an algorithm through physical feedback that will teach robots to be more careful with certain objects—like say when handling a sharp knife around highly stabbable humans.

Read more…


    



Cornell scientists 3D print ears with help from rat tails and cow ears

Cornell scientists 3D print ears with help from rat tails and cow ears

Science! A team of bioengineers and physicians over at Cornell University recently detailed their work to 3D print lifelike ears that may be used to treat birth defects like microtia and assist those who have lost or damaged an ear due to an accident or cancer. The product, which is, “practically identical to the human ear,” according to the school, was created using 3D printing and gels made from living cells — collagen was gathered from rat tails and cartilage cells were taken from cow’s ears. The whole process is quite quick, according to associate professor Lawrence Bonassar, who co-authored the report on the matter,

“It takes half a day to design the mold, a day or so to print it, 30 minutes to inject the gel, and we can remove the ear 15 minutes later. We trim the ear and then let it culture for several days in nourishing cell culture media before it is implanted.”

The team is looking to implant the first ear in around three years, if all goes well.

Filed under: ,

Comments

Source: Cornell Chronicle

Researchers turn to 19th century math for wireless data center breakthrough

Researchers turn to 19th century math for wireless data center breakthrough

Researchers from Microsoft and Cornell University want to remove the tangles of cables from data centers. It’s no small feat. With thousands of machines that need every bit of bandwidth available WiFi certainly isn’t an option. To solve the issue, scientists are turning to two sources: the cutting edge of 60GHz networking and the 19th century mathematical theories of Arthur Cayley. Cayley’s 1889 paper, On the Theory of Groups, was used to guide their method for connecting servers in the most efficient and fault tolerant way possible. The findings will be presented in a paper later this month, but it won’t be clear how effectively this research can be applied to an actual data center until someone funds a prototype. The proposed Cayley data centers would rely on cylindrical server racks that have transceivers both inside and outside the tubes of machines, allowing them to pass data both among and between racks with (hopefully) minimal interference. Since the new design would do away with traditional network switches and cables, researchers believe they may eventually cost less than current designs and will draw less power. And will do so while still streaming data at 10 gigabits per second — far faster than WiGig, which also makes use of 60GHz spectrum. To read the paper in its entirety check out the source.

Filed under: , , , ,

Researchers turn to 19th century math for wireless data center breakthrough originally appeared on Engadget on Fri, 12 Oct 2012 11:39:00 EDT. Please see our terms for use of feeds.

Permalink Wired  |  sourceOn the Feasibility of Completely Wireless Datacenters (PDF)  | Email this | Comments

This Fabric Simulator Will Make CG Characters Sound More Realistic Than Ever [Video]

Every last detail can help make a computer-generated character seem more realistic. So a team of researchers at Cornell University have developed a simulator that can accurately recreate the sound of cloth so that the CG characters you see on screen also sound as authentic as possible. More »

Fabricated: Scientists develop method to synthesize the sound of clothing for animations (video)

Fabricated Scientists synthesize the sound of moving clothing, but you'll still need the Wilhelm Scream

Developments in CGI and animatronics might be getting alarmingly realistic, but the audio that goes with it often still relies on manual recordings. A pair of associate professors and a graduate student from Cornell University, however, have developed a method for synthesizing the sound of moving fabrics — such as rustling clothes — for use in animations, and thus, potentially film. The process, presented at SIGGRAPH, but reported to the public today, involves looking into two components of the natural sound of fabric, cloth moving on cloth, and crumpling. After creating a model for the energy and pattern of these two aspects, an approximation of the sound can be created, which acts as a kind of “road map” for the final audio.

The end result is created by breaking the map down into much smaller fragments, which are then matched against a database of similar sections of real field-recorded audio. They even included binaural recordings to give a first-person perspective for headphone wearers. The process is still overseen by a human sound engineer, who selects the appropriate type of fabric and oversees the way that sounds are matched, meaning it’s not quite ready for prime time. Understandable really, as this is still a proof of concept, with real-time operations and other improvements penciled in for future iterations. What does a virtual sheet being pulled over an imaginary sofa sound like? Head past the break to hear it in action, along with a presentation of the process.

Continue reading Fabricated: Scientists develop method to synthesize the sound of clothing for animations (video)

Filed under: ,

Fabricated: Scientists develop method to synthesize the sound of clothing for animations (video) originally appeared on Engadget on Wed, 26 Sep 2012 23:40:00 EDT. Please see our terms for use of feeds.

Permalink PhysOrg  |  sourceCornell Chronical  | Email this | Comments

Cornell students build spider-like robotic chalkboard eraser out of Lego, magnets, fun (video)

Robotic Eraser

While you were trying to pass Poetry 101, Cornell seniors Le Zhang and Michael Lathrop were creating an apple-polishing Lego robot that automatically erases your prof’s chalkboard. A final class project, the toady mech uses an Atmel brain, accelerometers for direction control, microswitches to sense the edge of the board, magnets to stay attached and hot glue to keep the Lego from flying apart. As the video below the break shows, it first aligns itself vertically, then moves to the top of the board, commencing the chalk sweeping and turning 180 degrees each time its bumpers sense the edge. The duo are thinking of getting a patent, and a commercialized version would allow your teacher to drone on without the normal slate-clearing pause. So, if designing a clever bot and saving their prof from manual labor doesn’t get the students an ‘A’, we don’t know what will.

Continue reading Cornell students build spider-like robotic chalkboard eraser out of Lego, magnets, fun (video)

Filed under:

Cornell students build spider-like robotic chalkboard eraser out of Lego, magnets, fun (video) originally appeared on Engadget on Tue, 14 Aug 2012 08:39:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceYin Yang Robotics  | Email this | Comments

Google simulates the human brain with 1000 machines, 16000 cores and a love of cats

Google simulates the human brain with 1000 machines, 16000 cores and a love of cats

Don’t tell Google, but its latest X lab project is something performed by the great internet public every day. For free. Mountain View’s secret lab stitched together 1,000 computers totaling 16,000 cores to form a neural network with over 1 billion connections, and sent it to YouTube looking for cats. Unlike the popular human time-sink, this was all in the name of science: specifically, simulating the human brain. The neural machine was presented with 10 million images taken from random videos, and went about teaching itself what our feline friends look like. Unlike similar experiments, where some manual guidance and supervision is involved, Google’s pseudo-brain was given no such assistance.

It wasn’t just about cats, of course — the broader aim was to see whether computers can learn face detection without labeled images. After studying the large set of image-data, the cluster revealed that indeed it could, in addition to being able to develop concepts for human body parts and — of course — cats. Overall, there was 15.8 percent accuracy in recognizing 20,000 object categories, which the researchers claim is a 70 percent jump over previous studies. Full details of the hows and whys will be presented at a forthcoming conference in Edinburgh.

Google simulates the human brain with 1000 machines, 16000 cores and a love of cats originally appeared on Engadget on Tue, 26 Jun 2012 07:22:00 EDT. Please see our terms for use of feeds.

Permalink SMH.com.au  |  sourceCornell University, New York Times, Official Google Blog  | Email this | Comments