Large Hadron Collider completes three-year cycle, goes into long shutdown

The Large Hadron Collider (LHC) completed its first three-year running cycle at 7:24am today, when its crew removed its beams and entered it into its first long shutdown period. Called LS1 (Long Shutdown 1), the LHC will undergo maintenance and consolidation work, enabling it to run at a higher energy when it is fired back up in 2015.

Construction_of_LHC_at_CERN

The maintenance and consolidation work will be performed to the LHC itself, but especially to its entire accelerator complex, which will have the magnet interconnections rebuilt to run at energy level 7TeV per beam. According to the announcement, the CERN complex will resume running in the middle of next year, ahead of the LHC’s scheduled restart in 2015.

CERN’s Director General Rolf Heuer had this to say about the LHC’s long shutdown: “We have every reason to be very satisfied with the LHC’s first three years. The machine, the experiments, the computing facilities and all infrastructures behaved brilliantly, and we have a major scientific discovery in our pocket.”

Over the last three years, the LHC has achieved some monumental things, including discovering what is believed to be the Higgs boson particle, which was made public in summer 2012. In addition, just recently the 100 petabytes of stored data was exceeded, which CERN reports being approximately equal to 700 years of 1080p HD movies.

[via CERN]


Large Hadron Collider completes three-year cycle, goes into long shutdown is written by Brittany Hillen & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Crazy Brain Implants Give Lab Rats a Sixth Sense and Let Them "Touch" Light

It’s not every day that science and crazy brain implants lead to the generation of what is essentially a new sense, but it is that day today. Scientists from Duke University have found a way to make rats “feel” invisible infrared light and someday that same tech could give sight to the blind, or give us humans extras senses for fun. More »

Spanish scientists claim to have significantly improved GPS accuracy

Millions of vehicles, smartphones, and other devices in use all around the world support GPS navigation capability today. People rely on this GPS capability every day to get around in unfamiliar cities and to find better routes in cities they are familiar with. A group of Spanish researchers has recently claimed that they have discovered a way to help improve GPS accuracy in cities by as much as 90%.

gps_satellite

GPS can be difficult or impossible to use in major cities where satellite signals can be blocked by tall buildings. Many people who live in rural communities have a similar problem with GPS signals being blocked by tall trees and foliage. A group of Spanish researchers from Universidad Carlos III de Madrid has developed a prototype device to improve GPS signal quality.

The device is able to add data from accelerometers and gyroscopes to the conventional GPS signal thereby reducing the margin of error in the system. Researcher David Martin claims that he and his team were able to improve the determination of the vehicles position in critical cases at a rate of between 50 and 90% depending on how degraded the GPS signal to the navigation device is. The standard margin of error for commercial GPS receivers in cars is roughly 15 m in an open field.

In an urban setting where buildings block signals, accuracy can be off more than 50 m. Using the new prototype device, the researchers say that that margin of error within an urban setting can be reduced to one or 2 m. The prototype device uses a low-cost Inertial Measurement Unit using three accelerometers and three gyroscopes that are able to measure changes in velocity and vehicle maneuvers and direction. That data is then merged and used to correct errors in the GPS position data. There is no indication of when or if this technology might come to market.

[via TG Daily]


Spanish scientists claim to have significantly improved GPS accuracy is written by Shane McGlaun & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Supernova leftovers may contain newest black hole in Milky Way

New data just in from NASA‘s Chandra X-ray Observatory shows a rare supernova with remnants that may possibly contain the Milky Way’s newest black hole. The supernova’s leftovers appear to be the result of a rare explosion of a star. NASA is calling it W49B, and it’s about a thousand years old and located about 26,000 light-years away (roughly 152 quadrillion miles away).

A supernova remnant that is located about 26,000 light years from Earth.

NASA is calling it a rare occasion, because this particular supernova ejected materials from its poles at much higher speeds than most other supernovas. A typical supernova ejects matter in all directions in a mostly symmetrical fashion. The lead researcher of the study, Laura Lopez, says that “W49B is the first of its kind to be discovered in the [Milky Way] galaxy.”

Using NASA’s Chandra X-ray space telescope, researchers found that this supernova was very asymmetrical, and only half of the remnants showed concentrations of iron, while sulfur and silicon were spread evenly throughout the explosion. This type of explosion is known as a bipolar supernova, and it’s never been seen before in our galaxy.

Based on what was left behind, it’s said that a black hole may have formed. Most of the time, supernovas leave a neutron star behind. However, the Chandra telescope revealed no evidence of a neutron star existing, which implies that a black hole may have formed instead. Co-author Daniel Castro say that “it’s a bit circumstantial,” but there’s solid evidence that suggests a black hole. The official report will be published in this upcoming Sunday’s issue of the Astrophysical Journal.

[via Wired]


Supernova leftovers may contain newest black hole in Milky Way is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Toyota top in 2013 reliability but US marques show resurgence

Auto reliability is up 5-percent overall, J.D. Power has declared for 2013, with some surprises for US manufacturers including General Motors, though Toyota stands clear as the overall winner despite its accelerator hangover. The Japanese firm dominated in seven segments, across all three of its Lexus, Toyota, and Scion brands, in the 2013 vehicle dependability study, based on 37,000 reports from 2010-year vehicles.

lexus_rx450h-580x416-1

Singled out for praise were the Lexus ES 350, Lexus RX, Scion xB, Scion xD, Toyota Prius, Toyota Sienna, and the  Toyota RAV4, with the Lexus RX topping the chart for the fewest reported problems. Porsche, Lincoln, Toyota, and Mercedes Benz close out the top five marques.

However, it’s not just Lincoln flying the flag for American auto companies. General Motors snatched four awards in total, with the Buick Lucerne, Chevrolet Camaro, Chevrolet Tahoe, and the GMC Sierra HD all getting segment awards. Ford’s Ranger clinched an award in its segment, too, while Chrysler’s Ram saw the biggest year-on-year improvement in reliability.

If there were any doubts that reliability affects repurchasing behavior, J.D. Power’s findings dismiss them. 54-percent of owners stick with their current brand when buying their next vehicle, assuming they’ve had no problems with their existing vehicle. However, that brand loyalty drops to 41-percent after three or more issues; it’s even more of an issue among premium car brands, owner loyalty for which drops to 39-percent under the same circumstances.

That’s worth remembering for marques like the Lincoln Motor Company, which are attempting to refresh their image with a more luxury feel. However, it’s also a reality-check for the potential impact of high-profile safety recalls, as Toyota in particular has suffered from in recent months. In January, the company was forced to recall 1.1m cars worldwide, while the month before it paid out a $1.1bn settlement over acceleration lawsuits.

Despite Toyota cars bouncing back to the service centers like ping pong balls, that doesn’t appear to have negatively impacted its showing in the J.D. Power survey.


Toyota top in 2013 reliability but US marques show resurgence is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Apple and Samsung grab over half smartphone share as phone sales slim in 2012

Apple and Samsung now collectively dominate more than half of the worldwide smartphone market by sales, according to new research, though the phone market as a whole shrank slightly in 2012, year-on-year. Samsung held pole position, Gartner said, in both smartphones and phones generally, with almost 385m global sales in the year; Apple’s sales reached 130m units for the same period. Meanwhile, Huawei managed to snag the number three spot in Q4 2012, though Gartner points out that it’s far from a stable position.

iphone_5_galaxy_s_iii_1-580x442

While Huawei sold 27.2m smartphones in 2012, an impressive rise of 73.8-percent year-on-year, and is indeed predicted for more growth in 2013, Gartner highlights brand strength as a key shortcoming. “The success of Apple and Samsung is based on the strength of their brands as much as their actual products” analyst Anshul Gupta argues. “Their direct competitors, including those with comparable products, struggle to achieve the same brand appreciation among consumers.” That, it’s suggested, leaves plenty of room for others to poach the position.

One company unlikely to be taking that third place any time soon is Nokia, at least going on Gartner’s numbers. The research firm highlighted the low-cost Asha devices as a point of phone success, but market share overall fell 18-percent, selling just over 39m smartphones in 2012 as a whole. The premature demise of Symbian sales and the race to the bottom to better compete with budget Android devices is partially blamed for Nokia’s shortcomings.

Within Android, Samsung has 42.5-percent of all Android device sales globally, with Google’s platform holding more than half the OS market in the smartphone segment. Windows Phone, meanwhile, grew 1.2-percent in Q4 2012, and the research firm predicts 2013 will be the year when it and BlackBerry 10 fight for the position of third ecosystem.

Nonetheless, despite Samsung and Apple’s successes, the phone market overall declined a little in 2012. Overall sales to end-users hit 1.75bn units, down 1.7-percent from 2011, though that’s primarily down to feature phone decline; in Q4, for instance, smartphone sales broke records, up more than 38-percent year-on-year.

gartner_1
gartner_2
gartner_3


Apple and Samsung grab over half smartphone share as phone sales slim in 2012 is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

New Software Resurrects Dead Languages

New Software Resurrects Dead LanguagesResearchers have managed to piece together a new tool that was specially developed in order to help it reconstruct long-dead languages, where it is also known as protolanguages. Protolanguages are considered to be the ancient tongues from which our modern languages have evolved from. In order to test out this particular system, the research team took 637 languages that are currently spoken in Asia and the Pacific, followed up by recreating the early language from which they descended from.

At this point in time, language reconstructions are carried out by linguists, and it is definitely a slow and labor intensive process. Dan Klein, an associate professor at the University of California, Berkeley, said,”It’s very time consuming for humans to look at all the data. There are thousands of languages in the world, with thousands of words each, not to mention all of those languages’ ancestors. It would take hundreds of lifetimes to pore over all those languages, cross-referencing all the different changes that happened across such an expanse of space – and of time. But this is where computers shine.”

The question does arise, is it possible to go beyond the many protolanguages, to the very first protolanguage that ever existed?

By Ubergizmo. Related articles: Apple Loses To Amazon As Top Trusted Company, Carrie Underwood’s Projection Dress,

How Effective Can a Microscopic Nanotube Cupid Really Be?

Banking on the whole notion of it not being the size of the gift that counts, physics students at Brigham Young University have created what could be the world’s tiniest cupid, made of carbon nanotubes. More »

Subsidy scam: Lifeline phone program misuse rife as FCC weighs fines

Lax self-certification rules and poor record keeping have seen the Lifeline program – the subsidized cellphone scheme aiming to give low-income Americans access to a mobile phone – taken advantage of, according to new research. The program – which spent roughly $2.2bn on subsidized phones in 2012 – has seen rules tightened by the FCC after concerns that carriers were not doing due-diligence on whether participants were actually eligible. However, according to research by the WSJ, there are suggestions a sizable portion of those still using Lifeline may not, in fact, fall within its remit.

pile_of_cellphones-e1287763549200-580x344

[Image credit: Matthijs Rouw]

Research conducted by the FCC for the newspaper into subscribers on Virgin Mobile, AT&T, Telrite, Tag Mobile, and Verizon, indicated more than 40-percent of Lifeline subscribers either could not demonstrate their eligibility, or simply did not respond to demands for proof.

Changes in the FCC rules around Lifeline came into effect last year, toughening considerably what hoops would-be subscribers needed to jump through in order to get service. For instance, some states had permitted applications without any evidence of eligibility – which includes being on Medicaid, food stamps, or various other criteria – while others allowed self-certification rather than production of official evidence.

lifeline_stats

Carriers are paid $9.25 per customer, per month by the Universal Service Administrative Co., which organizes the Lifeline scheme. Roughly $2.50 is contributed per month by American tax payers, though that is subsequently spread across several of the subsidy schemes, including broadband and landline access.

As a result of the new research, the FCC is apparently considering levying fines on carriers which have been particularly liberal with their Lifeline sign-ups. That could prove expensive: up to $150,000 per violation per day, though that’s capped at $1.5m.

The carriers, meanwhile, argue that the previous lax criteria – and the absence of a central database of those involved in the scheme – had made it difficult for them to monitor users; there were also few checks as to whether households did indeed have only one active Lifeline account, as is the rules. Still, the FCC expects savings of up to $2bn over the next three years through toughening the checks.


Subsidy scam: Lifeline phone program misuse rife as FCC weighs fines is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Researchers use computer program to reconstruct ancient languages

Languages have evolved over the years, and some have ended up dying off completely. However, these extinct languages are still around today thanks to documentation, and researchers are now trying to reconstruct these ancient languages using a modified version Rosetta Stone, a computer program that teaches users how to speak a different language.

hieroglyphics

The team of researchers reconstructed a set of protolanguages from a database of more than 142,000 words from 637 Austronesian languages, which are spoken in Southeast Asia, the Pacific, and parts of continental Asia. Essentially, the computer program does the work that would normally take “hundreds of lifetimes” for people to do manually, according to Dan Klein, who is an associate professor at the University of California, Berkeley.

Protolanguages are reconstructed by simply grouping words with common meanings from related languages of today. Then, researchers must analyze common features, and then apply any sound-change rules and other criteria to derive the common parent word. Researchers believe that these languages were spoken about 7,000 years ago.

Essentially, the program replicates what linguists do manually with 85% accuracy. Plus, it takes just hours instead of years. The program uses an algorithm known as the Markov chain Monte Carlo sampler, and it sorts through sets of words in different languages that share a common sound, history, and origin. While researchers are able analyze these older languages, it’s still up in the air whether or not they can go further back in time to reconstruct even the very first protolanguage from which all other languages derived from.

[via BBC News]

Image via Flickr


Researchers use computer program to reconstruct ancient languages is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.