T-Mobile postpones Sprint 3G shutdown to March 31st, 2022

T-Mobile will wait a while longer to shut down Sprint’s 3G network. The Vergereports T-Mobile has delayed the CDMA network shutdown from January 1st, 2022 to March 31st of that year. The carrier pinned the delay on “partners” who hadn’t “followed through” on helping their customers transition to newer network technology.

This would supposedly give partners “every opportunity” to fulfill their obligations. “There should be no more room for excuses,” T-Mobile said.

The explanation appears to be a not-so-subtle attempt to pin the blame on Dish. The satellite TV provider bought Boost Mobile from T-Mobile in July 2020 and planned to use Sprint’s legacy network until it could move Boost customers to its 5G service. Dish argued this didn’t give it enough time to migrate its customers, and accused T-Mobile of anti-competitive behavior meant to push Boost exiles to T-Mobile.

This may be a response to both Dish’s original accusation and the ensuing fallout. The Justice Department told Dish and T-Mobile in July that it had serious concerns about the Sprint network shutdown, asking the two companies to do whatever was necessary to lessen the blow. The delay might address those worries and reduce the chances of more serious government scrutiny.

A delayed shutdown still isn’t ideal. T-Mobile expects to shutter Sprint’s LTE network on June 30th, 2022. This leaves a three-month window where Boost customers might have LTE access, but nothing else. While you’ll probably have made a decision by the March cutoff if you’re a customer, this won’t be a very gradual shift for some users — they’ll have just a short period of limited Boost service before they have to embrace 5G.

Hitting the Books: The genetic fluke that enabled us to drink milk

It may not contain our recommended daily allowance of Vitamin R but milk — or “cow juice” as it’s known on the streets — is among the oldest known animal products repurposed for human consumption. Milk has been a staple of our diets since the 9th century BC but it wasn’t until a fortuitous mutation to the human genome that we were able to properly digest that delicious bovine-based beverage. In her latest book, Life as We Made It: How 50,000 Years of Human Innovation Refined — and Redefined — Nature, author Beth Shapiro takes readers on a journey of scientific discovery, explaining how symbiotic relationships between humans and the environment around us have changed — but not always for the better.

Life as we made it by Beth Shapiro
Basic Books

Excerpted from Life as We Made It: How 50,000 Years of Human Innovation Refined—and Redefined—Nature by Beth Shapiro. Copyright © 2021. Available from Basic Books, an imprint of Hachette Book Group, Inc.


The first archaeological evidence that people were dairying dates to around 8,500 years ago — 2,000 years after cattle domestication. In Anatolia (present-day eastern Turkey), which is pretty far from the original center of cattle domestication, archaeologists recovered milk fat residues from ceramic pots, indicating that people were processing milk by heating it up. Similar analyses of milk fat proteins in ceramics record the spread of dairying into Europe, which appears to have happened simultaneously with the spread of domestic cattle.

It’s not surprising that people began dairying soon after cattle domestication. Milk is the primary source of sugar, fat, vitamins, and protein for newborn mammals, and as such is evolved expressly to be nutritious. It would not have taken much imagination for a cattle herder to deduce that a cow’s milk would be just as good for him and his family as it was for her calf. The only challenge would have been digesting it—without the lactase persistence mutation, that is.

Because lactase persistence allows people to take advantage of calories from lactose, it also makes sense that the spread of the lactase persistence mutation and the spread of dairying would be tightly linked. If the mutation arose near the start of dairying or was already present in a population that acquired dairying technology, the mutation would have given those who had it an advantage over those who did not. Those with the mutation would, with access to additional resources from milk, more efficiently convert animal protein into more people, and the mutation would increase in frequency.

Curiously, though, ancient DNA has not found the lactase persistence mutation in the genomes of early dairy farmers, and the mutation is at its lowest European frequency today in the precise part of the world where dairying began. The first dairy farmers were not, it seems, drinking milk. Instead, they were processing milk by cooking or fermenting it, making cheeses and sour yogurts to remove the offending indigestible sugars.

If people can consume dairy products without the lactase persistence mutation, there must be some other explanation as to why the mutation is so prevalent today. And lactase persistence is remarkably prevalent. Nearly a third of us have lactase persistence, and at least five different mutations have evolved—all on the same stretch of intron 13 of the MCM6 gene—that make people lactase persistent. In each case, these mutations have gone to high frequency in the populations in which they evolved, indicating that they provide an enormous evolutionary advantage. Is being able to drink milk (in addition to eating cheese and yogurt) sufficient to explain why these mutations have been so important?

The most straightforward hypothesis is that, yes, the benefit of lactase persistence is tied to lactose, the sugar that represents about 30 percent of the calories in milk. Only those who can digest lactose have access to these calories, which may have been crucial calories during famines, droughts, and disease. Milk may also have provided an important source of clean water, which also may have been limited during periods of hardship.

Another hypothesis is that milk drinking provided access to calcium and vitamin D in addition to lactose, the complement of which aids calcium absorption. This might benefit particular populations with limited access to sunlight, as ultraviolet radiation from sun exposure is necessary to stimulate the body’s production of vitamin D. However, while this might explain the high frequency of lactase persistence in places like northern Europe, it cannot explain why populations in relatively sunny climates, such as parts of Africa and the Middle East, also have high frequencies of lactase persistence.

Neither this hypothesis nor the more straightforward hypothesis linked to lactase can explain why lactase persistence is at such low frequency in parts of Central Asia and Mongolia where herding, pastoralism, and dairying have been practiced for millennia. For now, the jury is still out as to why lactase persistence has reached such high frequencies in so many different parts of the world, and why it remains at low frequencies in some regions where dairying is economically and culturally important.

Ancient DNA has shed some light on when and where the lactase persistence mutation arose and spread in Europe. None of the remains from pre-Neolithic archaeological sites—economies that relied on hunting and gathering—have the lactase persistence mutation. None of the ancient Europeans from early farming populations in southern and central Europe (people believed to be descended from farmers spreading into Europe from Anatolia) had the lactase persistence mutation. Instead, the oldest evidence of the lactase persistence mutation in Europe is from a 4,350-year-old individual from central Europe. Around that same time, the mutation is found in a single individual from what is now Sweden and at two sites in northern Spain. While these data are sparse, the timing is coincident with another major cultural upheaval in Europe: the arrival of Asian pastoralists of the Yamnaya culture. Perhaps the Yamnaya brought with them not only horses, wheels, and a new language, but an improved ability to digest milk.

The mystery of lactase persistence in humans highlights the complicated interaction among genes, environment, and culture. The initial increase in frequency of a lactase persistence mutation, regardless of in whom it first arose, may have happened by chance. When the Yamnaya arrived in Europe, for example, they brought disease—specifically plague—that devastated native European populations. When populations are small, genes can drift quickly to higher frequency regardless of what benefit they might provide. If the lactase persistence mutation was already present when plague appeared and populations crashed, the mutation’s initial increase may have happened surreptitiously. When populations recovered, dairying was already widespread and the benefit to those with the mutation would have been immediate. By domesticating cattle and developing dairying technologies, our ancestors created an environment that changed the course of our own evolution.

We continue to live and evolve in this human-constructed niche. In 2018, our global community produced 830 million metric tons (more than 21 billion US gallons) of milk, 82 percent of which was from cattle. The rest comes from a long list of other species that people domesticated within the last 10,000 years. Sheep and goats, which together make up around 3 percent of global milk production, were first farmed for their milk in Europe around the same time as cattle dairying began. Buffaloes were domesticated in the Indus Valley 4,500 years ago and are today the second largest producer of milk next to cattle, producing around 14 percent of the global supply. Camels, which were domesticated in Central Asia 5,000 years ago, produce around 0.3 percent of the world’s milk supply. People also consume milk from horses, which were first milked by people of the Botai culture 5,500 years ago; yaks, which were domesticated in Tibet 4,500 years ago; donkeys, which were domesticated in Arabia or East Africa 6,000 years ago; and reindeer, which are still in the process of being domesticated. But those are just the most common dairy products. Dairy products from more exotic species—moose, elk, red deer, alpacas, llamas—can be purchased and consumed today, and rumor has it that Top Chef ’s Edward Lee is working out how to make pig milk ricotta, should one want to try such a thing.

Charlie Cox Thinks It's Fine If He's Not Daredevil Again

Charlie Cox became known to most audiences when he suited up as Daredevil for Netflix and did a pretty dang good job at it. Since the show was cancelled in 2018, fans have been demanding it and most of the other Netflix Marvel shows (sorry, Iron Fist) return, and some have even convinced themselves he’s showing up in…

Read more…

GM’s chip woes ease as more trucks are completed

Automotive manufacturers are facing significant hardships with the chip shortage that’s plaguing the world. GM decided that it continued producing its popular pickups even though they could not be completed for shipping. Instead of stopping manufacturing lines, GM simply built the truck and then parked them in lots around the country until the chips needed to complete them were available … Continue reading

20 years ago Apple introduced the iPod, the perfect gateway drug to the Mac

It’s hard to remember, but 20 years ago, Apple was not a very cool company. Sure, OS X was intriguing, and the titanium PowerBook was definitely a cool computer, But when most people thought of Apple, it was probably the bulbous, colorful iMac G3 that popped into people’s heads. The company was starting to build its reputation for truly desirable products, but it wasn’t solidified just yet.

That all changed on October 23, 2001, when Steve Jobs pulled the first iPod out of his pocket. For a generation of music fans, it became the quintessentially cool item that was more than just a fad. It’s not a stretch to say it reinvented the music industry while simultaneously paving a path for Apple to become the world’s biggest company. It was the ultimate gateway drug to getting people who had never bought an Apple product before to see what all the fuss was about.

At this point, the somewhat skeptical reception to the iPod is part of tech industry lore – particularly Slashdot’s dismissal of the product as “lame” compared to a Nomad MP3 player. (Raise your hand if you ever used a Nomad. That’s what I thought.) And it’s not like the product was an instant hit – the first iPod cost $400 and only worked with the Mac, two factors that limited its appeal.

Those limitations helped it achieve some serious cachet, though. Seeing an iPod in the wild was a rarity, and my Mac-owning friends who were early adopters had to deal with my incessant questions and requests to hold it and spin its distinctive wheel. It didn’t help that my college suite-mate (who had a titanium PowerBook and iPod) and graphic designer friend (with a PowerMac G4 and iPod) were constantly going off about how great their hardware was. I was primed to become one of those switchers Apple liked to talk about in the early 2000s.

The iPod may have started out as a Mac-only product, but less than a year later, Apple opened it up to the other 98 percent of computer users by introducing a Windows-compatible model in the summer of 2002. Less than a year after that, Apple completely redesigned the iPod and released a new version of iTunes for Windows. At the same time, Apple launched the iTunes Music Store, making it a lot easier to get legal music onto an iPod. With that, the iPod moved fully into the mainstream.

There’s no good way to quantify how many people bought an iPod for Windows and then eventually switched to a Mac. But, Mac sales increased from about 3 million in 2003 to more than 7 million by 2007. Apple’s move to more powerful Intel processors in 2005 likely helped adoption, but the iPod “halo effect” was often cited in the mid-2000s as a driver of the Mac’s increasing popularity.

Growing Mac sales and the most popular consumer electronics device of the decade truly paved the way for the iPhone to be the monumental success that it was almost. Sure, the iPhone eventually killed the iPod, but as Steve Jobs said, he’d rather cannibalize Apple’s sales with another Apple product than let some other company do it — this was how he justified the existence of the iPod touch, which was basically an iPhone without a phone.

I might be overselling the iPod to Mac to iPhone evolution, because I lived it. After getting a second-generation iPod in 2002 (embarrassing admission time: I also bought four more full-size iPods between then and 2009), I got my first Mac in 2003 and the first iPhone in late 2007. I remember being more excited about my first iPhone than my first iPod, mostly because it was light years better than the Moto RAZR I was using at the time. But my first iPod was similarly a huge step forward from the MP3 players I owned before. And in my early 20s, there was nothing more important to me than music.

That may not make me unique, but it’s still true. Before the iPod was everywhere, someone else who had one was someone you could trust. They took music as seriously as you did; they knew how liberating it was to have your 100 favorite albums with you, on demand, any time you needed them. In a world where Apple Music offers access to 90 million songs anywhere you are for 10 bucks a month, that might seem quaint. But 20 years ago, it was a revelation.

I still have the last iPod I ever purchased, a 2008 iPod classic with 120GB of storage – about the same space as I have in my iPhone 12 Pro. It’s still stuffed to the gills with music, some 11,000-plus songs, most of which come from albums I carefully selected over time. Most of them are still in my Apple Music library, which has now ballooned to more than double that size, with over 25,000 songs.

I’m still a firm believer in the art of making a good album, but I’ve also collected thousands of singles, or a handful of songs from artists who catch my ear on one of the many curated playlists out there. The music industry has changed, and so have I. Whether or not that’s a good thing is a debate for another time, but there’s no doubt that both the music and technology industries changed completely because of the iPod – something its humble introduction 20 years ago only barely hinted at.

Samsung's giant Galaxy Tab S8 Ultra might include a notch

Rumors have persisted of a flagship Samsung tablet even larger than the Galaxy Tab S7+, and now you might know what it looks like. OnLeaks and 91Mobiles have shared what they claim are images of the Galaxy Tab S8 Ultra. The slate would minimize the impact of its huge 14.6-inch display by stuffing the front camera system into a notch — potentially distracting, but better than a conventional design that might be even larger.

The design wouldn’t be quite so unusual on the other side. The leak suggests the Galaxy Tab S8 Ultra would have dual rear cameras and the familiar magnetic strip to hold your S Pen. The source claimed the imagery was “not 100 percent complete,” so there’s a chance the design could change slightly no matter how accurate it is as of this writing.

The regular Tab S8 and S8+ models aren’t expected to use the notch. They might instead be subtle evolutions of the existing designs, which tuck the front camera into the bezel.

It’s not clear just when the Galaxy Tab S8 Ultra would arrive. 91Mobiles speculates that Samsung might launch the design in November or December, but it would be odd to wait until the very end of the year to release an important tablet, even if chip shortages weren’t a factor. It might be easier for Samsung to wait until early 2022, when it can launch the Tab S8 series alongside the Galaxy S22.

One More Voting Rights Filibuster Will Soon Cause A Final Showdown In The Senate

Senate Dems are calling for a return to original principles as the push to change the filibuster approaches its climax.

‘New Democrats’ Break With Their Anti-Welfare Past And Back Biden’s Agenda

Rep. Suzan DelBene of Washington state is the centrist Democrat at the center of everything.

Full-Time Work Is Not Enough For Millions Of Americans. We Need To Talk About It.

“I met people across the country, who worked multiple jobs, and still couldn’t reliably pay all their bills each month.”

Apple Car hits another snag, this time with batteries

The Apple Car project has been on-again, off-again for the last several years. Most people who followed the project when it was first revealed thought the project was dead. However, a rumor surfaced that Apple is moving forward with the project and that it was in talks with battery manufacturers. Battery manufacturers specifically named in alleged talks with Apple were … Continue reading