NBA TV rights go to ESPN, NBC and Amazon as TNT is rejected

The NBA and WNBA have inked deals for where games will be aired and streamed for the next eleven years. The NBA deals run from the 2025-2026 season through the 2035-2036 season. For the WNBA, the agreement covers the 2026 through 2036 seasons.

Pro basketball has been an ESPN mainstay for years and that will continue, with the Disney-owned network remaining the primary media rights owner for both leagues. ESPN will be the exclusive home for the NBA finals for all eleven years of the new deal, as well as five out of the eleven years of the WNBA finals. The games covered by ESPN’s deal will be part of the sports network’s direct-to-consumer platform and a package of NBA and WNBA games will also be made available to stream on Disney+ in select international markets.

While the bulk of the games will go to ESPN, basketball is going to have more of a streaming presence thanks to two new partnerships. NBC and Peacock will have access to 100 NBA national games during each regular season. About 50 games will be exclusive to the Peacock streaming platform, including national Monday night games and doubleheaders. The rest of the games go to Amazon. Prime Video will be the home for 66 regular-season NBA games and 30 regular-season WNBA games each year of the deal.

Regular basketball viewers may notice that TNT Sports is not part of this lineup. The NBA’s deal with that network does not appear to be getting an extension after next year, with those games mostly going to Amazon. But the situation may yet go into overtime. TNT Sports claims that it matched Amazon’s offer for the games and seems to be challenging whether the NBA can switch partners. NBA’s statement counters that the offer from parent company Warner Bros. Discovery did not match Amazon’s, leaving them free to shop elsewhere.

The long-awaited agreements for both basketball leagues aren’t a complete slam dunk for fans. On the positive side, the next decade marks a notable shift toward streaming. After so long with the sport closely tied to broadcast shows, having access as part of your existing streaming plans is great. But on the negative side, multiple media partners mean that you’ll have to double- and triple-check where to watch each game. Major League Baseball, for instance, has games scattered across ESPN, Fox, Apple TV+, TNT Sports, and MLB Network on any given night.

This article originally appeared on Engadget at https://www.engadget.com/nba-tv-rights-go-to-espn-nbc-and-amazon-as-tnt-is-rejected-230811550.html?src=rss

Max's SharePlay feature for iOS is now available to all ad-free subscribers

Back when Max was still known as HBO Max, it released a redesigned app that added SharePlay for Apple devices, but only in the US. Now, the streaming service is rolling out the feature to all its users around the world. SharePlay is now available to all Max users paying for Ad-Free and Ultimate Ad-Free plans, allowing them to hold and join watch parties over FaceTime and iMessage, no matter where they are. 

Users can start watching with friends by hitting the “share” button either on the details section of each title or within the FaceTime app. Each session can have as many as 32 participants, but they all have to be Max subscribers. That means people from regions where Max isn’t available, such as in Asian countries, won’t be able to hop on and watch with their pals in the US or Europe. Warner Bros. is planning to expand Max’s reach to South East Asia later this year, but it warns on its website that the timeline could still change.

SharePlay for Max works on iPhones, iPads, Apple TVs and Vision Pro headsets. To initiate a watch party on iPhones, iPads and Vision Pros, users have to find the Share icon on the details page of a show or a movie, enter the contacts they want to share with and initiate a FaceTime call. If they choose Messages on their mobile devices, their friends will get a message asking them to join SharePlay. On Apple TV, users will have to open FaceTime first before clicking the SharePlay button and choosing Max from the app list. 

This article originally appeared on Engadget at https://www.engadget.com/maxs-shareplay-feature-for-ios-is-now-available-to-all-ad-free-subscribers-040624031.html?src=rss

The Google Pixel Buds A-Series drop to $69

Amazon Prime Day 2024 might be behind us but the deals keep coming and they don’t stop coming. Folks with a Google Pixel or other Android device who are in the market for a set of budget-friendly earbuds may be interested in a discount on the Pixel Buds A-Series (they’re compatible with iPhones too, but the integration won’t be as deep on iOS products). These earbuds were already decent value at $99, and now they’ve dropped to an even more attractive price of $69.

We gave the Pixel Buds A-Series a score of 84 in our 2021 review. They don’t support wireless charging or have onboard controls, but otherwise we felt that they deliver excellent value for money (even more so now thanks to the current discount).

The sound quality is pretty darned decent and the buds can reduce background noise while you’re on calls. You’ll get up to five hours of listening time and 2.5 hours of talk time before you’ll need to return the earbuds to the case, Google says. You’ll seemingly get up to 24 hours of total listening time before you have to charge the case. Thanks to quick charging, you’ll be able to add three hours of listening time after plugging in the case for just 15 minutes. While there’s an adaptive sound function for automatically adjusting the volume, there’s no true active noise cancellation here.

If you’d like something a more premium option, you can go with the Pixel Buds Pro instead. Those have dropped to $140, which is $60 off. However, they dropped to $120 during Prime Day.

Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/the-google-pixel-buds-a-series-drop-to-69-145630793.html?src=rss

ASUS Unveils Enhanced ROG Ally X Handheld Game Console

ASUS Republic of Gamers (ROG) has announced the availability of its new handheld game console, the ROG Ally X. Designed for enhanced Windows 11 gaming, the ROG Ally X features numerous hardware upgrades, including an AMD Ryzen Z1 Extreme processor, 1 TB SSD storage, 24 GB DDR5-7500 RAM, and an 80Wh battery, doubling the previous model’s capacity.

The ROG Ally X offers improved ergonomics with a redesigned black chassis, deeper handles, and more ergonomic button and stick placement. The joysticks are upgraded for durability with a 5 million cycle lifespan, and the D-Pad now supports precise 8-direction input.

The device features two USB Type-C ports, including one Thunderbolt-compatible, replacing the XG Mobile port for better connectivity with third-party docks and external GPUs. Thermal improvements include 23% smaller fans with 50% thinner blades, new airflow tunnels, and an additional exhaust vent, increasing air volume by 24% and cooling the touchscreen by up to 6 °C.

Specs

SpecificationDetails
Operating SystemWindows 11 Home
Display7” FHD (1920 × 1080) 16:9, 500 nits brightness, 100% sRGB, Corning Gorilla Glass Victus with Gorilla Glass DXC coating, touchscreen, 120 Hz refresh rate, 7 ms response time, AMD FreeSync Premium
CPUAMD Ryzen Z1 Extreme Processor (“Zen4” architecture with 4 nm process, 8-core /16-threads, 24 MB total cache, up to 5.10 GHz boost)
GPUAMD Radeon Graphics (AMD RDNA 3, 12 CUs, up to 2.7 GHz, up to 8.6 Teraflops)
Memory24 GB LPDDR5 on board (7500 MHz dual channel)
StorageUp to 1 TB M.2 2280 NVMe PCIe G4x4 Value SSD
I/O Ports
  • 1 x Audio combo jack
  • 1 x Micro SD Card reader (UHS-II, 312 MB/s)
  • 1 x USB-C supports USB 3.2 Gen 2 (DisplayPort 1.4 with Freesync support, Power Delivery 3.0 (Input: 20V/5A, Output 5V/1.5A))
  • 1 x USB4 (Thunderbolt 4 compliance, DisplayPort 1.4 with Freesync support, Power Delivery 3.0 (Input: 20V/5A, output:5V/3A))
Audio2-speaker system with Smart Amplifier Technology Dolby Atmos, AI noise-canceling technology, Hi-Res Audio certification, Built-in array microphone
Battery80 Wh
Power SupplyUSB-C, 65W AC Adapter, Output: 20V DC, 3.25 A, 65 W, Input: 100~240V AC 50/60 Hz universal
Dimension11.02″ x 4.37″ x 0.97″ ~ 1.45″
Weight1.49lbs (678g)

Availability and Price

The ROG Ally X is available for purchase at Best Buy and the ASUS Online Store, priced at $799.99.

ASUS Unveils Enhanced ROG Ally X Handheld Game Console

, original content from Ubergizmo. Read our Copyrights and terms of use.

Breakthrough Quantum Microscopy Reveals Electron Movements In Slow Motion

Researchers at the University of Stuttgart have developed a groundbreaking quantum microscopy method that allows for the visualization of electron movements in slow motion, a feat previously unachievable. Prof. Sebastian Loth, managing director of the Institute for Functional Matter and Quantum Technologies (FMQ), explains that this innovation addresses long-standing questions about electron behavior in solids, with significant implications for developing new materials.

In conventional materials like metals, insulators, and semiconductors, atomic-level changes do not alter macroscopic properties. However, advanced materials produced in labs show dramatic property shifts, such as turning from insulators to superconductors, with minimal atomic modifications. These changes occur within picoseconds, directly affecting electron movement at the atomic scale.

THE IMAGING TIP OF THE TIME-RESOLVING SCANNING TUNNELING MICROSCOPE CAPTURES THE COLLECTIVE ELECTRON MOTION IN MATERIALS THROUGH ULTRAFAST TERAHERTZ PULSES. PHOTO CREDIT: © SHAOXIANG SHENG, UNIVERSITY OF STUTTGART(FMQ)

Loth’s team has successfully observed these rapid changes by applying a one-picosecond electrical pulse to a niobium and selenium material, studying the collective motion of electrons in a charge density wave. They discovered how single impurities can disrupt this collective movement, sending nanometer-sized distortions through the electron collective. This research builds on previous work at the Max Planck Institutes in Stuttgart and Hamburg.

Understanding how electron movement is halted by impurities could enable the targeted development of materials with specific properties, beneficial for creating ultra-fast switching materials for sensors or electronic components. Loth emphasizes the potential of atomic-level design to impact macroscopic material properties.

The innovative microscopy method combines a scanning tunneling microscope, which offers atomic-level resolution, with ultrafast pump-probe spectroscopy to achieve both high spatial and temporal resolution. The experimental setup is highly sensitive, requiring shielding from vibrations, noise, and environmental fluctuations to measure extremely weak signals. The team’s optimized microscope can repeat experiments 41 million times per second, ensuring high signal quality and making them pioneers in this field.

Breakthrough Quantum Microscopy Reveals Electron Movements In Slow Motion

, original content from Ubergizmo. Read our Copyrights and terms of use.

GM shelves the autonomous Cruise Origin shuttle van

General Motors is putting the autonomous Cruise Origin shuttle van on ice. The company said that the embattled Cruise, of which GM is the majority owner, will now focus on making the next-gen Chevy Bolt. The automaker discontinued the previous Bolt last year due to a shift away from an older battery system but did not reveal plans for a new model at the time.

According to a letter that GM CEO Mary Barra sent to shareholders, the indefinite delay of the shuttle van “addresses the regulatory uncertainty we faced with the Origin because of its unique design.” Barra added that the per-unit costs of the next-gen Bolt will be much lower, “which will help Cruise optimize its resources.”

GM and Cruise were working on the Origin with Honda. The Origin — which does not have a driver’s seat, steering wheel or pedals — was supposed to debut in Japan in 2026.

In October, the California Department of Motor Vehicles suspended Cruise’s driverless vehicle permits over safety issues. Earlier that month, a pedestrian in San Francisco was dragged 20 feet by a Cruise vehicle and pinned under it after a hit-and-run by another car pushed her into the robotaxi’s path. Cruise later paused all driverless operations before temporarily halting production in November.

According to CNBC, former Cruise CEO Kyle Vogt at one point told staff that hundreds of pre-commercial Origin vehicles had been built. The company has resumed robotaxi operations in Phoenix, Houston and Dallas with human operators on board and is carrying out tests in Dubai. However, it hasn’t recommenced operations in San Francisco. It’s still under investigation for the October incident there.

Shelving the Origin is not a decision that GM and Cruise would have come to lightly. In GM’s second quarter earnings report, the automaker noted that it incurred around $583 million of Cruise restructuring costs. It said these resulted “from Cruise voluntarily pausing its driverless, supervised and manual [autonomous vehicle] operations in the US and the indefinite delay of the Cruise Origin.”

On the plus side, resuming work on the Bolt (which will presumably use GM’s Ultium battery tech the next time around) could be a boon for GM’s bottom line. As of 2023, the Bolt EV and EUV accounted for most of GM’s electric vehicle sales. It planned to make around 70,000 of them last year before ceasing production.

This article originally appeared on Engadget at https://www.engadget.com/gm-shelves-the-autonomous-cruise-origin-shuttle-van-144256801.html?src=rss

Meta's AI assistant is coming to Quest headsets in the US and Canada

Meta’s AI-powered assistant have been accessible on the Ray-Ban smart glasses for quite some time, but the company will only start rolling it out to its Quest headsets next month. The assistant will still be in experimental mode, however, and it’s availability will be limited to users in the US and Canada. Meta has revealed the update alongside its announcements for the Llama 3.1 and the new Meta AI capabilities

Users who get access to the assistant in August will be able to put its hands-free controls to the test. The company said Meta AI is replacing the current technology used for Voice Commands on Quest, so it will be the one controlling the headset whenever people use voice for navigation and the one answering their questions if they ask for information. They can ask the assistant for restaurant recommendations for an upcoming trip, as an example, or ask it for the weather those days, as well as suggestions on how to dress for it. 

They will also be able to use the “Meta AI with Vision” feature, which will let them ask the assistant for information on what they’re seeing, while using Passthrough on the Quest. Passthrough lets users see their environment through a video feed while watching or doing something else on their headsets. A user can, for instance, ask the assistant to look at what’s inside the fridge and suggest what they can cook, or ask for tips on what kind of top would go with a skirt they’re holding up, all while watching a YouTube video.

This article originally appeared on Engadget at https://www.engadget.com/metas-ai-assistant-is-coming-to-quest-headsets-in-the-us-and-canada-150033530.html?src=rss

Meta's AI assistant is coming to Quest headsets in the US and Canada

Meta’s AI-powered assistant have been accessible on the Ray-Ban smart glasses for quite some time, but the company will only start rolling it out to its Quest headsets next month. The assistant will still be in experimental mode, however, and it’s availability will be limited to users in the US and Canada. Meta has revealed the update alongside its announcements for the Llama 3.1 and the new Meta AI capabilities

Users who get access to the assistant in August will be able to put its hands-free controls to the test. The company said Meta AI is replacing the current technology used for Voice Commands on Quest, so it will be the one controlling the headset whenever people use voice for navigation and the one answering their questions if they ask for information. They can ask the assistant for restaurant recommendations for an upcoming trip, as an example, or ask it for the weather those days, as well as suggestions on how to dress for it. 

They will also be able to use the “Meta AI with Vision” feature, which will let them ask the assistant for information on what they’re seeing, while using Passthrough on the Quest. Passthrough lets users see their environment through a video feed while watching or doing something else on their headsets. A user can, for instance, ask the assistant to look at what’s inside the fridge and suggest what they can cook, or ask for tips on what kind of top would go with a skirt they’re holding up, all while watching a YouTube video.

This article originally appeared on Engadget at https://www.engadget.com/metas-ai-assistant-is-coming-to-quest-headsets-in-the-us-and-canada-150033530.html?src=rss

Meta's AI assistant is coming to Quest headsets in the US and Canada

Meta’s AI-powered assistant have been accessible on the Ray-Ban smart glasses for quite some time, but the company will only start rolling it out to its Quest headsets next month. The assistant will still be in experimental mode, however, and it’s availability will be limited to users in the US and Canada. Meta has revealed the update alongside its announcements for the Llama 3.1 and the new Meta AI capabilities

Users who get access to the assistant in August will be able to put its hands-free controls to the test. The company said Meta AI is replacing the current technology used for Voice Commands on Quest, so it will be the one controlling the headset whenever people use voice for navigation and the one answering their questions if they ask for information. They can ask the assistant for restaurant recommendations for an upcoming trip, as an example, or ask it for the weather those days, as well as suggestions on how to dress for it. 

They will also be able to use the “Meta AI with Vision” feature, which will let them ask the assistant for information on what they’re seeing, while using Passthrough on the Quest. Passthrough lets users see their environment through a video feed while watching or doing something else on their headsets. A user can, for instance, ask the assistant to look at what’s inside the fridge and suggest what they can cook, or ask for tips on what kind of top would go with a skirt they’re holding up, all while watching a YouTube video.

This article originally appeared on Engadget at https://www.engadget.com/metas-ai-assistant-is-coming-to-quest-headsets-in-the-us-and-canada-150033530.html?src=rss

Llama 3.1 is Meta's latest salvo in the battle for AI dominance

Meta on Tuesday announced the release of Llama 3.1, the latest version of its large language model that the company claims now rivals competitors from OpenAI and Anthropic. The new model comes just three months after Meta launched Llama 3 by integrating it into Meta AI, a chatbot that now lives in Facebook, Messenger, Instagram and WhatsApp and also powers the company’s smart glasses. In the interim, OpenAI and Anthropic already released new versions of their own AI models, a sign that Silicon Valley’s AI arms race isn’t slowing down any time soon.

Meta said that the new model, called Llama 3.1 405B, is the first openly available model that can compete against rivals in general knowledge, math skills and translating across multiple languages. The model was trained on more than 16,000 NVIDIA H100 GPUs, currently the fastest available chips that cost roughly $25,000 each, and can beat rivals on over 150 benchmarks, Meta claimed.

The “405B” stands for 405 billion parameters, which are internal variables that an AI model uses to reason and make decisions. The higher the number of parameters an AI model has, the smarter we perceive it to be. OpenAI’s GPT-4 model, by comparison, reportedly has roughly 1.5 trillion parameters, although the company has not disclosed the number so far. In addition, Meta also released upgraded versions of existing Llama models that contain 70 billion and 8 billion parameters each, claiming that the newer versions had stronger reasoning abilities among other things.

Developers can download Llama 3.1 from its official website, while regular users can play with it through Meta AI in WhatsApp or on meta.ai, the company’s website for its chatbot. “Llama 405B’s improved reasoning capabilities make it possible for Meta AI to understand and answer your more complex questions, especially on the topics of math and coding,” Meta’s blog post states. “You can get help on your math homework with step-by-step explanations and feedback, write code faster with debugging support and optimization.” (Editor’s note: Engadget will pit Llama 3.1 against the New York Times Spelling Bee and report back to you). For now, Meta AI on Facebook, Messenger and Instagram are still restricted to the smaller version of Llama 3.1 that uses 70 billion parameters.

Unlike OpenAI, Google, Microsoft and Anthropic that keep their AI models proprietary, Meta’s AI models are open source, which means that anyone can modify and use them for free without sharing personal data with Meta. In a letter published on Tuesday, Meta CEO Mark Zuckerberg argued that an open source approach to AI development will ensure wider access to the technology’s benefits, prevent the concentration of power among a few big companies, and enable safer AI deployment across society. By open sourcing the company’s largest language model to date, Meta aims to make Llama the “industry standard” for anyone to develop AI-powered apps and services with, Zuckerberg wrote.

Open sourcing AI models and adding them to its existing products already used by billions of people could allow Meta to compete more effectively with OpenAI whose ChatGPT and DALL-E chatbots ignited an AI explosion when they launched in 2022. And it could also boost engagement — Meta announced today that users would soon be able to add AI-generated images directly into feeds, stories, comments and messages across Facebook, Messenger, WhatsApp and Instagram.

In his letter, Zuckerberg also criticized Apple and its closed ecosystem, arguing that the iPhone maker’s restrictive and arbitrary policies had constrained what Meta could build on its platforms. “[It’s] clear that Meta and many other companies would be freed up to build much better services for people if we could build the best versions of our producers and competitors were not able to constrain what we could build,” he wrote.

This article originally appeared on Engadget at https://www.engadget.com/llama-31-is-metas-latest-salvo-in-the-battle-for-ai-dominance-150042924.html?src=rss