The Super Nintendo World theme park in Orlando is nearly ready for visitors. Universal Orlando Resort just announced that the Mario-friendly attraction will open its doors on May 22, 2025. That gives you over six months to find the perfect Goomba costume to wear on opening day.
This is the third Nintendo theme park throughout the world, as the Orlando location joins pre-existing parks in Los Angeles and Japan. If the layout looks anything like the other two parks, you should expect a large interactive area to explore, special themed rides and, of course, all kinds of Nintendo-adjacent dining and shopping. The original Japanese park just got a nifty Donkey Kong Country area, but it remains to be seen if that’ll make the jump to the states.
This is part of a larger expansion of Universal Resort Orlando, called Universal Epic Universe. This includes five areas to explore. There’s the aforementioned Super Nintendo World, but this expansion will also host the pre-existing Harry Potter attraction.
The area will be home to a theme park based on the How to Train Your Dragon franchise and another based on the Dark Universe franchise. That last one is pretty odd to me, being as how the Dark Universe franchise peaked with a few middling horror films in the 2010s. Most of the planned films in this shared cinematic universe were scrapped after 2017’s The Mummy crashed and burned.
Finally, there’s Celestial Park. This looks to be a standard amusement park with a slight sci-fi bent. There are space-themed roller coasters and the like.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/super-nintendo-world-orlando-opens-next-may-164506895.html?src=rss
There’s a certain level of fandom you hit when you research a band’s tour set list before they come to your city. And some of us like to relive great concerts with some quick research on setlist.fm. The next logical step, once we’re armed with this information, is to create a playlist on our preferred streaming service for quick access. Thanks to third-party options like Setify, the process is easy for Apple Music and Spotify users, but you still have to take the time to do it.
Apple Music has now given artists the ability to turn set lists into playlists thanks to info from tour info site Bandsintown. Once an artist has connected the two services, they can select the type of show in Apple Music for Artists (concert, tour or residency) and link it to upcoming dates on Bandsintown. From there, artists can set a publish date and use search to build out the playlist. These collections of songs can include original tunes the artist covers or collaborations with other acts. Apple Music allows unlimited set list playlists for past or future shows, but the service recommends that artists select a track listing that most accurately reflects the whole tour if they’re making one for an entire run of dates.
Set lists playlists aren’t entirely new on Apple Music. The service has been curating playlists for popular tours for a while now, like Zach Bryan’s 2024 Quittin’ Time Tour. What’s more, Apple Music is touting this new tool as a promotional feature for artists, so there are a number of ways to share the playlists once they’re live. However, it will also be a great item for fans who either want more info on the songs they can expect to hear, can’t make it to a stop on a tour or want to relive the experience of seeing the band in person.
Of course, if one of your favorites doesn’t hop on this bandwagon, you still have options for set list playlists. With Setify, you can link either Apple Music or Spotify and pull in data from setlist.fm in order to make your collections. It’s not perfect, but it works well most of the time, and you can always adjust things in the streaming service apps if you need to further curate a playlist. I recently missed one of my all-time favorites at Furnace Fest, but thanks to this combo, I can at least get a small piece of Blindside playing through About a Burning Fire.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/music/apple-music-helps-artists-turn-concert-set-lists-into-playlists-133916684.html?src=rss
2K and developer 31st Union just unveiled Project: Ethos, a free-to-play 3rd-person hero shooter. It’s entering a crowded and fraught marketplace, but the publisher says this is an “exciting evolution” of the genre.
That evolution seems to take the form of some light roguelike mechanics. The playable characters evolve throughout each match, via semi-randomized upgrades unique to each hero. The publisher gives an example of evolving a sniper into a “close-range skirmisher” or a “support role into a powerful lone wolf.”
2K
The “abilities, stakes and challenges” change from match to match and players can eventually unlock powerful Augments to further enhance runs. It remains to be seen if these mechanics can set it apart from the pack, but you can find that out for yourself. There’s a community playtest going on right now.
Players can test out the game’s signature Trials mode, which is an “ongoing, persistent fight” or check out the Gauntlet. This is your standard head-to-head tournament mode, with teams and brackets.
This community playtest goes until October 20 in the US, Canada, Mexico and much of Europe. There is a fairly annoying hurdle to jump through to access the early build. You have to complete a Twitch Drop and stream 30 minutes of content from one of 2K’s partner creators. There’s no information yet regarding an actual release date for people who don’t want to sit through a 30-minute stream.
This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/2k-games-wades-into-risky-waters-and-announces-a-free-to-play-hero-shooter-160059725.html?src=rss
Netflix has released the first trailer for The Electric State, a post-apocalyptic road movie from Marvel (and Community) mainstays The Russo Brothers. The adaptation of Simon Stålenhag’s 2018 graphic novel is set in a retro-futuristic version of the ’90s after a robot uprising. It tells the story of Michelle, an orphaned teenager (Millie Bobby Brown) who ventures across the west of the US to look for her younger brother with a smuggler (a mustachioed Chris Pratt) and a pair of robots.
The movie’s look draws heavily from Stålenhag’s gorgeous artwork, right down to the oversized VR helmets. The robots, in particular the one accompanying Michelle, have a cartoon-inspired aesthetic that wouldn’t look out of place in Fallout. A large teddy bear robot can be seen as part of a parade of machines, while our heroes appear to face off against a massive one that looks a little like Sonic the Hedgehog.
Meanwhile, the whole “slowed down iteration of a popular song in a movie trailer” thing might have jumped the shark with the version of Oasis’ “Champagne Supernova” that plays over the top of this. It fits the ’90s setting, of course, but I couldn’t help but laugh as soon as I recognized it.
The movie has a hell of a cast. Alongside Brown and Pratt, it stars Ke Huy Quan, Jason Alexander, Woody Harrelson, Anthony Mackie, Brian Cox, Jenny Slate, Giancarlo Esposito and Stanley Tucci. The Electric State hits Netflix on March 14.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/netflixs-the-electric-state-trailer-shows-off-cartoony-robots-and-oversized-vr-headsets-143628514.html?src=rss
I don’t know about you but I will be spending most of the upcoming cold months sitting on my couch and watching television (with some books thrown in). The only thing I’m missing is a really good television set and, while I’ll be opting for a more budget-friendly pick, I’m tempted by the sale on LG’s C3 Series OLED TV. Right now, the 55-inch model is down to $1,197 from $1,800 — a 34 percent discount. The all-time low price isn’t the only version on sale, with the 42-inch option dropping to $997 from $1,197.
LG released the C3 series last year as a mid-range OLED option. It offers an a9 AI Processor Gen6, HDR tone mapping, AI upscaling and object-based picture sharpening. The TVs also come with Brightness Booster, which — though not to the level of some of its competitors — makes it easier to watch even in a relatively sunny room.
If you want the newest model then check out LG’s C4 OLED series. The 2024 release is also on sale, with the 55-inch version down to $1,297 from $2,000 — the same 35 percent discount we recently saw on Prime Day. The C4 TVs offer nearly 1,000 nits of brightness and a maximum refresh rate of 144Hz. This model will also wirelessly connect with LG soundbars, foregoing the need for messy cables.
Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.
This article originally appeared on Engadget at https://www.engadget.com/deals/lgs-c3-oled-tvs-are-more-than-600-off-before-black-friday-135916937.html?src=rss
London’s W1 is somewhere to go if you’ve got too much money to spend on something. Within minutes of each other, you can visit the city’s priciest private doctor, buy a Steinway and a pair of designer glasses that cost more than my mortgage. Wigmore Street is also where the ultra rich go to buy a kitchen that Thorstein Veblen would weep at the sight of. It’s also the new home of Moley Robotics, a company selling luxury kitchens and the robot arm that’ll kinda/sorta do all of the cooking for you, too.
Moley is the brainchild of Dr. Mark Oleynik and is one part kitchen showroom and one part robot lab. It’s a spartan space with three demo kitchens, a wide dining table and some display units showing you the different types of artisan marble you can have for your countertop. The point of interest is the working X-AiR robot just behind the front window that acts as a lure for would-be consumers. It’s got its own cooktop, shelves, oils and utensils and, with the proper help, can even whip up a meal.
Moley
Photo by Daniel Cooper / Engadget
Oleynik explained he wanted to create something to help people eat better food with less reliance on preservatives. His dislike of reheated and processed food sent him looking for alternatives, which led him to finding a way to automate fresh cooking. If you’re coming back late from work, the obvious temptations are microwave meals or delivery food. He believes people would much rather healthy recipes where you just prep the raw ingredients and let the robot do the rest. The focus on health extends to the database of potential meals, many of which have been created by the SHA Wellness Clinic.
Moley has its own in-house chef, James Taylor, who adapts each recipe so it can be made by a one-armed robot. The company says it hopes to add two or three new recipes each month, and that if you have a family dish you’d love to see automated, you can send it in. Oleynik said the movements are mapped onto the robot after watching a human chef prepare the same meal. And that, once it had learned what to do, the robot would be far less error-prone than its human counterpart.
The initial demonstration of Moley’s vision (above) used a two-armed chef that ran on overhead tracks that earned the company so many plaudits initially. Unfortunately, Oleynik admitted the cost for such a robot would have likely reached north of £250,000 (Around $330,000). Which is probably too rich even for the sort of people who frequent Wigmore Street for their kitchen appliances. To reduce the price, the company stripped down the project from a mobile, two-armed version to a single arm. The robot that Moley is actually selling is bought off-the-shelf from Universal Robots, an industrial robotics company.
The robot
Photo by Daniel Cooper / Engadget
The one-armed version that’s currently up for pre-order is known as the X-AiR, which is what sits in the front of Moley’s showroom. If you want one for yourself, you’ll need to buy a new countertop, two custom shelving units, a cooktop, control tablet and the robot itself. The prices are in the “if you have to ask, you can’t afford it” range but the price to get in the door is £80,000 (around $105,000). So far, Moley hasn’t installed a single robot, but expects the process to begin in the next three to six months. But there are people who have already laid down cash to get one of these in their homes, and the kitchen that goes around it.
X-AiR has no built-in vision or sensing technology enabling it to perceive or engage with its environment. The system does come with a camera, embedded in one of the shelves, that I understand is more for technical support than to aid cooking. Instead, the robot arm moves around its space from memory, knowing where all of the ingredients, oils and tools should be. The saucepans are held in place over the jobs on the cooktop to keep the environment as controlled as possible.
I was present to witness Moley’s now standard demonstration using an SHA Clinic recipe for Asian Tofu Saute. Staff members had pre-prepared the ingredients and placed them in the pots necessary for the robot to grab. In order to start the process, the user needs to tell the system which ingredients are in which sections. There’s even a little diagram of the shelf layout, so you can tap “Bean Sprouts” and tap that the pot with them is seated in position A1, for instance. Once you’ve done that, you can set the machine going and theoretically leave it be until it’s time to eat.
The system is set up to call out every instruction from the recipe so it’s easy to follow along with it. In the video, you should be able to see why it’s an interesting thing to watch as the arm starts its ballet to start cooking your food. It almost theatrically turns on the cooktop before pouring a liberal quantity of oil into the pan to begin warming. After that, it begins adding the ingredients as and when commanded to, and stirring the mixture in between. The stirring is more of a back and forth pushing of the mix, which is obviously less thorough than a human would be. After each stir, the robot scrapes its spatula on the side of the pan before returning it to its hook.
There are similar touches when the robot adds the next ingredient from its dedicated bin, double tapping the pot on the side to ensure everything falls out. I noticed, however, that there were a few ingredients still attached to the spatula and the pots when they were returned to the shelf. This is the big issue with a robot that lacks any sort of vision to perceive its local environment. During my demonstration, a few strips of leek clung to the spatula and fell off, onto the cooktop itself, while in motion. It was quickly wiped away, but I couldn’t help but wonder what would have happened if it’d landed a millimeter closer to the burner and pan and started burning.
What it can’t do
Photo by Daniel Cooper / Engadget
I’m much happier tending to a pan and actually cooking than I am peeling carrots and trying to dice onions. The obvious question, then, is why Moley sought to automate the ostensibly fun part of cooking rather than the bit people dislike? Oleynik said it might be possible in a far-flung future but there are just too many variables to make a carrot-peeling robot work. Not to mention, he added, the safety risks inherent in giving a robot a bladed instrument to wield.
Moley’s first-generation robots are also limited by the volume of food they can cook in a single session. Depending on the meal, they can make between eight and ten portions, enough for a dinner party but nothing more extravagant. Not to mention the robots can’t make much of any adjustment if you don’t have exactly the right ingredients ready for use. You can remove any you don’t have, naturally, but there’s no ability to improvise beyond that, or to variate its program to take into account seasonal differences in ingredient quality.
The food
Photo by Daniel Cooper / Engadget
When I was told the robot was making me tofu, I had to work hard to keep myself standing upright. If they could have seen my soul, they’d have watched my shoulders droop so hard they fell through the floor, through the basement, and into the subway line below. Friends, I cannot stand tofu and grimace my way through it whenever my vegan chums insist we go to a meat-free restaurant. Even when they insist I’m eating “really good” tofu, it just tastes like stringy matter, devoid of any inherent flavor as I try to mash it in my mouth. So bear that in mind when I say that the tofu the robot cooked me was actually delicious. It had a nice texture and tasted pretty delicious, meshing beautifully with the vegetables.
The future
Oleynik believes his robots will find a variety of niches to fill, first with money-rich, time-poor folks in London and beyond. The internet tells me that a private chef would set you back around £300 a day, so you’d burn through that £80,000 in less than a year. Naturally, it’s likely anyone who can drop £80,000 on a cooking robot can probably afford to buy their ingredients pre-prepared, so they could just dump them in the bins and set things going.
After that, Oleynik believes the technology could be used to prepare fresh meals for business and first-class airline passengers. Or in small kitchens where one employee supervises a production line of robots all making fresh dishes. His vision stretches to any situation where there may be a desire for fresh-cooked food, but the economics of a trained chef won’t allow it.
He cited the example of a hotel with 24/7 room service, where people are paid to wait around on the off-chance someone wants food. Or service stations in remote areas where there’s potential demand for meals but no need to hire a professional chef. Similarly, Oleynik cited care homes where there’s a similar conflict between a desire to produce good food but limited budgets.
Of course, it’s not clear, given there would need to be a human preparing the raw ingredients and dishing up, how much labor is being saved. And anyone who is involved with food would likely need to be trained and paid accordingly, which may eliminate any potential savings. But Oleynik is certain that a business can expect to see a return on its investment within its first year of service.
As for the price, Oleynik believes the technology will refine to the point that the cost will fall quite far. He gestured to one of the demo kitchens in the showroom, which had a Miele-branded oven and fridge, saying each model cost £5,000 (around $6,500) each. He hopes he’ll be able to sell a cooking robot for £10,000 to the sort of people who don’t blink when spending £5,000 on an oven and another £5,000 on a fridge. But, if nothing else, it’s entirely in keeping with everything else you can buy on Wigmore Street.
This article originally appeared on Engadget at https://www.engadget.com/home/kitchen-tech/a-105000-robot-arm-nobody-needs-cooked-me-a-delicious-lunch-140050065.html?src=rss
Be My Eyes, the accessibility app for mobile devices that puts blind and low-vision people on a live video call with a sighted guide, will help Microsoft train its AI. Be My Eyes will provide anonymized video data to improve scene understanding in Microsoft’s accessibility-focused AI models.
The data sets Be My Eyes gives Microsoft will include “unique objects, lighting and framing that realistically represents the lived experience of the blind and low vision community.” The goal is to make Microsoft’s AI more inclusive for people with vision disabilities.
The companies say all personal info has been scrubbed from the metadata. The provided data won’t be used for advertising or any purpose other than training Microsoft’s AI models.
Although this is Be My Eyes’ first such data partnership, it’s worked with Microsoft before by incorporating its Be My AI tool into Microsoft’s Disability Answer Desk. As its name suggests, Be My AI is the company’s GPT-4-powered spin on an assistance product. In that case, it helps people with vision disabilities navigate Office, Windows and Xbox.
Be My Eyes also struck a deal with Hilton earlier this month. In that case, dedicated hotel staff help blind and low-vision lodgers do things like adjust their thermostats, make coffee and raise or lower their blinds. A previous 2023 partnership between the two companies helped train the Be My AI model.
This article originally appeared on Engadget at https://www.engadget.com/ai/microsoft-recruits-accessibility-app-to-make-its-ai-more-useful-to-blind-and-low-vision-users-130006439.html?src=rss
Just in time for the 2024 US elections, the call screening and fraud detection company Hiya has launched a free Chrome extension to spot deepfake voices. The aptly named Hiya Deepfake Voice Detector “listens” to voices played in video or audio streams and assigns an authenticity score, telling you whether it’s likely real or fake.
Hiya tells Engadget that third-party testers have validated the extension as over 99 percent accurate. The company says that even covers AI-generated voices the detection model hasn’t trained on, and the company claims it can spot voices created by new synthesis models as soon as they’re launched.
We played around with the extension ahead of launch, and it seems to work well. I pulled up a YouTube video about the blues pioneer Howlin’ Wolf that I suspected used AI narration, and it assigned it a 1/100 authenticity score, declaring it likely a deepfake. Suspicions confirmed.
Hiya
Hiya threw a well-earned jab at social media companies for making such a tool necessary. “It’s clear social media sites have a huge responsibility to alert users when the content they are consuming has a high chance of being an AI deepfake,” Hiya President Kush Parikh wrote in a press release. “The onus is currently on the individual to be vigilant to the risks and use tools like our Deepfake Voice Detector to check if they are concerned content is being altered. That’s a big ask, so we’re pleased to be able to support them with a solution that helps put some of the power back in their hands.”
The extension only needs to listen to a few seconds of a voice to spit out a result. It works on a credit system to prevent Hiya’s servers from getting slammed by excessive requests. You’ll get 20 credits daily, which may or may not cover the flood of manipulative AI content you’ll come across on social media in the coming weeks.
This article originally appeared on Engadget at https://www.engadget.com/ai/a-new-chrome-extension-can-reliably-detect-ai-generated-voices-130059842.html?src=rss
Uber is reportedly exploring the idea of purchasing Expedia, one of the largest travel booking companies in the world, according to the Financial Times. Expedia, which is valued at $20 billion and which reported its highest-ever annual revenue in 2023, will be the company’s biggest acquisition, if the deal does indeed push through. The Times says it’s very early days, however, and Uber hasn’t even made a formal offer for the travel company yet. It’s still in the process of studying the implications of acquiring Expedia and has, over the past months, worked with advisers to figure out whether the deal is feasible and how it would be structured.
The company’s CEO, Dara Khosrowshahi, may have to sit out deal discussions, seeing as he used to be CEO of Expedia before he was hired by the ride-hailing service in 2017. He’s still in its Board of Directors, as well. It doesn’t sound like Khosrowshahi was the one who suggested the potential purchase, though — in its report, the Times said the idea was “broached by a third party.”
Uber has had plans to become a wider travel booking platform for a while now. Khosrowshahi said he wanted Uber to be the “Amazon of transportation” from the time he joined the company. Since then, the ride-hailing service has added train, bus and flight bookings in some markets, and it has also made several large acquisitions. It purchased online food delivery service Postmates for $2.65 billion and alcohol delivery service Drizly for $1.1 billion before shutting it down three years later. The company also teamed up with Waymo and Cruise to offer autonomous rides in certain markets. As the Times notes, Uber became profitable for the first time in 2023 due to a renewed demand for rides and food delivery and could be a in a good position to acquire a company as big as Expedia.
This article originally appeared on Engadget at https://www.engadget.com/big-tech/uber-is-reportedly-exploring-an-expedia-takeover-120038754.html?src=rss
Uber is reportedly exploring the idea of purchasing Expedia, one of the largest travel booking companies in the world, according to the Financial Times. Expedia, which is valued at $20 billion and which reported its highest-ever annual revenue in 2023, will be the company’s biggest acquisition, if the deal does indeed push through. The Times says it’s very early days, however, and Uber hasn’t even made a formal offer for the travel company yet. It’s still in the process of studying the implications of acquiring Expedia and has, over the past months, worked with advisers to figure out whether the deal is feasible and how it would be structured.
The company’s CEO, Dara Khosrowshahi, may have to sit out deal discussions, seeing as he used to be CEO of Expedia before he was hired by the ride-hailing service in 2017. He’s still in its Board of Directors, as well. It doesn’t sound like Khosrowshahi was the one who suggested the potential purchase, though — in its report, the Times said the idea was “broached by a third party.”
Uber has had plans to become a wider travel booking platform for a while now. Khosrowshahi said he wanted Uber to be the “Amazon of transportation” from the time he joined the company. Since then, the ride-hailing service has added train, bus and flight bookings in some markets, and it has also made several large acquisitions. It purchased online food delivery service Postmates for $2.65 billion and alcohol delivery service Drizly for $1.1 billion before shutting it down three years later. The company also teamed up with Waymo and Cruise to offer autonomous rides in certain markets. As the Times notes, Uber became profitable for the first time in 2023 due to a renewed demand for rides and food delivery and could be a in a good position to acquire a company as big as Expedia.
This article originally appeared on Engadget at https://www.engadget.com/big-tech/uber-is-reportedly-exploring-an-expedia-takeover-120038754.html?src=rss
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.