Nio’s soon-to-arrive ET7 is practically tailor-made to challenge Tesla’s Model S, and now the company appears to have a (partial) answer to the Model 3. Electreksays Nio has introduced the ET5, a more affordable “mid-size” electric sedan. It starts at RMB 328,000 (about $51,450), or well under the roughly $70,000 of the ET7, but offers similarly grandiose range figures. Nio claims the base 75kWh battery offers over 341 miles of range using China’s test cycle, while the highest-end 150kWh “Ultralong Range” pack is supposedly good for more than 620 miles. You’ll likely pay significantly more for the privilege and may not see that range in real life, but the numbers could still tempt you away from higher-end Model 3s if long-distance driving is crucial.
You can expect the usual heapings of technology. The ET5 will have built-in support for autonomous driving features as they’re approved, and drivers get a “digital cockpit” thanks to Nreal-developed augmented reality glasses that can project a virtual screen equivalent to 201 inches at a 20-foot viewing distance. Nio has teamed with Nolo to make VR glasses, too, although it’s safe to say you won’t wear those while you’re driving.
Deliveries are expected to start September 2022. That’s a long way off, but Nio appears to be on track with its EV plans as it expects to deliver the ET7 on time (if only just) starting March 28th.
This launch also dovetails with Nio’s tentative steps outside of China. The brand only expanded to Norway in 2021, but it aimed to begin sales in Denmark, Germany, the Netherlands and Sweden in 2022. You should see the badge in 25 countries and regions by 2025. While Nio still won’t count as automotive heavyweight by that stage, it could easily put pressure on other EV makers within a few years.
Dragonlanceis one of Dungeons & Dragons’ most iconic settings, brought to life by designers Margaret Weis and Tracy and Laura Hickman. In late 2020, it seemed as though the creators’ relationship with Wizards of the Coast had soured to the point of a lawsuit from the creators concerning breach of contact regarding a…
Current Toyota drivers might not be thrilled about having to subscribe just to remotely start from their key fobs, but what about new buyers? There’s mixed news. The automaker told Roadshow in a statement that remote starting won’t be available on key fobs for new vehicles. You’ll have to use the brand’s mobile app, in other words. With that said, you might not mind the cost.
You may not ever have to pay for the feature. While it was previously clear 2018 to 2020 vehicles were limited to a three-year Connected Sevices trial, some 2020 model year and newer vehicles include a 10-year trial. There’s a real possibility you’ll have moved on to another car by the time the freebie expires.
This still won’t please anyone who prefers the simplicity of a fob, or owners who intend to keep their vehicles for a long time. You may have to pay extra just to keep the functionality your car had for a large part of its lifespan. We wouldn’t count on Toyota backtracking, mind you. Like many companies, Toyota is turning to services to provide a steadier revenue stream than it would get through sales alone. Remote starting isn’t likely to represent a windfall when it will only collect $80 per year a decade from now, but it hints where Toyota’s strategy is going.
Two years ago, Bleach fans lost their mind when it was revealed that the anime adaptation of Tite Kubo’s supernatural adventure manga would be making a comeback. The beloved anime concluded all the way back in 2014 with the “Lost Substitute Shinigami” arc, just a few years before the manga would close things out in…
In his new book, SuperSight: What Augmented Reality Means for Our Lives, Our Work, and the Way We Imagine the Future, author David Rose delves into the current state of the art of augmented reality, discussing how the technology is already transforming myriad industries — from food service to medicine to education to construction and architecture — and what it might accomplish in the near future. In the excerpt below, Rose takes a look at two companies leveraging computer vision and generative adversarial networks to reimagine existing properties as 21st century electrified smart homes.
We should all be using solar panels. Period. The average cost for a sustainable energy system has fallen about 70% in the last decade, from $5.86/watt to $1.50/ watt, so it’s a financial no-brainer. For no money down, you can finance an installation and start saving a hundred dollars a month in the first month, and even more if you live in the sun-saturated South.
So why aren’t we? It’s complicated! Math, logistics, taxes, and aesthetics all play a role. Many homeowners fear it will make their houses shiny and reflective like the Tin Man from The Wizard of Oz. The process of figuring out the number of panels in what size you need requires learning to “talk solar” in unfamiliar units like kilowatt-hours. And change always comes with risk, whether actual or just perceived.
The pro-climate mission of Boston-based company Energy Sage is to get people to electrify their homes. That means solar panels on your roof, an electric car, a home battery system, automatic blinds, and a smart thermostat that precools or preheats as you drive home. And they’ve partnered with us at Continuum to get potential customers more comfortable with the idea by showing them what an electrified version of their home might look like. Using publicly available Google Home satellite images, we size solar panels, digitally overlay them on clients’ roofs, and then show them what their pad would look like from both the street and their neighbor’s fence. We then take those images and pair them with data from Project Sunroof, a Google project that helps you work out the solar savings potential of your roof. Once you’ve seen the beautiful pictures of your electrified home and realized how much you’re going to save over the years—and you have the visual and financial data in hand—it’s a simple decision to go forward and make that change.
Other home improvement projects will benefit from a similar SuperSight-envisioned approach. Let’s consider landscaping: another complicated, potentially expensive project with its own disorienting language, risks, and desperate need for pre-project visualization.
I met landscape designer Julie Moir-Messervy at an MIT pitch competition and was immediately intrigued with her mission: to give homeowners the confidence and tools they need to change their barren yard into a collection of outdoor living spaces. Her company, HomeOutside, helps people see new possibilities for their backyards using AI and computer vision. Once they’ve visualized their yard in a compelling way, the company makes it easy for them to make that vision a reality by hiring the landscape installer, getting materials delivered, and even helping spread the payments out over time.
Landscaping isn’t just good for property values; greenscapes filter airborne pollutants that trigger asthma, help people recuperate faster from illness, reduce summer temperatures, and even lower crime. Proper native landscaping powers a dynamic system that helps out the bees and birds, who in turn pollinate trees and reseed plants. Southwest shade trees can reduce the need for air-conditioning, and northeast hedges cut down on winter winds—and heating bills. More trees mean more carbon capture—a ton over the lifetime of each tree—as they literally suck the bad stuff we produce out of the air while reducing runoff and erosion.
But “most people don’t do anything in their yards because they don’t know where to start,” Julie told me. “They don’t know which plants to select and how to arrange them, or don’t know how to install a landscape design and care for it over time.” I was so inspired to work on the problem that I accepted a position on her board and got to work.
HomeOutside is training a generative adversarial network (GAN) to automatically compose beautiful and sustainable landscape designs, based on the thousands of designs (think of these as recipes) the firm has developed for clients over the last twenty-plus years. The company uses Google Earth Engine and photogrammetry to start with a 3D view of any address (US only, currently). The GAN architecture then uses one network (the Generator) to make a new design, and another network (the Discriminator) to judge or score the work. These two networks continue their iterative game, generating then scoring, until the discriminator judges that the landscape has a good composition: shade trees, natural pollinators, grass for playing, hardscapes/decks and furniture for gathering places, plant diversity, and so forth.
Companies that sell plants, furniture, lighting, and hardscapes are obviously interested in this type of “imagination engine” technology, because it bridges the conceptual gap between the current state of someone’s garden and what could be—thus motivating many more people to make the dream real. It’s not just great for the homeowners and outdoor retailers, either—it’s great for the environment, too. But what the company’s environmentally focused investors find most captivating about this project is the opportunity to change the landscape of entire neighborhoods at scale. What if we could create a new national park across millions of backyards that stitch together places for birds and bees? Every acre of forest absorbs about 2.5 tons of carbon a year. What if we turned neighborhoods into significant carbon sequestration zones?
I helped Julie and her team develop HomeOutside’s grand plan to proactively redesign seventy million front yards, then work with Home Depot, Lowe’s, Wayfair, IKEA, and garden centers to email their customers a 3D redesign of their yard. Customers simply go outside their home, open their phone, and,through the app’s use of spatial world anchors, walk through an immersive animated landscape superimposed on their current yard. A time-lapse view from sunrise to sunset shows why the edible garden is placed where it is. The winter visualization explains the choice of new fir trees between their yard and the neighbor’s. Spring flowers bloom with a cacophony of color.
Will people be alarmed by the idea of an algorithm proactively redesigning their yard, with new shade trees and naturally pollinating shrubs? It’s not as if your front yard is private now, thanks to Google Street View. And if you are selling your home, you might decide against hiring the landscapers and just choose to post images of HomeOutside’s makeover version instead to maximize your curb appeal.
Once this visioning technology is commonplace, lots of different fields will start taking advantage of it. Home Depot, for example, recently invested in a startup called Hover, which, after digitizing your home in 3D, visualizes and prices new paint, siding, and roofing materials. SuperSight will soon show the actual paint crew up on their ladders, finishing the last few brush strokes, so you get that delightful experience of a job just finished. Volkswagen might put a new Passat in your driveway, complete with the kayaks and mountain bikes it knows you love on top. And the company trying to sell you home and car insurance? They’ll project a disaster scenario: solar panels fallen off, the shade tree hit by lightning, and your new Passat pummeled in a hail storm. Better buy the insurance before you repaint.
How will we interact with these types of immersive designs? With our SuperSight glasses on, will we point and place trees, or paint flowers from a palette of choices, like a 3D version of Photoshop? Will we select each plant from a vast menu of options for infinite control and customization, or will we just tell the system what we like so it learns our preferences, then proposes a single solution we’ll love? I believe in the happy medium: that we’ll largely prefer to see several “expertly composed” options and choose from among them, much as we do today when working with an architect, interior designer, or wedding planner.
Experts are usually so good at what they do that it’s often a mistake to over-specify particular details. For example, you shouldn’t tell an architect that you want a window exactly here, or an interior designer that you want this particular chair in a specific color in this corner. Instead, you express your opinions at a higher level of abstraction (“I want the room to feel more connected to the environment”) or through describing a required function (“We want a vegetable garden”), and let them do the detailed work.
The same expert-guided interaction model will dominate our relationships with SuperSight AIs. For landscaping, we might ask for a more formal French garden with rectilinear layouts and exotic colorful plants, or a curvaceous organic design that prioritizes privacy from our neighbors. We might indicate a preference for an open space for play, or for a filled-in scheme with more space for a productive garden. And as we express these higher-level interests, our 3D landscape design will dynamically recalculate to match our preferences. With SuperSight glasses on, we’ll be able to test our hunches faster by seeing reconfigurations immediately and in context, superimposed on our real home.
The jury is still out on whether HomeOutside will be able to use this technology to convince millions of homeowners to invest significantly in a sustainable landscape. The testing is promising, though; customers are delighted to see their yards reimagined and restaged. In the next five years, HomeOutside plans to use Google Earth and street view imagery in a generative AI tool to automatically redesign tens of millions of landscapes, with sustainable plants, shade trees, natural pollinators, and bird-friendly berries. If it succeeds, it will mean a million homeowners will plant at least 3 million new shade trees, like oaks and beeches, that will each capture 48 pounds of carbon a year as they mature. That’s 14 billion tons of carbon sequestered over those trees’ lifespans.
As one of the HomeOutside advisors summed up, “You are building the equivalent of a new national park—the National Park of us! Visualization tools like HomeOutside can persuade homeowners to reshape the American landscape.”
That’s the ultimate potential power of SuperSight: to help people envision and imagine a future that benefits themselves and the planet.
Adidas’ first NFT effort appears to have been a success. The Block has learned the collaboration with Bored Ape Yacht Club, Gmoney and Punks has raked in more than $23 million in Ethereum between a $15.5 million “Early Access” phase and $7.5 million in a general sale. All told, the drop minted nearly 30,000 NFTs despite a hiccup that led Adidas to pause early transactions.
The number may not sound large for such a well-known brand, but Adidas managed the equivalent of $538.4 million in profit during its latest quarter — $23 million from a limited-run digital release is significant. More sales like this could pad Adidas’ bottom line while giving NFTs the hype levels previously reserved for sneakers.
There’s a good chance you’ll see more NFT releases as a result, not to mention greater involvement in metaverses. Not that Adidas necessarily has much choice. Nike bought RTFKT precisely to deepen its involvement with NFTs and metaverse collectibles like shoes, and Adidas risks ceding ground if it doesn’t counter one of its most obvious rivals.
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.