Is this the return of Google Glass? Magic Leap and Google team up could be bad news for Meta and Apple

We already knew Google was making at least a tentative return to the world of extended reality (XR, a catchall for VR, AR, and MR) with the announcement it’s helping to make the Samsung XR headset. But it could also be looking for another try and with one of its most high-profile flops: Google Glass.

This speculation follows an announcement from Magic Leap that it is partnering with Google to “bring Magic Leap Augmented Reality (AR) expertise and optics leadership together with Google’s technology platforms and collaborate on Extended Reality (XR) solutions and experiences.”

This is hardly a Google Glass confirmation, but it follows a few rumors that Google wants to have another crack at AR specs – including what might have been an accidental leak from its own Google I/O presentation. It also comes after Meta teased its AR glasses project, and with Apple testing the waters with the Vision Pro it would seem the entire industry is chasing an AR trend.

See more

Even though this partnership is seemingly set in stone we shouldn’t get our hopes up that Google Glass 2 is coming soon. LG and Meta announced plans to partner on XR technology, only for rumors to come out weeks later that they had already parted ways – rumors LG refused to dismiss.

Much like Meta and LG reportedly butted heads in several ways, Google and Magic Leap could also disagree on how best to create an AR device which could lead to their relationship breaking down. 

What could a ‘Google Glass 2’ look like? 

Assuming this partnership does bear fruit, what do we expect to see from Google Glass 2 – or whatever Google wants to call it?

Well design-wise we imagine it’ll look a lot more like a typical pair of specs. While Google Glass’ space-age design might have charmed some, it’s not at all subtle. The obvious camera freaked people out, and it painted a target on wearers as they were clearly wearing expensive tech that would-be thieves could rip from them. And when the battery dies, they’re useless.

Modern smart and AR glasses have some signs they’re more than meets the eye – like thicker arms, and a light that makes it clear when the camera is in use – but in general you wouldn’t know the specs were anything but regular shades unless you’re well-versed in tech. With prescription or shaded lenses, they’re also something you can wear all the time even when they run out of charge.

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

The Ray-Ban Meta Smart Glasses (Image credit: Meta)

As for its features, obviously, AR aspects would be included. To us, this means a HUD with an overlay showing you things like directions pointing you towards your destination, and apps that have you interact with virtual objects as if they’re in the real world.

But the other big feature will most likely be AI. We’ve already seen how the Ray-Ban Meta smart glasses can leverage cameras and its AI to help you identify objects, find recipes, translate signs, and generally answer questions you might have by simply talking to your specs. Google also has a generative AI – Gemini – and while its recent attempts at AI search haven’t been the best, we’d be shocked if this tech wasn’t incorporated into Google Glass 2.

We’ll have to wait and see what Google’s next AR device has in store for us if and when it launches. You can be sure we’ll be ready to give you the lowdown as soon as we have it.

You might also like

TechRadar – All the latest technology news

Read More

Adobe’s new photo editor looks even more powerful than Google’s Magic Editor

Adobe MAX 2023 is less than a week away, and to promote the event, the company recently published a video teasing its new “object-aware editing engine” called Project Stardust.

According to the trailer, the feature has the ability to identify individual objects in a photograph and instantly separate them into their own layers. Those same objects can then be moved around on-screen or deleted. Selecting can be done either manually or automatically via the Remove Distractions tool. The software appears to understand the difference between the main subjects in an image and the people in the background that you want to get rid of.

What’s interesting is moving or deleting something doesn’t leave behind a hole. The empty space is filled in most likely by a generative AI model. Plus, you can clean up any left-behind evidence of a deleted item. In its sample image, Adobe erases a suitcase held by a female model and then proceeds to edit her hand so that she’s holding a bouquet of flowers instead.  

Image 1 of 2

Project Stardust editing

(Image credit: Adobe)
Image 2 of 2

Project Stardust generative AI

(Image credit: Adobe)

The same tech can also be used to change articles of clothing in pictures. A yellow down jacket can be turned into a black leather jacket or a pair of khakis into black jeans. To do this, users will have to highlight the piece of clothing and then enter what they want to see into a text prompt. 

Stardust replacement tool

(Image credit: Adobe)

AI editor

Functionally, Project Stardust operates similarly to Google’s Magic Editor which is a generative AI tool present on the Pixel 8 series. The tool lets users highlight objects in a photograph and reposition them in whatever manner they please. It, too, can fill gaps in images by creating new pixels. However, Stardust feels much more capable. The Pixel 8 Pro’s Magic Eraser can fill in gaps, but neither it nor Magic Editor can’t generate content. Additionally, Google’s version requires manual input whereas Adobe’s software doesn’t need it.

Seeing these two side-by-side, we can’t but wonder if Stardust is actually powered by Google’s AI tech. Very recently, the two companies announced they were entering a partnership “and offering a free three-month trial for Photoshop on the web for people who buy a Chromebook Plus device. Perhaps this “partnership” runs a lot deeper than free Photoshop considering how similar Stardust is to Magic Editor.

Impending reveal

We should mention that Stardust isn't perfect. If you look at the trailer, you'll notice some errors like random holes in the leather jacket and strange warping around the flower model's hands. But maybe what we see is Stardust in an early stage. 

There is still a lot we don’t know like whether it's a standalone app or will it be housed in, say, Photoshop? Is Stardust releasing in beta first or are we getting the final version? All will presumably be answered on October 10 when Adobe MAX 2023 kicks off. What’s more, the company will be showing other “AI features” coming to “Firefly, Creative Cloud, Express, and more.”

Be sure to check out TechRadar’s list of the best Photoshop courses online for 2023 if you’re thinking of learning the software, but don’t know where to start. 

You might also like

TechRadar – All the latest technology news

Read More