These new AI smart glasses are like getting a second pair of ChatGPT-powered eyes

The Ray-Ban Meta glasses have a new rival for the title of best smart glasses, with the new Solos AirGo Visions letting you quiz ChatGPT about the objects and people you're looking at.

Unlike previous Solos glasses, the AirGo Vision boast a built-in camera and support for OpenAI's latest GPT-4o model. These let the glasses identify what you're looking at and respond to voice prompts. For example, you could simply ask, “what am I looking at?” or give the AirGo Visions a more specific request like “give me directions to the Eiffel Tower.”

Another neat feature of the new Solos glasses is their modular frame design, which means you can change some parts – for example, the camera or lenses – to help them suit different situations. These additional frames start from $ 89 (around £70 / AU$ 135).   

If talking to a pair of camera-equipped smart glasses is a little too creepy, you can also use the camera to simply take holiday snaps. The AirGo Visions also feature built-in speakers to answer your questions or play music.

While there's no official price or release date for the full version of the AirGo Visions, Solos will release a version without the camera for $ 249 (around £200 / AU$ 375) in July. That means we can expect a camera-equipped pair to cost at least as much as the Ray-Ban Meta glasses, which will set you back $ 299 / £299 / AU$ 449.

How good are AI-powered smart glasses?

While we haven't yet tried the Solos AirGo Visions, it's fair to say that smart glasses with AI assistants are a work in progress. 

TechRadar's Senior Staff Writer Hamish Hector recently tried the Meta AI's 'Look and Ask' feature on his Ray-Ban smart glasses and found the experience to be mixed. He stated that “the AI is – when it works – fairly handy,” but that “it wasn’t 100% perfect, struggling at times due to its camera limitations and an overload of information.”

The smart glasses failed in some tests, like identifying trees, but their ability to quickly summarize a confusing, information-packed sign about the area’s parking restrictions showed how useful they can be in some situations.

As always, with any AI-powered responses, you'll want to corroborate any answers to filter out errors and so-called hallucinations. But there's undoubtedly some potential in the concept, particularly for travelers or anyone who is visually impaired.

The Solos AirGo Visions' support for OpenAI's latest GPT-4o model should make for an interesting comparison with the Ray-Ban Meta smart glasses when the camera-equipped version lands. Until then, you can check out our guide to the best smart glasses you can buy right now.

You might also like

TechRadar – All the latest technology news

Read More

Xreal’s cheap smartphone-like gadget solves the biggest issues with AR smart glasses

Xreal has unveiled its latest gadget at the Augmented World Expo (AWE) 2024 XR industry showcase, but it’s not a pair of AR smart glasses – it’s a smartphone-like device called the Xreal Beam Pro.

The Beam Pro is a successor to the original Beam adapter, and it solves many of the issues I’ve had with this generation of AR specs – and it’s surprisingly affordable.

The upgraded Beam Pro is a whole new Xreal glasses attachment that’s compatible with Xreal Air and Xreal Air 2 models – including the Pro and Ultra. Simply connect it to your specs with a USB-C to USB-C cable and they’ll turn into bonafide spatial computers that run on Android 14 and NebulaOS (Xreal’s home-grown operating system).

You can control what you see on the Xreal glasses using the Beam Pro’s 6.5-inch LCD 2K touchscreen. You can also use the screen just like you would on a smartphone to manage Play Store apps and tweak your settings, and thanks to the dual 50MP cameras on its rear you can capture spatial images and 3D video at 1080p and 60fps.

Xreal Air glasses connected to the Xreal Beam Pro being used to watch TV

(Image credit: Xreal)

Best of all it comes in at just $ 199 / £189 for the 6GB of RAM / 128GB storage model, while the 8GB of RAM / 256GB storage model will set you back $ 249 / £239. Preorders are live right now at XREAL.com for the United States, United Kingdom, China, Japan, Germany, France, Italy and Netherlands; availability and pricing for Australia are TBC.

If you’re picking up the Beam Pro and a pair of Xreal glasses you can save on bundles, and those of you with a 1st-gen Beam can save $ 50 / £40 on a Beam Pro if you preorder one by July 10, 2024.

The AR accessory we’ve needed 

I’ve not yet had the chance to try out the Xreal Beam Pro, but it sounds like the AR add-on we’ve been needing for a while by solving a bunch of issues I’ve had with the tech when testing it out.

Firstly, finding compatible gadgets can be confusing. There are all sorts of compatibility issues thanks to hardware and software nuances that are confusing if you aren’t techy. While the original Beam was billed as a potential solution, it still suffered with compatibility issues because it couldn’t be used without a smartphone, while if you pick up the Pro and a pair of Xreal Airs you know you’ll have everything you need.

Second, it solves the battery annoyances I’ve had thanks to its two USB-C ports. The Xreal glasses (and other wired AR specs) can burn through your phone’s charge, and there’s no way to charge your phone and use the glasses at the same time. That’s not an issue with the Beam Pro, as you can use it and connect it to power at the same time.

Girl wearing Meta Quest 3 headset interacting with a jungle playset

The Meta Quest 3 has some competition (Image credit: Meta)

Finally, it seems like good value for money. Without any bundle discounts an AR glasses and Beam Pro setup costs between $ 498 and $ 648 / £528 and £638, which is a little more expensive than a Meta Quest 3

While AR isn't prohibitively expensive, it can feel like you're getting relatively little bang for your buck compared to XR devices such as VR headsets because of the aforementioned compatibility and complexity issues.

The Beam Pro gives you a simple plug-and-play option that’s a pocket TV and 3D camera and which doesn’t require other tech – just some subscriptions to the best streaming services

I’ll obviously need to try the Beam Pro out to verify Xreal’s bold promises, but if you’ve been waiting for AR tech to start feeling worthwhile, this is your notice to start paying attention to the space – and maybe even finally dive in.

You might also like

TechRadar – All the latest technology news

Read More

Ray-Ban Meta smart glasses get new Amazon Music and mental health update

In a sign that they could follow the roughly monthly Meta Quest 3 software update schedule we’re used to, the Ray-Ban Meta smart glasses have received several new tools like Calm and improved Instagram integration just 29 days since they got their massive April 23 upgrade.

While none of these improvements seemingly include wider access to the still-in-beta and still-US-and-Canada exclusive Meta AI, they do include some helpful hands-free features that users can enjoy right now.

The first are new Meta AI prompts that allow you to enjoy guided meditation, mindfulness exercises, and self-care content through your smart specs by simply saying “Hey Meta, play the Daily Calm.” New Calm users will also be able to access a three-month free subscription through prompts in the Meta View app.

Beyond this, your smart specs can now directly stream tunes from Amazon Music using voice-only controls (joining Apple Music which added hands-free support in April) – you just need to connect your account via the Meta View app. There’s new Instagram Story sharing options, too.

Simply say something like, “Hey Meta, post a photo to Instagram” to snap a pic that’ll be shared automatically to your connected Instagram account.

As the Meta blog post sharing details of the update explains, these new Ray-Ban Meta smart glasses features are rolling out gradually. So if you don’t see the update in the Meta View app, don’t panic – you should get the update soon enough.

Three new styles

The Skyler Ray-Ban Meta smart glasses with pink lenses

The Skyler in Shiny Chalky Gray (above) are one of three new versions of the Ray-Ban Meta smart glasses (Image credit: Meta / Ray-Ban)

If you don’t like waiting for software changes, there are also some hardware updates – which are available now.

The internal specs are the exact same as the original models, but Meta and Ray-Ban have launched three new styles which are available in the US, Canada, Australia, and “throughout Europe.” They are:

  • Skyler in Shiny Chalky Gray with Gradient Cinnamon Pink Lenses
  • Skyler in Shiny Black with Transitions® Cerulean Blue Lenses
  • Headliner Low Bridge Fit in Shiny Black with Polar G15 Lenses

Hopefully this monthly software schedule will continue, and if it does maybe those of us outside the US might not have to wait too much longer for the Meta AI to hit our devices in a future update.

You might also like

TechRadar – All the latest technology news

Read More

A huge Meta AI update finally arrives on Ray-Ban Meta smart glasses… for some

After months of waiting the moment is here: Meta AI features have arrived on the Ray-Ban Meta smart glasses for everyone – well, everyone in the US and Canada, for now.

The exclusivity to those regions is not the only caveat unfortunately. Another big one is that while the Meta AI tools are no longer locked behind an exclusive beta, Meta notes in its blog post announcement that they are still beta features – suggesting that you’ll likely run into several problems with regard to reliability and accuracy.

But while the update isn’t quite as complete as we’d have liked, it’s still a major leap forward for Meta’s smart glasses – finally having them deliver on the impressive AI promises Meta CEO Mark Zuckerberg made when they were revealed back at Meta Connect 2023 in September last year.

What can Meta AI do?

A video call shot on the Ray-Ban Meta smart glasses

(Image credit: Ray-Ban / Meta)

The main Meta AI feature you’ll want to take advantage of is ‘Look and Ask.’ To activate it simply start a phrase with “Hey Meta, look and …” then ask the glasses a question about something you can see. 

You could try “… tell me about this animal,” or “…tell me about this building,” or even “…tell me what I can make for dinner with these ingredients.”

The glasses will then use your command alongside an image captured by the camera to search for an answer in its database – which include data the Meta AI has been trained on, and information it has gathered from Google and Bing.

As with all AI responses, we’d recommend taking what the Meta AI says with a pinch of salt. AI assistants are prone to hallucinating – which in the AI context you can read simply as “getting stuff completely wrong” – and this Meta model is no different. It will get stuff right too, but don’t take its advice as gospel.

Ray-Ban Meta Smart Glasses covered in water droplets

(Image credit: Meta)

Beyond Look and Ask you can use the Meta AI assistant like the Google or Siri assistant on your phone. This means starting video calls (above), sending texts and images, or playing music all with just voice commands.

Just be prepared to get some attention as you walk around talking to your smart glasses – we got some odd looks when we were testing a different pair of specs the other day.

You might also like

TechRadar – All the latest technology news

Read More

Meta AR glasses: everything we know about the AI-powered AR smart glasses

After a handful of rumors and speculation suggested Meta was working on a pair of AR glasses, it unceremoniously confirmed that Meta AR glasses are on the way – doing so via a short section at the end of a blog post celebrating the 10th anniversary of Reality Labs (the division behind its AR/VR tech).

While not much is known about them, the glasses were described as a product merging Meta’s XR hardware with its developing Meta AI software to “deliver the best of both worlds” in a sleek wearable package.

We’ve collected all the leaks, rumors, and some of our informed speculation in this one place so you can get up to speed on everything you need to know about the teased Meta AR glasses. Let’s get into it.

Meta AR glasses: Price

We’ll keep this section brief as right now it’s hard to predict how much a pair of Meta AR glasses might cost because we know so little about them – and no leakers have given a ballpark estimate either.

Current smart glasses like the Ray-Ban Meta Smart Glasses, or the Xreal Air 2 AR smart glasses will set you back between $ 300 to $ 500 / £300 to £500 / AU$ 450 to AU$ 800; Meta’s teased specs, however, sound more advanced than what we have currently.

Lance Ulanoff showing off Google Glass

Meta’s glasses could cost as much as Google Glass (Image credit: Future)

As such, the Meta AR glasses might cost nearer $ 1,500 (around £1,200 / AU$ 2300)  – which is what the Google Glass smart glasses launched at.

A higher price seems more likely given the AR glasses novelty, and the fact Meta would need to create small yet powerful hardware to cram into them – a combo that typically leads to higher prices.

We’ll have to wait and see what gets leaked and officially revealed in the future.

Meta AR glasses: Release date

Unlike price, several leaks have pointed to when we might get our hands – or I suppose eyeballs – on Meta’s AR glasses. Unfortunately, we might be waiting until 2027.

That’s according to a leaked Meta internal roadmap shared by  The Verge back in March 2023. The document explained that a precursor pair of specs with a display will apparently arrive in 2025, with ‘proper’ AR smart glasses due in 2027.

RayBan Meta Smart Glasses close up with the camera flashing

(Image credit: Meta)

In February 2024  Business Insider cited unnamed sources who said a pair of true AR glasses could be shown off at this year’s Meta Connect conference. However, that doesn’t mean they’ll launch sooner than 2027. While Connect does highlight soon-to-release Meta tech, the company takes the opportunity to show off stuff coming further down the pipeline too. So, its demo of Project Orion (as those who claim to be in the know call it) could be one of those ‘you’ll get this when it’s ready’ kind of teasers.

Obviously, leaks should be taken with a pinch of salt. Meta could have brought the release of its specs forward, or pushed it back depending on a multitude of technological factors – we won’t know until Meta officially announces more details. Considering it has teased the specs suggests their release is at least a matter of when not if.

Meta AR glasses: Specs and features

We haven't heard anything about the hardware you’ll find in Meta’s AR glasses, but we have a few ideas of what we’ll probably see from them based on Meta’s existing tech and partnerships.

Meta and LG recently confirmed that they’ll be partnering to bring OLED panels to Meta’s headsets, and we expect they’ll bring OLED screens to its AR glasses too. OLED displays appear in other AR smart glasses so it would make sense if Meta followed suit.

Additionally, we anticipate that Meta’s AR glasses will use a Qualcomm Snapdragon chipset just like Meta’s Ray-Ban smart glasses. Currently, that’s the AR1 Gen 1, though considering Meta’s AR specs aren’t due until 2027 it seems more likely they’d be powered by a next-gen chipset – either an AR2 Gen 1 or an AR1 Gen 2.

A Meta Quest 3 player sucking up Stay Puft Marshmallow Men from Ghostbusters in mixed reality using virtual tech extending from their controllers

The AR glasses could let you bust ghost wherever you go (Image credit: Meta)

As for features, Meta’s already teased the two standouts: AR and AI abilities.

What this means in actual terms is yet to be seen but imagine virtual activities like being able to set up an AR Beat Saber jam wherever you go, an interactive HUD when you’re navigating from one place to another, or interactive elements that you and other users can see and manipulate together – either for work or play.

AI-wise, Meta is giving us a sneak peek of what's coming via its current smart glasses. That is you can speak to its Meta AI to ask it a variety of questions and for advice just as you can other generative AI but in a more conversational way as you use your voice.

It also has a unique ability, Look and Ask, which is like a combination of ChatGPT and Google Lens. This allows the specs to snap a picture of what’s in front of you to inform your question, allowing you to ask it to translate a sign you can see, for a recipe using ingredients in your fridge, or what the name of a plant is so you can find out how best to care for it.

The AI features are currently in beta but are set to launch properly soon. And while they seem a little imperfect right now, we’ll likely only see them get better in the coming years – meaning we could see something very impressive by 2027 when the AR specs are expected to arrive.

Meta AR glasses: What we want to see

A slick Ray-Ban-like design 

RayBan Meta Smart Glasses

The design of the Ray-Ban Meta Smart Glasses is great (Image credit: Meta)

While Meta’s smart specs aren't amazing in every way – more on that down below – they are practically perfect in the design department. The classic Ray-Ban shape is sleek, they’re lightweight, super comfy to wear all day, and the charging case is not only practical, it's gorgeous.

While it’s likely Ray-Ban and Meta will continue their partnership to develop future smart glasses – and by extension the teased AR glasses – there’s no guarantee. But if Meta’s reading this, we really hope that you keep working with Ray-Ban so that your future glasses have the same high-quality look and feel that we’ve come to adore.

If the partnership does end, we'd like Meta to at least take cues from what Ray-Ban has taught it to keep the design game on point.

Swappable lenses 

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

We want to change our lenses Meta! (Image credit: Meta)

While we will rave about Meta’s smart glasses design we’ll admit there’s one flaw that we hope future models (like the AR glasses) improve on; they need easily swappable lenses.

While a handsome pair of shades will be faultless for your summer vacations, they won’t serve you well in dark and dreary winters. If we could easily change our Meta glasses from sunglasses to clear lenses as needed then we’d wear them a lot more frequently – as it stands, they’re left gathering dust most months because it just isn’t the right weather.

As the glasses get smarter, more useful, and pricier (as we expect will be the case with the AR glasses) they need to be a gadget we can wear all year round, not just when the sun's out.

Speakers you can (quietly) rave too 

JBL Soundgear Sense

These open ear headphones are amazing, Meta take notes (Image credit: Future)

Hardware-wise the main upgrade we want to see in Meta’s AR glasses is better speakers. Currently, the speakers housed in each arm of the Ray-Ban Meta Smart Glasses are pretty darn disappointing – they can leak a fair amount of noise, the bass is practically nonexistent and the overall sonic performance is put to shame by even basic over-the-ears headphones.

We know open-ear designs can be a struggle to get the balance right with. But when we’ve been spoiled by open-ear options like the JBL SoundGear Sense – that have an astounding ability to deliver great sound and let you hear the real world clearly (we often forget we’re wearing them) – we’ve come to expect a lot and are disappointed when gadgets don’t deliver.

The camera could also get some improvements, but we expect the AR glasses won’t be as content creation-focused as Meta’s existing smart glasses – so we’re less concerned about this aspect getting an upgrade compared to their audio capabilities.

You might also like

TechRadar – All the latest technology news

Read More

Meta’s Smart Glasses will get a sci-fi upgrade soon, but they’re still not smart enough

There's a certain allure to smart glasses that bulky mixed-reality headsets lack. Meta's Ray-Ban Smart Glasses (formerly Stories), for instance, are a perfect illustration of how you can build smarts into a wearable without making the wearer look ridiculous. The question is, can you still end up being ridiculous while wearing them?

Ray-Ban Meta Smart Glasses' big upcoming Meta AI update will let you talk to your stylish frames, querying them about the food you're consuming, the buildings you're facing, and the animals you encounter. The update is set to transform the wearable from just another pair of voice-enabled glasses into an always-on-your-face assistant.

The update isn't public and will only apply to Ray-Ban Smart Glasses and not the Ray-Ban Meta Stories predecessors that do not feature Qualcomm's new AR1 Gen 1 chip. This week, however, Meta gave a couple of tech reporters at The New York Times early access to the Meta AI integration and they came away somewhat impressed.

I must admit, I found the walkthrough more intriguing than I expected.

Even though they didn't tear the glasses apart, or get into the nitty gritty tech details I crave, the real-world experience depicts Meta AI as a fascinating and possibly useful work in progress.

Answers and questions

In the story, the authors use the Ray Ban smart glasses to ask Meta AI to identify a variety of animals, objects, and landmarks with varying success. In the confines of their homes, they spoke full voice and asked Meta AI. “What am I looking at?” They also enabled transcription so we could see what they asked and the responses Meta AI provided.

It was, in their experience, quite good at identifying their dogs' breed. However, when they took the smart glasses to the zoo, Meta AI struggled to identify far-away animals. In fact, Meta AI got a lot wrong. To be fair, this is beta and I wouldn't expect the large language model (Llama 2) to get everything right. At least it's not hallucinating (“that's a unicorn!”), just getting it wrong.

The story features a lot of photos taken with the Ray-Ban Meta Smart Glasses, along with the queries and Meta AI's responses. Of course, that's not really what was happening. As the authors note, they were speaking to Meta AI wherever they went and then heard the responses spoken back to them. This is all well and good when you're at home, but just weird when you're alone at a zoo talking to yourself.

The creep factor

This, for me, remains the fundamental flaw in many of these wearables. Whether you wear Ray-Ban Smart Glasses or Amazon Echo Frames, you'll still look as if you're talking to yourself. For a decent experience, you may engage in a lengthy “conversation” with Meta AI to get the information you need. Again, if you're doing this at home, letting Meta AI help you through a detailed recipe, that's fine. Using Meta AI as a tour guide when you're in the middle of, say, your local Whole Foods might label you as a bit of an oddball.

We do talk to our best phones and even our best smartwatches, but I think that when people see you holding your phone or smartwatch near your face, they understand what's going on.

The New York Times' authors noted how they found themselves whispering to their smart glasses, but they still got looks.

I don't know a way around this issue and wonder if this will be the primary reason people swear off what is arguably a very good-looking pair of glasses (or sunglasses) even if they could offer the passive smart technology we need.

So, I'm of two minds. I don't want to be seen as a weirdo talking to my glasses, but I can appreciate having intelligence there and ready to go; no need to pull my phone out, raise my wrist, or even tap a smart lapel pin. I just say, “Hey Meta” and the smart glasses wake up, ready to help.

Perhaps the tipping point here will be when Meta can integrate very subtle AR screens into the frames that add some much-needed visual guidance. Plus, the access to visuals might cut down on the conversation, and I would appreciate that.

You might also like

TechRadar – All the latest technology news

Read More

Want to skip to the good bit of a video? YouTube is testing a smart AI feature for that

I’ve been increasingly driven to distraction by YouTube’s ever-more-aggressive delivery of adverts before, during and after videos, which is making it a challenge to even get to the bits of a video that I want to see without having some earnest voice encourage me to trade stocks or go to Dubai. Until now I’ve been too cheap to subscribe to YouTube Premium – but that may soon change. 

That’s because YouTube is apparently testing an AI-powered recommendation system that will analyze patterns in viewer behavior to cleverly skip to the most popular parts of a video with just a double tap on a touchscreen. 

“The way it works is, if a viewer is double tapping to skip ahead on an eligible segment, we’ll show a jump ahead button that will take them to the next point in the video that we think they’re aiming for,” YouTube creator-centric channel Creator Insider explained. “This feature will also be available to creators while watching their own videos.”

Currently, such a double-tap action skips a YouTube video forward by a few seconds, which I don’t find hugely useful. And while YouTube introduces a form of wave pattern on the video timeline to show what the most popular parts of the video are, it’s not the easiest thing to use, and can sometimes feel rather lacking in intuitiveness.

So being able to easily tap to get to the most popular part of a video, at least according to an AI, could be a boon for impatient people like me. The only wrinkle is that this feature is only being tested for YouTube Premium users, and is currently limited to the US.

But such features do tend to get a larger global rollout once they come out of the testing phase, meaning there’s scope for Brits like myself to have access to some smart double-tap video skipping – that’s if I do finally decide to bite the bullet and pay for YouTube Premium.

You might also like

TechRadar – All the latest technology news

Read More

Google Drive could add a smart new way to keep your files organized

Finding your way around your Google Drive files could be about to get a lot easier: there's evidence that you'll soon be able to categorize your files into different groups, like banking and work, to keep them better organized.

This is according to hidden code spotted in the Google Drive app by TheSpAndroid (via Android Police). Apps often lay the coding groundwork for future features, before those features go live and are announced to users.

As per the app, the categories you'll be able to make use of are Auto, Banking, Expenses, Home, IDs, Insurance, Medical, Pets, School, Taxes, Travel, and Work. From this leak, it doesn't seem as though custom labels will be allowed, but those 12 categories cover the business of modern life pretty well.

As Android Police points out, these categories are similar to the labeling system that companies can use in Google Workspace. However, this should be available to individual users too, across Android, iOS, and the web.

How it'll work

Google Drive category feature leak

How the upcoming feature might look (Image credit: TheSpAndroid)

Here's how it's going to work: From the Home tab in the Android app, you'll be able to tap the three dots next to a file, then choose from the categories list. A file can be in multiple categories, potentially making the feature more useful than the current folders system.

We don't get any indication here about when the switch might be flipped to give users access to file categories: the report on TheSpAndroid says “it won't come very soon”, so presumably there's still work to do before it's ready for the public.

Given Google's recent and very committed push into artificial intelligence features, it's possible that some kind of AI processing might be involved as well, in categorizing files for you (or at least suggesting categories based on a file name or its contents). Suggested categories do appear in the screens produced by the hidden code.

We now know that Google I/O 2024 is getting underway on May 14 this year, so in between all the Android 15 and Pixel 8a news we might get an announcement or two regarding new Google Drive features – and of course we'll bring you all the news from the event.

You might also like

TechRadar – All the latest technology news

Read More

Meta’s Ray-Ban smart glasses are becoming AI-powered tour guides

While Meta’s most recognizable hardware is its Quest VR headsets, its smart glasses created in collaboration with Ray-Ban are proving to be popular thanks to their sleek design and unique AI tools – tools that are getting an upgrade to turn them into a wearable tourist guide.

In a post on Threads – Meta’s Twitter-like Instagram spinoff – Meta CTO Andrew Bosworth showed off a new Look and Ask feature that can recognize landmarks and tell you facts about them. Bosworth demonstrated it using examples from San Francisco such as the Golden Gate Bridge, the Painted Ladies, and Coit Tower.

As with other Look and Ask prompts, you give a command like “Look and tell me a cool fact about this bridge.” The Ray-Ban Meta Smart Glasses then use their in-built camera to scan the scene in front of you, and cross-reference the image with info in the Meta AI’s knowledge database (which includes access to the Bing search engine). 

The specs then respond with the cool fact you requested – in this case explaining the Golden Gate Bridge (which it recognized in the photo it took) is painted “International Orange” so that it would be more visible in foggy conditions.

Screen shots from Threads showing the Meta Ray-Ban Smart Glasses being used to give the suer information about San Francisco landmarks

(Image credit: Andrew Bosworth / Threads)

Bosworth added in a follow-up message that other improvements are being rolled out, including new voice commands so you can share your latest Meta AI interaction on WhatsApp and Messenger. 

Down the line, Bosworth says you’ll also be able to change the speed of Meta AI readouts in the voice settings menu to have them go faster or slower.

Still not for everyone 

One huge caveat is that – much like the glasses’ other Look and Ask AI features – this new landmark recognition feature is still only in beta. As such, it might not always be the most accurate – so take its tourist guidance with a pinch of salt.

Orange RayBan Meta Smart Glasses

(Image credit: Meta)

The good news is Meta has at least opened up its waitlist to join the beta so more of us can try these experimental features. Go to the official page, input your glasses serial number, and wait to get contacted – though this option is only available if you’re based in the US.

In his post Bosworth did say that the team is working to “make this available to more people,” but neither he nor Meta have given a precise timeline of when the impressive AI features will be more widely available.

You might also like

TechRadar – All the latest technology news

Read More

Oppo’s new AI-powered AR smart glasses give us a glimpse of the next tech revolution


  • Oppo has shown off its Air Glass 3 AR glasses at MWC 2024
  • They’re powered by its AndesGPT AI model and can answer questions
  • They’re just a prototype, but the tech might not be far from launching

While there’s a slight weirdness to the Meta Ray-Ban Smart Glasses – they are a wearable camera, after all – the onboard AI is pretty neat, even if some of its best features are still in beta. So it’s unsurprising that other companies are looking to launch their own AI-powered specs, with Oppo being the latest in unveiling its new Air Glass 3 at MWC 2024.

In a demo video, Oppo shows how the specs have seemingly revolutionized someone's working day. When they boot up, the Air Glass 3's 1,000-nit displays show the user a breakdown of their schedule, and while making a coffee ahead of a meeting they get a message saying that it's started early.

While in the meeting the specs pick up on a question that’s been asked, and Oppo's AndesGPT AI model (which runs on a connected smartphone) is able to provide some possible answers. Later it uses the design details that have been discussed to create an image of a possible prototype design which the wearer then brings to life.

After a good day’s work they can kick back to some of their favorite tunes that play through the glasses’ in-built speakers. All of this is crammed into a 50g design. 

Now, the big caveat here is the Air Glass 3 AR glasses are just a prototype. What’s more, neither of the previous Air Glass models were released outside of China – so there’s a higher than likely chance the Air Glass 3 won’t be either.

But what Oppo is showing off isn’t far from being mimicked by its rivals, and a lot of it is pretty much possible in tech that you can go out and buy today – including those Meta Ray-Ban Smart Glasses.

The future is now

The Ray-Ban Meta Smart Glasses already have an AI that can answer questions like a voice-controlled ChatGPT

They can also scan the environment around you using the camera to get context for questions – for example, “what meal can I make with these ingredients?” – via their 'Look and Ask' feature. These tools are currently in beta, but the tech is working and the AI features will hopefully be more widely available soon.

They can also alert you to texts and calls that you’re getting and play music, just like the Oppo Air Glass 3 concept.

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

The Ray-Ban Meta glasses ooze style and have neat AI tools (Image credit: Meta)

Then there’s the likes of the Xreal Air 2. While their AR display is a little more distracting than the screen found on the Oppo Air Glass 3, they are a consumer product that isn’t mind-blowingly expensive to buy – just $ 399 / £399 for the base model.

If you combine these two glasses then you’re already very close to Oppo’s concept; you’d just need to clean up the design a little, and probably splash out a little more as I expect lenses with built-in displays won’t come cheap.

The only thing I can’t see happening soon is the AI creating a working prototype product design for you. It might be able to provide some inspiration for a designer to work off, but reliably creating a fully functional model seems more than a little beyond existing AI image generation tools' capabilities.

While the Oppo Air Glass 3 certainly look like a promising glimpse of the future, we'll have to see what they're actually capable of if and when they launch outside China.

You might also like

TechRadar – All the latest technology news

Read More