Meta AR glasses: everything we know about the AI-powered AR smart glasses

After a handful of rumors and speculation suggested Meta was working on a pair of AR glasses, it unceremoniously confirmed that Meta AR glasses are on the way – doing so via a short section at the end of a blog post celebrating the 10th anniversary of Reality Labs (the division behind its AR/VR tech).

While not much is known about them, the glasses were described as a product merging Meta’s XR hardware with its developing Meta AI software to “deliver the best of both worlds” in a sleek wearable package.

We’ve collected all the leaks, rumors, and some of our informed speculation in this one place so you can get up to speed on everything you need to know about the teased Meta AR glasses. Let’s get into it.

Meta AR glasses: Price

We’ll keep this section brief as right now it’s hard to predict how much a pair of Meta AR glasses might cost because we know so little about them – and no leakers have given a ballpark estimate either.

Current smart glasses like the Ray-Ban Meta Smart Glasses, or the Xreal Air 2 AR smart glasses will set you back between $ 300 to $ 500 / £300 to £500 / AU$ 450 to AU$ 800; Meta’s teased specs, however, sound more advanced than what we have currently.

Lance Ulanoff showing off Google Glass

Meta’s glasses could cost as much as Google Glass (Image credit: Future)

As such, the Meta AR glasses might cost nearer $ 1,500 (around £1,200 / AU$ 2300)  – which is what the Google Glass smart glasses launched at.

A higher price seems more likely given the AR glasses novelty, and the fact Meta would need to create small yet powerful hardware to cram into them – a combo that typically leads to higher prices.

We’ll have to wait and see what gets leaked and officially revealed in the future.

Meta AR glasses: Release date

Unlike price, several leaks have pointed to when we might get our hands – or I suppose eyeballs – on Meta’s AR glasses. Unfortunately, we might be waiting until 2027.

That’s according to a leaked Meta internal roadmap shared by  The Verge back in March 2023. The document explained that a precursor pair of specs with a display will apparently arrive in 2025, with ‘proper’ AR smart glasses due in 2027.

RayBan Meta Smart Glasses close up with the camera flashing

(Image credit: Meta)

In February 2024  Business Insider cited unnamed sources who said a pair of true AR glasses could be shown off at this year’s Meta Connect conference. However, that doesn’t mean they’ll launch sooner than 2027. While Connect does highlight soon-to-release Meta tech, the company takes the opportunity to show off stuff coming further down the pipeline too. So, its demo of Project Orion (as those who claim to be in the know call it) could be one of those ‘you’ll get this when it’s ready’ kind of teasers.

Obviously, leaks should be taken with a pinch of salt. Meta could have brought the release of its specs forward, or pushed it back depending on a multitude of technological factors – we won’t know until Meta officially announces more details. Considering it has teased the specs suggests their release is at least a matter of when not if.

Meta AR glasses: Specs and features

We haven't heard anything about the hardware you’ll find in Meta’s AR glasses, but we have a few ideas of what we’ll probably see from them based on Meta’s existing tech and partnerships.

Meta and LG recently confirmed that they’ll be partnering to bring OLED panels to Meta’s headsets, and we expect they’ll bring OLED screens to its AR glasses too. OLED displays appear in other AR smart glasses so it would make sense if Meta followed suit.

Additionally, we anticipate that Meta’s AR glasses will use a Qualcomm Snapdragon chipset just like Meta’s Ray-Ban smart glasses. Currently, that’s the AR1 Gen 1, though considering Meta’s AR specs aren’t due until 2027 it seems more likely they’d be powered by a next-gen chipset – either an AR2 Gen 1 or an AR1 Gen 2.

A Meta Quest 3 player sucking up Stay Puft Marshmallow Men from Ghostbusters in mixed reality using virtual tech extending from their controllers

The AR glasses could let you bust ghost wherever you go (Image credit: Meta)

As for features, Meta’s already teased the two standouts: AR and AI abilities.

What this means in actual terms is yet to be seen but imagine virtual activities like being able to set up an AR Beat Saber jam wherever you go, an interactive HUD when you’re navigating from one place to another, or interactive elements that you and other users can see and manipulate together – either for work or play.

AI-wise, Meta is giving us a sneak peek of what's coming via its current smart glasses. That is you can speak to its Meta AI to ask it a variety of questions and for advice just as you can other generative AI but in a more conversational way as you use your voice.

It also has a unique ability, Look and Ask, which is like a combination of ChatGPT and Google Lens. This allows the specs to snap a picture of what’s in front of you to inform your question, allowing you to ask it to translate a sign you can see, for a recipe using ingredients in your fridge, or what the name of a plant is so you can find out how best to care for it.

The AI features are currently in beta but are set to launch properly soon. And while they seem a little imperfect right now, we’ll likely only see them get better in the coming years – meaning we could see something very impressive by 2027 when the AR specs are expected to arrive.

Meta AR glasses: What we want to see

A slick Ray-Ban-like design 

RayBan Meta Smart Glasses

The design of the Ray-Ban Meta Smart Glasses is great (Image credit: Meta)

While Meta’s smart specs aren't amazing in every way – more on that down below – they are practically perfect in the design department. The classic Ray-Ban shape is sleek, they’re lightweight, super comfy to wear all day, and the charging case is not only practical, it's gorgeous.

While it’s likely Ray-Ban and Meta will continue their partnership to develop future smart glasses – and by extension the teased AR glasses – there’s no guarantee. But if Meta’s reading this, we really hope that you keep working with Ray-Ban so that your future glasses have the same high-quality look and feel that we’ve come to adore.

If the partnership does end, we'd like Meta to at least take cues from what Ray-Ban has taught it to keep the design game on point.

Swappable lenses 

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

We want to change our lenses Meta! (Image credit: Meta)

While we will rave about Meta’s smart glasses design we’ll admit there’s one flaw that we hope future models (like the AR glasses) improve on; they need easily swappable lenses.

While a handsome pair of shades will be faultless for your summer vacations, they won’t serve you well in dark and dreary winters. If we could easily change our Meta glasses from sunglasses to clear lenses as needed then we’d wear them a lot more frequently – as it stands, they’re left gathering dust most months because it just isn’t the right weather.

As the glasses get smarter, more useful, and pricier (as we expect will be the case with the AR glasses) they need to be a gadget we can wear all year round, not just when the sun's out.

Speakers you can (quietly) rave too 

JBL Soundgear Sense

These open ear headphones are amazing, Meta take notes (Image credit: Future)

Hardware-wise the main upgrade we want to see in Meta’s AR glasses is better speakers. Currently, the speakers housed in each arm of the Ray-Ban Meta Smart Glasses are pretty darn disappointing – they can leak a fair amount of noise, the bass is practically nonexistent and the overall sonic performance is put to shame by even basic over-the-ears headphones.

We know open-ear designs can be a struggle to get the balance right with. But when we’ve been spoiled by open-ear options like the JBL SoundGear Sense – that have an astounding ability to deliver great sound and let you hear the real world clearly (we often forget we’re wearing them) – we’ve come to expect a lot and are disappointed when gadgets don’t deliver.

The camera could also get some improvements, but we expect the AR glasses won’t be as content creation-focused as Meta’s existing smart glasses – so we’re less concerned about this aspect getting an upgrade compared to their audio capabilities.

You might also like

TechRadar – All the latest technology news

Read More

Meta’s Smart Glasses will get a sci-fi upgrade soon, but they’re still not smart enough

There's a certain allure to smart glasses that bulky mixed-reality headsets lack. Meta's Ray-Ban Smart Glasses (formerly Stories), for instance, are a perfect illustration of how you can build smarts into a wearable without making the wearer look ridiculous. The question is, can you still end up being ridiculous while wearing them?

Ray-Ban Meta Smart Glasses' big upcoming Meta AI update will let you talk to your stylish frames, querying them about the food you're consuming, the buildings you're facing, and the animals you encounter. The update is set to transform the wearable from just another pair of voice-enabled glasses into an always-on-your-face assistant.

The update isn't public and will only apply to Ray-Ban Smart Glasses and not the Ray-Ban Meta Stories predecessors that do not feature Qualcomm's new AR1 Gen 1 chip. This week, however, Meta gave a couple of tech reporters at The New York Times early access to the Meta AI integration and they came away somewhat impressed.

I must admit, I found the walkthrough more intriguing than I expected.

Even though they didn't tear the glasses apart, or get into the nitty gritty tech details I crave, the real-world experience depicts Meta AI as a fascinating and possibly useful work in progress.

Answers and questions

In the story, the authors use the Ray Ban smart glasses to ask Meta AI to identify a variety of animals, objects, and landmarks with varying success. In the confines of their homes, they spoke full voice and asked Meta AI. “What am I looking at?” They also enabled transcription so we could see what they asked and the responses Meta AI provided.

It was, in their experience, quite good at identifying their dogs' breed. However, when they took the smart glasses to the zoo, Meta AI struggled to identify far-away animals. In fact, Meta AI got a lot wrong. To be fair, this is beta and I wouldn't expect the large language model (Llama 2) to get everything right. At least it's not hallucinating (“that's a unicorn!”), just getting it wrong.

The story features a lot of photos taken with the Ray-Ban Meta Smart Glasses, along with the queries and Meta AI's responses. Of course, that's not really what was happening. As the authors note, they were speaking to Meta AI wherever they went and then heard the responses spoken back to them. This is all well and good when you're at home, but just weird when you're alone at a zoo talking to yourself.

The creep factor

This, for me, remains the fundamental flaw in many of these wearables. Whether you wear Ray-Ban Smart Glasses or Amazon Echo Frames, you'll still look as if you're talking to yourself. For a decent experience, you may engage in a lengthy “conversation” with Meta AI to get the information you need. Again, if you're doing this at home, letting Meta AI help you through a detailed recipe, that's fine. Using Meta AI as a tour guide when you're in the middle of, say, your local Whole Foods might label you as a bit of an oddball.

We do talk to our best phones and even our best smartwatches, but I think that when people see you holding your phone or smartwatch near your face, they understand what's going on.

The New York Times' authors noted how they found themselves whispering to their smart glasses, but they still got looks.

I don't know a way around this issue and wonder if this will be the primary reason people swear off what is arguably a very good-looking pair of glasses (or sunglasses) even if they could offer the passive smart technology we need.

So, I'm of two minds. I don't want to be seen as a weirdo talking to my glasses, but I can appreciate having intelligence there and ready to go; no need to pull my phone out, raise my wrist, or even tap a smart lapel pin. I just say, “Hey Meta” and the smart glasses wake up, ready to help.

Perhaps the tipping point here will be when Meta can integrate very subtle AR screens into the frames that add some much-needed visual guidance. Plus, the access to visuals might cut down on the conversation, and I would appreciate that.

You might also like

TechRadar – All the latest technology news

Read More

Want to skip to the good bit of a video? YouTube is testing a smart AI feature for that

I’ve been increasingly driven to distraction by YouTube’s ever-more-aggressive delivery of adverts before, during and after videos, which is making it a challenge to even get to the bits of a video that I want to see without having some earnest voice encourage me to trade stocks or go to Dubai. Until now I’ve been too cheap to subscribe to YouTube Premium – but that may soon change. 

That’s because YouTube is apparently testing an AI-powered recommendation system that will analyze patterns in viewer behavior to cleverly skip to the most popular parts of a video with just a double tap on a touchscreen. 

“The way it works is, if a viewer is double tapping to skip ahead on an eligible segment, we’ll show a jump ahead button that will take them to the next point in the video that we think they’re aiming for,” YouTube creator-centric channel Creator Insider explained. “This feature will also be available to creators while watching their own videos.”

Currently, such a double-tap action skips a YouTube video forward by a few seconds, which I don’t find hugely useful. And while YouTube introduces a form of wave pattern on the video timeline to show what the most popular parts of the video are, it’s not the easiest thing to use, and can sometimes feel rather lacking in intuitiveness.

So being able to easily tap to get to the most popular part of a video, at least according to an AI, could be a boon for impatient people like me. The only wrinkle is that this feature is only being tested for YouTube Premium users, and is currently limited to the US.

But such features do tend to get a larger global rollout once they come out of the testing phase, meaning there’s scope for Brits like myself to have access to some smart double-tap video skipping – that’s if I do finally decide to bite the bullet and pay for YouTube Premium.

You might also like

TechRadar – All the latest technology news

Read More

Google Drive could add a smart new way to keep your files organized

Finding your way around your Google Drive files could be about to get a lot easier: there's evidence that you'll soon be able to categorize your files into different groups, like banking and work, to keep them better organized.

This is according to hidden code spotted in the Google Drive app by TheSpAndroid (via Android Police). Apps often lay the coding groundwork for future features, before those features go live and are announced to users.

As per the app, the categories you'll be able to make use of are Auto, Banking, Expenses, Home, IDs, Insurance, Medical, Pets, School, Taxes, Travel, and Work. From this leak, it doesn't seem as though custom labels will be allowed, but those 12 categories cover the business of modern life pretty well.

As Android Police points out, these categories are similar to the labeling system that companies can use in Google Workspace. However, this should be available to individual users too, across Android, iOS, and the web.

How it'll work

Google Drive category feature leak

How the upcoming feature might look (Image credit: TheSpAndroid)

Here's how it's going to work: From the Home tab in the Android app, you'll be able to tap the three dots next to a file, then choose from the categories list. A file can be in multiple categories, potentially making the feature more useful than the current folders system.

We don't get any indication here about when the switch might be flipped to give users access to file categories: the report on TheSpAndroid says “it won't come very soon”, so presumably there's still work to do before it's ready for the public.

Given Google's recent and very committed push into artificial intelligence features, it's possible that some kind of AI processing might be involved as well, in categorizing files for you (or at least suggesting categories based on a file name or its contents). Suggested categories do appear in the screens produced by the hidden code.

We now know that Google I/O 2024 is getting underway on May 14 this year, so in between all the Android 15 and Pixel 8a news we might get an announcement or two regarding new Google Drive features – and of course we'll bring you all the news from the event.

You might also like

TechRadar – All the latest technology news

Read More

Meta’s Ray-Ban smart glasses are becoming AI-powered tour guides

While Meta’s most recognizable hardware is its Quest VR headsets, its smart glasses created in collaboration with Ray-Ban are proving to be popular thanks to their sleek design and unique AI tools – tools that are getting an upgrade to turn them into a wearable tourist guide.

In a post on Threads – Meta’s Twitter-like Instagram spinoff – Meta CTO Andrew Bosworth showed off a new Look and Ask feature that can recognize landmarks and tell you facts about them. Bosworth demonstrated it using examples from San Francisco such as the Golden Gate Bridge, the Painted Ladies, and Coit Tower.

As with other Look and Ask prompts, you give a command like “Look and tell me a cool fact about this bridge.” The Ray-Ban Meta Smart Glasses then use their in-built camera to scan the scene in front of you, and cross-reference the image with info in the Meta AI’s knowledge database (which includes access to the Bing search engine). 

The specs then respond with the cool fact you requested – in this case explaining the Golden Gate Bridge (which it recognized in the photo it took) is painted “International Orange” so that it would be more visible in foggy conditions.

Screen shots from Threads showing the Meta Ray-Ban Smart Glasses being used to give the suer information about San Francisco landmarks

(Image credit: Andrew Bosworth / Threads)

Bosworth added in a follow-up message that other improvements are being rolled out, including new voice commands so you can share your latest Meta AI interaction on WhatsApp and Messenger. 

Down the line, Bosworth says you’ll also be able to change the speed of Meta AI readouts in the voice settings menu to have them go faster or slower.

Still not for everyone 

One huge caveat is that – much like the glasses’ other Look and Ask AI features – this new landmark recognition feature is still only in beta. As such, it might not always be the most accurate – so take its tourist guidance with a pinch of salt.

Orange RayBan Meta Smart Glasses

(Image credit: Meta)

The good news is Meta has at least opened up its waitlist to join the beta so more of us can try these experimental features. Go to the official page, input your glasses serial number, and wait to get contacted – though this option is only available if you’re based in the US.

In his post Bosworth did say that the team is working to “make this available to more people,” but neither he nor Meta have given a precise timeline of when the impressive AI features will be more widely available.

You might also like

TechRadar – All the latest technology news

Read More

Oppo’s new AI-powered AR smart glasses give us a glimpse of the next tech revolution


  • Oppo has shown off its Air Glass 3 AR glasses at MWC 2024
  • They’re powered by its AndesGPT AI model and can answer questions
  • They’re just a prototype, but the tech might not be far from launching

While there’s a slight weirdness to the Meta Ray-Ban Smart Glasses – they are a wearable camera, after all – the onboard AI is pretty neat, even if some of its best features are still in beta. So it’s unsurprising that other companies are looking to launch their own AI-powered specs, with Oppo being the latest in unveiling its new Air Glass 3 at MWC 2024.

In a demo video, Oppo shows how the specs have seemingly revolutionized someone's working day. When they boot up, the Air Glass 3's 1,000-nit displays show the user a breakdown of their schedule, and while making a coffee ahead of a meeting they get a message saying that it's started early.

While in the meeting the specs pick up on a question that’s been asked, and Oppo's AndesGPT AI model (which runs on a connected smartphone) is able to provide some possible answers. Later it uses the design details that have been discussed to create an image of a possible prototype design which the wearer then brings to life.

After a good day’s work they can kick back to some of their favorite tunes that play through the glasses’ in-built speakers. All of this is crammed into a 50g design. 

Now, the big caveat here is the Air Glass 3 AR glasses are just a prototype. What’s more, neither of the previous Air Glass models were released outside of China – so there’s a higher than likely chance the Air Glass 3 won’t be either.

But what Oppo is showing off isn’t far from being mimicked by its rivals, and a lot of it is pretty much possible in tech that you can go out and buy today – including those Meta Ray-Ban Smart Glasses.

The future is now

The Ray-Ban Meta Smart Glasses already have an AI that can answer questions like a voice-controlled ChatGPT

They can also scan the environment around you using the camera to get context for questions – for example, “what meal can I make with these ingredients?” – via their 'Look and Ask' feature. These tools are currently in beta, but the tech is working and the AI features will hopefully be more widely available soon.

They can also alert you to texts and calls that you’re getting and play music, just like the Oppo Air Glass 3 concept.

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

The Ray-Ban Meta glasses ooze style and have neat AI tools (Image credit: Meta)

Then there’s the likes of the Xreal Air 2. While their AR display is a little more distracting than the screen found on the Oppo Air Glass 3, they are a consumer product that isn’t mind-blowingly expensive to buy – just $ 399 / £399 for the base model.

If you combine these two glasses then you’re already very close to Oppo’s concept; you’d just need to clean up the design a little, and probably splash out a little more as I expect lenses with built-in displays won’t come cheap.

The only thing I can’t see happening soon is the AI creating a working prototype product design for you. It might be able to provide some inspiration for a designer to work off, but reliably creating a fully functional model seems more than a little beyond existing AI image generation tools' capabilities.

While the Oppo Air Glass 3 certainly look like a promising glimpse of the future, we'll have to see what they're actually capable of if and when they launch outside China.

You might also like

TechRadar – All the latest technology news

Read More

These new smart glasses can teach people about the world thanks to generative AI

It was only a matter of time before someone added generative AI to an AR headset and taking the plunge is start-up company Brilliant Labs with their recently revealed Frame smart glasses.

Looking like a pair of Where’s Waldo glasses (or Where’s Wally to our UK readers), the Frame houses a multimodal digital assistant called Noa. It consists of multiple AI models from other brands working together in unison to help users learn about the world around them. These lessons can be done just by looking at something and then issuing a command. Let’s say you want to know more about the nutritional value of a raspberry. Thanks to OpenAI tech, you can command Noa to perform a “visual analysis” of the subject. The read-out appears on the outer AR lens. Additionally, it can offer real-time language translation via Whisper AI.

The Frame can also search the internet via its Perplexity AI model. Search results will even provide price tags for potential purchases. In a recent VentureBeat article, Brilliant Labs claims Noa can provide instantaneous price checks for clothes just by scanning the piece, or fish out home listings for new houses on the market. All you have to do is look at the house in question. It can even generate images on the fly through Stable Diffusion, according to ZDNET

Evolving assistant

Going back to VentureBeat, their report offers a deeper insight into how Noa works. 

The digital assistant is always on, constantly taking in information from its environment. And it’ll apparently “adopt a unique personality” over time. The publication explains that upon activating for the first time, Noa appears as an “egg” on the display. Owners will have to answer a series of questions, and upon finishing, the egg hatches into a character avatar whose personality reflects the user. As the Frame is used, Noa analyzes the interactions between it and the user, evolving to become better at tackling tasks.

Brilliant Labs Frame exploded view

(Image credit: Brilliant Labs)

An exploded view of the Frame can be found on Brilliant Labs’ official website providing interesting insight into how the tech works. On-screen content is projected by a micro-OLED onto a “geometric prism” in the lens. 9To5Google points out this is reminiscent of how Google Glass worked. On the nose bridge is the Frame’s camera sitting on a PCBA (printed circuit board assembly). 

At the end of the stems, you have the batteries inside two big hubs. Brilliant Labs states the frames can last a whole day, and to charge them, you’ll have to plug in the Mister Power dongle, inadvertently turning the glasses into a high-tech Groucho Marx impersonation.

Brilliant Labs Frame with Mister Power

(Image credit: Brilliant Labs)

Availability

Currently open for pre-order, the Frame will run you $ 350 a pair. It’ll be available in three colors: Smokey Black, Cool Gray, and the transparent H20. You can opt for prescription lenses. Doing so will bump the price tag to $ 448.There's a chance Brilliant Labs won’t have your exact prescription. They recommend to instead select the option that closely matches your actual prescription. Shipping is free and the first batch rolls out April 15.

It appears all of the AI features are subject to a daily usage cap. Brilliant Labs has plans to launch a subscription service lifting the limit. We reached out to the company for clarification and asked several other questions like exactly how does the Frame receive input? This story will be updated at a later time.

Until then, check out TechRadar's list of the best VR headsets for 2024.

You might also like

TechRadar – All the latest technology news

Read More

ChatGPT could become a smart personal assistant helping with everything from work to vacation planning

Now that ChatGPT has had a go at composing poetry, writing emails, and coding apps, it's turning its attention to more complex tasks and real-world applications, according to a new report – essentially, being able to do a lot of your computing for you.

This comes from The Information (via Android Authority), which says that ChatGPT developer OpenAI is working on “agent software” that will act almost like a personal assistant. It would be able to carry out clicks and key presses as it works inside applications from web browsers to spreadsheets.

We've seen something similar with the Rabbit R1, although that device hasn't yet shipped. You teach an AI how to calculate a figure in a spreadsheet, or format a document, or edit an image, and then it can do the job for you in the future.

Another type of agent in development will take on online tasks, according to the sources speaking to The Information: These agents are going to be able to research topics for you on the web, or take care of hotel and flight bookings, for example. The idea is to create a “supersmart personal assistant” that anyone can use.

Our AI agent future?

The Google Gemini logo on a laptop screen that's on an orange background

Google is continuing work on its own AI (Image credit: Google)

As the report acknowledges, this will certainly raise one or two concerns about letting automated bots loose on people's personal computers: OpenAI is going to have to do a lot of work to reassure users that its AI agents are safe and secure.

While many of us will be used to deploying macros to automate tasks, or asking Google Assistant or Siri to do something for us, this is another level up. Your boss isn't likely to be too impressed if you blame a miscalculation in the next quarter's financial forecast on the AI agent you hired to do the job.

It also remains to be seen just how much automation people want when it comes to these tasks: Booking vacations involves a lot of decisions, from the position of your seats on an airplane to having breakfast included, which AI would have to make on your behalf.

There's no timescale on any of this, but it sounds like OpenAI is working hard to get its agents ready as soon as possible. Google just announced a major upgrade to its own AI tools, while Apple is planning to reveal its own take on generative AI at some point later this year, quite possibly with iOS 18.

You might also like

TechRadar – All the latest technology news

Read More

Ray-Ban Meta smart glasses finally get the AI camera feature we were promised, but there’s a catch

When the Ray-Ban Meta smart glasses launched they did so without many of the impressive AI features we were promised. Now Meta is finally rolling out these capabilities to users, but they’re still in the testing phase and only available in the US.

During their Meta Connect 2023 announcement, we were told the follow-up to the Ray-Ban Stories smart glasses would get some improvements we expected – namely a slightly better camera and speakers – but also some unexpected AI integration.

Unfortunately, when we actually got to test the specs out its AI features boiled down to very basic commands. You can instruct them to take a picture, record a video, or contact someone through Messenger or WhatsApp. In the US you could also chat to a basic conversational AI – like ChatGPT – though this was still nothing to write home about. 

While the glasses’ design is near-perfect, the speakers and camera weren’t impressive enough to make up for the lacking AI. So overall in our Ray-Ban Meta Smart Glasses review we didn’t look too favorably on the specs. 

The Ray-Ban Meta Smart Glasses Collection is stylish looking on this person's face

Press the button or ask the AI to take a picture (Image credit: Meta)

Our perception could soon be about to change drastically, however, as two major promised features are on their way: Look and Ask, and Bing integration.

Look and Ask is essentially a wearable voice-controlled Google Lens with a few AI-powered upgrades. While wearing the smart glasses you can say “Hey Meta, look and…” followed by a question about what you can see. The AI will then use the camera to scan your environment so it can provide a detailed answer to your query. On the official FAQ possible questions you can ask include “What can I make with these ingredients?” or “How much water do these flowers need?” or “Translate this sign into English.” 

To help the Meta glasses provide better information when you’re using its conversational and Look and Ask features the specs can also now access the internet via Bing. This should mean the specs can source more up-to-date data letting it answer questions about sports matches that are currently happening, or provide real-time info on what nearby restaurants are the best rated, among other things.

Still not perfect

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

(Image credit: Meta)

It all sounds very science fiction, but unfortunately these almost magical capabilities come with a catch. For now, the new features – just like the existing conversational AI – are in beta testing. 

So the glasses might have trouble with some of your queries and provide inaccurate answers, or not be able to find an answer at all. What’s more, as Meta explains in its FAQ any AI-processed pictures you take while part of the beta will be stored by Meta and used to train its AI. So your Look and Ask snaps aren’t private.

Lastly, the Meta Ray-Ban smart glasses beta is only available in the US. So if you live somewhere else like me you won’t be able to try these features out – and probably won’t until 2024.

If you are in the US and happy with the terms of Meta’s Privacy Policy, you can sign up for the Early Access program and start testing these new tools. For everyone else hopefully these features won’t be in beta for long, or at least won’t be US-exclusive – otherwise we’ll be left continuing to wonder why we spent $ 299 / £299 / AU$ 449 on smart specs that aren’t all that much better than dumb Ray-Ban Wayfarers at half the cost.

You might also like

TechRadar – All the latest technology news

Read More

Samsung Glasses could be the name of a new pair of Samsung smart specs

Rumors of some kind of Samsung smart glasses have been swirling for years at this point, but it looks as though the wait for an actual device might soon be over: Samsung has filed to register “Samsung Glasses” as a trademark in the UK.

This comes from UploadVR (via Android Central), and the filing comes with a description of the categories the product covers: virtual reality headsets, augmented reality headsets, headphones, smartphones, and smart glasses.

That covers a lot of ground. Virtual reality or VR means fully enclosed digital experiences, augmented reality or AR means looking at the real world with digital graphics overlaid on top, mixed reality or MR is enhanced AR where the digital elements and real elements interact, and extended reality or XR is used to mean VR, AR and MR all together.

Exactly which category the Samsung Glasses might fall into remains to be seen, but we know that the company is working on several different products offering these technologies, after previously being responsible for the Samsung Gear VR.

What to expect

Samsung itself has confirmed that it has an XR headset in the pipeline to rival the Apple Vision Pro, but it's not expected to appear until later in 2024, so that Samsung has time to get features such as display sharpness as good as they can be.

The term “glasses” really doesn't sound like a headset, anyway. Could it be that Samsung is also working on a pair of AR specs? We've seen suggestions of this in previous years, though no confirmation from Samsung itself.

Or, we might be talking about more basic smart glasses: able to take photos and videos, an on-board smart assistant, but no fancy augmented reality. See our Ray-Ban Meta Smart Glasses review for Meta's recent entry in this product category.

Right now it's not clear exactly what to expect – but it looks very much like Samsung will soon launch a device that you can wear on your face. Its next big launch event should be for the Samsung Galaxy S24 phone, sometime in January.

Follow TechRadar on TikTok for news, reviews, unboxings, and the best deals!

You might also like

TechRadar – All the latest technology news

Read More