These new AI smart glasses are like getting a second pair of ChatGPT-powered eyes

The Ray-Ban Meta glasses have a new rival for the title of best smart glasses, with the new Solos AirGo Visions letting you quiz ChatGPT about the objects and people you're looking at.

Unlike previous Solos glasses, the AirGo Vision boast a built-in camera and support for OpenAI's latest GPT-4o model. These let the glasses identify what you're looking at and respond to voice prompts. For example, you could simply ask, “what am I looking at?” or give the AirGo Visions a more specific request like “give me directions to the Eiffel Tower.”

Another neat feature of the new Solos glasses is their modular frame design, which means you can change some parts – for example, the camera or lenses – to help them suit different situations. These additional frames start from $ 89 (around £70 / AU$ 135).   

If talking to a pair of camera-equipped smart glasses is a little too creepy, you can also use the camera to simply take holiday snaps. The AirGo Visions also feature built-in speakers to answer your questions or play music.

While there's no official price or release date for the full version of the AirGo Visions, Solos will release a version without the camera for $ 249 (around £200 / AU$ 375) in July. That means we can expect a camera-equipped pair to cost at least as much as the Ray-Ban Meta glasses, which will set you back $ 299 / £299 / AU$ 449.

How good are AI-powered smart glasses?

While we haven't yet tried the Solos AirGo Visions, it's fair to say that smart glasses with AI assistants are a work in progress. 

TechRadar's Senior Staff Writer Hamish Hector recently tried the Meta AI's 'Look and Ask' feature on his Ray-Ban smart glasses and found the experience to be mixed. He stated that “the AI is – when it works – fairly handy,” but that “it wasn’t 100% perfect, struggling at times due to its camera limitations and an overload of information.”

The smart glasses failed in some tests, like identifying trees, but their ability to quickly summarize a confusing, information-packed sign about the area’s parking restrictions showed how useful they can be in some situations.

As always, with any AI-powered responses, you'll want to corroborate any answers to filter out errors and so-called hallucinations. But there's undoubtedly some potential in the concept, particularly for travelers or anyone who is visually impaired.

The Solos AirGo Visions' support for OpenAI's latest GPT-4o model should make for an interesting comparison with the Ray-Ban Meta smart glasses when the camera-equipped version lands. Until then, you can check out our guide to the best smart glasses you can buy right now.

You might also like

TechRadar – All the latest technology news

Read More

Xreal’s cheap smartphone-like gadget solves the biggest issues with AR smart glasses

Xreal has unveiled its latest gadget at the Augmented World Expo (AWE) 2024 XR industry showcase, but it’s not a pair of AR smart glasses – it’s a smartphone-like device called the Xreal Beam Pro.

The Beam Pro is a successor to the original Beam adapter, and it solves many of the issues I’ve had with this generation of AR specs – and it’s surprisingly affordable.

The upgraded Beam Pro is a whole new Xreal glasses attachment that’s compatible with Xreal Air and Xreal Air 2 models – including the Pro and Ultra. Simply connect it to your specs with a USB-C to USB-C cable and they’ll turn into bonafide spatial computers that run on Android 14 and NebulaOS (Xreal’s home-grown operating system).

You can control what you see on the Xreal glasses using the Beam Pro’s 6.5-inch LCD 2K touchscreen. You can also use the screen just like you would on a smartphone to manage Play Store apps and tweak your settings, and thanks to the dual 50MP cameras on its rear you can capture spatial images and 3D video at 1080p and 60fps.

Xreal Air glasses connected to the Xreal Beam Pro being used to watch TV

(Image credit: Xreal)

Best of all it comes in at just $ 199 / £189 for the 6GB of RAM / 128GB storage model, while the 8GB of RAM / 256GB storage model will set you back $ 249 / £239. Preorders are live right now at XREAL.com for the United States, United Kingdom, China, Japan, Germany, France, Italy and Netherlands; availability and pricing for Australia are TBC.

If you’re picking up the Beam Pro and a pair of Xreal glasses you can save on bundles, and those of you with a 1st-gen Beam can save $ 50 / £40 on a Beam Pro if you preorder one by July 10, 2024.

The AR accessory we’ve needed 

I’ve not yet had the chance to try out the Xreal Beam Pro, but it sounds like the AR add-on we’ve been needing for a while by solving a bunch of issues I’ve had with the tech when testing it out.

Firstly, finding compatible gadgets can be confusing. There are all sorts of compatibility issues thanks to hardware and software nuances that are confusing if you aren’t techy. While the original Beam was billed as a potential solution, it still suffered with compatibility issues because it couldn’t be used without a smartphone, while if you pick up the Pro and a pair of Xreal Airs you know you’ll have everything you need.

Second, it solves the battery annoyances I’ve had thanks to its two USB-C ports. The Xreal glasses (and other wired AR specs) can burn through your phone’s charge, and there’s no way to charge your phone and use the glasses at the same time. That’s not an issue with the Beam Pro, as you can use it and connect it to power at the same time.

Girl wearing Meta Quest 3 headset interacting with a jungle playset

The Meta Quest 3 has some competition (Image credit: Meta)

Finally, it seems like good value for money. Without any bundle discounts an AR glasses and Beam Pro setup costs between $ 498 and $ 648 / £528 and £638, which is a little more expensive than a Meta Quest 3

While AR isn't prohibitively expensive, it can feel like you're getting relatively little bang for your buck compared to XR devices such as VR headsets because of the aforementioned compatibility and complexity issues.

The Beam Pro gives you a simple plug-and-play option that’s a pocket TV and 3D camera and which doesn’t require other tech – just some subscriptions to the best streaming services

I’ll obviously need to try the Beam Pro out to verify Xreal’s bold promises, but if you’ve been waiting for AR tech to start feeling worthwhile, this is your notice to start paying attention to the space – and maybe even finally dive in.

You might also like

TechRadar – All the latest technology news

Read More

Ray-Ban Meta smart glasses get new Amazon Music and mental health update

In a sign that they could follow the roughly monthly Meta Quest 3 software update schedule we’re used to, the Ray-Ban Meta smart glasses have received several new tools like Calm and improved Instagram integration just 29 days since they got their massive April 23 upgrade.

While none of these improvements seemingly include wider access to the still-in-beta and still-US-and-Canada exclusive Meta AI, they do include some helpful hands-free features that users can enjoy right now.

The first are new Meta AI prompts that allow you to enjoy guided meditation, mindfulness exercises, and self-care content through your smart specs by simply saying “Hey Meta, play the Daily Calm.” New Calm users will also be able to access a three-month free subscription through prompts in the Meta View app.

Beyond this, your smart specs can now directly stream tunes from Amazon Music using voice-only controls (joining Apple Music which added hands-free support in April) – you just need to connect your account via the Meta View app. There’s new Instagram Story sharing options, too.

Simply say something like, “Hey Meta, post a photo to Instagram” to snap a pic that’ll be shared automatically to your connected Instagram account.

As the Meta blog post sharing details of the update explains, these new Ray-Ban Meta smart glasses features are rolling out gradually. So if you don’t see the update in the Meta View app, don’t panic – you should get the update soon enough.

Three new styles

The Skyler Ray-Ban Meta smart glasses with pink lenses

The Skyler in Shiny Chalky Gray (above) are one of three new versions of the Ray-Ban Meta smart glasses (Image credit: Meta / Ray-Ban)

If you don’t like waiting for software changes, there are also some hardware updates – which are available now.

The internal specs are the exact same as the original models, but Meta and Ray-Ban have launched three new styles which are available in the US, Canada, Australia, and “throughout Europe.” They are:

  • Skyler in Shiny Chalky Gray with Gradient Cinnamon Pink Lenses
  • Skyler in Shiny Black with Transitions® Cerulean Blue Lenses
  • Headliner Low Bridge Fit in Shiny Black with Polar G15 Lenses

Hopefully this monthly software schedule will continue, and if it does maybe those of us outside the US might not have to wait too much longer for the Meta AI to hit our devices in a future update.

You might also like

TechRadar – All the latest technology news

Read More

A huge Meta AI update finally arrives on Ray-Ban Meta smart glasses… for some

After months of waiting the moment is here: Meta AI features have arrived on the Ray-Ban Meta smart glasses for everyone – well, everyone in the US and Canada, for now.

The exclusivity to those regions is not the only caveat unfortunately. Another big one is that while the Meta AI tools are no longer locked behind an exclusive beta, Meta notes in its blog post announcement that they are still beta features – suggesting that you’ll likely run into several problems with regard to reliability and accuracy.

But while the update isn’t quite as complete as we’d have liked, it’s still a major leap forward for Meta’s smart glasses – finally having them deliver on the impressive AI promises Meta CEO Mark Zuckerberg made when they were revealed back at Meta Connect 2023 in September last year.

What can Meta AI do?

A video call shot on the Ray-Ban Meta smart glasses

(Image credit: Ray-Ban / Meta)

The main Meta AI feature you’ll want to take advantage of is ‘Look and Ask.’ To activate it simply start a phrase with “Hey Meta, look and …” then ask the glasses a question about something you can see. 

You could try “… tell me about this animal,” or “…tell me about this building,” or even “…tell me what I can make for dinner with these ingredients.”

The glasses will then use your command alongside an image captured by the camera to search for an answer in its database – which include data the Meta AI has been trained on, and information it has gathered from Google and Bing.

As with all AI responses, we’d recommend taking what the Meta AI says with a pinch of salt. AI assistants are prone to hallucinating – which in the AI context you can read simply as “getting stuff completely wrong” – and this Meta model is no different. It will get stuff right too, but don’t take its advice as gospel.

Ray-Ban Meta Smart Glasses covered in water droplets

(Image credit: Meta)

Beyond Look and Ask you can use the Meta AI assistant like the Google or Siri assistant on your phone. This means starting video calls (above), sending texts and images, or playing music all with just voice commands.

Just be prepared to get some attention as you walk around talking to your smart glasses – we got some odd looks when we were testing a different pair of specs the other day.

You might also like

TechRadar – All the latest technology news

Read More

Ray Ban’s Meta Glasses now let you listen to Apple Music with voice controls for maximum nerd points

The Ray-Ban Meta smart glasses are still waiting on their big AI update – which is set to bring features like ‘Look and Ask’ out of the exclusive beta and bring them to everyone – but while we wait, a useful upgrade has just rolled out to the specs.

The big feature for many will be native Apple Music controls (via 9to5Mac). Previously you could play Apple Music through the Ray-Ban Meta glasses by using the app on your phone and touch controls on the glass’ arms, but this update allows you to use the Meta AI voice controls to play songs, playlists, albums, and stations from your music library for a hands-free experience.

The update also brings new touch controls. You touch and hold the side of the glasses to have Apple Music automatically play tracks based on your listening history.

The Apple Music app icon against a red background on an iPhone.

(Image credit: Brett Jordan / Unsplash)

Beyond Apple Music integration, the new update also allows you to use the glasses as a video source for WhatsApp and Messenger calls. This improves on pre-existing interoperability that allows you to send messages, and images or videos you captured using the glasses to contacts in these apps using the Meta AI.

You can also access a new command, “Hey Meta, what song is this?” to have your glasses tell you what song is playing through your smart specs. This isn’t quite as useful as recognizing tracks that are playing in public as you walk around, but could be handy if you like collecting playlists of new and unfamiliar artists.

To update your glasses to the latest version, simply go to the Meta View App, go to Settings, open the Your Glasses menu option, then Updates. You’ll also want to have your glasses to hand and make sure they’re turned on and connected to your phone via Bluetooth. If you can’t see the update – and your phone says it isn’t already on version 4.0 – then check the Play Store or App Store to see if the Meta View app itself needs an update.

You might also like

TechRadar – All the latest technology news

Read More

Meta AR glasses: everything we know about the AI-powered AR smart glasses

After a handful of rumors and speculation suggested Meta was working on a pair of AR glasses, it unceremoniously confirmed that Meta AR glasses are on the way – doing so via a short section at the end of a blog post celebrating the 10th anniversary of Reality Labs (the division behind its AR/VR tech).

While not much is known about them, the glasses were described as a product merging Meta’s XR hardware with its developing Meta AI software to “deliver the best of both worlds” in a sleek wearable package.

We’ve collected all the leaks, rumors, and some of our informed speculation in this one place so you can get up to speed on everything you need to know about the teased Meta AR glasses. Let’s get into it.

Meta AR glasses: Price

We’ll keep this section brief as right now it’s hard to predict how much a pair of Meta AR glasses might cost because we know so little about them – and no leakers have given a ballpark estimate either.

Current smart glasses like the Ray-Ban Meta Smart Glasses, or the Xreal Air 2 AR smart glasses will set you back between $ 300 to $ 500 / £300 to £500 / AU$ 450 to AU$ 800; Meta’s teased specs, however, sound more advanced than what we have currently.

Lance Ulanoff showing off Google Glass

Meta’s glasses could cost as much as Google Glass (Image credit: Future)

As such, the Meta AR glasses might cost nearer $ 1,500 (around £1,200 / AU$ 2300)  – which is what the Google Glass smart glasses launched at.

A higher price seems more likely given the AR glasses novelty, and the fact Meta would need to create small yet powerful hardware to cram into them – a combo that typically leads to higher prices.

We’ll have to wait and see what gets leaked and officially revealed in the future.

Meta AR glasses: Release date

Unlike price, several leaks have pointed to when we might get our hands – or I suppose eyeballs – on Meta’s AR glasses. Unfortunately, we might be waiting until 2027.

That’s according to a leaked Meta internal roadmap shared by  The Verge back in March 2023. The document explained that a precursor pair of specs with a display will apparently arrive in 2025, with ‘proper’ AR smart glasses due in 2027.

RayBan Meta Smart Glasses close up with the camera flashing

(Image credit: Meta)

In February 2024  Business Insider cited unnamed sources who said a pair of true AR glasses could be shown off at this year’s Meta Connect conference. However, that doesn’t mean they’ll launch sooner than 2027. While Connect does highlight soon-to-release Meta tech, the company takes the opportunity to show off stuff coming further down the pipeline too. So, its demo of Project Orion (as those who claim to be in the know call it) could be one of those ‘you’ll get this when it’s ready’ kind of teasers.

Obviously, leaks should be taken with a pinch of salt. Meta could have brought the release of its specs forward, or pushed it back depending on a multitude of technological factors – we won’t know until Meta officially announces more details. Considering it has teased the specs suggests their release is at least a matter of when not if.

Meta AR glasses: Specs and features

We haven't heard anything about the hardware you’ll find in Meta’s AR glasses, but we have a few ideas of what we’ll probably see from them based on Meta’s existing tech and partnerships.

Meta and LG recently confirmed that they’ll be partnering to bring OLED panels to Meta’s headsets, and we expect they’ll bring OLED screens to its AR glasses too. OLED displays appear in other AR smart glasses so it would make sense if Meta followed suit.

Additionally, we anticipate that Meta’s AR glasses will use a Qualcomm Snapdragon chipset just like Meta’s Ray-Ban smart glasses. Currently, that’s the AR1 Gen 1, though considering Meta’s AR specs aren’t due until 2027 it seems more likely they’d be powered by a next-gen chipset – either an AR2 Gen 1 or an AR1 Gen 2.

A Meta Quest 3 player sucking up Stay Puft Marshmallow Men from Ghostbusters in mixed reality using virtual tech extending from their controllers

The AR glasses could let you bust ghost wherever you go (Image credit: Meta)

As for features, Meta’s already teased the two standouts: AR and AI abilities.

What this means in actual terms is yet to be seen but imagine virtual activities like being able to set up an AR Beat Saber jam wherever you go, an interactive HUD when you’re navigating from one place to another, or interactive elements that you and other users can see and manipulate together – either for work or play.

AI-wise, Meta is giving us a sneak peek of what's coming via its current smart glasses. That is you can speak to its Meta AI to ask it a variety of questions and for advice just as you can other generative AI but in a more conversational way as you use your voice.

It also has a unique ability, Look and Ask, which is like a combination of ChatGPT and Google Lens. This allows the specs to snap a picture of what’s in front of you to inform your question, allowing you to ask it to translate a sign you can see, for a recipe using ingredients in your fridge, or what the name of a plant is so you can find out how best to care for it.

The AI features are currently in beta but are set to launch properly soon. And while they seem a little imperfect right now, we’ll likely only see them get better in the coming years – meaning we could see something very impressive by 2027 when the AR specs are expected to arrive.

Meta AR glasses: What we want to see

A slick Ray-Ban-like design 

RayBan Meta Smart Glasses

The design of the Ray-Ban Meta Smart Glasses is great (Image credit: Meta)

While Meta’s smart specs aren't amazing in every way – more on that down below – they are practically perfect in the design department. The classic Ray-Ban shape is sleek, they’re lightweight, super comfy to wear all day, and the charging case is not only practical, it's gorgeous.

While it’s likely Ray-Ban and Meta will continue their partnership to develop future smart glasses – and by extension the teased AR glasses – there’s no guarantee. But if Meta’s reading this, we really hope that you keep working with Ray-Ban so that your future glasses have the same high-quality look and feel that we’ve come to adore.

If the partnership does end, we'd like Meta to at least take cues from what Ray-Ban has taught it to keep the design game on point.

Swappable lenses 

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

We want to change our lenses Meta! (Image credit: Meta)

While we will rave about Meta’s smart glasses design we’ll admit there’s one flaw that we hope future models (like the AR glasses) improve on; they need easily swappable lenses.

While a handsome pair of shades will be faultless for your summer vacations, they won’t serve you well in dark and dreary winters. If we could easily change our Meta glasses from sunglasses to clear lenses as needed then we’d wear them a lot more frequently – as it stands, they’re left gathering dust most months because it just isn’t the right weather.

As the glasses get smarter, more useful, and pricier (as we expect will be the case with the AR glasses) they need to be a gadget we can wear all year round, not just when the sun's out.

Speakers you can (quietly) rave too 

JBL Soundgear Sense

These open ear headphones are amazing, Meta take notes (Image credit: Future)

Hardware-wise the main upgrade we want to see in Meta’s AR glasses is better speakers. Currently, the speakers housed in each arm of the Ray-Ban Meta Smart Glasses are pretty darn disappointing – they can leak a fair amount of noise, the bass is practically nonexistent and the overall sonic performance is put to shame by even basic over-the-ears headphones.

We know open-ear designs can be a struggle to get the balance right with. But when we’ve been spoiled by open-ear options like the JBL SoundGear Sense – that have an astounding ability to deliver great sound and let you hear the real world clearly (we often forget we’re wearing them) – we’ve come to expect a lot and are disappointed when gadgets don’t deliver.

The camera could also get some improvements, but we expect the AR glasses won’t be as content creation-focused as Meta’s existing smart glasses – so we’re less concerned about this aspect getting an upgrade compared to their audio capabilities.

You might also like

TechRadar – All the latest technology news

Read More

Meta teases its next big hardware release: its first AR glasses, and we’re excited

Meta’s Reality Labs division – the team behind its VR hardware and software efforts – has turned 10 years old, and to celebrate the company has released a blog post outlining its decade-long history. However, while a trip down memory lane is fun, the most interesting part came right at the end, as Meta teased its next major new hardware release: its first-ever pair of AR glasses.

According to the blog post, these specs would merge the currently distinct product pathways Meta’s Reality Labs has developed – specifically, melding its AR and VR hardware (such as the Meta Quest 3) with the form factor and AI capabilities of its Ray-Ban Meta Smart Glasses to, as Meta puts it, “deliver the best of both worlds.”

Importantly for all you Quest fans out there, Meta adds that its AR glasses wouldn’t replace its mixed-reality headsets. Instead, it sees them being the smartphones to the headsets’ laptop/desktop computers – suggesting that the glasses will offer solid performance in a sleek form factor, but with less oomph than you’d get from a headset.

Before we get too excited, though, Meta hasn’t said when these AR specs will be released – and unfortunately they might still be a few years away.

When might we see Meta’s AR glasses?

A report from The Verge back in March 2023 shared an apparent Meta Reality Labs roadmap that suggested the company wanted to release a pair of smart glasses with a display in 2025, followed by a pair of 'proper' AR smart glasses in 2027.

The Meta Quest 3 dangling down as a user looks towards a sunny window while holding it

We’re ready for Meta’s next big hardware release (Image credit: Meta)

However, while we may have to wait some time to put these things on our heads, we might get a look at them in the next year or so,

A later report that dropped in February this year, this time via Business Insider, cited unnamed sources who said a pair of true AR glasses would be demoed at this year’s Meta Connect conference. Dubbed 'Orion' by those who claim to be in the know, the specs would combine Meta’s XR (a catchall for VR, AR, and MR) and AI efforts – which is exactly what Meta described in its recent blog post.

As always, we should take rumors with a pinch of salt, but given that this latest teaser came via Meta itself it’s somewhat safe to assume that Meta AR glasses are a matter of when, not if. And boy are we excited.

We want Meta AR glasses, and we want ‘em now 

Currently Meta has two main hardware lines: its VR headsets and its smart glasses. And while it’s rumored to be working on new entries to both – such as a budget Meta Quest 3 Lite, a high-end Meta Quest Pro 2, and the aforementioned third-generation Ray-Ban glasses with a screen – these AR glasses would be its first big new hardware line since it launched the Ray-Ban Stories in 2021.

And the picture Meta has painted of its AR glasses is sublime.

Firstly, while Meta’s current Ray-Ban smart glasses aren’t yet the smartest, a lot of major AI upgrades are currently in beta – and should be launching properly soon.

Ray-Ban meta glasses up close

The Ray-Ban Meta Smart Glasses are set to get way better with AI (Image credit: Future / Philip Berne)

Its Look and Ask feature combines the intelligence of ChatGPT – or in this instance its in-house Meta AI – with the image-analysis abilities of an app like Google Lens. This apparently lets you identify animals, discover facts about landmarks, and help you plan a meal based on the ingredients you have – it all sounds very sci-fi, and actually useful, unlike some AI applications.

We then take those AI-abilities and combine them with Meta’s first-class Quest platform, which is home to the best software and developers working in the XR space. 

While many apps likely couldn’t be ported to the new system due to hardware restrictions – as the glasses might not offer controllers, will probably be AR-only, and might be too small to offer as powerful a chipset or as much RAM as its Quest hardware – we hope that plenty will make their way over. And Meta’s existing partners would plausibly develop all-new AR software to take advantage of the new system.

Based on the many Quest 3 games and apps we’ve tried, even if just a few of the best make their way to the specs they’d help make Meta’s new product feel instantly useful. a factor that’s a must for any new gadget.

Lastly, we’d hopefully see Meta’s glasses adopt the single-best Ray-Ban Meta Smart Glasses feature: their design. These things are gorgeous, comfortable, and their charging case is the perfect combination of fashion and function. 

A closeup of the RayBan Meta Smart Glasses

We couldn’t ask for better-looking smart specs than these (Image credit: Meta)

Give us everything we have already design-wise, and throw in interchangeable lenses so we aren’t stuck with sunglasses all year round – which in the UK where I'm based are only usable for about two weeks a year – and the AR glasses could be perfect.

We’ll just have to wait and see what Meta shows off, either at this year’s Meta Connect or in the future – and as soon as they're ready for prime time, we’ll certainly be ready to test them.

You might also like

TechRadar – All the latest technology news

Read More

Meta’s Smart Glasses will get a sci-fi upgrade soon, but they’re still not smart enough

There's a certain allure to smart glasses that bulky mixed-reality headsets lack. Meta's Ray-Ban Smart Glasses (formerly Stories), for instance, are a perfect illustration of how you can build smarts into a wearable without making the wearer look ridiculous. The question is, can you still end up being ridiculous while wearing them?

Ray-Ban Meta Smart Glasses' big upcoming Meta AI update will let you talk to your stylish frames, querying them about the food you're consuming, the buildings you're facing, and the animals you encounter. The update is set to transform the wearable from just another pair of voice-enabled glasses into an always-on-your-face assistant.

The update isn't public and will only apply to Ray-Ban Smart Glasses and not the Ray-Ban Meta Stories predecessors that do not feature Qualcomm's new AR1 Gen 1 chip. This week, however, Meta gave a couple of tech reporters at The New York Times early access to the Meta AI integration and they came away somewhat impressed.

I must admit, I found the walkthrough more intriguing than I expected.

Even though they didn't tear the glasses apart, or get into the nitty gritty tech details I crave, the real-world experience depicts Meta AI as a fascinating and possibly useful work in progress.

Answers and questions

In the story, the authors use the Ray Ban smart glasses to ask Meta AI to identify a variety of animals, objects, and landmarks with varying success. In the confines of their homes, they spoke full voice and asked Meta AI. “What am I looking at?” They also enabled transcription so we could see what they asked and the responses Meta AI provided.

It was, in their experience, quite good at identifying their dogs' breed. However, when they took the smart glasses to the zoo, Meta AI struggled to identify far-away animals. In fact, Meta AI got a lot wrong. To be fair, this is beta and I wouldn't expect the large language model (Llama 2) to get everything right. At least it's not hallucinating (“that's a unicorn!”), just getting it wrong.

The story features a lot of photos taken with the Ray-Ban Meta Smart Glasses, along with the queries and Meta AI's responses. Of course, that's not really what was happening. As the authors note, they were speaking to Meta AI wherever they went and then heard the responses spoken back to them. This is all well and good when you're at home, but just weird when you're alone at a zoo talking to yourself.

The creep factor

This, for me, remains the fundamental flaw in many of these wearables. Whether you wear Ray-Ban Smart Glasses or Amazon Echo Frames, you'll still look as if you're talking to yourself. For a decent experience, you may engage in a lengthy “conversation” with Meta AI to get the information you need. Again, if you're doing this at home, letting Meta AI help you through a detailed recipe, that's fine. Using Meta AI as a tour guide when you're in the middle of, say, your local Whole Foods might label you as a bit of an oddball.

We do talk to our best phones and even our best smartwatches, but I think that when people see you holding your phone or smartwatch near your face, they understand what's going on.

The New York Times' authors noted how they found themselves whispering to their smart glasses, but they still got looks.

I don't know a way around this issue and wonder if this will be the primary reason people swear off what is arguably a very good-looking pair of glasses (or sunglasses) even if they could offer the passive smart technology we need.

So, I'm of two minds. I don't want to be seen as a weirdo talking to my glasses, but I can appreciate having intelligence there and ready to go; no need to pull my phone out, raise my wrist, or even tap a smart lapel pin. I just say, “Hey Meta” and the smart glasses wake up, ready to help.

Perhaps the tipping point here will be when Meta can integrate very subtle AR screens into the frames that add some much-needed visual guidance. Plus, the access to visuals might cut down on the conversation, and I would appreciate that.

You might also like

TechRadar – All the latest technology news

Read More

Leaked Apple roadmap hints at iPhone SE 4, foldable iPhone, and AR glasses launch dates

We've heard plenty of rumors about the iPhone SE 4, the foldable iPhone, and the Apple AR glasses, and now a leaked roadmap has given us a better idea of when we might actually see these devices get launched.

The document, apparently from finance company Samsung Securities, was leaked by well-known tipster @Tech_Reve (via Wccftech). It offers an overview of what's on the way from Apple for the next few years, up until 2027.

It's in 2027 when we'll apparently get the augmented reality glasses. We've not heard much about the specs in recent months, with the Apple Vision Pro taking most of the attention when it comes to AR and VR (or mixed reality, if you prefer). We're also, it seems, getting a cheaper Vision Pro sometime in 2026.

A foldable 20-inch iPad is slated to arrive in 2027, with the foldable 8-inch iPhone turning up a year before. That's somewhat in opposition to recent rumors that said the foldable iPad would turn up first – though considering a foldable iPhone would be about the size of an iPad mini anyway, there may be some confusion over which product is which.

Coming soon

See more

There's also a mention of the long-rumored OLED MacBook in 2026, and then looking at next year, we've got the iPhone SE 4 mentioned. That matches up with a rumor from last month that pointed to an early 2025 launch for the mid-ranger – with a switch to a more modern design and an OLED display also being talked about.

As for the rest of this year, it looks very much as though we'll get an 11-inch iPad Pro and a 12.9-inch iPad Pro, both running OLED screens. Most tipsters have predicted a 2024 launch for these tablets, and they could show up any day now (though you might have to get your orders in quickly for the 11-inch version).

The usual caveats about leaks and rumors apply: these dates might not be completely accurate, and even if they are, Apple's plans can always change. That said, this roadmap does  match up nicely with other bits of information that have leaked out.

If Apple does indeed launch new iPads in the near future, the next big announcements to expect will be about iOS 18, artificial intelligence, and Apple's other software. That will be at WWDC (the Worldwide Developers Conference) 2024, happening sometime in June.

You might also like

TechRadar – All the latest technology news

Read More

Meta’s Ray-Ban smart glasses are becoming AI-powered tour guides

While Meta’s most recognizable hardware is its Quest VR headsets, its smart glasses created in collaboration with Ray-Ban are proving to be popular thanks to their sleek design and unique AI tools – tools that are getting an upgrade to turn them into a wearable tourist guide.

In a post on Threads – Meta’s Twitter-like Instagram spinoff – Meta CTO Andrew Bosworth showed off a new Look and Ask feature that can recognize landmarks and tell you facts about them. Bosworth demonstrated it using examples from San Francisco such as the Golden Gate Bridge, the Painted Ladies, and Coit Tower.

As with other Look and Ask prompts, you give a command like “Look and tell me a cool fact about this bridge.” The Ray-Ban Meta Smart Glasses then use their in-built camera to scan the scene in front of you, and cross-reference the image with info in the Meta AI’s knowledge database (which includes access to the Bing search engine). 

The specs then respond with the cool fact you requested – in this case explaining the Golden Gate Bridge (which it recognized in the photo it took) is painted “International Orange” so that it would be more visible in foggy conditions.

Screen shots from Threads showing the Meta Ray-Ban Smart Glasses being used to give the suer information about San Francisco landmarks

(Image credit: Andrew Bosworth / Threads)

Bosworth added in a follow-up message that other improvements are being rolled out, including new voice commands so you can share your latest Meta AI interaction on WhatsApp and Messenger. 

Down the line, Bosworth says you’ll also be able to change the speed of Meta AI readouts in the voice settings menu to have them go faster or slower.

Still not for everyone 

One huge caveat is that – much like the glasses’ other Look and Ask AI features – this new landmark recognition feature is still only in beta. As such, it might not always be the most accurate – so take its tourist guidance with a pinch of salt.

Orange RayBan Meta Smart Glasses

(Image credit: Meta)

The good news is Meta has at least opened up its waitlist to join the beta so more of us can try these experimental features. Go to the official page, input your glasses serial number, and wait to get contacted – though this option is only available if you’re based in the US.

In his post Bosworth did say that the team is working to “make this available to more people,” but neither he nor Meta have given a precise timeline of when the impressive AI features will be more widely available.

You might also like

TechRadar – All the latest technology news

Read More