Meta AR glasses: everything we know about the AI-powered AR smart glasses

After a handful of rumors and speculation suggested Meta was working on a pair of AR glasses, it unceremoniously confirmed that Meta AR glasses are on the way – doing so via a short section at the end of a blog post celebrating the 10th anniversary of Reality Labs (the division behind its AR/VR tech).

While not much is known about them, the glasses were described as a product merging Meta’s XR hardware with its developing Meta AI software to “deliver the best of both worlds” in a sleek wearable package.

We’ve collected all the leaks, rumors, and some of our informed speculation in this one place so you can get up to speed on everything you need to know about the teased Meta AR glasses. Let’s get into it.

Meta AR glasses: Price

We’ll keep this section brief as right now it’s hard to predict how much a pair of Meta AR glasses might cost because we know so little about them – and no leakers have given a ballpark estimate either.

Current smart glasses like the Ray-Ban Meta Smart Glasses, or the Xreal Air 2 AR smart glasses will set you back between $ 300 to $ 500 / £300 to £500 / AU$ 450 to AU$ 800; Meta’s teased specs, however, sound more advanced than what we have currently.

Lance Ulanoff showing off Google Glass

Meta’s glasses could cost as much as Google Glass (Image credit: Future)

As such, the Meta AR glasses might cost nearer $ 1,500 (around £1,200 / AU$ 2300)  – which is what the Google Glass smart glasses launched at.

A higher price seems more likely given the AR glasses novelty, and the fact Meta would need to create small yet powerful hardware to cram into them – a combo that typically leads to higher prices.

We’ll have to wait and see what gets leaked and officially revealed in the future.

Meta AR glasses: Release date

Unlike price, several leaks have pointed to when we might get our hands – or I suppose eyeballs – on Meta’s AR glasses. Unfortunately, we might be waiting until 2027.

That’s according to a leaked Meta internal roadmap shared by  The Verge back in March 2023. The document explained that a precursor pair of specs with a display will apparently arrive in 2025, with ‘proper’ AR smart glasses due in 2027.

RayBan Meta Smart Glasses close up with the camera flashing

(Image credit: Meta)

In February 2024  Business Insider cited unnamed sources who said a pair of true AR glasses could be shown off at this year’s Meta Connect conference. However, that doesn’t mean they’ll launch sooner than 2027. While Connect does highlight soon-to-release Meta tech, the company takes the opportunity to show off stuff coming further down the pipeline too. So, its demo of Project Orion (as those who claim to be in the know call it) could be one of those ‘you’ll get this when it’s ready’ kind of teasers.

Obviously, leaks should be taken with a pinch of salt. Meta could have brought the release of its specs forward, or pushed it back depending on a multitude of technological factors – we won’t know until Meta officially announces more details. Considering it has teased the specs suggests their release is at least a matter of when not if.

Meta AR glasses: Specs and features

We haven't heard anything about the hardware you’ll find in Meta’s AR glasses, but we have a few ideas of what we’ll probably see from them based on Meta’s existing tech and partnerships.

Meta and LG recently confirmed that they’ll be partnering to bring OLED panels to Meta’s headsets, and we expect they’ll bring OLED screens to its AR glasses too. OLED displays appear in other AR smart glasses so it would make sense if Meta followed suit.

Additionally, we anticipate that Meta’s AR glasses will use a Qualcomm Snapdragon chipset just like Meta’s Ray-Ban smart glasses. Currently, that’s the AR1 Gen 1, though considering Meta’s AR specs aren’t due until 2027 it seems more likely they’d be powered by a next-gen chipset – either an AR2 Gen 1 or an AR1 Gen 2.

A Meta Quest 3 player sucking up Stay Puft Marshmallow Men from Ghostbusters in mixed reality using virtual tech extending from their controllers

The AR glasses could let you bust ghost wherever you go (Image credit: Meta)

As for features, Meta’s already teased the two standouts: AR and AI abilities.

What this means in actual terms is yet to be seen but imagine virtual activities like being able to set up an AR Beat Saber jam wherever you go, an interactive HUD when you’re navigating from one place to another, or interactive elements that you and other users can see and manipulate together – either for work or play.

AI-wise, Meta is giving us a sneak peek of what's coming via its current smart glasses. That is you can speak to its Meta AI to ask it a variety of questions and for advice just as you can other generative AI but in a more conversational way as you use your voice.

It also has a unique ability, Look and Ask, which is like a combination of ChatGPT and Google Lens. This allows the specs to snap a picture of what’s in front of you to inform your question, allowing you to ask it to translate a sign you can see, for a recipe using ingredients in your fridge, or what the name of a plant is so you can find out how best to care for it.

The AI features are currently in beta but are set to launch properly soon. And while they seem a little imperfect right now, we’ll likely only see them get better in the coming years – meaning we could see something very impressive by 2027 when the AR specs are expected to arrive.

Meta AR glasses: What we want to see

A slick Ray-Ban-like design 

RayBan Meta Smart Glasses

The design of the Ray-Ban Meta Smart Glasses is great (Image credit: Meta)

While Meta’s smart specs aren't amazing in every way – more on that down below – they are practically perfect in the design department. The classic Ray-Ban shape is sleek, they’re lightweight, super comfy to wear all day, and the charging case is not only practical, it's gorgeous.

While it’s likely Ray-Ban and Meta will continue their partnership to develop future smart glasses – and by extension the teased AR glasses – there’s no guarantee. But if Meta’s reading this, we really hope that you keep working with Ray-Ban so that your future glasses have the same high-quality look and feel that we’ve come to adore.

If the partnership does end, we'd like Meta to at least take cues from what Ray-Ban has taught it to keep the design game on point.

Swappable lenses 

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

We want to change our lenses Meta! (Image credit: Meta)

While we will rave about Meta’s smart glasses design we’ll admit there’s one flaw that we hope future models (like the AR glasses) improve on; they need easily swappable lenses.

While a handsome pair of shades will be faultless for your summer vacations, they won’t serve you well in dark and dreary winters. If we could easily change our Meta glasses from sunglasses to clear lenses as needed then we’d wear them a lot more frequently – as it stands, they’re left gathering dust most months because it just isn’t the right weather.

As the glasses get smarter, more useful, and pricier (as we expect will be the case with the AR glasses) they need to be a gadget we can wear all year round, not just when the sun's out.

Speakers you can (quietly) rave too 

JBL Soundgear Sense

These open ear headphones are amazing, Meta take notes (Image credit: Future)

Hardware-wise the main upgrade we want to see in Meta’s AR glasses is better speakers. Currently, the speakers housed in each arm of the Ray-Ban Meta Smart Glasses are pretty darn disappointing – they can leak a fair amount of noise, the bass is practically nonexistent and the overall sonic performance is put to shame by even basic over-the-ears headphones.

We know open-ear designs can be a struggle to get the balance right with. But when we’ve been spoiled by open-ear options like the JBL SoundGear Sense – that have an astounding ability to deliver great sound and let you hear the real world clearly (we often forget we’re wearing them) – we’ve come to expect a lot and are disappointed when gadgets don’t deliver.

The camera could also get some improvements, but we expect the AR glasses won’t be as content creation-focused as Meta’s existing smart glasses – so we’re less concerned about this aspect getting an upgrade compared to their audio capabilities.

You might also like

TechRadar – All the latest technology news

Read More

Meta Quest Pro 2: everything we know about the Apple Vision Pro competitor

Meta’s Quest 3 may be less than a year old, but Meta appears to be working on a few follow-ups. Leaks and rumors point to the existence of a Meta Quest 3 Lite – a cheaper version of the Meta Quest 3 – and a Meta Quest Pro 2 – a follow-up to the high-end Meta Quest Pro.

The original Meta Quest Pro doesn’t seem to have been all that popular – evidenced by the fact its price was permanently cut by a third less than six months after its launch – but the Apple Vision Pro seems to have fueled a renaissance of high-end standalone VR hardware. This means we’re getting a Samsung XR headset (developed in partnership with Google), and mostly likely a Meta Quest Pro 2 of some kind.

While one leak suggested the Meta Quest Pro 2 had been delayed – after Meta cancelled a project that the leak suggested was set to be the next Quest Pro – there’s more than a little evidence that the device is on the way. Here’s all of the evidence, as well as everything you need to know about the Meta Quest Pro 2 – including some of our insight, and the features we’d most like to see it get.

Meta Quest Pro 2: Price

Because the Meta Quest Pro 2 hasn’t been announced we don’t know exactly how much it’ll cost, but we expect it’ll be at least as pricey as the original which launched at $ 1,499.99 / £1,499.99 / AU$ 2,449.99.

The Meta Quest Pro being worn by a person in an active stance

(Image credit: Meta)

The Meta Quest Pro was permanently discounted to $ 999.99 / £999.99 / AU$ 1729.99 five months after it launched, but we expect this was Meta attempting to give the Quest Pro a much-needed sales boost rather than an indication of the headsets actual cost. So we expect this is much cheaper than Quest Pro 2 will be.

What’s more, given that the device is expected to be more of an Apple Vision Pro competitor — which costs $ 3,500 or around £2,800 / AU$ 5,350 – with powerful specs, LG-made OLED panels, and could boast next-gen mixed reality capabilities there’s a good chance it could cost more than its predecessor.

As such we’re expecting it to come in at nearer $ 2,000 / £2,000 / AU$ 3,000. Over time, and as more leaks about the hardware come out, we should start to get a better idea of its price – though as always we won’t know for certain how much it’ll cost until Meta says something officially.

Meta Quest Pro 2: Release date

The Meta Quest 3 on a notebook surrounded by pens and school supplies on a desk

The Meta Quest 3 (Image credit: Meta)

Meta hasn’t announced the Quest Pro 2 yet – or even teased it. Given its usual release schedule this means the earliest we’re likely to see a Pro model is October 2025; that’s because it would tease the device at this year’s Meta Connect in September/October 2024, and then launch it the following year’s event as it did with the original Quest Pro and Quest 3.

But there are a few reasons we could see it launch sooner or later. On the later release date side of things we have the rumored Meta Quest 3 Lite – a cheaper version of the Meta Quest 3. Meta may want to push this affordable model out the gate sooner rather than later, meaning that it might need to take a release slot that could have been used by the Quest Pro 2.

Alternatively, Meta may want to push a high-end model out ASAP so as to not let the Apple Vision Pro and others like the Samsung XR headset corner the high-end VR market. If this is the case it could forgo its usual tease then release strategy and just release the headset later this year – or tease it at Connect 2024 then launch it in early 2025 rather than a year later in late 2025 as it usually would.

This speculation all assumes a Meta Quest Pro 2 is even on the way – though Meta has strongly suggested that another Pro model would come in the future; we’ll just have to wait and see what’s up its sleeve.

Meta Quest Pro 2: Specs

Based on LG and Meta’s announcement of their official partnership to bring OLED displays to Meta VR headsets in the future, it’s likely that the Meta Quest Pro 2 would feature OLED screens. While these kind of displays are typically pricey, the Quest Pro 2 is expected to be a high-end model (with a high price tag), and boasting OLED panels would put it on par with other high-end XR products like the Apple Vision Pro.

Key Snapdragon XR2 Plus Gen 2 specs, including that it has support fo 4.3k displays, 8x better AI performance, and 2.5x better GPU performance

(Image credit: Qualcomm)

It also seems likely the Meta Quest Pro 2 will boast a Snapdragon XR2 Plus Gen 2 chipset – the successor to the Gen 1 used by the Quest Pro. If it launches further in the future than we expect it would instead boast a currently unannounced Gen 3 model.

While rumors haven’t teased any other specs, we also assume the device would feature full-color mixed reality like Meta’s Quest 3 and Quest Pro – though ideally the passthrough would be higher quality than either of these devices (or at least, better than the Quest Pro’s rather poor mixed reality).

Beyond this, we predict the device would have specs at least as good as its predecessor. By that we mean we expect the base Quest Pro 2 would come with 12GB of RAM, 256GB of storage and a two-hour minimum battery life.

Meta Quest Pro 2: What we want to see

We’ve already highlighted in depth what we want to see from the Meta Quest Pro 2 – namely it should ditch eye-tracking and replace it with four different features. But we’ll recap some of those points here, and make a few new ones of things we want to see from the Quest Pro 2.

Vastly better mixed-reality passthrough, more entertainment apps and, 4K OLED displays would go a long way to making the Meta Quest Pro 2 feel a lot more like a Vision Pro competitor – so we hope to see them on the Quest Pro 2. 

Eye-tracking could also help, but Meta really needs to prove it’s worthwhile. So far every instance of the tech feels like an expensive tech demo for a feature that’s neat, but not all that useful.

The Meta Quest Pro being worn by Hamish Hector, his cheeks are puffed up

What we want from the next Quest Pro (Image credit: Meta)

Ignoring specs and design for a second, our most important hope is that the Quest Pro 2 isn’t as prohibitively expensive as the Apple Vision Pro. While the Vision Pro is great, $ 3,500 is too much even for a high-end VR headset when you consider the realities of how and how often the device will be used. Ideally the Quest Pro 2 would be at most $ 2,000 / £2,000 / AU$ 3,000, though until we know more about its specs we won’t know how realistic our request is.

Lastly we hope the device is light, perhaps with a removable battery pack like the one seen in the HTC Vive XR Elite. This would allow someone who wants to work at their desk or sit back and watch a film in VR wear a much lighter device for the extended period of time (provided their near a power source). Alternatively they can plug the battery in and enjoy a typical standalone VR experience – to us this would be a win-win.

TechRadar – All the latest technology news

Read More

Meta Quest 3 Lite: everything we know about the rumored cheap VR headset

Based on the leaks and rumors it seems increasingly likely that Meta is working on a cheaper version of the Meta Quest 3 – expected to be called the Meta Quest 3 Lite or Meta Quest 3s. 

It’s not yet been confirmed, but the gadget is expected to be a more affordable version of the Quest 3 – at a price closer to the Quest 2 – that would see the Meta fully phase out its last-gen VR hardware. The trade-off would be the device wouldn’t have all the capabilities of the Quest 3 – likely sporting lower-resolution displays, less RAM, a worse chipset, or dropping mixed reality support (though that last point seems unlikely).

While we’re not convinced the gadget will look exactly like what’s been rumored so far, as the saying goes: where there's smoke there’s fire. The fact that several independent leaks have come out suggests Meta is definitely working on something.

We’ve collected the latest news and rumors here so this page can serve as your one-stop shop for all things Meta Quest 3 Lite. As we learn more about the device we’ll be sure to update the page and keep you in the loop with all the latest information.

Meta Quest 3 Lite: Latest news

We’ve seen not one, but two distinct Meta Quest 3 Lite leaks – one render called the Meta Quest 3 Lite and one with more details that the leaker called the Quest 3s.

The Oculus Quest 2 was also at a record low price ($ 200 / £200) as part of this year's Amazon Spring Sale, following a permanent price cut to $ 249.99 / £249.99 / AU$ 439.99 earlier this year. This could be a sign Meta and retailers are trying to shift stock ahead of the last-gen device being phased out before a Quest 3 Lite release.

Oculus Quest 2 on a white background

Is the Quest 3 Lite the true Quest 2 replacement? (Image credit: Shutterstock / Boumen Japet)

Meta Quest 3 Lite: Price

As the Meta Quest 3 Lite isn’t yet official – meaning Meta itself hasn’t confirmed (or denied) its existence – we can’t say for certain how much it’ll cost or when it will be released.

But based on rumors and previous Meta hardware releases, we can make some reasoned predictions on what the gadget might cost and when we could see it in action.

Price-wise, we can reasonably expect it’ll cost around the same as Meta’s last-gen headset, given the Lite is billed as a super-affordable model meant to fully replace the Oculus Quest 2. It’ll certainly cost less than the Meta Quest 3.

This would likely see it released at around $ 299 / £299 / AU$ 479, which is where the Quest 2 started life. Honestly, we’d be more than a little disappointed if it was more expensive.

A man using his Zenni customized Meta Quest 3 headset

The Meta Quest 3 could soon have a sibling (Image credit: Zenni)

Meta Quest 3 Lite: Release date

As for the Quest 3 Lite’s release date, Meta usually likes to release new hardware in October. However, it might decide to mix things up with this budget-friendly gadget to avoid confusing it with its main line Quest and Quest Pro lines.

We predict the Quest 3 Lite will be announced and released as part of this year’s Meta Quest Gaming Showcase, which should be around June based on previous years. 

If Meta sticks to its usual hardware release schedule, though, then a launch after this year’s Meta Connect – which we expect will land in September or October – could be on the cards.

Of course, this assumes the Meta Quest 3 Lite even launches at all.

The Meta Quest 3 in action

The Meta Quest 3 Lite will likely look a little different to the Quest 3 (Image credit: Meta)

Meta Quest 3 Lite: Specs and design

So far we haven’t heard many specs for the Meta Quest 3 Lite. The main leaks so far have been renders showing off its possible design.

See more

These leaks suggest it’ll be bulkier than the Quest 3, likely because the Lite would adopt the fresnel lens system used by the Quest 2. This makes some sense as fresnel lenses are cheaper, partly because the alternative pancake lenses require brighter displays. However, considering pancake lenses lead not only to a slimmer headset design but also better image quality (and we’ve seen cheap headsets like the Pico 4 use pancake lenses) we’d be surprised if Meta didn't use them in the Lite.

One of the leaks went into more detailed specs, suggesting it’ll have 128GB or 256GB of storage (instead of the 128GB or 512GB in the Quest 3) and 1,832 x 1,920 pixel displays (one per eye). Something seems off about the leak, though, in terms of the assets shared and the included info that could help identify the leaker (which seems like a bad idea for anyone trying to avoid the wrath of Meta’s well-funded legal team).  

See more

As such, color us skeptical when it comes to the details highlighted in the post.

Meta Quest 3 Lite: Software

Assuming the Meta Quest 3 Lite has the same or similar mixed-reality capabilities as the Meta Quest 3, we expect it’ll have access to all of the same software – which is to say, everything available on the Quest platform’s Store (and many other games and apps available through sideloading via third-party digital storefronts).

If it has significantly worse specs – such as the Quest 2’s Snapdragon XR2 Gen 1 chipset – there may be some software that launches in the future that would be exclusive to the full Quest 3. But we expect the Quest 3 Lite would use a Snapdragon XR2 Gen 2 so, hopefully, this won’t be an issue.

We’ll have to wait and see what Meta announces.

Girl wearing Meta Quest 3 headset interacting with a jungle playset

The Meta Quest 3 Lite needs to have mixed reality (Image credit: Meta)

Meta Quest 3 Lite: What we want to see

As for what we want to see from the Quest 3 Lite VR headset – acknowledging that its lower price will necessitate lower specs than the Meta Quest 3 proper – our ideal setup would boast the same Snapdragon XR2 Gen 2 chipset and 8GB of RAM as the Quest 3, though 6GB of RAM like the Quest 2 is, admittedly, a lot more likely. 

Storage options would start at 64GB – as frankly, you don’t need a lot of storage space for VR apps, especially if you’re willing to download and delete them as necessary – and the displays would be a lower resolution than the Quest 3. A leak suggested the 1,832 x 1,920 pixels per eye option, and considering this is what’s used by the Quest 2 it does make some sense.

Pancake lenses seem like an easy win from a design and image-quality perspective (especially if Meta opts for poorer displays), and mixed-reality passthrough that’s at least as high-quality as the Quest 3 is also a must.

Beyond this, one rogue cost-cutting measure could see Meta scrap or change its Quest 3 controllers. However, given how much developers have emphasized to us the importance of VR handsets having a standard design, and the fact that many Quest titles don’t support hand-tracking, this might be a step too far.

TechRadar – All the latest technology news

Read More

Tired of Windows File Explorer? This app makes it way easier to navigate everything on your PC

If you think that Windows 11’s File Explorer could be better, you’re not alone – and there’s a popular third party alternative, the Files app. The Files app (which despite its name, has no relation to Microsoft’s own File Explorer) just got an upgrade that makes it an even better tool for navigating your file systems, with the latest version of the app allowing users to navigate big folders more easily. 

The Files app update 3.2 brings user interface (UI) improvements like a list view layout for files and folders, the capability to edit album covers of media files via folder properties, and support for higher quality thumbnails. Along with UI improvements, users can also expect many fixes and general improvements.

According to Windows Central, the Files app’s occasional instability while handling large file folders was one of the biggest user complaints with it and this update addresses that, too. The app should now be more functional when users attempt to use it with bigger file folders.

A young woman is working on a laptop in a relaxed office space.

(Image credit: Getty Images)

How the Files app measures up as a file explorer 

Windows Central does state that it doesn’t think the Files app is just ready to completely replace the default Windows Files Explorer, but that “it can be a powerful and useful companion app.” It offers unique features that File Explorer itself doesn’t offer and, to many users, it’s got a sleeker look. This app is available for both Windows 10 and Windows 11, but the app’s performance can vary from system to system. Window Central writes of its own investigation of the File app’s performance and it does report that the app has issues with performance and stability on some PCs. You can check the full change log of what Files version 3.2 delivers if you’d like to know more.

Many users would like to see Windows’ old File Explorer include many of the File app’s features, and maybe Microsoft is watching. It recently released its own proprietary PC Cleaner app, a system cleaner tool that offers lots of the tools of popular paid third-party system cleaners for free. Also, Microsoft’s been at the receiving end of some heat both from industry professionals and competitors, as well as regulators in the European Union with its recent introduction of the Digital Markets Act (DMA). Offering tools like PC Cleaner and a souped-up File Explorer could be a way for it to win back some user trust and goodwill. 

The existence of third-party apps like this is good for users two-fold because it can motivate first-party developers to improve their products faster, and it also gives users more choice over how they use their devices. The Files app looks like it sees regular updates and improvements, and definitely sounds like it could be worth users’ while given that it has no malware issues and if you get good performance upon installing it.

If you’d like to try out Files for yourself, bear in mind that it isn’t free: the app comes with a one-time charge of $ 8.99/£7.49, although thankfully there aren’t any subscription fees. You can download it directly from the Microsoft Store

You might also like

TechRadar – All the latest technology news

Read More

ChatGPT could become a smart personal assistant helping with everything from work to vacation planning

Now that ChatGPT has had a go at composing poetry, writing emails, and coding apps, it's turning its attention to more complex tasks and real-world applications, according to a new report – essentially, being able to do a lot of your computing for you.

This comes from The Information (via Android Authority), which says that ChatGPT developer OpenAI is working on “agent software” that will act almost like a personal assistant. It would be able to carry out clicks and key presses as it works inside applications from web browsers to spreadsheets.

We've seen something similar with the Rabbit R1, although that device hasn't yet shipped. You teach an AI how to calculate a figure in a spreadsheet, or format a document, or edit an image, and then it can do the job for you in the future.

Another type of agent in development will take on online tasks, according to the sources speaking to The Information: These agents are going to be able to research topics for you on the web, or take care of hotel and flight bookings, for example. The idea is to create a “supersmart personal assistant” that anyone can use.

Our AI agent future?

The Google Gemini logo on a laptop screen that's on an orange background

Google is continuing work on its own AI (Image credit: Google)

As the report acknowledges, this will certainly raise one or two concerns about letting automated bots loose on people's personal computers: OpenAI is going to have to do a lot of work to reassure users that its AI agents are safe and secure.

While many of us will be used to deploying macros to automate tasks, or asking Google Assistant or Siri to do something for us, this is another level up. Your boss isn't likely to be too impressed if you blame a miscalculation in the next quarter's financial forecast on the AI agent you hired to do the job.

It also remains to be seen just how much automation people want when it comes to these tasks: Booking vacations involves a lot of decisions, from the position of your seats on an airplane to having breakfast included, which AI would have to make on your behalf.

There's no timescale on any of this, but it sounds like OpenAI is working hard to get its agents ready as soon as possible. Google just announced a major upgrade to its own AI tools, while Apple is planning to reveal its own take on generative AI at some point later this year, quite possibly with iOS 18.

You might also like

TechRadar – All the latest technology news

Read More

Two days with Vision Pro: Apple’s almost convinced me to part with $3,500 by transforming everything I do

Whatever you've heard or read about Apple's new Apple Vision Pro mixed reality headset, nothing quite prepares you for seeing it in person, putting it on, and experiencing for the first time Apple's vision for spatial computing. You realize quite quickly that this is more than a marketing term, it's a new approach to the digital experience. 

I'm still getting a feel for the glass, aluminum, and fabric system but I thought I'd start by sharing my first hours with the $ 3499 (to start), US-only mixed reality headset. It was mostly smooth sailing with one early, albeit tiny, bump in the road.

Apple Vision Pro box

Apple Vision Pro box (Image credit: Future)

A package arrives

January 30th 4:30 PM:

The box arrives! It's large because Apple sent me both the 1TB Apple Vision Pro ($ 3,899) and a carrying case ($ 199). Inside is a tall white box that reminds me of oversized iPhone packaging. I mean, it is different, but also oddly familiar – at least on the outside.

The carrying case looks like it might be more at home on the moon. A covering I initially took for packaging is the case's Apollo-mission space-suit-like material. I quickly put the case aside so I could get to the business of unboxing the fruits of Apple's first new product category in almost a decade.

While it's not remotely cramped, there is a lot in the Vision Pro box. First is the spatial computer itself, nestled comfortably inside with its Solo Knit Band already attached. Every accessory is wrapped in Apple-ly cardboard. There's the Dual Loop Band, which can replace the Solo Knit Band and potentially offer more support for the 1.3lb. headset. The bands are easy to swap but I'm determined to try wearing the Vision Pro with the default gear (though in most of my previous brief demos, I preferred the Dual Loop and wish Apple had created a hybrid that combines the Solo Knit with a top loop band).

There's an extra Light Seal Cushion. They come in a few sizes but I also have to use the thicker one because I'll be wearing the Vision Pro with my optional custom Zeiss lens inserts (an extra $ 149). 

There's a cover to protect the Vision Pro's lustrous glass front, and a cleaning cloth to wipe away the smudges that instantly appear when you pick it up.

There's the battery which is attached to a cable that runs to a proprietary power port on the Vision Pro. While some might think it odd that Apple didn't simply go with a USB-C charge port, I think that would stick too far out from the headset and look more awkward than the battery-power solution Apple cooked up. 

There's also a USB-C cable and power adapter to charge the battery. 

What comes in the Apple Vision Pro box

What comes in the box. (Image credit: Future)

Unboxing Vision Pro

5:00 PM ET

I unbox the Vision Pro during a TikTok live stream. While doing so, I realized that Apple still has my Zeiss lens inserts. Without them, the visuals in the headset will be blurry. I decide to plug in the battery to charge it up while I wait for the Zeiss lenses to arrive. 

In the meantime, I examine the Vision Pro and practice swapping the Solo Knit for the Dual Loop Band. It's an easy process because, like almost everything else on the Vision Pro, the bands are held in place mostly by magnets or magnetized posts. Things easily pop off. I noticed that if I picked up the wrong part of the Vision Pro, the whole light seal would pop off. Again, super easy to put back on.

I pop one light seal foam off and put the thinner one on to see how it looks and feels. The difference between the two is barely perceptible.

6:00 PM ET

Time to take some photos of the Vision Pro

Image 1 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 2 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 3 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 4 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 5 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 6 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 7 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 8 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 9 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 10 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 11 of 11

Apple Vision Pro Review

(Image credit: Future)

7:15PM ET

My custom Zeiss lenses arrive. Now the fun begins. To get started, I connect the power to the side of the Vision Pro. It's a push-and-turn operation, similar to how you might mount a lens of a DSLR. It's easy (very little with the Vision Pro isn't easy). Next, I insert my lenses, which are clearly marked left and right and, like everything else, snap in with strong magnets. These lenses are not going anywhere.

Image 1 of 2

Apple Vision Pro Review

(Image credit: Future)
Image 2 of 2

Apple Vision Pro Review

(Image credit: Future)

Setup is familiar

Vision Pro starts by teaching you about using Vision Pro (there's also a nice booklet-sized, color manual to help you get started). It explains the eye tracking and subtle gestures you use to control the device. I think Apple did a good job here. 

There are a few steps to go through to get set including making sure the pupillary distance is right (just a press of the digital crown), scanning my Zeiss lens code, scanning a code with my phone to get it properly paired with my iPhone and set up with my Apple ID details, scanning the front and backs of your hands, and the process of staring at a circle of dots (three sets) while pinching my thumb and index finger, which calibrates the system.

The headset also asks if I want to set up Optic ID, which registers my Iris for some security and commerce functions but, though I try multiple times, I can't get it to work.

I start by using  the Solo Knit Band, which means the headset is fairly tight on my face. However, the back of the band is, at least initially, more comfortable than the Dual-Loop.

As with any VR or mixed reality headset, there are prominent safety reminders including, Stay Aware of Your Surroundings, Use in Safe Areas, and Take Frequent Breaks.

It's during the setup that I learn that Vision Pro is not intended for kids, or at least anyone under 13.

Meet my Persona

My Vision Pro Persona

My Vision Pro Persona (Image credit: Future)

You can't get around creating a Persona, which is a digital representation of you that will be used in things like FaceTime and Zoom calls, so you don't have to appear on camera wearing the headset and looking ridiculous (I did this once or twice).

Vision Pro guides me to take off the headset, and then use the system's 3D cameras to capture my face (left side, right side, top, bottom), as well as a couple of expressions. It takes less than a minute for Vision Pro to build my Persona (the system is still in beta, by the way).

I decide to slide the battery pack into my front pocket.

With the questions about transferring existing data and keeping the device up to date with updates, sharing audio recordings with Apple, Apple Pay and Card setup, this is a lot like setting up an iPhone. You go through virtually all the same steps.

I make a FaceTime call to my wife in the other room. Her reaction to my digital persona is not exactly enthusiastic. She calls it disturbing. My son says it reminds him of one of those AI avatars in sci-fi movies that can only answer questions they've been pre-programmed to answer (see iRobot for reference). I ask my wife to grab some screenshots and send them to me (see above).

I think it did a decent job, though Apple appears to have shaved my goatee and fixed my teeth, the latter of which I do not mind.

7:35PM ET

The visuals are still pretty astounding. The home screen floats in my home office with icons sharp enough to touch (I like how some interface elements look like frosted glass – such an Apple thing to do). I use Siri to open Safari. The expert integration of Siri throughout the system is a nice revelation. Imagine if it had worked this well when Apple launched it on the iPhone 4s.

7:50PM ET

Had to take a break because it was hurting my forehead.

The right fit and an endless desktop

Apple Vision Pro home screen

The home screen that you reach by pressing the Digital Crown. (Image credit: Future)

8:10PM ET

Switched to Dual Loop Band. Now that I got the adjustment right, I think it's more comfortable.

I want to play Wordle, as I do every night, but to do so, I must use Vision Pro's Safari instead of the Chrome browser I usually use on my Mac. This means I have to sign into my NY Times account again, which gives me a nice opportunity to use the virtual keyboard. It lets you type on an AR keyboard in the air using your fingers. It's pretty cool, though without tactile feedback, typos proliferate.

My two-factor authentication uses my iPhone, which I naturally cannot unlock with FaceID but, fortunately, my PIN works fine. I never have to take off the headset to see my phone or anything else, for that matter. The passthrough is good enough that I can always see whatever I need to see.

Apple Vision Pro Review

Apple Vision Pro with the Dual Loop Band. (Image credit: Future)

I've been typing on my MacBook Pro M3 and get ready to expand my desktop into augmented reality. Using the control panel, I access the Mac Virtual Display. Vision Pro immediately finds my MacBook and once I select it, the Mac Screen goes dark and a giant virtual MacBook desktop appears floating in front of me. No more looking down at a laptop screen! Of course, I still have to occasionally look at my hands to type. Later when I switch to my real desktop it feels incredibly cramped.

I'm a bit torn about the control panel system. You access it by looking up at a tiny green arrow near the top of your viewport. The Control Center, which is one level down, looks like the one that you'd find on the iPhone but with some Vision Pro-specific touches. I just feel like that little arrow is one of the rare, non-obvious interface bits in the Vision Pro system.

Adding Mac Virtual Desktop to my Vision Pro interface

Adding Mac Virtual Desktop to my Vision Pro interface. (Image credit: Future)

Immersive landcapes and the real feel

8:30PM ET

Have not solved Wordle, which is not designed for this interface but the gaze and pinch system of letter selection works well enough. Itching to have some more immersive fun.

I try the moon environment, which virtually puts you on the surface of the moon. I spin the digital crown to make the environment fully immersive and then realize that by doing that, I can no longer see my keyboard – just my hands floating about the dusty, gray surface of the moon.

I take a break from typing and get ready to sample the 3D version of Avatar: Way of Water….Oh, wait, I have to pay for that. Never mind.

I choose Prehistoric Planet: Immersive, which is just wild. The visuals here are stunning. This is what I imagined when I first started thinking about virtual reality. Having a realistic dinosaur just centimeters from your face changes you.

Vision Pro control panel

Vision Pro control panel. (Image credit: Future)

Perfect for panoramas and meeting EyeSight

8:40PM ET

I switch back to Wordle to give it another shot. I'm enjoying moving things around my endless virtual desktop. 

Do some screen recording, which shows the view inside the Vision Pro, and then I switch to checking out my own panoramic photos. There is simply no better platform for viewing all these photos than the Vision Pro. I have almost 150 panoramic images in my library and I can finally see them in all their vivid detail and beauty. In a photo of a lovely rainbow cresting over my neighborhood, I spot colors I previously missed.

The spatial videography that I captured on my Phone 15 Pro Max looks great.

I leave my home office and walk into the living room. It's easy enough to use the digital crown to dial back the immersion so I can see where I'm going. I sit down on the couch next to my wife and as I start to talk to her she appears slowly, breaking through the immersive landscape as if coming through a fog. On her side, she can see my “eyes” in the Vision Pro's front display. I could almost hear the air quotes in her voice. She did not love the look of Vision Pro Eyesight, which creates a simulacrum of my eyes and their movements based on what the internal cameras can see.

Vision Pro EyeSight in action

The view of my Vision Pro EyeSight in action. (Image credit: Future)

The home movie house

9:05PM ET

I discover that I can use my MacBook mouse across all the apps floating in my virtual desktop; it doesn't matter if they're native to macOS or visionOS.

While the Vision Pro works with virtually all iOS and iPadOS apps, I wanted to see what the platform could do with apps that were built for it. There are, at the moment, about 20 such apps. I install a half-dozen free ones.

I load up Disney Plus and am even able to copy and paste a password from the Mac Pro into the Vision Pro Disney Plus app. I love how smoothly the different platforms work together.

It takes a beat to download an environment like the Avengers Tower.

9:30PM ET

The degree to which I enjoy watching 3D movies with the Vision Pro surprises me. Watching Doctor Strange Multiverse of Madness in the darkened Avengers Tower environment takes me back to being in a real movie theater. Even though the headset has some heft, I'm noticing it less and less. I'm sure I can handle a two-hour movie in this thing. Where is my popcorn?

As I type this, I realize that my pocket is warm. The battery does generate some heat while in use. Also, I see I'm down to 37% power. Doubtful I'll make it through this whole movie.

Battery life

9:45PM ET

Down to 20% battery life. Movies seem to drain the battery fast.

Found a game called Loona. There's an adorable blue character. When I look at her (it?) and pinch my fingers she hiccups and giggles. It's intoxicating. Loona turns out to be a calming puzzle game that I manipulate by pinching and dragging pieces into place.

I switch back to the movie. What a wonderful experience.

10:05PM ET

Vision Pro ran out of power. The battery is warm. Time to recharge and catch some shuteye.

Image 1 of 3

Apple Vision Pro carrying case

(Image credit: Future)
Image 2 of 3

Apple Vision Pro carrying case

(Image credit: Future)
Image 3 of 3

Apple Vision Pro carrying case

(Image credit: Future)

January 31, 7AM ET

My goal is to work, play, and learn about the headset all day long. Instead of running solely off battery power, I'm keeping the battery plugged into a wall outlet. This has the unfortunate side effect of doubling the number of wires running near my body. Not a big deal but I can't just get up and walk away from my desk.

Just realized I never finished Wordle. Oh well, there goes that streak.

While I've viewed a lot of spatial imagery through the headset, both in demos with Apple, and during my first day with Vision Pro, I'd never taken a spatial photo or video with the device.

I press the dedicated button on the upper left side of the headset and it asks about location tracking (I set it to While using the App), and then lets you toggle between spatial photos or video with a gesture. I take a spatial photo, which is pretty straightforward, but when I take a video, there's on-screen visual guidance that seeks to keep the view straight and fixed in one position.

The 3D spatial photo of my hand is so good it's creepy.

The 3D spatial video, despite the somewhat annoying visual guidance, looks excellent.

Image 1 of 2

Apple Vision Pro Review

(Image credit: Future)
Image 2 of 2

Apple Vision Pro Review

(Image credit: Future)

Showing your work

7:30AM ET

Noticing that some of the interface text nearest to me and at the bottom of the field of view is broken into two images. Not sure if something has gone wrong with the calibration.

The system just asked me to move the Vision Pro slightly to the left on my head. It's constantly tracking my eyes, so perhaps it noticed the eye-tracking was slightly off. That may have solved my little parallax issue.

Been experimenting with capture. I don't know how to just record my Persona in action, besides having someone else screen-record my call. I try doing it by screen recording the view of my Persona in Settings but the recording also captures all my real-world head movements, making the video unwatchable.

I did just discover that the easiest way to capture a screenshot of your Vision Pro environment is to simply ask Siri to grab a screenshot of the desktop. It works perfectly every time.

7:53AM ET

I experience my first app crash. The App Store stopped responding and then it disappeared. Can't seem to get my virtual keyboard to appear at all in the App Store or Safari.

Answering questions

8:06AM ET

Pull the headset off for a short break, not because I'm uncomfortable but because I want to let the rest of my face breathe.

8:20AM ET

Back in it and the keyboard malfunction appears to have solved itself. Realize that if I make my Virtual Mac Desktop too large and put it too high on the Vision Pro desktop, I'm craning my neck to read what's at the top. Making adjustments.

I haven't spent much time in environments but I think I prefer them dialed in about 50% when working. 100% and I can't see my physical keyboard and the atmospheric audio is maybe a bit too much for the workday.

Someone asks me on Threads if there's a lot of light leakage. I tell them little, if any. I notice just a bit around my nose, but, especially in passthrough mode, your real-world blends seamlessly with the augmented one. It's quite something.

My wife asks me if I feel disoriented when I remove the headset. I don't. Perhaps that's because I'm often using it with the real-world view intact. Still, I think it has a lot to do with the virtual quality and eye-tracking capabilities.

Heading into video meetings that my Vision Pro persona does not support.

Using the Apple Vision Pro virtual keyboard

Using the Apple Vision Pro virtual keyboard. (Image credit: Future)

Ready to game

10:00AM ET

I want to tie off this initial test run with a game. Apple provided an Xbox controller that I should be able to hook up to the Vision Pro and play some Apple Arcade Games.

Turns out there are a lot of simple mini-games designed explicitly for the Vision Pro. I end up playing What the Golf, which takes me a little while to master. Later I connect the controller and use it to play Asphalt 8: Airborne Plus. I find that I prefer these virtual gaming screens as large as possible and often with the Environment immersion turned to 100. I do think gamers who can afford it will come to love the Vision Pro.

Apple Vision Pro Review

Asphalt 8 in Vision Pro. (Image credit: Future)

10:45AM ET

I end up playing for just 15 minutes before getting back to work. I launch Photoshop on my MacBook Pro and try editing photos on the big screen. It's generally a good experience though I do wonder if I'm seeing the most accurate colors on the Vision Pro Virtual Mac Display.

As I'm working, an iMessage alert comes through. I pinch on the floating iMessage icon and it launches iMessage where I can read it in the app. I could use the Virtual keyboard to type my reply, but it's not good for any more than a few words of typing. I want to use the MacBook's keyboard, but since that app is not inside the Mac, I can't. So I switch to iMessage on the Mac for full control and the ability to type on a physical keyboard.

Initial thoughts

Apple Vision Pro

Wearing Apple Vision Pro. (Image credit: Future)

What did I learn from the first two days with Apple Vision Pro? It delivers on its promises. It's versatile and powerful. The eye and gesture tracking is almost faultless. I only had to occasionally remind myself that a hand hanging down at my side would not be seen by the system cameras.

While I'd struggled to find a comfortable fit in some of my demo experiences, the time and space to select my best fit with the Dual Loop Band resulted in long-term comfort. I wore it for an hour or more at a time without any pain or discomfort.

It's as good at fun and content consumption as it is at work. I especially appreciated the Mac virtual display integration, something I now believe could transform my work life. I've always wanted a bigger desktop and now I have an almost limitless one.

For all that, I still don't know if I would spend $ 3,500 on it. The reality is that I don't even spend that much on my computers (if I can help it). Is a device that's equal parts work machine and entertainment room worth those extra bucks? Maybe. To be fair, it's early days and I may have a more concrete opinion when I finish my review.

You might also like

TechRadar – All the latest technology news

Read More

Samsung XR/VR headset – everything we know so far and what we want to see

We know for certain that a new Samsung XR/VR headset is in the works, with the device being made in partnership with Google. But much of the XR product’s details (XR, or extended reality, is a catchall for virtual, augmented, and mixed reality) are still shrouded in mystery. 

This so-called Apple Vision Pro rival (an XR headset from Apple) will likely have impressive specs – Qualcomm has confirmed its new Snapdragon XR2 Plus Gen 2 chip will be in the headset, and Samsung Display-made screens will probably be featured. It'll also likely have an equally premium price tag. Unfortunately, until Samsung says anything officially, we won’t know exactly how much it will cost, or when it will be released.

But using the few tidbits of official info, as well as our industry knowledge and the rumors out there, we can make some educated guesses that can clue you into the Samsung XR/VR headset’s potential price, release date, and specs – and we’ve got them down below. We’ve also highlighted a few of the features we’d like to see when it’s eventually unveiled to the public.

Samsung XR/VR headset: Price

The Samsung Gear VR headset on a red desk

The Samsung Gear VR, you needed a phone to operate it (Image credit: samsung)

We won’t know how much Samsung and Google’s new VR headset will cost until the device is officially announced, but most rumors point to it boasting premium specs – so expect a premium price.

Some early reports suggested Samsung was looking at something in the $ 1,000 / £1,000 / AU$ 1,500 range (just like the Meta Quest Pro) though it may have changed its plans. After the Apple Vision Pro reveal, it’s believed Samsung delayed the device most likely to make it a better Vision Pro rival in Samsung’s eyes – the Vision Pro is impressive, as you can find out from our hands-on Apple Vision Pro review.

If that’s the case, the VR gadget might not only more closely match the Vision Pro’s specs it might adopt the Vision Pro’s $ 3,499 (about £2,725 / AUS$ 5,230) starting price too, or something close to it.

Samsung XR/VR headset: Release date

Much like its price, we don’t know anything concrete about the incoming Samsung VR headset's release date yet. But a few signs point to a 2024 announcement – if not a 2024 release.

Firstly, there was the teaser Samsung revealed in February 2023 when it said it was partnering with Google to develop an XR headset. It didn’t set a date for when we’d hear more, but Samsung likely wouldn’t make this teasing announcement if the project was still a long way from finishing. Usually, a more full reveal happens a year or so from the teaser – so around February 2024.

There was a rumor that Samsung’s VR headset project was delayed after the Vision Pro announcement, though the source maintained that the headset would still arrive in 2024 – just mid-to-late 2024, rather than February.

Three people on stage at Samsung Unpacked 2023 teasing Samsung's future of XR

The Samsung Unpacked 2023 XR headset teaser (Image credit: Samsung)

Then there’s the Snapdragon XR2 Plus Gen 2 chipset announcement. Qualcomm was keen to highlight Samsung and Google as partners that would be putting the chipset to use. 

It would be odd to highlight these partners if its headset was still a year or so from launching. Those partners may have preferred to work with a later next-gen chip, if the XR/VR headset was due in 2025 or later. So this would again point to a 2024 reveal, if not a precise date this year.

Lastly, there have also been suggestions that the Samsung VR headset might arrive alongside the Galaxy Z Flip 6 – Samsung's folding phone that's also due to arrive in 2024.

Samsung XR/VR headset: Specs

A lot of the new Samsung VR headset’s specs are still a mystery. We can assume it’ll use Samsung-made displays (it would be wild if Samsung used screens from one of its competitors) but the type of display tech (for example, QLED, OLED or LCD), resolution, and size are still unknown.

We also don’t know what size battery it’ll have, or its storage space, or its RAM. Nor what design it will adopt – will it look like the Vision Pro with an external display, like the Meta Quest 3 or Quest Pro, or something all-new?

Key Snapdragon XR2 Plus Gen 2 specs, including that it has support fo 4.3k displays, 8x better AI performance, and 2.5x better GPU performance

(Image credit: Qualcomm)

But we do know one thing. It’ll run (as we predicted) on a brand-new Snapdragon XR2 Plus Gen 2 chip from Qualcomm – an updated version of the chipset used by the Meta Quest Pro, and slightly more powerful than the XR2 Gen 2 found in the Meta Quest 3.

The upshot is that this platform can now support two displays at 4.3K resolution running at up to 90fps. It can also manage over 12 separate camera inputs that VR headsets will rely on for tracking – including controllers, objects in the space, and face movements – and it has more advanced AI capabilities, 2.5x better GPU performance, and Wi-Fi 7 (as well as 6 and 6E).

What we want to see from the new Samsung XR/VR headset

1. Samsung’s XR/VR headset to run on the Quest OS 

Girl wearing Meta Quest 3 headset interacting with a jungle playset

We’d love to see the best Quest apps on Samsung’s VR headset (Image credit: Meta)

This is very much a pipe dream. With Google and Samsung already collaborating on the project it’s unlikely they’d want to bring in a third party – especially if this headset is intended to compete with Apple and Meta hardware.

But the Quest platform is just so good; by far the best we’ve seen on standalone VR headsets. It’s clean, feature-packed, and home to the best library of VR games and apps out there. The only platform that maybe beats it is Steam, but that’s only for people who want to be tethered to a PC rig.

By partnering with Meta, Samsung’s headset would get all of these benefits, and Meta would have the opportunity to establish its OS as the Windows or Android of the spatial computing space – which might help its Reality Labs division to generate some much-needed revenue by licensing the platform to other headset manufacturers.

2. A (relatively) affordable price tag

Oculus Quest 2 on a white background

The Quest 2 is the king of VR headsets, because it’s affordable  (Image credit: Shutterstock / Boumen Japet)

There’s only been one successful mainstream VR headset so far: the Oculus Quest 2. The Meta-made device has accounted for the vast, vast majority of VR headset sales over the past few years (eclipsing the total lifetime sales of all previous Oculus VR headsets combined in just five months) and that’s down to one thing; it’s so darn cheap.

Other factors (like a pandemic that forced everyone inside) probably helped a bit. But fundamentally, getting a solid VR headset for $ 299 / £299 / AU$ 479 is a very attractive offer. It could be better specs-wise but it’s more than good enough and offers significantly more bang for your buck than the PC-VR rigs and alternative standalone headsets that set you back over $ 1,000 when you factor in everything you need.

Meta’s Quest Pro, the first headset it launched after the Quest 2 that has a much more premium $ 999 / £999 / AU$ 1,729 price (it launched at $ 1,500 / £1,500 / AU$ 2,450) has seemingly sold significantly worse. We don’t have exact figures but using the Steam Hardware Survey figures for December 2023 we can see that while 37.87% of Steam VR players use a Quest 2 (making it the most popular option, and more than double the next headset) only 0.44% use a Quest Pro – that’s about 86 times less.

The Apple Vision Pro headset on a grey background

The Apple Vision Pro is too pricey (Image credit: Apple)

So by making its headset affordable, Samsung would likely be in a win-win situation. We win because its headset isn’t ridiculously pricey like the $ 3,499 (around £2,800 / AU$ 5,300) Apple Vision Pro. Samsung wins because its headset has the best chance of selling super well.

We’ll have to wait and see what’s announced by Samsung, but we suspect we’ll be disappointed on the price front. A factor that could keep this device from becoming one of the best VR headsets out there.

3. Controllers and space for glasses 

We’ve combined two smaller points into one for this last ‘what we want to see’.

Hand tracking is neat, but ideally it’ll just be an optional feature on the upcoming Samsung VR headset rather than the only way to operate it – which is the case with the Vision Pro. 

Most VR apps are designed with controllers in mind, and because most headsets now come with handsets that have similar button layouts it’s a lot easier to port software to different systems. 

Meta Quest 3 controllers floating in a turquoise colored void.

The Meta Quest 3’s controllers are excellent, copy these Samsung (Image credit: Meta )

There are still challenges, but if your control scheme doesn’t need to be reinvented, developers have told us that’s a massive time-saver. So having controllers with this standard layout could help Samsung get a solid library of games and apps on its system by making it easier for developers to bring their software to it.

We’d also like it to be easy for glasses wearers to use the new Samsung VR headset. The Vision Pro’s prescription lenses solution is needlessly pricey when headsets like the Quest 2 and Quest 3 have a free in-built solution for the problem – an optional spacer or way to slightly extend the headset so it’s further from your face leaving room for specs.

Ideally, Samsung’s VR headset would also have a free and easy solution for glasses wearers, too.

You might also like

TechRadar – All the latest technology news

Read More

What is Google Bard? Everything you need to know about the ChatGPT rival

Google finally joined the AI race and launched a ChatGPT rival called Bard – an “experimental conversational AI service” earlier this year. Google Bard is an AI-powered chatbot that acts as a poet, a crude mathematician and a even decent conversationalist.

The chatbot is similar to ChatGPT in many ways. It's able to answer complex questions about the universe and give you a deep dive into a range of topics in a conversational, easygoing way. The bot, however, differs from its rival in one crucial respect: it's connected to the web for free, so – according to Google – it gives “fresh, high-quality responses”.

Google Bard is powered by PaLM 2. Like ChatGPT, it's a type of machine learning called a 'large language model' that's been trained on a vast dataset and is capable of understanding human language as it's written.

Who can access Google Bard?

Bard was announced in February 2023 and rolled out for early access the following month. Initially, a limited number of users in the UK and US were granted access from a waitlist. However, at Google I/O – an event where the tech giant dives into updates across its product lines – Bard was made open to the public.

It’s now available in more than 180 countries around the world, including the US and all member states of the European Union. As of July 2023, Bard works with more than 40 languages. You need a Google account to use it, but access to all of Bard’s features is entirely free. Unlike OpenAI’s ChatGPT, there is no paid tier.

The Google Bard chatbot answering a question on a computerscreen

(Image credit: Google)

Opening up chatbots for public testing brings great benefits that Google says it's “excited” about, but also risks that explain why the search giant has been so cautious to release Bard into the wild. The meteoric rise of ChatGPT has, though, seemingly forced its hand and expedited the public launch of Bard.

So what exactly will Google's Bard do for you and how will it compare with ChatGPT, which Microsoft appears to be building into its own search engine, Bing? Here's everything you need to know about it.

What is Google Bard?

Like ChatGPT, Bard is an experimental AI chatbot that's built on deep learning algorithms called 'large language models', in this case one called LaMDA. 

To begin with, Bard was released on a “lightweight model version” of LaMDA. Google says this allowed it to scale the chatbot to more people, as this “much smaller model requires significantly less computing power”.

The Google Bard chatbot answering a question on a phone screen

(Image credit: Google)

At I/O 2023, Google launched PaLM 2, its next-gen language model trained on a wider dataset spanning multiple languages. The model is faster and more efficient than LamDA, and comes in four sizes to suit the needs of different devices and functions.

Google is already training its next language model, Gemini, which we think is one of its most exciting projects of the next 25 years. Built to be multi-modal, Gemini is touted to deliver yet more advancements in the arena of generative chatbots, including features such as memory.

What can Google Bard do?

In short, Bard is a next-gen development of Google Search that could change the way we use search engines and look for information on the web.

Google says that Bard can be “an outlet for creativity” or “a launchpad for curiosity, helping you to explain new discoveries from NASA’s James Webb Space Telescope to a 9-year-old, or learn more about the best strikers in football right now, and then get drills to build your skills”.

Unlike traditional Google Search, Bard draws on information from the web to help it answer more open-ended questions in impressive depth. For example, rather than standard questions like “how many keys does a piano have?”, Bard will be able to give lengthy answers to a more general query like “is the piano or guitar easier to learn”?

The Google Bard chatbot answering a question on a computer screen

An example of the kind or prompt that Google’s Bard will give you an in-depth answer to. (Image credit: Google)

We initially found Bard to fall short in terms of features and performance compared to its competitors. But since its public deployment earlier this year, Google Bard’s toolkit has come on leaps and bounds. 

It can generate code in more than 20 programming languages, help you solve text-based math equations and visualize information by generating charts, either from information you provide or tables it includes in its responses. It’s not foolproof, but it’s certainly a lot more versatile than it was at launch.

Further updates have introduced the ability to listen to Bard’s responses, change their tone using five options (simple, long, short, professional or casual), pin and rename conversations, and even share conversations via a public link. Like ChatGPT, Bard’s responses now appear in real-time, too, so you don’t have to wait for the complete answer to start reading it.

Google Bard marketing image

(Image credit: Google)

Improved citations are meant to address the issue of misinformation and plagiarism. Bard will annotate a line of code or text that needs a citation, then underline the cited part and link to the source material. You can also easily double-check its answers by hitting the ‘Google It’ shortcut.

It works with images as well: you can upload pictures with Google Lens and see Google Search image results in Bard’s responses.

Bard has also been integrated into a range of Google apps and services, allowing you deploy its abilities without leaving what you’re working on. It can work directly with English text in Gmail, Docs and Drive, for example, allowing you to summarize your writing in situ.

Similarly, it can interact with info from the likes of Maps and even YouTube. As of November, Bard now has the limited ability to understand the contents of certain YouTube videos, making it quicker and easier for you to extract the information you need.

What will Google Bard do in future?

A huge new feature coming soon is the ability for Google Bard to create generative images from text. This feature, a collaborative effort between Google and Adobe, will be brought forward by the Content Authenticity Initiative, an open-source Content Credentials technology that will bring transparency to images that are generated through this integration.

The whole project is made possible by Adobe Firefly, a family of creative generative AI models that will make use of Bard's conversational AI service to power text-to-image capabilities. Users can then take these AI-generated images and further edit them in Adobe Express.

Otherwise, expect to see Bard support more languages and integrations with greater accuracy and efficiency, as Google continues to train its ability to generate responses.

Google Bard vs ChatGPT: what’s the difference?

Fundamentally the chatbot is based on similar technology to ChatGPT, with even more tools and features coming that will close the gap between Google Bard and ChatGPT.

Both Bard and ChatGPT are chatbots that are built on 'large language models', which are machine learning algorithms that have a wide range of talents including text generation, translation, and answering prompts based on the vast datasets that they've been trained on.

A laptop screen showing the landing page for ChatGPT Plus

(Image credit: OpenAI)

The two chatbots, or “experimental conversational AI service” as Google calls Bard, are also fine-tuned using human interactions to guide them towards desirable responses. 

One difference between the two, though, is that the free version of ChatGPT isn't connected to the internet – unless you use a third-party plugin. That means it has a very limited knowledge of facts or events after January 2022. 

If you want ChatGPT to search the web for answers in real time, you currently need to join the waitlist for ChatGPT Plus, a paid tier which costs $ 20 a month. Besides the more advanced GPT-4 model, subscribers can use Browse with Bing. OpenAI has said that all users will get access “soon”, but hasn't indicated a specific date.

Bard, on the other hand, is free to use and features web connectivity as standard. As well as the product integrations mentioned above, Google is also working on Search Generative Experience, which builds Bard directly into Google Search.

Does Google Bard only do text answers?

Until recently Google's Bard initially only answered text prompts with its own written replies, similar to ChatGPT. But one of the biggest changes to Bard is its multimodal functionality. This allows the chatbot to answer user prompts and questions with both text and images.

Users can also do the same, with Bard able to work with Google Lens to have images uploaded into Bard and Bard responding in text. Multimodal functionality is a feature that was hinted at for both GPT-4 and Bing Chat, and now Google Bard users can actually use it. And of course, we also have Google Bard's Adobe-powered AI image generator, which will be powered by Adobe Firefly.

TechRadar – All the latest technology news

Read More

11 new AI projects announced at Adobe MAX 2023 – here’s why they could change everything

Adobe is currently holding its MAX 2023 event showing off what it has in store for the next year or so. One of the focal points of the conference was a series of 11 “Projects” that have the potential to become “important elements” of Adobe products in the future.

Recently, the company provided a sneak peek at one of these elements called Project Stardust, which has the ability to separate objects in a photograph into individual layers for easy editing. Users will have the ability to move objects around or delete them. From there, you can have a generative AI create something to take its place. The other 10 perform similarly as they harness AI technology to power their robust editing and creative capabilities. The group is split into three main categories. 

Photos

Alongside Stardust in the Photos category, you have Project See Through, a tool that removes reflections in a photograph. Adobe states that glass reflections can be really annoying since they can obscure subjects. Instead of having to go through a multi-step process of editing the image on Photoshop, See Through does it all for you quickly.

Image 1 of 2

Adobe Project Through before

(Image credit: Adobe)
Image 2 of 2

Adobe Project See Through after

(Image credit: Adobe)

Video & Audio

Similar to how Stardust can remove objects in images, Project Fast Fill can remove them in videos thanks to the company’s Generative Fill tech. It can also add or change content via “Firefly-powered text prompts.” In the example shown to us, Fast Fill can add a tie to a man whose suit doesn't have or alter latte art in a cup of coffee from a heart to a flower. 

See more

Next, Project Res Up can bump up the resolution of a clip via diffusion-based upsampling technology. Scene Change is third and it can swap out the background of a video from, say, an office building to a jungle. For audio, there’s Project Dub Dub Dub, a software tool claimed to be able to translate speech from one language to another “while preserving the voice of the original speaker”. 

3D & Design

For the last category, these five are all about helping users create – even if they’re not the best artist. 

Project Draw & Delight can turn your doodle into a polished drawing utilizing a text prompt to guide it. Glyph Ease “makes customized lettering more accessible” by instantly applying specific design elements to a word in Illustrator. All you have to do is provide a rough outline of what you want the AI to add.

Image 1 of 2

Project Draw & Delight before

(Image credit: Adobe)
Image 2 of 2

Project Draw & Delight after

(Image credit: Adobe)

The trio of 3D imaging software is more situational, but still impressive nonetheless.

Project Poseable’s AI can morph a 3D model to match “poses from photos of real people.” So if you upload a picture of someone striking a karate pose, the model will do the same. Project Primrose lets artists quickly alter the texture of a rendered piece of clothing. And finally, we have Neo which aids creators in creating 3D objects using  “2D tools and methods.

To reiterate what we said earlier, these projects are prototypes at the time of this writing. There’s no guarantee any of these will become a new feature in Photoshop or any other Adobe product. However, there are some we believe have the potential for an eventual release. 

Stardust, Res Up, as well as Draw & Delight, appear to be the most “complete”. There aren't as many visible flaws as with some of the others. Certain projects require more time in the oven in our opinion. For example, the voice from Dub Dub Dub sounds really stilted and robotic. It's not natural.

Be sure to check out TechRadar’s list of the best AI art generators of the year if you’re looking for ways to bolster content generation. 

You might also like

TechRadar – All the latest technology news

Read More

Meta Quest 3: price, design, and everything we know so far

The Oculus Quest 3, now officially known as Meta Quest 3, has been announced, and we won't have to wait much longer to get our hands on the highly-anticipated VR headset. 

Meta's more budget-friendly Quest 3 headset, separate from the pricier, premium Meta Quest Pro, will be more expensive than the Oculus Quest 2. That is to be expected for a new 2023 headset, and to soften the blow Meta has lowered the price of the Quest 2 so you can at least try VR if you're on a tighter budget even if it's not the latest and greatest version of it. And the Quest 3 will be greatest, with Meta calling it its “most powerful headset” yet.

There's plenty of new and official information on the Meta Quest 3 now, and it's certainly shaping up to be a contender when it comes to topping our list of the best VR headsets you can buy. For now, read on to learn everything we know about the upcoming headset.

Meta Quest 3: What you need to know

  • What is the Meta Quest 3? Meta’s follow-up VR headset to the Quest 2
  • Meta Quest 3 release date: “Fall” 2023, but most likely September or October
  • Meta Quest 3 specs: We don’t have all the details, but Meta says it’s its “most powerful headset” yet
  • Meta Quest 3 design: Similar to the Quest 2 but slimmer and the controllers lack tracking rings

Meta Quest 3 price

So far, Meta has only confirmed the price of the Quest 3's 128GB model, which will retail for $ 499 / £499. The official announcement page also makes mention of “an additional storage option for those who want some extra space.”

We're not sure exactly as to how much extra storage will be provided by this upgraded model, but it's very likely to be a total of 256GB there. No price for this model has been announced either, but expect something in the region of $ 599 / £599 or potentially up to $ 699 / £699.

Meta Quest 3 release date

The Quest 3 has been confirmed to launch in “Fall 2023,” which means, barring any delays, we should see it launch somewhere between the months of September and December 2023. Meta's official page for the Quest 3 states that further details will be revealed at the Meta Connect event happening on September 27.

Given the release schedule of the Meta Quest Pro, which was detailed at Meta Connect 2022 and launched soon after, we expect that the Meta Quest 3 will launch shortly after the Connect in September or October. We'll have to wait and see what Meta decides though.

Oculus Quest 3 specs and features

Meta Quest 3 floating next to its handsets

(Image credit: Meta )

We now have official information, straight from Meta's mouth, about the specs and features we can expect for Meta Quest 3. It was always a safe bet that the headset would continue to be a standalone device, and that's certainly remained true. You won't need a PC or external device to play the best VR games out there.

The most immediate improvement for Quest 3 is a high-fidelity color passthrough feature, which should allow you to view your immediate surroundings via a high quality camera. Not only will this help you plan out your playing space, but should also aid augmented and mixed reality experiences become even more immersive.

Quest 3 has also been confirmed to sport a 40% slimmer optic profile over the last-gen Quest 2. That'll reduce the weight of the device and should allow for comfier play sessions overall. Similarly, its Touch Plus controllers have been reworked with a more ergonomic design. Other improvements in this area include enhanced hand tracking and controller haptic feedback, similar to the DualSense wireless controller for PS5.

Meta Quest 3's front exploding outwards, revealing all of its internal parts

(Image credit: Meta )

It's been speculated that the Quest 3 will adopt uOLED displays (an upgraded version of OLED). Though, we've also seen conflicting reports that instead hint at OLED displays, and mini LED displays. What analysts seem to agree on is some kind of visual enhancement will come to the Quest 3 – so expect improved display quality and higher resolutions.

So far, Meta's own details remain vague on this front. We know that Quest 3 will feature a higher resolution display than Quest 2, paired with pancake lenses for greater image clarity and an overall reduction in device weight. These lenses should also improve the display of motion, hopefully reducing motion sickness and the dreaded image ghosting effect that plagues many a VR headset, even the PSVR 2.

Lastly, Meta has confirmed that the Quest 3 will be powered by the latest Snapdragon chipset from Qualcomm. In Meta's own words, the new chipset “delivers more than twice the graphical performance as the previous generation Snapdragon GPU in Quest 2.” We should expect a pretty significant leap in visual quality, then.

Oculus Quest 3 – what we’d like to see

In our Oculus Quest 2 review, it was hard to find fault with a VR headset that proved immersive, comfortable and easy to use. And yet, while it clearly leads the pack in the VR market, it still falls foul of some of the pitfalls that the technology as a whole suffers from. Here’s a list of updates we want to see on the Oculus Quest 3:

Improved motion sickness prevention
One of those technological pitfalls, and perhaps an unavoidable one, is the motion sickness that can often ensue when using any VR headset. Depending on your tolerance for whirring and blurring, the Quest 2 can be one helluva dizziness-inducer. While there isn’t yet a clear path to making any VR headset immune to user dizziness, it’s nonetheless something we’d like to see improved on the Oculus Quest 3.

A better fit
The same goes for the fit of the device. While the Quest 2 is indeed a comfortable weight when on the head, it can still be a little claustrophobic to achieve a good, tight fit. Again, it’s a problem encountered by almost all VR headsets, and a base-level issue that the next generation of hardware should at least attempt to better address. Those aforementioned design rumors suggest the new Oculus device could solve some of these issues.

Improved Oculus Store
Other improvements we’d like to see include a more effective in-VR Oculus Store. While the equivalent store on browser and in the app makes it easy to discover new releases and search for upcoming games, the store inside the headset itself seems to roll the dice on what apps are shown with no way to quickly navigate to new content. This makes it difficult to pre-order games and discover new titles to purchase when using the device, which is a pivotal part of ensuring the headset maintains replayability.

A virtual hand pointing at the Quest store menu using the Quest's hand-tracking

(Image credit: Oculus / Facebook)

A neighborhood-like social space 
While the Quest 2 has a competent party invitation system to get you game-to-game with your friends, there isn’t a social space to engage with others in-between. It would be interesting to see the Quest 3 introduce a virtual social space, in the same vein as NBA 2K’s neighborhood area, to share some downtime with others. What’s with the multi-person furniture in the current home environment if there’s nobody to share it with? Luckily, Meta's new metaverse project – ambitious as it seems – suggests virtual social spaces will be at the forefront of all future Quest headsets.

Improved media sharing
Sharing screenshots and videos on Oculus devices has never been easy, and it’s an issue that the Quest 2 has tried to address with a few video updates.  The process could still be more streamlined, so we'd like to see the Oculus 3 make the whole deal more accessible.1080p video, app integration, proper audio syncing – that’d all be golden.

TechRadar – All the latest technology news

Read More