Meta can’t stop making the Meta Quest 3’s mixed reality better with updates

June is here, and like clockwork the latest update for your Meta Quest 3 headset is ready to roll out. 

The standout upgrade for v66 is to the VR headset’s mixed reality (again) – after it was the main focus of Horizon OS v64, and got some subtle tweaks in v65 too.

We aren’t complaining though, as this improvement looks set to make the image quality even better, with reduced image distortion in general and a reduction to the warping effect that can appear around moving objects. The upshot is that you should notice that it’s easier to interact with real-world objects while in mixed reality, and the overlay that displays your virtual hands should better align with where your actual hands appear to be.

If you want to see a side-by-side, Meta has handily released a video showcasing the improvements to mixed reality.

If you’re using your hands instead of controllers, Meta is also adding new wrist buttons.

Should you choose to enable this option in the experimental settings menu, you’ll be able to tap on your right or left wrist to use the Meta or Menu buttons respectively.

According to Meta, wrist buttons will make it a lot easier to open a menu from within a game or app – either the in-game pause screen, or the system-level menu should you want to change to a different experience, take a screenshot or adjust your headset’s settings. We’ll have to try them out for ourselves, but they certainly sound like an improvement, and a similar feature could bring even more button controls to the hand-tracking experience.

A gif showing a person pinching their fingers to open the Quest menu

You’ll no longer need to pinch to open menus (Image credit: Meta)

Lastly Meta is making it easier to enjoy background audio – so if you start audio or a video in the Browser, it’ll keep playing when you minimize the app – as well as a few changes to Parental Supervision features. Namely, from June 27, children aged 10 to 12 who are supervised by the same parent account will automatically be able to see each other in the Family Center.

As Meta warns however its update is rolling out gradually, and because this month’s passthrough change is so big it’s saying it will be sending out updates even more slowly than usual – and what’s more, some people who update to v66 might not get all the improvements right away.

So if you don’t see the option to update right away, or any passthrough improvements once you've installed v66 on your Meta Quest 3, don’t fret. You will get the upgrade eventually.

You might also like

TechRadar – All the latest technology news

Read More

Meta’s recent Quest 3 update includes a secret AI upgrade for mixed reality

Meta’s VR headsets recently received update v64, which according to Meta added several improvements to their software – such as better-quality mixed-reality passthrough in the case of the Meta Quest 3 (though I didn’t see a massive difference after installing the update on my headset).

It’s now been discovered (first by Twitter user @Squashi9) that the update also included another upgrade for Meta’s hardware, with Space Scan, the Quest 3’s room scanning feature, getting a major buff thanks to AI.

The Quest 3’s Space Scan is different to its regular boundary scan, which sets up your safe play space for VR. Instead, Space Scan maps out your room for mixed-reality experiences, marking out walls, floors, and ceilings so that experiences are correctly calibrated.

You also have the option to add and label furniture, but you had to do this part manually until update v64 rolled out. Now, when you do a room scan your Quest 3 will automatically highlight and label furniture – and based on my tests it works flawlessly.

Annoyingly, the headset wouldn’t let me take screenshots of the process, so you’ll have to trust me when I say that every piece of furniture was not only picked up by the scan and correctly marked out, it was also labelled accurately – it even picked up on my windows and doors, which I wasn’t expecting.

The only mistake I spotted was that a chair I have in my living room was designated a 'couch', though this seems to be more an issue with Meta’s lack of more specific labels than with Space Scan’s ability to detect what type of object each item of furniture is.

Post by @edwardrichardmiller
View on Threads

This feature isn’t a complete surprise, as Reality Labs showed a version of it off on Threads in March. What is surprising, however, is how quickly it’s been rolled out after being unveiled – though I’m not complaining, considering how well it works and how easy it makes scanning your room. 

So what? 

Adding furniture has a use for MR and VR apps. Tables can be used by apps like Horizon Workrooms as designated desks, while sitting down in or getting up from a designated couch will change your VR experience between a standing or seated mode.

Meanwhile, some apps can use the detected doors, windows, walls, and furniture such as a bookshelf to adjust how mixed-reality experiences interact with your space.

With Meta making it less tedious to add these data points, app developers have more of a reason to take furniture into account when designing VR and MR experiences, which should lead to them feeling more immersive.

This also gives Meta a leg up over the Apple Vision Pro, as it’s not yet able to create a room scan that’s as detailed as the one found on Meta’s hardware – though until software starts to take real advantage of this feature it’s not that big a deal.

We’ll have to wait and see what comes of this improvement, but if you’ve already made a space scan or two on your Quest 3 you might want to redo them, as the new scans should be a lot more accurate.

You might also like

TechRadar – All the latest technology news

Read More

The latest Meta Quest 3 update brings mixed reality improvements

Update v64 is here for your Meta Quest 3 and other Quest hardware, and it adds a big quality-of-life improvement for fans of mixed reality: better passthrough.

According to Meta’s official blog, the update has made the Quest 3’s passthrough “higher-fidelity than before”, resulting in your headset being better at adjusting color, exposure, contrast, and dynamic range to best suit your real-world environment. 

These updates should make it easier to read text on screens, and to use the headset in darker rooms.

We’ve tested the improved feature out, and it does seem easier to read text on real-world screens, and Meta’s claim that it’s less grainy in low light seems to ring true as well. That said, in general the passthrough doesn’t seem massively improved – so while it is better, don’t go in expecting ultra-crisp, Apple Vision Pro-levels of mixed-reality passthrough quality. 

The experimental menu on the Meta Quest 3 showing the new External Mic Support feature toggle

(Image credit: Meta)

Can you hear me now?

Beyond upgraded passthrough, update v64 brings with it a few additional refinements to your VR headset’s software.

The first is that your Quest device can now support an external microphone – with the feature appearing in the experimental settings menu. Once you’ve toggled it on you’ll be able to plug in an external microphone via the USB-C port to capture audio for VR content creation or in-game chat instead of using the Quest 3’s built-in mic.

Following its addition last month for Oculus Quest 2 users, Meta Quest 3 users can now use their headsets lying down too. 

What’s more, Continuous Casting has been added. Previously, if you removed your headset while casting to your phone the session would end, and Meta admitted that more often than not users would rather keep it running rather than have to restart every time they wanted to take their headset off (say to take a drink or talk to someone).  So now if you remove your headset while casting the session won’t be cut short – just make sure that you stop casting manually using your phone when you’re done.

You might also like

TechRadar – All the latest technology news

Read More

Apple Vision Pro blasts out of mixed reality and into real stores – here’s how to sign up for a demo

It felt almost odd to be standing in the rain outside of Apple's glassy Fifth Avenue flagship store on Groundhog Day and not be wearing my Apple Vision Pro. I'd barely removed the mixed reality headset in my first two days of testing the Vision Pro and the real world felt a bit flat. Until, that is, Apple CEO Tim Cook opened the swinging glass doors and opened the proverbial floodgates to new and soon-to-be-new Apple Vision Pro owners.

It is something of a tradition for Cook to usher in every new product at Apple's Central Park-adjacent location but this moment was different, maybe bigger. It has been almost a decade since Apple launched a new product category (see the Apple Watch) and so expectations were high.

The crowd gathered outside was not what I'd call iPhone size – the miserable weather might have been a factor there – but there were dozens of people somewhat evenly split between media and customers.

A cluster of blue-shirted Apple employees poured out of the store, which featured the giant white outline of a Vision Pro on the storefront, and started clapping and cheering (I'd heard them practicing cheers and getting amped up from inside the store), doing their best to substitute any enthusiasm the crowd might've been lacking. This, too, is tradition and I find it almost endearing but also just a tiny bit cringe-worthy. It's just a gadget – a very expensive one – after all.

At precisely 8AM ET, Cook appeared behind the glass doors (someone had previously double-checked and triple-checked that the doors were not locked so Cook didn't have to bend down and release a latch). He swung open the door and gave a big wave.

Soon customers who had preordered the $ 3,499 (to start) spatial reality computer were filing into the store (many pausing to take a selfie with Cook), while I waited outside, getting drenched and wondering if the Vision Pro is waterproof (it's not).

Image 1 of 5

Apple Vision Pro store launch

Tim Cook acknowledges the crowd. (Image credit: Future / Lance Ulanoff)
Image 2 of 5

Apple Vision Pro store launch

Cook pops out and waves. (Image credit: Future / Lance Ulanoff)
Image 3 of 5

Apple Vision Pro store launch

Tim Cook was in his element. (Image credit: Future / Lance Ulanoff)
Image 4 of 5

Apple Vision Pro store launch

Waiting for the launch. (Image credit: Future / Lance Ulanoff)
Image 5 of 5

Apple Vision Pro store launch

First guy on line. (Image credit: Future / Lance Ulanoff)

Inside the store, which sits below ground level, the floor was packed. Vision Pros were lined up on stands similar to what I'd seen at launch. Below each one was an iPad, describing the experience you were about to have. Some people were seated on wooden benches near the back of the store, wearing Vision Pro headsets and gesturing to control the interfaces.

Oddly, though, not a lot of people were trying Vision Pros, but that was probably because Tim Cook was still in the room.

The scrum around him was dense, so much so that I noticed some nervous-looking Apple employees trying to gently clear a path and give the Apple leader some air. Cook, ever the gracious southern gentleman, smiled for countless photos with fans. He even signed a few things.

I stepped forward and Cook's eyes caught mine. He smiled broadly and said hello. We shook hands and I congratulated him on a successful launch. Then I gave him my brief assessment of the product: “It's incredible.” He brightened even further, “I know!” he shouted back over the din.

Apple Vision Pro store launch

(Image credit: Future / Lance Ulanoff)
Image 1 of 4

Apple Vision Pro store launch

They put some of the Vision Pros on stands. (Image credit: Future / Lance Ulanoff)
Image 2 of 4

Apple Vision Pro store launch

You cna see people in the back wearing them. (Image credit: Future / Lance Ulanoff)
Image 3 of 4

Apple Vision Pro store launch

Tim Cook is surrounded. (Image credit: Future / Lance Ulanoff)
Image 4 of 4

Apple Vision Pro store launch

Hi, Mr. Cook. (Image credit: Future / Lance Ulanoff)

There wasn't much more to say, really, and I left him to get sucked back into the crowd while I took another look at the Vision Pro sales setup. In the meantime, customers were leaving with large Vision Pro boxes they'd pre-ordered. Thousands of the mixed reality headsets are in stores and arriving at people's homes (in the US only). This will be their first experience with Vision Pro.

The good news is, as I told someone else today, there is no learning curve. The setup is full of hand-holding and using the system generally only requires your gaze and very simple gestures.

There will be comments about the weight and getting the right, comfortable fit on your head, and some may be frustrated with the battery pack and that they have to keep Vision Pro plugged in if they want to use it for more than two hours at a time.

Still, the excitement I saw at the store this morning and in Tim Cook's eyes may be warranted. This is not your father's mixed reality.

Booking your demo

For the next few days, all demos will be first-come-first-serve in the stores. However, if you can wait until after Feb 5, you can book your in-store demo by visiting the Apple Store site, navigating to the Vision Pro section, and selecting “Book a demo.” Apple will guide you to sign in with your Apple ID. You must also be at least 13 years old to go through the experience.

Demos take about 30 minutes. An Apple specialist will guide you through the setup processes, which is fairly straightforward.

You'll choose a store near you, a date, and an available time. If you wear glasses, Apple should be able to take your lenses and do a temporary measurement to give you the right lenses for the demonstration (you'll be buying your own Zeiss inserts if you buy a headset.).

After that, you can go home and figure out how to save up $ 3,500.

@techradar

♬ Epic Inspiration – DM Production

You might also like

TechRadar – All the latest technology news

Read More

Apple tells developers NOT to use “virtual reality” when talking about Vision Pro

The Vision Pro will go on sale next month, and we’ve just learned that Apple has requested that app developers for visionOS (the operating system that runs on the headset) don’t allude to visionOS apps as “AR” or “VR”. 

We first heard about Apple’s newest innovation in June 2023 – where it was marketed as a spatial computer that combines digital content and the user’s physical surroundings. It’s also equipped with some serious Apple graphics specs and visionOS, which Apple calls the “world’s first spatial computing system”

At first glance, the Vision Pro certainly appears to be similar to existing Virtual Reality (VR) and Augmented Reality (AR) headsets, so it’s interesting that Apple is at pains to ensure that it isn’t mistaken for one. The de facto ban on AR and VR references (as well as Extended Reality (XR) and Mixed Reality (MR)) was spotted in the guidelines of the new Xcode (Apple’s suite of developer tools) update that came after the announcement that Vision Pro devices will be in stores in early February

Vision Pro

(Image credit: Apple)

Apple lays down the law

This recommendation is pretty explicitly laid out on a new Apple Developer page which goes through what a developer needs to do to prepare their app for submission to the App Store. 

Apple insists that developers will also have to use the “visionOS” branding beginning with a lowercase “v” (similar to how they brand their flagship operating system for desktop and laptop devices, macOS), and to use the device’s full name, “Apple Vision Pro,” when referring to it. These aren’t as unexpected as Apple’s more notable instructions to avoid VR and AR, however. According to Apple, visionOS apps will not be considered VR, XR, or MR apps but as “spatial computing apps”.

It’s an interesting move for a number of reasons; coining a new term can be confusing to people, meaning that users will have to build familiarity and actually use the term for it to stick, but it also means that Apple can differentiate itself from the pack of AR/VR devices out there. 

It’s also a pivot from messaging that until now has relied on existing terms like augmented reality and virtual reality. Most of Apple’s current marketing refers to the Vision Pro as  a “spatial computing” platform, but at the Worldwide Developers Conference (WWDC) in 2023, Apple’s annual event for Apple platform developers, Apple CEO Tim Cook introduced the Vision Pro as an “entirely new AR platform.” Materially, this is mainly a marketing and branding move as Apple becomes more confident in its customers’ understanding of what the Vision Pro actually is. 9to5Mac reports that Apple engineers referred to visionOS as xrOS leading up to the device’s official announcement. 

Apple Vision Pro VR headset

(Image credit: Future / Lance Ulanoff)

Apple charts its own course

The pointed effort to distinguish itself from its competitors is an understandable move from Apple considering that some other tech giants have already attempted to dominate this space. 

Meta, Facebook and Instagram’s parent company, was one of the most noticeable examples. You might have a not-so-distant memory of a certain “metaverse”. The metaverse has seen a reception most would call lukewarm, even at its peak, and Apple is making a bold attempt to have its own association in people’s minds, with Apple’s VP of global marketing Greg Joswiak dismissing the word “metaverse” as one he’ll “never use” according to 9to5Mac.

I enjoy watching Apple make bolder moves into existing markets because it’s often when we’ve seen new industry standards emerge, which is always exciting – no matter whether you want to call it AR, VR, or spatial computing. 

YOU MIGHT ALSO LIKE…

TechRadar – All the latest technology news

Read More

Your New Reality Is Not Secure

The threat landscape has simply become too vast for cyber security executives to rest assured. We discuss the answers to five key questions to ensure that your new reality of remote work is more secur…

Articles RSS Feed

Read More