Your Oculus Quest 2 just got better hand-tracking to boost your virtual boxing skills

Meta has released the v56 update for the Meta Quest Pro and the Oculus Quest 2, which introduces a bunch of upgrades for two of the best VR headsets out there.

With the new Quest update Meta is rolling out Hand Tracking 2.2, which says aims to bring responsiveness more in line with what users experience with controllers. According to Meta, Hand Tracking 2.2 will reduce the latency experienced by a typical user by 40%, with a 75% latency reduction for fast hand movements. 

Meta recommends that you download the Move Fast demo app from App Lab to get a feel for what these improvements mean in practise. It looks like a simple fitness trainer in which you have to punch, chop and block incoming blocks while looking out over a lake decorated with cherry blossom trees. Meta has said we can expect more hand-tracking improvements when the Meta Quest 3 launches later this year. It's yet to be seen if these upgrades can keep up with the hand-tracking Apple is expected to launch with its Apple Vision Pro headset.

Another important improvement is coming just for Meta Quest Pro owners. One of the premium headset’s best upgrades over the Quest 2 is its display, which offers local dimming. This allows screens to achieve deeper black levels and improved contrast, something which can help a lot with immersion, as dark spaces actually look dark without it being impossible to see. However, local dimming isn’t available in every app, so with v56 Meta is launching a Local dimming Experimental Setting (which can be found in the Experimental menu in your headset’s Settings).

The feature is off by default, but if you turn it on you should see the benefits of local dimming in a load more titles – that is, unless a developer chooses to opt out. Just note that as with other experimental settings, you may find it isn’t quite perfect or causes some problems.

Quest 2 users aren't missing out on visual upgrades entirely though, as Meta recently announced that a Quest Super Resolution upscaling tool is coming to help developers make their games look and run better.

This month Meta is also improving the accessibility of Quest titles by introducing button mapping and live captions. Live captions will appear in your Quest headset’s Settings, under the Hearings section of the Accessibility menu. Once turned on you’ll see live subtitles while using the Meta Quest TV app, Explore, and the in-headset Quest Store. In the same Accessibility menu, go to the Mobility section and you’ll find an option to remap your Quest controllers – you can swap any buttons you want on the handsets to create a completely custom layout.

These accessibility settings won’t revolutionize your headset overnight, but they’re a great first step. Hopefully, we’ll see Meta introduce captioning to more apps and services, and perhaps it’ll launch custom-accessible controllers like the ones that Sony and Microsoft offer for their PS5 Access controller and the Xbox Adaptive Controller.

New ways to stay connected 

Beyond these major upgrades, Meta is rolling out a handful of smaller improvements as part of update v56.

First, when you leave your headset charging on standby between play sessions it can smartly wake up and install updates whenever it detects that your installed software is out of date. This should help to reduce instances of you going to play a game only to find that you need to wait for ages while your headset installs a patch.

Second is the new Chats and Parties feature. Whenever you start a call in VR a chat thread is also connected with all of the call members, so you can keep in contact later; you can also now start a call from a chat thread (whether it’s a one-on-one chat or a group chat).

Third, and finally, meta is making it easier to stream your VR gameplay to Facebook, and while you play you’ll be able to see a live chat, so you can keep in contact with your viewers. While the platform isn’t many people’s first choice, it hopefully opens the door for easier real-time live streaming to more popular platforms like YouTube and Twitch.

TechRadar – All the latest technology news

Read More

Meta Builder Bot concept happily builds virtual worlds based on voice description

The Metaverse, that immersive virtual world where Meta (née Facebook) imagines we'll work, play, and interact with friends and family is also where we may someday build entire worlds with nothing but our voice.

During an online AI development update delivered, in part, by Meta/Facebook Founder and CEO Mark Zuckerberg on Wednesday (February 23), the company offered a glimpse of Builder Bot, an AI concept that allows the user to build entire virtual experiences using their voice.

Standing in what looked like a stripped-down version of Facebook's Horizon Worlds' Metaverse, Zuckerberg's and a co-worker's avatars asked a virtual bot to add an island, some furniture, clouds, a catamaran, and even a boombox that could pay real music to the environment. In the demonstration, the command phrasing was natural and the 3D virtual imagery appeared instantly, though it did look a bit like the graphics you'd find in Nintendo's Animal Crossing: New Horizons.

The development of Builder Bot is part of a larger AI initiative called Project CAIRaeoke, which is an end-to-end neural model for building on-device assistance. 

Meta's Builder Bot concept

Mark Zuckerberg’s legless avatar and Builder Bot. (Image credit: Future)

Zuckerberg explained that current technology is not yet equipped to help us explore an immersive version of the internet that will ultimately live in the Metaverse. While that will require updates across a whole range of hardware and software, Meta believes AI is the key to unlocking advancement that will lead to, as Zukerberg put it, “a new generation of assistants that will help us explore new worlds”.

“When we’re wearing [smart] Glasses, it will be the first time an AI system will be able to see the world from our perspective,” he added. A key goal here is for the AI they're developing to see as we do and, more importantly, learn about the world as we do, as well.

It's unclear if Builder Bot will ever become a true part of the burgeoning Metaverse, but its skill with real-time language processing and understanding how parts of the environment should go together is clearly informed by the work Meta is doing.

Mark Zuckerberg talks AI translation

Mark Zuckerberg talks AI translation (Image credit: Future)

Zuckerberg outlined a handful of other related AI projects, all of which will eventually feed into a Metaverse that can be accessed and used by anyone in the world.

These include “No Language Left Behind,” which, unlike traditional translation that often uses English as a mid-translation point, can translate languages directly from the source to the translation language. There's also the very Star Trek-like “Universal Speech Translator”, which would provide instantaneous speech-to-speech translation across all languages, including spoken languages.

“AI is going to deliver that in our lifetimes,” said Zuckerberg.

Mark Zuckerberg talks image abstraction

Mark Zuckerberg talks image abstraction (Image credit: Future)

Meta is also investing heavily in self-supervised learning (SSL) to build human-like cognition into AI systems. Instead of training with tons of images to help the AI identify patterns, the system is fed raw data and then asked to predict the missing parts. Eventually, the AI learns how to build abstract representations.

An AI that can understand abstraction could complete an image just from a few pieces of visual information, or generate the next frame of a video it's never seen. It could also build a visually pleasing virtual world with only your words to guide it.

For those full-on freaked out by Meta's Metaverse ambitions, Zuckerberg said that the company is building the Metaverse for everyone and they are “committed to build openly and responsibly” while protecting privacy and preventing harm.

It's unlikely anyone will take his word for it, but we look forward to watching the Metaverse's development.

TechRadar – All the latest technology news

Read More

Huawei could drop another foldable at its MWC 2020 virtual conference on February 24

Huawei is going to host a virtual press conference on February 24 from Barcelona, Spain after GSMA decided to cancel MWC 2020 as many companies dropped out of the event due to the spread of COVID-19 (coronavirus) in may countries across the world.

Huawei will be streaming a pre-recorded launch session where it is expected to unveil multiple products across PC, laptop, wearables, audio and TV categories. However, according to a recent report by Gizmochina, Huawei is also expected to announce its next foldable phone during the event. This is a new foldable product by the company and we are not sure about the name of this upcoming phone, if it's really being unveiled on the said date.

Now, we think this foldable phone could be the Huawei Mate Xs that was announced back in October 2019. A significant update in the Mate Xs is said to be the presence of the Kirin 990 5G chipset, an improved hinge design and a more durable screen. 

The Mate Xs is confirmed to feature a similar design as the Mate X with the same set of Leica-branded camera array. It's worth noting though, that while the Huawei Mate X debuted in 2019, it's availability is still limited to China.

With a ban on Huawei currently in effect, it's unlikely that any phone they launch would have access to Google's suite of applications or even Play Store. As for the other products, we believe some of them could be powered by HarmonyOS.

Previously, the Mate Xs was scheduled to launch during MWC 2020 but as the event was scrapped, Huawei seems to be going ahead with the virtual launch anyway. We'll know more about the upcoming phone as we near the date of the conference on February 24.

TechRadar – All the latest technology news

Read More