Vision Pro spatial Personas are like Apple’s version of the metaverse without the Meta

While the initial hype over Apple Vision Pro may have died down, Apple is still busy developing and rolling out fresh updates, including a new one that lets multiple Personas work and play together.

Apple briefly demonstrated this capability when it introduced the Vision Pro and gave me my first test-drive last year but now spatial Personas is live on Vision Pro mixed-reality headsets.

To understand “spatial Personas” you need to start with the Personas part. You capture these somewhat uncanny valley 3D representations of yourself using Vision Pro's spatial (or 3D) cameras. The headset uses that data to skin a 3D representation of you that can mimic your face, head, upper torso, and hand movements and be used in FaceTime and other video calls (if supported).

Spatial Personas does two key things: it gives you the ability to put two (or more) avatars in one space and lets them interact with either different screens or the same one and does so in a spatially aware space. This is all still happening within the confines of a FaceTime call where Vision Pro users will see a new “spatial Persona” button.

To enable this feature, you'll need the visionOS 1.1 update and may need to reboot the mixed reality headset. After that you can at any time during a FaceTime Persona call tap on the spatial icon to enable the featue.

Almost together

Apple Vision Pro spatial Personas

(Image credit: Apple)

Spatial Personas support collaborative work and communal viewing experiences by combining the feature with Apple's SharePlay. 

This will let you “sit side-by-side” (Personas don't have butts, legs or feet, so “sitting” is an assumed experience) to watch the same movie or TV show. In an Environment (you spin the Vision Pro's digital crown until your real world disappears in favor of a selected environment like Yosemite”) you can also play multi-player games. Most Vision Pro owners might choose “Game Room”, which positions the spatial avatars around a game table. A spatial Persona call can become a real group activity with up with five spatial Personas participating at once.

Vision Pro also supports spatial audio which means the audio for the Persona on the right will sound like it's coming from the right. Working in this fashion could end up feeling like everyone is in the room with you, even though they're obviously not.

Currently, any app that supports SharePlay can work with spatial Personas but not every app will allow for single-screen collaboration. If you use window share or share the app, other personas will be able to see but not interact with your app window.

Being there

Apple Vision Pro spatial Personas

Freeform lets multiple Vision Pro spatial Personas work on the same app. (Image credit: Apple)

While your spatial Personas will appear in other people's spaces during the FaceTime call, you'll remain in control of your viewing experience and can still move your windows and Persona to suit your needs, while not messing up what people see in the shared experience.

In a video Apple shared, it shows two spatial Personas positioned on either side of a Freeform app window, which is, in and of itself somewhat remarkable. But things take a surprising turn when each of them can reach out with their Persona hands to control the app with gestures. That feels like a game-changer to me.

In some ways, this seems like a much more limited form of Meta CEO Mark Zuckerberg's metaverse ideal, where we live work and play together in virtual reality. In this case, we collaborate and play in mixed reality while using still somewhat uncanny valley avatars. To be fair, Apple has already vastly improved the look of these things. They're still a bit jarring but less so than when I first set mine up in February.

I haven't had a chance to try the new feature, but seeing those two floating Personas reaching out and controlling an app floating a single Vision Pro space is impressive. It's also a reminder that it's still early days for Vision Pro and Apple's vision of our spatial computing future. When it comes to utility, the pricey hardware clearly has quite a bit of road ahead of it.

You might also like

TechRadar – All the latest technology news

Read More

The next Apple Pencil could work on the Vision Pro for spatial sketching

A rumor claims the Apple Vision Pro headset will one day support a future model of the Apple Pencil. This news comes from MacRumors, who got their information from an anonymous “source familiar with the matter,” so we'll take it with a grain of salt. Details on the update are scarce as you can imagine, but if it is indeed real, it could quite literally turn the world into your personal canvas.

The report states the upcoming Apple Pencil could be used on flat surfaces, like a desk, “complete with pressure and tilt sensitivity” to accurately display your artistic vision on one of the headset’s illustration apps. Support for a stylus would require a software upgrade, “but it is unclear which version” of visionOS will see the patch. MacRumors points out the first beta for visionOS 1.2 could come out this week  with the Apple Pencil support. However, nothing can be said with total confidence. We can only surmise that testing is currently ongoing internally.

No word on when the update will roll out, if at all, and it’s entirely possible this will never see the light of day. However, MacRumors seems to believe we could see something during the expected reveal of visionOS 2 at WWDC 2024 this June.

It is worth mentioning an Apple Pencil refresh is supposed to come out alongside new iPads models very soon. Whether or not this refresh and a Vision Pro update are one and the same remains to be seen. 

Analysis: Picking up the digital pen

Assuming this is all true (and fingers crossed that it is), an Apple Pencil on the Vision Pro would do wonders for achieving precise control. The hands-free control scheme is one of the main selling points for the headset. You don’t need special controllers to navigate the user interface. Thanks to an array of cameras and sensors, owners can simply use their eyes and hands to command the software. This method of navigation is fine for most things, but when it comes to drawing, it turns into a nightmare.

TechRadar’s Editor At Large Lance Ulanoff dealt with this first hand when he tried to illustrate on the Vision Pro. He ended up calling the whole experience “insanely frustrating and difficult.” The main problem is that the gaze controls clash with the hand gestures. If your eyes move between a reference image and the digital canvas, the art piece falls apart because the headset prioritizes what you’re looking at. Then there are other problems, like the numerous bugs affecting the current slate of art apps.

The hope with the future Apple Pencil is it’ll help keep the canvas steady. That way, there isn’t this weird back and forth between the two methods of controls.

If you're looking to pick up illustration as a hobby, check out TechRadar's list of the best free drawing software for 2024.

You might also like

TechRadar – All the latest technology news

Read More

The Meta Quest 3 yoinks Vision Pro’s spatial video to help you relive your memories

Just as the Vision Pro launches, Meta has started rolling out software update v62 to its Meta Quest 3, Quest Pro, and Quest 2. The new software’s headline feature is it’s now a lot easier to watch your spatial video recordings on Quest hardware – stealing the Vision Pro’s best feature.

You’ve always been able to view 3D spatial video (or stereoscopic video as most people call it) on Quest hardware. And using a slightly awkward workaround you could convert spatial video recordings you’ve made using an iPhone 15 Pro into a Quest-compatible format to watch them in 3D without needing Apple’s $ 3,500 Vision Pro. But, as we predicted it would, Meta’s made this conversion process a lot simpler with v62.

Now you can simply upload the captured footage through the Meta Quest mobile app and Meta will automatically convert and send it to your headset – even giving the videos the same cloudy border as you’d see on the Vision Pro. 

You can find the recordings, and a few Meta-made demo videos, in the spatial videos section of the Files menu on your Quest headset.

iPhone 15 Pro review front flat angled handheld

You need an iPhone 15 Pro or Pro Max to record 3D video (Image credit: Future | Alex Walker-Todd)

Spatial video has been a standout feature referenced in nearly every review featured in our Apple Vision Pro review roundup – with our own Lance Ulanoff calling it an “immersive trip” after one of his demos with the Apple headset. So it’s almost certainly not a coincidence that Meta has announced it’s nabbed the feature literally as the Vision Pro is launching.

Admittedly Quest spatial video isn’t identical to the Vision Pro version as you need an iPhone 15 Pro – on the Vision Pro you can use the iPhone or the headset itself – but over time there’s one potential advantage Meta’s system could have. Non-exclusivity. 

Given that other smartphone manufacturers are expected to launch headsets of their own in the coming year or so – such as the already teased Samsung XR headset created in partnership with Google – it’s likely the ability to record 3D video will come to non-iPhones too. 

If this happens you’d likely be able to use whichever brand of phone you’d like to record 3D videos that you can then convert and watch on your Quest hardware through the Meta Quest app. Given its typical walled garden approach, you’ll likely always need an iPhone to capture 3D video for the Vision Pro and Apple’s future headsets – and Samsung, Google, and other brands that make smartphones may also impose some kind of walled garden to lock you into their hardware.

A gif showing a person pinching their fingers to open the Quest menu

(Image credit: Meta)

Other v62 improvements 

It’s not just spatial video coming in the next Quest operating system update.

Meta has added support for a wider array of controllers – including the PS5 DualSense controller and PS4 DualShock – that you can use to play games through apps like the Xbox Cloud Gaming (Beta) or the Meta Quest Browser.

Facebook Livestreaming, after being added in update v56, is now available to all Meta Quest users. So now everyone can share their VR adventures with their Facebook friends in real-time by selecting “Go Live” from the Camera icon on the Universal Menu while in VR (provided your Facebook and Meta accounts are linked through the Accounts Center). 

If you prefer YouTube streaming, it’s now possible to see your chat while streaming without taking the headset off provided you’re using OBS software.

Lastly, Meta is improving its hand-tracking controls so you can quickly access the Universal Menu by looking at your palm and doing a short pinch. Doing a long pinch will recenter your display. You can always go back to the older Quick Actions Menu by going into your Settings, searching for Expanded Quick Actions, and turning it back on.

You might also like…

TechRadar – All the latest technology news

Read More

Seeing your own spatial video on Vision Pro is an immersive trip – and I highly recommend it

Every experience I have with Apple's Vision Pro mixed reality headset is the same as the last and yet also quite different. I liken it to peeling an onion: I think I understand the feel and texture of it, but each time I notice new gradations and even flavors that remind me that I still don't fully know Apple's cutting-edge wearable technology.

For my third go around wearing the Vision Pro I had the somewhat unique experience of viewing my own content through the powerful and pricey ($ 3,499 when it ships next year) headset.

A few weeks ago, Apple dropped a beta for iOS 17.2, which added Spatial Video capture to the iPhone 15 Pro and iPhone 15 Pro Max (the full version landed this week). It's a landscape-only mode video format that uses the 48MP main and 12MP Ultrawide cameras to create a stereo video image. I started capturing videos in that format almost immediately, but with the caveat that not every video is worthy of this more immersive experience (you can't be too far away from your subject, and keeping the phone level and steady helps). Still, I had a solid nine clips that I brought with me for my second and by far more personal Vision Pro Spatial Video experience.

I tried, during this third Vision Pro trial, to pay more attention to some of the headset's setup and initialization details. As I've mentioned previously, the Vision Pro is one of Apple's more bespoke hardware experiences. If you wear glasses, you will need to pay extra for a pair of custom-made Zeiss lens inserts – I provided my prescription details in advance of this test run. It's not clear how long consumers might have to wait for their own inserts (could Apple have an express optician service in the back of each Apple Store? Doubtful).

Not everyone will need those lenses, or have to endure that extra cost and wait. If you don't wear glasses, you're ahead of people like me, and likewise if you're a contact lens wearer.

Man using Apple Vision Pro

Not me wearing the Vision Pro, because Apple still won’t allow me to photograph myself wearing them. That said, pressing the digital crown is part of the initial setup process (Image credit: Apple)

Getting the custom experience right

Still, there are other customizations that I didn't pay attention to until now. The face cushion that rests on your face and connects magnetically to the main part of Vision Pro comes in a few different curve styles to accommodate the differing contours of of a range of typical human faces. I don't know how many different options Apple will offer.

One thing that's critical for a comfortable AR and VR experience is matching your eye's pupillary distance – the distance between the centers of your eyes. This was the first time I paid attention to one of the first steps in my Vision Pro setup. After I long-pressed the headset's digital crown, a pair of large green shapes appeared before my eyes. They measured the space between my eyes and inside the Vision Pro, and then the dual micro-LED displays and their 23 million pixels of imagery moved to match the space between my eyes. If you listen carefully, you might be able to hear the mechanics doing their job.

I also noted how the Vision Pro runs me through three distinct sets of eye-tracking tests, where I looked at a ring of dots and, for each one, pinched my index finger and thumb together to select them. It might feel tedious to do this three times (okay, it did) but it's a critical step that ensures the Vision Pro's primary interaction paradigm works perfectly every time.

Now, at my third wearing, I've become quite an expert at the looking and pinching thing. A gold star for me.

The Apple Vision Pro headset on a grey background

This cushion is magnetic, and detaches so you can get one that better fits your face. The band also detaches when you pull on a small, bright orange tab (Image credit: Apple)

Spatial computing is kind of familiar

Las Vegas panorama

Can you find me in this photo? (Image credit: Lance Ulanoff)

We AirDropped my spatial video and panorama shots from a nearby phone. It was nice to see how smoothly AirDrop works on the Vision Pro – I saw that someone was trying to AirDrop the content and simply looked at 'Accept' and then pinched my thumb and finger. Within seconds, the content was in my Photos library (spatial video gets its own icon).

When Apple's panorama photography was new in iOS 6, I took a lot of panoramic photos. I was tickled by the torn humans who moved too fast in the shot, and the ability to have someone appear twice in one trick panoramic photo. Apple has mostly cleared up the first issue – I noticed that fewer of my recent panos feature people with two heads. These days, though, I take very few panos and only had four decent ones to try with the Vision Pro.

Even with just a few samples, though, I was startled by the quality and immersive nature of the images. My favorite by far was the photo I took earlier this year from my CES 2023 hotel room with an iPhone 14 Pro. Taking these shots is something of a ritual. I like to see what the view and weather are like in Las Vegas, and usually share something on social media to remind people that I'm back at CES.

It would not be an exaggeration to say that this one shot, taken from fairly high up at the Planet Hollywood Hotel, was a revelation. Not just because the vista which virtually wrapped almost around my head was gorgeous, but for the first time I noticed when I looked at the far-right side of the image a complete reflection of me taking the photo. It's a detail I never noticed when looking at the pano on my phone, and there's something incredibly weird about unexpectedly spotting yourself in an immersive environment like that.

A vista from Antigua was similarly engaging. The clarity and detail overall, which is a credit to iPhone 14 Pro and iPhone 15 Pro Max photography, is impressive. I viewed most of my panos in immersive mode, but could, by using a pinch-and-push gesture with both hands, put the panoramic image back in a windowed view.

Spatial view

Train spatial video

I promise you, this is much cooler when viewed on the Vision Pro (Image credit: Lance Ulanoff)

In preparation for my spatial video experience, I shot videos of Thanksgiving dinner, Dickensian carollers, walks in the park, model trains, and interactions with a friend's four-year-old.

Each of these videos hit me a little differently, and all of them in immersive mode shared a few key features. You can view spatial video on the Vision Pro in a window, but I preferred the immersive style, which erases the borders and delivers each video in almost a cloud. Instead of hard edges, each 3D video fades away at the borders, so there's no clear delineation between the real world and the one floating in front of your face. This does reduce the field of view a bit, especially the vertical height and depth – when I viewed the spatial videos on my iPhone (on which they look like regular, flat videos), I could see everything I captured from edge to edge, while in immersive mode on the Vision Pro, some of the details got lost to the top and bottom of the ether.

With my model train videos, the 3D spatial video effect reminded me of the possibly apocryphal tale of early cinema audiences who, upon seeing a film of an oncoming train, ran screaming from the theater. I wouldn't say my video was that intense, but my model train did look like it was about to ride right into my lap.

I enjoyed every video, and while I did not feel as if I was inside any of them, each one felt more real, and whatever emotions I had watching them were heightened. I suspect that when consumers start experiencing the Vision Pro and spatial videos for themselves they might be surprised at the level of emotion they experience from family videos – it can be quite intense.

It was yet another short and seated experience, and I'm sure I didn't press the endurance of the Vision Pro's external two-hour battery pack. I did notice that if I were about to, say, work a full day, watch multiple two-hour movies, or go through a vast library of spatial videos, I could plug a power-adapter-connected cable right into the battery pack's available USB-C port.

I still don't know if the Apple Vision Pro is for everyone, but the more I use it, and the more I learn about it, the more I'm convinced that Apple is set to trigger a seismic shift in our computing experience. Not everyone will end up buying Vision Pro, but most of us will feel its impact.

You might also like

TechRadar – All the latest technology news

Read More

I tried the iPhone 15 Pro’s new spatial video feature, and it will be the Vision Pro’s killer app

I’ve had exactly two Apple Visio Pro experiences: one six months ago, on the day Apple announced its mixed reality headset, and the other just a few hours ago. And where with the first experience I felt like I was swimming across the surface of the headset’s capabilities, today I feel like I’m qualified as a Vision Pro diver. I mean, how else am I expected to feel after not only experiencing spatial video on the Vision Pro, but also shooting this form of video for the headset with a standard iPhone 15 Pro?

By now, you probably know that iOS 17.2, which Apple released today as a public beta, will be the first time most of us will gain experience with spatial video. Granted, initially it will only be half the experience. Your iPhone 15 Pro and iPhone 15 Pro Max will, with the iOS 17 update, add a new videography option that you can toggle under Camera Formats in Settings. Once the Vision Pro ships, sometime next year, the format will turn on automatically for Vision Pro owners who have connected the mixed reality device to their iCloud accounts.

I got a sneak peek at not only the new iPhone 15 Pro capabilities, but at what the life-like content looks like viewed on a $ 3,499 Apple Vision Pro headset – and I now realize that spatial video could be the Vision Pro’s killer app.

A critical iPhone design tweak

Apple Vision Pro spatial video

(Image credit: Apple)

To understand how Apple has been playing the long game with its product development, you need only look at your iPhone 15 Pro or iPhone 15 Pro Max, where you’ll find a subtle design and functional change that you likely missed, but which is obviously all about the still unreleased Vision Pro. It turns out Apple designed the iPhone 15 Pro and Pro Max with the Vision Pro's spatial needs in mind, taking the 13mm ultrawide camera and moving it from its position (on the iPhone 14 Pro) diagonally opposite the 48MP main camera to the spot vertically below with the main camera, which on the 14 Pro was occupied by the telephoto camera; the telephoto camera moves to the ultrawide's old slot.

By repositioning these two lenses, Apple makes it possible to shoot stereoscopic or spatial video, but only when you hold the iPhone 15 Pro and iPhone 15 Pro Max in landscape mode.

It is not, I learned, just a matter of recording video through both lenses at once and shooting slightly different angles of the same scene to create the virtual 3D effect. Since the 13mm ultrawide camera shoots a much larger frame, Apple’s computational photography must crop and scale the ultrawide video to match the frames coming from the main camera.

To simplify matters, Apple is only capturing two 1080p/30fps video streams in HEVC (high-efficiency video coding) format. Owing to the dual stream, the file size is a bit larger, creating a 130MB file for about one minute of video.

Even though these spatial files are ostensibly a new media format, they will appear like any other 2D video file on your iPhone or Mac. However, there will be limits. You can trim one of these videos, but you can’t apply any other edits, lest you break the perfect synchronization between the two streams.

The shoot

Apple Vision Pro spatial video

Spatial video capture arrives on the iPhone 15 Pro and 15 Pro Max with the iOS 17.2 public beta update, which anyone can download today (you have to change your settings to accept beta updates). Note that you’ll only be shooting horizontal spatial video (Image credit: Apple)

For my test, I used a standard iPhone 15 Pro running the iOS 17 developer beta. We had already enabled Spatial Video for Apple Vision Pro under Settings in Camera/Formats. In the camera app's video capture mode, I could select a tiny icon that, naturally, looks just like the Vision Pro to shoot in Spatial Video mode.

When I selected that, the phone guided me to rotate the phone 90 degrees so it was in landscape orientation (the Vision Pro icon rotates as soon as you tap it). I also noticed that the image level tool, which is optional for all formats, is on by default when you use spatial video. This is because spatial videos are best when shot level. In fact, shooting them in situations where you know you might not be able to keep the phone level, like an action shot, could be a bad idea. Mostly this is about what it will feel like to watch the finished product in the Vision Pro headset – lots of movement in a 3D video a few centimeters from your face might induce discomfort.

Similarly, I found that it’s best to keep between three and eight feet from your subject, so they don’t end up appearing like giants in the final spatial video.

I shot a couple of short spatial videos of a woman preparing sushi. I tried to put the sushi in the foreground and her in the background to give the scene some depth. Nothing about shooting the video felt different from any others I’ve shot, though I probably overthought it a bit as I was trying to create a pair of interesting spatial videos.

Even though the iPhone is jumping through a bunch of computational hoops to create Spatial Video out of what you shoot, you should be able to play the video back instantly. We handed over our phones and then, a few minutes later, we were ready to view our videos in the Vision Pro.

Hello, my old friend

Apple Vision Pro spatial video

(Image credit: Apple)

While I was worried that after all these months, I wouldn’t remember how to use the Vision Pro, it really only took me a moment or two to reorient myself to its collection of gaze, gesture, and Digital Crown-based controls. It remains a stunningly intuitive piece of bleeding-edge tech. I still needed to hand over my glasses for a prescription measurement so we could make sure Apple inserted the right Zeiss lenses (you don’t wear glasses when using the headset). It’s a reminder that, unlike an iPhone, the Vision Pro will be a somewhat bespoke experience.

For this second wear session, I did not have the optional over-the-head strap, which meant that, for the first time, I felt the full weight of the headgear. I did my best to adjust the headband using a control knob near the back of the headset while being careful not to over-tighten it, but I’m not sure I ever found that sweet spot (note to self: get the extra headband when you do finally get to review one of these headsets).

There were some new controls since I last tried the Vision Pro – for example, I could now resize windows by looking over at the edge of a window and then by virtually pinching and pulling the white curve that appears right below it. I got this on the second try, and then it became second nature.

I finally got a good look at the Vision Pro Photos app, which was easy to navigate using my gaze and finger taps – you pinch and pull with either hand to swipe through photos and galleries. I usually kept my hands in or near my lap when performing these gestures. I looked at photos shot with the iPhone 15 Pro at 24MP and 48 MP. It was fun to zoom into those photos, so they filled my field of view, and then pinch and drag to move around the images and see some of the exquisite detail in, for instance, the lace on a red dress.

I got a look at some incredible panorama shots, including one from Monument Valley in Arizona and another from Iceland, which featured a frozen waterfall, and which virtually wrapped all the way around me. As I noted in my original Vision Pro experience, there’s finally a reason to take panoramic photos with your iPhone.

Head spatial

Apple Vision Pro spatial video

This spatial video scene was one of the most effective. Those bubbles appeared to float right by my face (Image credit: Apple)

Inside the Vision Pro Photos app is a new media category called Spatial. This is where I viewed some canned spatial videos and, finally, the pair of spatial videos I shot on the iPhone 15 Pro. There was the campfire scene I saw during my WWDC 2023 experience, a birthday celebration, an intimate scene of a family camping, another of a family cooking in a kitchen, and, my favorite, a mother and child playing with bubbles.

You can view these spatial videos in a window or full-screen, where the edges blend with either your passthrough view or your immersive environment (a new environment is Joshua Tree) that replaces your real world with a 360-degree wraparound image. In the bubble video, the bubbles appeared to be floating both in the scene and closer to my face; I had the impulse to reach out and touch them.

In the kitchen scene, where the family is sitting around a kitchen island eating and the father is in the background cooking, the 3D effect initially makes the father look like a tiny man. When he turned and moved closer to his family, the odd effect disappeared.

It’s not clear how spatial video shot on iPhone 15 Pro is handling focal points, and if it’s defaulting to a long depth of field or using something different for the 3D effect. You can, by tapping your iPhone's screen during a spatial video shoot, set the focus point but you can't change this in editing.

My two short videos were impressive, if I do say so myself. During the shoot, I did my best to put one piece of sushi the chef held up to me in the foreground, and in the final result, I got exactly the effect I was hoping for. The depth is interesting, and not overbearing or jarring. Instead, the scene looks exactly as I remember it, complete with that lifelike depth. That’s not possible with traditional videography.

What I did not do was stand up and move closer to the spatial videos. Equally, these are not videos you can step into and move around. You're still only grabbing two slightly different videos to create the illusion of depth.

In case you’re wondering, the audio is captured too, and this sounded perfectly normal. I didn't notice any sort of spatial effect, but these videos were not shot with audio sources that spanned the distance of a room.

Apple Vision Pro example

In this sample provided by Apple, you can see how the candle smoke appears to float toward you – it’s a trippier effect when you’re wearing the Vision Pro headset (Image credit: Apple)

What’s next?

Because you’ll have spatial video shooting capabilities when you install the iOS 17.2 public beta you could be shooting a lot of spatial video between now and when Apple finally starts selling the Vision Pro to consumers. These videos will look perfectly normal – but imagine having a library of spatial video to swipe through when you do finally buy the Vision Pro. That, and the fact that your panoramas will look stunning on the device, may finally be the reason you buy Apple's headset.

Naturally, the big stumbling factor here is price. Apple plans on charging $ 3,499 (around £2,800 / AU$ 5,300) for the Vision Pro, not including the head strap accessory, which as mentioned, you’ll probably need. That means that while millions may own iPhone 15 Pros and be able to shoot spatial video, a precious few will be able to watch them on a Vision Pro.

Perhaps Apple will make the Vision Pro part of one of its financing plans, so that people can pay it off with a monthly fee. There might also be discounts if you buy an iPhone 15 Pro. Maybe not. Whatever Apple does, spatial video may make the most compelling case yet for, if not owning a Vision Pro, then at least wishing you did.

You might also like

TechRadar – All the latest technology news

Read More

Your iPhone 15 Pro can now capture spatial video for the Vision Pro

When Apple unveiled the Vision Pro headset, it explained that you’d be able to capture so-called spatial videos for the device using an iPhone 15 Pro or iPhone 15 Pro Max. Yet you’ve not been able to do that with either of the best iPhones currently available – until now.

That’s because Apple has just launched iOS 17.2 beta 2, and with it comes the ability to record 3D spatial videos. That means you can start prepping videos for the headset using just your iPhone and its main and ultra wide cameras; no fancy equipment necessary.

Of course, you can’t actually see these videos in their intended 3D environment yet, because the Vision Pro hasn’t launched – it’s not expected until some time in early 2024.

But what you can do is start filming videos ready to be used in 3D apps built using Apple’s frameworks, like RealityKit. So, if you’ve got your heart set on building a Vision Pro app that integrates 3D video, you can get started more or less right away.

A taste of things to come

Apple Vision Pro spatial video

(Image credit: Apple)

To enable spatial video capture on an iPhone, you’ll obviously need to be running iOS 17.2 beta 2. Once you are, open the Settings app and select Camera, then enable the Spatial Video for Apple Vision Pro toggle.

Now, the next time you open the Camera app, there will be a Spatial option for video recording. Just start filming and your iPhone will do all the necessary legwork to make the video 3D-enabled.

As spotted by 9to5Mac, Apple says video captured in this way will be recorded at 1080p and 30fps, and that a minute of spatial footage filmed this way will take up around 130MB of space. Better make sure you have plenty of free storage before you start.

When the Vision Pro eventually makes it onto shelves, you’ll also be able to capture videos using the headset itself, too. For now, though, you’re limited to a recent high-end iPhone, but it seems to be a taste of something greater.

You might also like

TechRadar – All the latest technology news

Read More