Google has announced that big changes are coming to Google Drive, its cloud-based file storage platform. This latest series of tweaks to the popular cloud storage service come mainly in the form of a shiny new landing page, but there’s an extra treat in sort for iOS users.
The new homepage (aptly named the ‘Home’ view) will become the default landing page for every Drive user when it rolls out over the next couple of months – though you’ll be able to swap back to the old view if you prefer. In a blog post explaining the changes, Google says that Home will be “streamlined” compared to the standard My Drive landing page, designed to make it “easier and faster for you to find files that matter most”.
To that end, the Home screen will include personalized suggestions that use AI to learn which files and folders you access regularly (or documents that are tied to upcoming events in your Google calendar). It’ll also include new ‘search chips’ that make filtering your files easier, and will employ Google’s Material Design 3 guidelines for a (hopefully) more modern and user-friendly look.
That’s not all, folks
Google isn’t stopping there, either. A long-awaited Drive feature is finally coming to iPhone and iPad: the document scanner, which uses your device camera to take high-quality scans of physical documents which are then converted to PDFs, with the ability to scan multiple pictures in succession for producing multi-page documents.
The feature has been available for Drive users on Android for a while now, so it’s good to see that Google isn’t planning on leaving iPhone owners out in the cold. The document scanner (which was recently upgraded for Android Users) will also use machine learning to suggest names for your scanned documents, such as recognizing a receipt from a store and giving it an appropriate filename.
The scanner feature is rolling out to iOS and iPadOS users now, so if you’ve got an Apple device you can expect to have it soon if you don’t already. The updated Google Drive homepage will be arriving at a slower pace, with early access starting now and a wide release for personal users from January 15 next year.
I’m personally a little dubious about an AI-powered homepage for Drive – ‘suggested content’ in the software I use has rarely been useful in my experience, AI-assisted or not. But thankfully Google has already confirmed that users will get an instant pop-up asking if they’d like to swap their default view back to the old My Drive page, so it’s not like this change is being forced on us.
You can now get Windows on your Apple iPhone, iPad, or Mac – sort of – with Microsoft’s latest innovation for its operating system, although this currently comes with a sizable catch (more on that later).
The application allows you to stream a Windows 11 desktop from a remote PC to your Apple device (or indeed another Windows device, or anything with a browser). Or alternatively you can stream a Windows 365 instance, or other options like Azure Virtual Desktop.
The Windows App is essentially a hub to facilitate streaming whichever instances you want to a given device. It packs support for multi-monitor setups, and device redirection to allow for the use of connected hardware like printers, webcams, speakers and so on hooked up to the device that the app is running on.
The Windows App is currently in preview – so expect potential flakiness – and available for iOS, iPadOS, macOS, and Windows itself. Those streaming a Windows desktop instance via a web browser don’t have to install any software at all.
Analysis: A business move, but that could change
So, that catch we mentioned: as well as it being a beta, the Windows App only works for Microsoft business accounts, not personal accounts – yet.
But as The Verge, which picked up on the app’s release, points out, the login on the Windows version of the Windows app seemingly has an option to use a personal Microsoft account, it just doesn’t work yet.
That’s not exactly surprising as this is a beta, which is the other caveat here – not everything will necessarily work properly yet. This is a more than a hopeful suggestion that consumers will be able to use the app and stream a remote PC to their Apple (or other) device eventually, come release.
Of course, another omission here is the lack of Android support, and presumably that’s something else that will be in the pipeline.
I’ve had exactly two Apple Visio Pro experiences: one six months ago, on the day Apple announced its mixed reality headset, and the other just a few hours ago. And where with the first experience I felt like I was swimming across the surface of the headset’s capabilities, today I feel like I’m qualified as a Vision Pro diver. I mean, how else am I expected to feel after not only experiencing spatial video on the Vision Pro, but also shooting this form of video for the headset with a standard iPhone 15 Pro?
By now, you probably know that iOS 17.2, which Apple released today as a public beta, will be the first time most of us will gain experience with spatial video. Granted, initially it will only be half the experience. Your iPhone 15 Pro and iPhone 15 Pro Max will, with the iOS 17 update, add a new videography option that you can toggle under Camera Formats in Settings. Once the Vision Pro ships, sometime next year, the format will turn on automatically for Vision Pro owners who have connected the mixed reality device to their iCloud accounts.
I got a sneak peek at not only the new iPhone 15 Pro capabilities, but at what the life-like content looks like viewed on a $ 3,499 Apple Vision Pro headset – and I now realize that spatial video could be the Vision Pro’s killer app.
A critical iPhone design tweak
To understand how Apple has been playing the long game with its product development, you need only look at your iPhone 15 Pro or iPhone 15 Pro Max, where you’ll find a subtle design and functional change that you likely missed, but which is obviously all about the still unreleased Vision Pro. It turns out Apple designed the iPhone 15 Pro and Pro Max with the Vision Pro's spatial needs in mind, taking the 13mm ultrawide camera and moving it from its position (on the iPhone 14 Pro) diagonally opposite the 48MP main camera to the spot vertically below with the main camera, which on the 14 Pro was occupied by the telephoto camera; the telephoto camera moves to the ultrawide's old slot.
By repositioning these two lenses, Apple makes it possible to shoot stereoscopic or spatial video, but only when you hold the iPhone 15 Pro and iPhone 15 Pro Max in landscape mode.
It is not, I learned, just a matter of recording video through both lenses at once and shooting slightly different angles of the same scene to create the virtual 3D effect. Since the 13mm ultrawide camera shoots a much larger frame, Apple’s computational photography must crop and scale the ultrawide video to match the frames coming from the main camera.
To simplify matters, Apple is only capturing two 1080p/30fps video streams in HEVC (high-efficiency video coding) format. Owing to the dual stream, the file size is a bit larger, creating a 130MB file for about one minute of video.
Even though these spatial files are ostensibly a new media format, they will appear like any other 2D video file on your iPhone or Mac. However, there will be limits. You can trim one of these videos, but you can’t apply any other edits, lest you break the perfect synchronization between the two streams.
For my test, I used a standard iPhone 15 Pro running the iOS 17 developer beta. We had already enabled Spatial Video for Apple Vision Pro under Settings in Camera/Formats. In the camera app's video capture mode, I could select a tiny icon that, naturally, looks just like the Vision Pro to shoot in Spatial Video mode.
When I selected that, the phone guided me to rotate the phone 90 degrees so it was in landscape orientation (the Vision Pro icon rotates as soon as you tap it). I also noticed that the image level tool, which is optional for all formats, is on by default when you use spatial video. This is because spatial videos are best when shot level. In fact, shooting them in situations where you know you might not be able to keep the phone level, like an action shot, could be a bad idea. Mostly this is about what it will feel like to watch the finished product in the Vision Pro headset – lots of movement in a 3D video a few centimeters from your face might induce discomfort.
Similarly, I found that it’s best to keep between three and eight feet from your subject, so they don’t end up appearing like giants in the final spatial video.
I shot a couple of short spatial videos of a woman preparing sushi. I tried to put the sushi in the foreground and her in the background to give the scene some depth. Nothing about shooting the video felt different from any others I’ve shot, though I probably overthought it a bit as I was trying to create a pair of interesting spatial videos.
Even though the iPhone is jumping through a bunch of computational hoops to create Spatial Video out of what you shoot, you should be able to play the video back instantly. We handed over our phones and then, a few minutes later, we were ready to view our videos in the Vision Pro.
Hello, my old friend
While I was worried that after all these months, I wouldn’t remember how to use the Vision Pro, it really only took me a moment or two to reorient myself to its collection of gaze, gesture, and Digital Crown-based controls. It remains a stunningly intuitive piece of bleeding-edge tech. I still needed to hand over my glasses for a prescription measurement so we could make sure Apple inserted the right Zeiss lenses (you don’t wear glasses when using the headset). It’s a reminder that, unlike an iPhone, the Vision Pro will be a somewhat bespoke experience.
For this second wear session, I did not have the optional over-the-head strap, which meant that, for the first time, I felt the full weight of the headgear. I did my best to adjust the headband using a control knob near the back of the headset while being careful not to over-tighten it, but I’m not sure I ever found that sweet spot (note to self: get the extra headband when you do finally get to review one of these headsets).
There were some new controls since I last tried the Vision Pro – for example, I could now resize windows by looking over at the edge of a window and then by virtually pinching and pulling the white curve that appears right below it. I got this on the second try, and then it became second nature.
I finally got a good look at the Vision Pro Photos app, which was easy to navigate using my gaze and finger taps – you pinch and pull with either hand to swipe through photos and galleries. I usually kept my hands in or near my lap when performing these gestures. I looked at photos shot with the iPhone 15 Pro at 24MP and 48 MP. It was fun to zoom into those photos, so they filled my field of view, and then pinch and drag to move around the images and see some of the exquisite detail in, for instance, the lace on a red dress.
I got a look at some incredible panorama shots, including one from Monument Valley in Arizona and another from Iceland, which featured a frozen waterfall, and which virtually wrapped all the way around me. As I noted in my original Vision Pro experience, there’s finally a reason to take panoramic photos with your iPhone.
Inside the Vision Pro Photos app is a new media category called Spatial. This is where I viewed some canned spatial videos and, finally, the pair of spatial videos I shot on the iPhone 15 Pro. There was the campfire scene I saw during my WWDC 2023 experience, a birthday celebration, an intimate scene of a family camping, another of a family cooking in a kitchen, and, my favorite, a mother and child playing with bubbles.
You can view these spatial videos in a window or full-screen, where the edges blend with either your passthrough view or your immersive environment (a new environment is Joshua Tree) that replaces your real world with a 360-degree wraparound image. In the bubble video, the bubbles appeared to be floating both in the scene and closer to my face; I had the impulse to reach out and touch them.
In the kitchen scene, where the family is sitting around a kitchen island eating and the father is in the background cooking, the 3D effect initially makes the father look like a tiny man. When he turned and moved closer to his family, the odd effect disappeared.
It’s not clear how spatial video shot on iPhone 15 Pro is handling focal points, and if it’s defaulting to a long depth of field or using something different for the 3D effect. You can, by tapping your iPhone's screen during a spatial video shoot, set the focus point but you can't change this in editing.
My two short videos were impressive, if I do say so myself. During the shoot, I did my best to put one piece of sushi the chef held up to me in the foreground, and in the final result, I got exactly the effect I was hoping for. The depth is interesting, and not overbearing or jarring. Instead, the scene looks exactly as I remember it, complete with that lifelike depth. That’s not possible with traditional videography.
What I did not do was stand up and move closer to the spatial videos. Equally, these are not videos you can step into and move around. You're still only grabbing two slightly different videos to create the illusion of depth.
In case you’re wondering, the audio is captured too, and this sounded perfectly normal. I didn't notice any sort of spatial effect, but these videos were not shot with audio sources that spanned the distance of a room.
Because you’ll have spatial video shooting capabilities when you install the iOS 17.2 public beta you could be shooting a lot of spatial video between now and when Apple finally starts selling the Vision Pro to consumers. These videos will look perfectly normal – but imagine having a library of spatial video to swipe through when you do finally buy the Vision Pro. That, and the fact that your panoramas will look stunning on the device, may finally be the reason you buy Apple's headset.
Naturally, the big stumbling factor here is price. Apple plans on charging $ 3,499 (around £2,800 / AU$ 5,300) for the Vision Pro, not including the head strap accessory, which as mentioned, you’ll probably need. That means that while millions may own iPhone 15 Pros and be able to shoot spatial video, a precious few will be able to watch them on a Vision Pro.
Perhaps Apple will make the Vision Pro part of one of its financing plans, so that people can pay it off with a monthly fee. There might also be discounts if you buy an iPhone 15 Pro. Maybe not. Whatever Apple does, spatial video may make the most compelling case yet for, if not owning a Vision Pro, then at least wishing you did.
That’s because Apple has just launched iOS 17.2 beta 2, and with it comes the ability to record 3D spatial videos. That means you can start prepping videos for the headset using just your iPhone and its main and ultra wide cameras; no fancy equipment necessary.
Of course, you can’t actually see these videos in their intended 3D environment yet, because the Vision Pro hasn’t launched – it’s not expected until some time in early 2024.
But what you can do is start filming videos ready to be used in 3D apps built using Apple’s frameworks, like RealityKit. So, if you’ve got your heart set on building a Vision Pro app that integrates 3D video, you can get started more or less right away.
A taste of things to come
To enable spatial video capture on an iPhone, you’ll obviously need to be running iOS 17.2 beta 2. Once you are, open the Settings app and select Camera, then enable the Spatial Video for Apple Vision Pro toggle.
Now, the next time you open the Camera app, there will be a Spatial option for video recording. Just start filming and your iPhone will do all the necessary legwork to make the video 3D-enabled.
As spotted by 9to5Mac, Apple says video captured in this way will be recorded at 1080p and 30fps, and that a minute of spatial footage filmed this way will take up around 130MB of space. Better make sure you have plenty of free storage before you start.
When the Vision Pro eventually makes it onto shelves, you’ll also be able to capture videos using the headset itself, too. For now, though, you’re limited to a recent high-end iPhone, but it seems to be a taste of something greater.
Microsoft has given Windows 11’s desktop email app, Outlook, a major revamp with the addition of Apple iCloud functionality for people who use iPhones or other Apple devices, plus other features. This upgrade is available to all Windows 11 users and you can add your iCloud account to your Outlook app by doing the following:
1. Click the cog icon in your Outlook menu, which should open your Email accounts setting. This is where you can see all of the accounts that are connected to your Outlook and manage them.
2. Select Add account and sign into your Apple iCloud account. This should connect your iCloud account.
The Outlook app had supported Apple’s email service in the past before Windows 11’s launch, but according to Windows Latest, Microsoft is in the process of deploying a new Outlook app in place of the old one. Apparently reception has been lukewarm from users, but Microsoft is adding lots of new features with every new version.
One of the biggest complaints users have with the renewed Outlook app has been that it launches in a web wrapper. The old app was a fully functional UWP app, with both online and offline support. However, the new app only got offline support very recently. User complaints about the new app persist, and Microsoft is continuing to develop the app to hopefully improve users’ experiences and improve their opinion of the new app.
The latest in a string of new developments
This development follows shortly after Microsoft also added compatibility with Gmail, Google Calendar, and contacts to Outlook. iCloud support is also now available to all Windows 11 users, and Microsoft is reportedly working on extending offline support for more parts of the Outlook app, including events and Calendar.
One feature that users have to look forward to as part of Microsoft’s new Outlook is being able to RSVP to meetings. Windows Latest spotted this as an upcoming update in the Microsoft 365 roadmap, which details what Microsoft has in store for various Microsoft 365 apps. This will help users receive information about the nature of any specific meeting and better decide if they would like to attend. This development is expected to debut in March 2024.
Another feature that has been added will help users understand their meetings and schedules. Microsoft explained on its Tech Community blog that users will be able to track declined meetings better in the Outlook calendar. This will be useful for many users, especially those who have overlapping or densely-packed meetings, and want to better understand what they are and aren’t attending.
How to turn on visibility for declined meetings
The above is now available within the most up to date version of Outlook, but is disabled by default. You can enable it through the following steps:
1. Open the Outlook app.
2. Go to: Settings > Calendar > Events & Invitations > Save declined events
3. Tick (Click) the Show declined events in your calendar box.
This should turn on the feature and declined meetings should begin to be displayed in your Calendar.
In order for a meeting to be classified as declined, you will have to have declined the meeting in all Outlook clients and Teams, with the exception of the original Windows Outlook client.
It’s going to take a little more to win over Windows users it seems, but these seem like some solid steps. These are available to all Windows 11 users with a valid copy of Outlook as far as we know and if you don’t have these features yet, you may need to update your Outlook app. It is to be confirmed if this extends to free users who use Outlook online.
The Apple Vision Pro may not have been in attendance during the recent iPhone 15 launch event hosted at Apple Park, but it got a shout-out and a couple of upgrades that you might have missed.
Unfortunately, these improvements aren’t coming to the headset directly, instead they're buffs exclusive for people who own multiple premium Apple products – specifically the new iPhone 15 Pro (or the iPhone 15 Pro Max) and the updated AirPods Pro 2 with USB-C port.
Starting with the iPhone 15, the Pro model’s cameras now have the ability to record Spatial Video. This immersive format allows you to use the Vision Pro to relive 3D recreations of memories you film and was first shown off in the Vision Pro reveal trailer at WWDC 2023.
While kinda cool – it feels like a step towards hologram recordings from sci-fi – the feature also felt rather dystopian when unveiled on the Vision Pro. Specifically, to be able to record Spatial Video of a special moment actually on the Vision Pro, you’d need to separate yourself from it; you’d cover your eyes with the VR headset to boot up the camera and start recording. It also wouldn’t let you relive any memories that happened while you didn’t have the $ 3,499 (around £2,800 / AU$ 5,300) headset on hand to record – and given the price of the Vision Pro it doesn’t strike us as something you want to carry with you everywhere.
The iPhone 15 Pro solves both of these issues. While recording spatial video on your iPhone you can still be present in the moment and experience it for real as it happens – not just through a recording – and you’ll almost always have your phone on you to be able to capture memories as they happen.
The feature won’t be live when the new iPhones launch, but Apple noted that Spatial Video recording would be coming in the near future (we expect it will arrive before or just as the Vision Pro releases).
For a start, the revamped buds have a new IP54 dustproof and water-resistant label (the previous iteration just had an IPX4 rating, suggesting its dust resistance wasn’t tested). More importantly, just for Vision Pro users, these earbuds will support a “groundbreaking wireless audio protocol” that unlocks 20-bit, 48 kHz Lossless Audio for the Apple headset. This means you can enjoy your Apple Vision Pro experiences in private and with high-end audio (higher quality than you can get from your AirPods Pro connected to even the latest iPhones) by slipping in a pair of USB-C AirPods Pro 2.
Weirdly, this upgrade seems to be exclusive to the new AirPods. Older AirPods Pro 2 charged via a Lightning cable don’t seem to offer this high-end audio quality with the Vision Pro. So if you only recently bought a pair of AirPods Pro 2 you might want to return them and pick up the USB-C model instead when the new model releases on September 22, if the Vision Pro is something you're remotely interested in.
We know that the Apple Vision Pro isn't going to be available to buy until 2024, but we're learning a little bit more about the specs of the device through leaks from early testers – including how much on-board storage the augmented reality headset might pack.
According to iPhoneSoft (via 9to5Mac), the Vision Pro is going to offer users 1TB of integrated storage as a default option, with 2TB or 4TB a possibility for those who need it (and who have bigger budgets to spend).
Alternatively, it might be that 256GB is offered as the amount of storage on the starting price Vision Pro headset, and that 512GB and 1TB configurations are the ones made available for those who want to spend more.
This information is supposedly from someone who has been given an early look at the AR device, and noticed the storage space listed on one of the settings screens. It's more than the standard iPhone 15 model is expected to have – if it sticks with the iPhone 14 configurations, it will be available with up to 512GB of storage.
Plenty of unknowns
It does make sense for a device like this to offer lots of room for apps and files, and it might go some way to explaining the hefty starting price of $ 3,499 (about £2,750 / AU$ 5,485). Watch this space for more Vision Pro revelations as the launch date gets closer.
While the Apple Vision Pro is now official, there's still a lot we don't know about it – and it may be that we won't find out everything until we actually have the headset in our hands and are able to test it fully.
There have been rumors that two more Vision Pro headsets are in the pipeline, and that some features – such as making group calls using augmented reality avatars – will be held back until those later generations of the device go on sale.
We're also hearing that Apple might not be planning to make a huge number of these headsets, so availability could be a problem. Right now it does feel like a high-end, experimental device rather than something aimed at the mass market.
ChatGPT’s iPhone app is going to offer users access to the internet via Bing – but only if you’re willing to pay for a premium ChatGPT subscription.
OpenAI’s large language models (LLM) that power its ChatGPT bot have taken the tech world by storm this year, with GPT 3 and now GPT 4 integration being introduced to a bevy of products including Spotify, Bing, and even Mercedes cars.
Some integrations go both ways too, with Bing being added to ChatGPT’s web version back in May. Now Bing has come to ChatGPT’s iPhone app for people who pay $ 20 a month for ChatGPT Plus (roughly £16 / AU$ 30).
This looks set to be a major upgrade for the iPhone app. One significant drawback to ChatGPT and the GPT 4 LLM is that it only has data that’s accurate up to around September 2021 – so if you ask the LLM questions about events that happened in 2022 or 2023 it probably won’t know what you’re talking about, and it may hallucinate (read: make something up). Giving GPT 4 access to Bing would enable the chatbot to find answers to questions that fall outside of its stored data.
Don’t expect this Bing integration to be an instant enhancement to ChatGPT’s iPhone version mind. For one thing, the feature is only in beta, so it may have a few issues that OpenAI still needs to patch out. For another, while the internet is home to more recent data that could help boost the AI’s reliability, it’s also home to inaccurate info, so ChatGPT’s answers will likely still feature errors.
How to use ChatGPT with Bing on iPhone
To use ChatGPT’s new Bing powers on your iPhone you’ll need to sign up for ChatGPT Plus. You’ll then want to download the latest version of the iOS app (v1.2023.173).
Once your update has been completed, make sure you’re signed in to the app and then tap the menu button (the three dots) at the top-right corner of the screen. Then tap Settings, then New Features. In this sub-menu you should see an option to enable Browsing; select GPT-4 as your model, and make sure to select 'Browse with Bing'.
For now, there’s no Android app for ChatGPT, though one is apparently coming soon. Microsoft has also said that Bing integration will be available to non-paying ChatGPT users, but for now, it’s only for Plus subscribers. If you want to enjoy an AI-powered Bing experience for free (and on Android or iOS), you’ll need to download the Bing app and use its Bing Chat feature.
Apple is striking another blow in the name of privacy by adding Link Tracking Protection to select iOS 17 apps, which will make it easier for users to keep their data private while browsing the web on their iPhone.
Link tracking identifiers are a collection of numbers and letters typically appended towards the end of a link as a way for websites to keep track of users as they move around the web. It’s most often used for ad tracking, as well as a substitute for third-party cookies. When Apple’s iOS 17 rolls out, Link Tracking Protection will prevent advertisers from bypassing privacy features simply because they are typically targeted toward cookies. It'll be enabled on Mail, Messages, and Safari's Private Browsing mode.
It is worth noting that you can still remove link-tracking identifiers by yourself by selecting and deleting the offending characters. However, with the identifiers often being quite long, many users simply wouldn’t bother, and Apple is just streamlining the otherwise tedious process.
“Privacy is designed into every new Apple product and feature from the beginning,” Craig Federighi, Apple’s senior vice president of Software Engineering, said in a press release announcing the feature. “We are focused on keeping our users in the driver’s seat when it comes to their data by continuing to provide industry-leading privacy features and the best data security in the world.
“This approach is evident in a number of features on our platforms, like the major updates to Safari Private Browsing, as well as the expansion of Lockdown Mode.”
A more private iOS
Compared to rival operating system Android, iOS has always been the more privacy and security-focused option, and iOS 17 furthers that. Apple put a stop to AirDrop flashing with Communication Safety, preemptively halted explicit contact posters, and beefed up Safari’s Private Browsing with an additional level of authentication. These are all subtle yet meaningful improvements – something that could be said for the iOS 17 update as a whole.
Apple is currently testing iOS 17 for developers, with a public preview slated for July. A full release of iOS 17 is expected to start rolling out to the best iPhones around September, to coincide with the launch of the iPhone 15.
Microsoft’s Bing chatbot is now more readily accessible for iOS users thanks to a new widget, plus the AI has been bolstered to perform more responsively when using voice input on an iPhone.
Windows Central spotted that Microsoft has implemented a Bing Chat widget that can be added to the Home screen, allowing you to initiate a session with the chatbot with a simple tap. That’s a handy ability indeed for regular users of Bing AI on iOS devices.
For instructions on how to add a widget to the iPhone Home screen, check here.
In the Bing blog post announcing this new feature for iOS, Microsoft also tells us that it has made progress on another front for iPhone owners – namely better performance for the voice input button on the Bing mobile app (for iOS, and Android as well). When you tap the button it should now indicate that it’s listening instantly.
Analysis: Catching up with Android
The widget is a very useful touch in terms of convenience for regular users on the iPhone, and it brings the Bing Chat experience up to parity with the Android version (which already had this feature).
Overall, Microsoft’s setting a pretty fast pace of development with its Bing AI, as considerable progress is being made on a weekly basis, with both the mobile and desktop incarnations of the chatbot.
Regarding the latter, we’ve just seen that Microsoft has brought voice input to desktop PCs (previously this had been a mobile-only feature). The idea is to make for a more natural chatting experience with the Bing chatbot, allowing you to speak to the AI, and have it reply via spoken words, too.