Rumored Apple and Meta collaboration might make the iPhone 16 a better AI phone

Apple may be augmenting its new Apple Intelligence Artificial Intelligence (AI) features with models built by Meta, according to a Wall Street Journal report. The two tech giants are supposedly discussing incorporating Meta's generative AI services into iOS 18 and the next generation of iPhone models. 

The WSJ report cites conversations Apple has begun with most of the big names in AI, including Google (Gemini), Anthropic (Claude), and OpenAI (ChatGPT). Plans for Apple Intelligence to include free ChatGPT access and GPT-4o integration were mentioned among the deluge of Apple Intelligence news at WWDC this year. That is clearly a non-exclusive arrangement if a Meta collaboration is underway. 

Apple's interest in Meta's Llama 2 and Llama 3 large language models makes sense on both ends of any deal. Meta would get to bring its AI to the massive global network of iPhone users, while Apple could cite Meta's AI features as another selling point for the iPhone. And while both Meta and Apple have some deals with OpenAI and its main funder, Microsoft, an alliance between the two might help build a competitive alternative even as OpenAI and ChatGPT may be what people first point to as generative AI. 

Mutually beneficial

For Apple as a hardware platform, it's especially good to widen the available AI model choices. That way, Apple can pitch iPhones as an AI hub, switching among models depending on what people want the AI to do. Apple explicitly pointed toward that goal at WWDC this year when announcing the deal with OpenAI to provide ChatGPT on Apple products.

“We wanted to start with the best, and we think ChatGPT from OpenAI and their new 4o model represents the best choice for our users today,” Apple's senior vice president of Software Engineering Craig Federighi explained at the event. “We think ultimately, people are going to have a preference perhaps for certain models that they want to use – maybe one that great’s for creative writing, or one they prefer for coding, and so we want to enable users ultimately to bring the model of their choice and we’re going to look forward to doing integrations with models, like Google Gemini for instance, in the future.”

Any speculation on how Apple Intelligence will change thanks to Meta is premature, but the fact it's happening at all might surprise some. Meta's advertising income took a beating after Apple changed its policies to give users more control over their data in 2021. Requiring user permission before tracking data across other apps and websites cost Meta billions of dollars and prompted Meta to release a method for advertisers to avoid Apple's service fee for boosting ad posts. The stakes of those business battles are apparently no match for Apple and Meta's anticipated AI earnings, and both now seem happy to let bygones be bygones. 

You might also like…

TechRadar – All the latest technology news

Read More

MacOS Sequoia’s wildest update – iPhone mirroring – might be more useful than you think

When Apple introduced macOS Sequoia and its new iPhone Mirroring capability, I didn't get it. Now, though, after seeing it in action and considering some non-obvious use cases, I may be ready to reconsider.

Apple unveiled the latest AI-infused version of macOS during its WWDC 2024 keynote, which also saw major updates to iOS, iPadOS, visionOS, tvOS, and watchOS. It also served as the launch platform for Apple Intelligence, an Apple-built and branded version of artificial intelligence. I get that Apple's been building AI PCs for a while (ever since the M1 chip, they've included an on-board neural engine), and there are many features, including a better Siri, powerful photo editing features, and smart writing help, to look forward to but I found myself fixating elsewhere.

Apple was putting the iPhone on your Mac, or, rather, an iPhone screen floating in the middle of the lovely macOS Sequoia desktop. In a way, this is the most significant redesign of the new platform. It puts an entirely different OS – a mobile one, no less – on top of a laptop or desktop. 

Wow. And also, why?

I admit that I had a hard time conceiving what utility you could gain from having a second, live interface on an already busy desktop. Apple has said in the past that they build features, in some cases, based on user requests. Who had ever asked for this?

After the keynote, I had the chance to take a deeper dive, which helped me better understand this seemingly unholy marriage and why, in some cases, it might make perfect sense.

Making it so

WWDC 2024

(Image credit: Future / Lance Ulanoff)

Apple built a new app to connect your iOS 18-running iPhone to your macOS Sequoia Mac. In a demo I saw, it took one click to make it happen. Behind the scenes, the two systems are building a secure Bluetooth and WiFi connection. On the iPhone, there is a message that mirroring is live. On the Mac, well, there's the iPhone screen, complete with the dynamic Island cutout (a strange choice if you ask me – why virtualize dead space?).

I was honestly shocked at the level of iPhone functionality Apple could bring to the Mac desktop.

You can use the Mac trackpad to swipe through iPhone apps.

You can click to launch apps and run them inside the iPhone screen on your Mac desktop.

Pinch and zoom on the Mac trackpad works as expected with the iPhone apps.

There's even full drag-and-drop capability between the two interfaces. So you could take a video from the Go Pro app on your mirrored iPhone screen and drag and drop it into another app, like Final Cut Pro on the Mac.

Essentially, you are reaching through one big screen to get to another smaller one – on a different platform – that is sitting locked beside your desktop. It's stange and cool, but is it necessary?

WWDC 2024

(Image credit: Future / Lance Ulanoff)

Not everything makes sense. You can search through your mirrored phone screen, but why not just search on your desktop?

You can use the mirrored iPhone screen in landscape mode and play games. However, there's no obvious way to tell someone trying to play a game that uses the iPhone gyroscope that this is a bad idea.

I like that there's enough awareness that while the iPhone screen can look exactly like the screen on the phone, you can click to access a slightly larger frame that allows you to control the mirrored screen.

It's not the kind of mirroring that locks you in. To end it, you just pick up and unlock the phone to end the connection.

Even seeing all this, though, I wondered how people might use iPhone Mirroring.

Even seeing all this, though, I wondered how people might use iPhone Mirroring. There's the opportunity to play some games that aren't available on Mac. Multi-player word game fans might like that if they get a notification, they can open the mirrored phone screen, make a move, and then return to work.

When macOS Sequoia ships later this fall, you'll even be able to resize the mirrored iPhone window, which I guess could be useful for landscape games.

Notifications from your phone sounds redundant, especially for those of us in the iCloud ecosystem where all our Apple products get the same iMessages. But the system is smart enough to know it shouldn't repeat notifications on both screens, and you'll have the option to decide which iPhone notifications appear on your Mac.

Some notifications only appear on your iPhone, and others appear in both places, but you can't always act on them on the Mac.  This new feature might bridge that gap. A fellow journalist mentioned that iPhone mirroring would finally give him a way to jump from a notification he saw on his Mac for his baby cam app, where this is no cam app, to the live feed on the iPhone. This finally struck me as truly useful.

Is that enough of a reason to have your iPhone screen pasted on your Mac desktop? I don't know.  It might take up too much real estate on my MacBook Air 13-inch, but it would be kind of cool on a 27-inch iMac, if I had one.

You might also like

TechRadar – All the latest technology news

Read More

‘Apple Intelligence’ is reportedly coming to your iPhone in iOS 18 – here’s what to expect

We’ve learned from Tim Cook’s comments and countless reports that Apple is working on AI features for all of its devices and platforms. And we’re almost certain that the technology giant will unveil it during the opening keynote of WWDC 2024

Now, though, we have an idea of how Apple will be branding the AI features – and no, it won’t be artificial intelligence or “Absolutely Incredible,” as Greg Joswiak, Apple’s SVP of Marketing, teased in a post on X (formerly Twitter)

According to Bloomberg’s Mark Gurman, it will be called “Apple Intelligence,” which certainly has a nice ring to it. It’ll apparently be the central location to opt-in to the new features built into iOS 18, iPadOS 18, and macOS 15. As predicted, it will likely be all about integrating AI functionality into current apps and services, ones that someone could use daily and provide value. 

As Gurman notes, “the company is less focused on whiz-bang technology — like image and video generation — and instead concentrating on features with broad appeal.”

WWDC 2024

(Image credit: Apple)

These will likely become summarization powers for navigating a crowded inbox or getting the gist of a webpage on the fly. Similar to Samsung’s Galaxy AI or Google’s Gemini feature set, they will extend to summarizing notes, automatically transcribing voice recordings, and even providing a simple digest of notifications.

In Messages, suggested replies should get an upgrade, and Siri will seemingly get the equivalent of a new brain, hopefully making it much more useful. The upgrade could integrate a large-language model to let the virtual assistant control functions and features within apps and multi-step queries. Bloomberg’s latest reporting also notes that Apple will partner with OpenAI and that its tools will be used to power some features.

The report notes that “Apple Intelligence” features will be entirely opt-in and not turned on by default – additionally, they may be labeled as “a beta version.” This hint suggests that Apple plans to improve them over time and potentially add additional features. 

It seems you’ll need a Mac or iPad with an M-Series chip or newer. For the iPhone, it will reportedly be supported on forthcoming models introduced in 2024 as well as the iPhone 15 Pro or 15 Pro Max. The A15 Bionic or later will likely be the requirement, but it will be interesting to see if it’s needed for all features or just specific elements. 

Apple iPad Pro 13-inch (2024)

(Image credit: Future)

Like other services, the processing will either be on a device or cloud-based computing; the latter would be a change for Apple, which always focuses on users' privacy and security. To that point, the report notes that WWDC will focus on what “precautions” Apple is taking, such as “security features on the chips that it’s using in its data centers,” and that user profiles based on customer data will not be built.

With either processing route, it’s clear that privacy will be front and center, and Apple will use it to differentiate itself from competitors. It could also help push more folks to actually opt-in to Apple Intelligence, and that, paired with actually useful features that are viewed as helpful, could help to turn the tide here. After all, useful upgrades to applications and tools we use daily can help speed through workflows and make tasks easier.

We’ll have to wait and see what Apple unveils at WWDC 2024’s kickoff and how it positions AI, err, Apple Intelligence. You can see the five things we expect Apple to unveil, including a round-up of all of our news leading up to the kick-off at 10am PT / 1pm ET / 6pm BST on June 10 (3am AEST, June 11).

If you’ve been waiting for a Calculator app for iPad, it seems this is the year. VisionOS 2.0 will also bring more environments to Vision Pro, and we'll also apparently get new Mac, iPhone, and iPad wallpapers, a dedicated app for managing passwords, and the ability to create emojis on the fly.

You Might also Like

TechRadar – All the latest technology news

Read More

Report: Apple may make it even easier to manage passwords on the iPhone and Mac

Yes, Apple’s long offered an easy way to save usernames and passwords as part of iCloud Keychain and even allowed syncing them across devices for easy sign-ins. However, Apple may soon devote an entire application across platforms to password management.

According to Bloomberg’s Mark Gurman, Apple will show off a new app called Passwords at WWDC 2024. The app will be a part of, and available on, “iOS 18, iPadOS 18, and macOS 15.” Like iCloud Keychain, the dedicated app can “generate passwords and keep track of them.” 

Reportedly, on iOS or iPadOS, it will move out of Settings and into a dedicated app, while on the Mac, it will leave Safari Settings and turn into a dedicated app. As you might suspect, iCloud Keychain will still be the backbone of this experience.

a promotional image for WWDC 2024 featuring a dark Apple logo on a black background

A teaser image for Apple WWDC 2024. (Image credit: Apple)

The application will be a one-stop shop for password management across Apple devices and be divided into categories. It will support standard logins and passwords, saved Wi-Fi networks, and passkeys that utilize Touch ID or Face ID. Releasing a dedicated password management app will put Apple more directly in competition with third-party services like LastPass, Aura, and 1Password.

Like those, Apple’s Passwords solution will likely support the import of passwords, possibly encouraging folks to opt for the one built into the heart of the platforms.

The latest report also notes that passwords will work on Vision Pro – we’ll note the $ 3,500 Spatial Computer already supports iCloud Keychain for auto-filling – and on Windows computers. It’s unclear if that will be a formal application or a web experience for the latter. Either route, the application will offer the ability to autofill logins and passwords on these devices.

As someone who uses iCloud Keychain, I already appreciate that it works across platforms, and a dedicated application might make it easier to manage them. And if I am searching for a password, it'll be a bit easier to open an app and authenticate versus diving into settings. It will be neat to see if any new features are rolled out, potentially an ability to auto-update logins or passwords after a set amount of time. That's all speculative, though.

Apple will likely unveil the Passwords app alongside a trove of other news at its opening keynote for WWDC 2024. Much of the focus will likely be on integrating artificial intelligence (AI) and machine learning (ML) announcements for all of Apple’s platforms – iOS, iPadOS, macOS, watchOS, tvOS, and visionOS.

We’ll know for sure on June 10, 2024, and you can see the other rumored announcements ahead of the World Wide Developers Conference here.

You Might Also Like

TechRadar – All the latest technology news

Read More

Windows 11 and Android users will finally get a feature iPhone users have had for ages – the easy copying of text from phone photos over to a PC

A Microsoft Phone Link update may be in the works to make exchanging text between your phone and your PC a lot easier. In short, you’ll be able to select text in photos synced from your Android phone. 

Phone Link is an app on your PC (also called Link to Windows on your phone) that allows you to sync your calls, messages, notifications, and images from your Android device onto your PC. It’s similar to how you’re able to sync much of your iPhone and its apps to your MacBook, so you can respond to messages and access photos you might need without having to pick up your phone. 

The feature will use optical character recognition (OCR) to spot text within images and highlight them, so you’ll be able to copy the text over to a word processor, email, or text box. This is great news for those of us who hate having to type out important details and are looking for a simpler procedure. Unfortunately, the feature is currently only available through Microsoft’s preview channel. 

Windows Central gave the new feature a go and showcased a simple layout within Phone Link that highlights all the available text in the image, with the option to copy the text to your clipboard in Windows. If you feel like this all sounds familiar, you may remember Microsoft actually started testing this feature out in the Snipping Tool, where your transferred photo would open in the app rather than with Phone Link. 

Welcome to the club 

Apple users like myself may be tempted to turn their noses up at an update like this, but overall it’s still a beneficial change that I’m sure will benefit a lot of people. However, from what we can tell the OCR isn’t 100% accurate, so you will have to double-check the pasted text before you send it off. 

If you’re just looking to paste written notes or basic information, the new feature will probably work just fine for you, however, if you want to paste over longer or more important blocks of text, using cross-device copy and paste may be better (assuming the text isn’t solely confined to an image file). 

So far, the feature is still locked behind the Windows Insider Preview Build, Microsoft’s hub for testing potential new features and changes. While we normally say that we have to take the Preview Build changes with a pinch of salt (not all features make a wide release) we’re fairly confident that this Phone Link update will come to fruition. 

If you want to try it out yourself, you’ll have to make sure you’re part of the Insider Preview Build channel (which is free to join), where you’ll be able to not only play around with the new Phone Link update but also see other features Microsoft has in the works. 

You might also like…

TechRadar – All the latest technology news

Read More

Apple’s next accessibility features let you control your iPhone and iPad with just your eyes

Ahead of Global Accessibility Day on May 16, 2024, Apple unveiled a number of new accessibility features for the iPhone, iPad, Mac, and Vision Pro. Eye tracking is leading a long list of new functionality which will let you control your iPhone and iPad by moving your eyes. 

Eye Tracking, Music Haptics, Vocal Shortcuts, and Vehicle Motion Cues will arrive on eligible Apple gadgets later this year. These new accessibility features will most likely be released with iOS 18, iPadOS 18, VisionOS 2, and the next version of macOS. 

These new accessibility features have become a yearly drop for Apple. The curtain is normally lifted a few weeks before WWDC, aka the Worldwide Developers Conference, which kicks off on June 10, 2024. That should be the event where we see Apple show off its next generation of main operating systems and AI chops. 

Eye-Tracking looks seriously impressive 

Eye Tracking demoed on an iPad.

(Image credit: Apple)

Eye-tracking looks seriously impressive and is a key way to make the iPhone and iPad even more accessible. As noted in the release and captured in a video, you can navigate iPadOS – as well as iOS – open apps, and even control elements all with just your eyes, and it uses the front-facing camera, artificial intelligence, and local machine learning throughout the experience. 

You can look around the interface and use “Dwell Control” to engage with a button or element. Gestures will also be handled through just eye movement. This means that you can first look at Safari, Phone, or another app, hold that view, and it will open. 

Most critically, all setup and usage data is kept local on the device, so you’ll be set with just your iPhone. You won’t need an accessory to use eye tracking. It’s designed for people with physical disabilities and builds upon other accessible ways to control an iPhone or iPad.

Vocal Shortcuts, Music Haptics, and Live Captions on Vision Pro

Apple's new Vocal Shortcuts for iPhone and iPad.

(Image credit: Apple)

Another new accessibility feature is Vocal Shortcuts, designed for iPad and iPhone users with ALS (amyotrophic lateral sclerosis), cerebral palsy, stroke, or “acquired or progressive conditions that affect speech.” This will let you set up a custom sound that Siri can learn and identify to launch a specific shortcut or run through a task. It lives alongside Listen for Atypical Speech, designed for the same users, to open up speech recognition for a wider set. 

These two features build upon some introduced within iOS 17, so it’s great to see Apple continue to innovate. With Atypical Speech, specifically, Apple is using artificial intelligence to learn and recognize different types of speech. 

Music Haptics on the iPhone is designed for users who are hard of hearing or deaf to experience music. The built-in taptic engine, which powers the iPhone’s haptics, will play different vibrations, like taps and textures, that resemble a song's audio. At launch, it will work across “millions of songs” within Apple Music, and there will be an open API for developers to implement and make music from other sources accessible.

Additionally, Apple has previews of a few other features and updates. Vehicle Motion Cues will be available on iPhone and iPad and aim to reduce motion sickness with animated dots on that screen that change as vehicle motion is detected. It's designed to help reduce motion sickness without blocking whatever you view on the screen.

A look at Live Captions in visionOS at Apple Vision Pro

(Image credit: Apple)

One major addition arriving for VisionOS – aka the software that powers Apple Vision Pro – will be Live Captions across the entire system. This will allow for captions for spoken dialogue within conversations from FaceTime and audio from apps to be seen right in front of you. Apple’s release notes that it was designed for users who are deaf or hard of hearing, but like all accessibility features, it can be found in Settings.

Since this is Live Captions on an Apple Vision Pro, you can move the window containing the captions around and adjust the size like any other window. Vision accessibility within VisosOS will also gain reduced transparency, smart inverting, and dim flashing light functionality.

Regarding when these will ship, Apple notes in the release that the “new accessibility features [are] coming later this year.” We’ll keep a close eye on this and imagine that these will ship with the next generation of OS’ like iOS 18 and iPadOS 18, meaning folks with a developer account may be able to test these features in forthcoming beta releases.

Considering that a few of these features are powered by on-device machine learning and artificial intelligence, aiding with accessibility features is just one way that Apple believes AI has the potential to make an impact. We’ll likely hear the technology giant share more of its thoughts around AI and consumer-ready features at WWDC 2024.

You Might Also Like

TechRadar – All the latest technology news

Read More

Your iPhone may soon be able to transcribe recordings and even summarize notes

It’s no secret that Apple is working on generative AI. No one knows what it’ll all entail, but a new leak from AppleInsider offers some insight. The publication recently spoke to “people familiar with the matter,” claiming Apple is working on an “AI-powered summarization [tool] and greatly enhanced audio transcription” for multiple operating systems. 

The report states these should bring “significant improvements” to staple iOS apps like Notes and Voice Memos. The latter is slated to “be among the first to receive upgraded capabilities,” namely the aforementioned transcriptions. They’ll take up a big portion of the app’s interface, replacing the graphical representation for audio recordings. AppleInsider states it functions similarly to Live VoiceMail on iPhone, with a speech bubble triggering the transcription and the text appearing right on the screen.

New summarizing tools

Alongside VoiceMemos, the Notes app will apparently also get some substantial upgrades. It’ll gain the ability to record audio and provide a transcription for them, just like Voice Memo. Unique to Notes though, is the summarization tool, which will provide “a basic text summary” of all the important points in a given note.

Safari and Messages will also receive their own summarization features, although they’ll function differently. The browser will get a tool that creates short breakdowns for web pages, while in Messages, the AI provides a recap of all your texts. It’s unknown if the Safari update will be exclusive to iPhone or if the macOS version will get the same capability, but there’s a chance it could.

Apple, at the time of this writing, is reportedly testing these features for an upcoming release on iOS 18 later this year. According to the report, there are plans to update the corresponding apps with the launch of macOS 15 and iPadOS 18; both of which are expected to come out in 2024.

Extra power needed

It’s important to mention that there are conflicting reports on how these AI models will work. AppleInsider claims certain tools will run “entirely on-device” to protect user privacy. However, a Bloomberg report says some of the AI features on iOS 18 will instead be powered by a cloud server equipped with Apple's M2 Ultra chip, the same hardware found on 2023’s Mac Studio.

The reason for the cloud support is that “complicated jobs” like summarizing articles require extra computing power. iPhones by themselves, may not have the ability to run everything internally.

Regardless of how the company implements its software, it could help Apple catch up to its AI rivals. Samsung’s Galaxy S24 has many of these AI features already. Plus, Microsoft’s OneNote app can summarize information thanks to Copilot. Of course, take all these details with a grain of salt. Apple could always change things at the last minute.

Be sure to check out TechRadar's list of the best iPhones for 2024 to see which ones “reign supreme”.

You might also like

TechRadar – All the latest technology news

Read More

Classic PlayStation and Saturn games may be coming to your iPhone next

With the advent of increased third-party support on iOS, video game emulators have rushed to the App Store to fill in the gap. The first bunch has been primarily for old Commodore 64 and GameBoy titles. However, this could soon change as we may see an emulator capable of running Sony PlayStation and Sega Saturn games. The app in question is called Provenance EMU. In an email to news site iMore, project lead Joseph Mattiello said his team is working on launching their software to the App Store.

Provenance, if you’re not familiar, can run titles from a variety of consoles, including famous ones such as the Super Nintendo and more obscure machines. It’s unknown when the emulator will make its debut. Mattiello states they also need to make some quality-of-life fixes first and he wants to “investigate” the new rules. The report doesn’t explain what he’s referring to, but Mattiello may be talking about the recent changes Apple made to the App Review Guidelines. Lines were added in early April stating “developers are responsible for all the software inside their apps”. Plus, emulators need to “comply with all applicable laws”. 


Please note the use of emulators may be in violation of the game developer and publisher terms and conditions as well as applicable intellectual property laws. These will vary so please check these. Emulators should only ever be used with your own purchased game copy. TechRadar does not condone or encourage the illegal downloading of games or actions infringing copyright. 

This could put third-party developers under deep scrutiny by gaming publishers. Nintendo, for example, is not afraid to sic its lawyers after developers it claims are violating the law. Look at what happened with Yuzu

Game emulation currently exists in a legally gray area. Despite this, they have been allowed to exist, but one wrong move could bring the hammer down. So, Mattiello wants to ensure his team won’t be stepping on any landmines at launch. If all goes well, we could see a new era of mobile gaming; one where the titles aren’t just sidescrollers with sprites, but games featuring fleshed-out 3D models and environments.

What to play

We don’t recommend downloading random ROMs of games off the internet. Not only could they violate intellectual property laws, but they can also hold malware. These digital libraries aren’t the most secure. 

So if and when Provenance is released on the App Store, what can people play? At the moment, it seems users will have to try out homebrew games. They’re independently made titles that copy certain graphical styles for emulators. 

iMore recommends PSX Place, a website where hobbyists come together to share their homebrewed PlayStation games. is another great resource. If you ever wanted to play a fan adaptation of Twin Peaks, has one available. For GameBoy-style titles, Homebrew Hub has tons of fan-made projects. Personally, we would love to see publishers like Sony and Nintendo release their games on iOS. That way, people can enjoy the classics without skirting the law.

For those looking to upgrade, check out TechRadar's guide for the best iPhone for 2024.

You might also like

TechRadar – All the latest technology news

Read More

YouTube TV’s sports-friendly Multiview mode is rolling out to iPad and iPhone

YouTube TV’s Multiview feature is reportedly rolling out to iOS devices, giving iPhone owners a new, more immersive way to watch sports.

News of this update comes from multiple users on the YouTubeTV subreddit claiming that they had just received the option on their smartphones. One person even shared a short video of their iPhone playing four different basketball games at once – well, one’s a commercial, but you can tell it’s basketball due to the ESPN banner. 

We don’t know the full capabilities of Multiview on YouTube for iOS. According to 9To5Google it can be activated from the app’s Home tab, however it “only works with select games,” and it doesn't have all of the same features as the smart TV version. 

Multiview on iOS apparently can’t show sports scores alongside a broadcast, nor does it have the Last Channel Shortcut to hop between recently viewed channels. There is a gap in performance, but regardless of what it can’t do, Multiview on mobile is still very useful to have, especially now during March Madness.


It appears this isn’t a limited roll out as a company representative told Reddit users the feature will appear in a patch that will be available on all iOS devices. You need to have YouTube version 8.11 installed to see the option. 

The feature is also coming to iPadOS, as another user claims to have the patch on their iPad Pro 12.9. Admittedly, it’s difficult to watch four sports games on their iPhone since the small screen shrinks each window considerably, but iPad owners should have a better viewing experience.

An Android version is apparently in the works, however it won’t be out for a while. The same representative said that the update will arrive within “the coming months” although it may arrive sooner than expected. One user claims to have received a notification after opening the YouTube app on their Android informing them of Multiview. But, when they checked, it wasn’t actually there. 

We reached out to Google asking them to confirm whether or not the iOS release will reach everyone or just a select few. We'll update this story if we learn anything new. 

Until then, check out TechRadar's list of the best iPhone for 2024 if you're looking to upgrade.

You might also like

TechRadar – All the latest technology news

Read More

Leaked Apple roadmap hints at iPhone SE 4, foldable iPhone, and AR glasses launch dates

We've heard plenty of rumors about the iPhone SE 4, the foldable iPhone, and the Apple AR glasses, and now a leaked roadmap has given us a better idea of when we might actually see these devices get launched.

The document, apparently from finance company Samsung Securities, was leaked by well-known tipster @Tech_Reve (via Wccftech). It offers an overview of what's on the way from Apple for the next few years, up until 2027.

It's in 2027 when we'll apparently get the augmented reality glasses. We've not heard much about the specs in recent months, with the Apple Vision Pro taking most of the attention when it comes to AR and VR (or mixed reality, if you prefer). We're also, it seems, getting a cheaper Vision Pro sometime in 2026.

A foldable 20-inch iPad is slated to arrive in 2027, with the foldable 8-inch iPhone turning up a year before. That's somewhat in opposition to recent rumors that said the foldable iPad would turn up first – though considering a foldable iPhone would be about the size of an iPad mini anyway, there may be some confusion over which product is which.

Coming soon

See more

There's also a mention of the long-rumored OLED MacBook in 2026, and then looking at next year, we've got the iPhone SE 4 mentioned. That matches up with a rumor from last month that pointed to an early 2025 launch for the mid-ranger – with a switch to a more modern design and an OLED display also being talked about.

As for the rest of this year, it looks very much as though we'll get an 11-inch iPad Pro and a 12.9-inch iPad Pro, both running OLED screens. Most tipsters have predicted a 2024 launch for these tablets, and they could show up any day now (though you might have to get your orders in quickly for the 11-inch version).

The usual caveats about leaks and rumors apply: these dates might not be completely accurate, and even if they are, Apple's plans can always change. That said, this roadmap does  match up nicely with other bits of information that have leaked out.

If Apple does indeed launch new iPads in the near future, the next big announcements to expect will be about iOS 18, artificial intelligence, and Apple's other software. That will be at WWDC (the Worldwide Developers Conference) 2024, happening sometime in June.

You might also like

TechRadar – All the latest technology news

Read More