Yes, Apple Vision Pro is being returned to stores – but this could actually be a good thing

We’ve officially passed the two-week return window for the Apple Vision Pro, which allowed people who purchased the headset on launch day to hand it back. Social media buzz has suggested that the Vision Pro was being returned in droves. However, inside sources suggest this may not be the case – and offer an interesting insight into who is returning their headset, and why. 

In our Apple Vision Pro review, we touched on the positives and negatives of using the device and rounded up our top three reasons why users may end up returning the headset. As Apple’s first attempt at a mixed-reality headset, the product was always going to be rather polarizing. It lacks the backing of familiarity that other Apple products like a new iPhone or MacBook always have at this point. 

Not to mention the fact that the Apple Vision Pro is expensive. Retailing at $ 3,499/ £2,788, AU$ 6349, it’s easy to imagine more than a few returns are down to buyer's remorse – I know I would slink back to the Apple Store as soon as I found even the slightest discomfort or annoyance (or looked at my bank account, frankly). Especially if I couldn’t get my prescription sorted out for the headset or just found it really uncomfortable. 

In fact, AppleInsider reached out to sources within Apple’s retail chain for more info on the headset returns and noted that discomfort is probably one of the biggest concerns when it comes to it. “Most of our returns, by far, are within a day or two. They're the folks that get sick using it,” one source told AppleInsider’s Mike Wuerthele. “The pukers, the folks that get denied by prescription-filling, that kind of thing. They know real quick.”

Influencer investments – gotta get that content!

The second group of people that seem to be making up most of the returns are influencers and YouTubers. Again, the Vision Pro is a product many people want to get their hands on, so it would make sense that online tech ‘gurus’ would want to jump on the trend at launch. 

With the two-week return window offered by Apple, that’s more than enough time to milk the headset for as much content as possible then give it back, and get your money back too. If you’re a tech content creator, it’s easier to look at the Vision Pro as a short-term investment rather than a personal splurge. 

“It's just the f***ing YouTubers so far,” one retail employee told Wuerthele. 

According to AppleInsider's sources, however, the return process isn’t as simple as just boxing the headset up and dropping it off. Each return is accompanied by a detailed, lengthy survey that will allow users to go in-depth on their reason for return and their experience with the product. This is great news in the long run because it could mean any future iterations of the Apple Vision Pro will be designed and built with this feedback in mind – and the Vision Pro is already arguably a public beta for what will presumably eventually become the ‘Apple Vision’.

Beyond AppleInsider's coverage, prolific Apple leaker and Bloomberg writer Mark Gurman has (unsurprisingly) chipped into the discussion surrounding Vision Pro returns. He reported much the same; some people think it's uncomfortable or induces sickness, while for others it's simply too much money. 

Gurman spoke to a Los Angeles professional who bought and returned the headset, who said 'I loved it. It was bananas,' but then went on to explain that he simply hadn't found himself using it that often, and that the price was just too much: “If the price had been $ 1,500 to $ 2,000, I would have kept it just to watch movies, but at essentially four grand, I’ll wait for version two.”

If users are returning it because they’re not using it as much as they thought they would, certain aspects are making them feel nauseous, or the headset is just really uncomfortable on their head, Apple can take this feedback in mind and carry it forward. It’s a common criticism of VR headsets in general, to be fair – perhaps some people just aren’t built for using this type of product?

You might also like…

TechRadar – All the latest technology news

Read More

TikTok is now on Apple Vision Pro, ready to take over your view and eat up your gestures

TikTok has had a big impact on the world of music since it was launched back in 2016, and now it’s set to make its presence felt in the world of VR with a new native app for the Apple Vision Pro. Is there anything that TikTok can’t do?

In January, Ahmad Zahran, Product Leader at TikTok, revealed that a Vision Pro app was in the works, saying his team had “designed a new TikTok experience for the Apple Vision Pro”. Its reimagined interface takes you out of TikTok in Safari – which used to be the only way to access the platform on the Vision Pro – and into a new app version that’s designed for the Vision Pro’s visionOS platform and takes full advantage of the headset’s visual layout. 

Similar to the design of its iOS and Android apps, TikTok for visionOS has a vertical layout and includes the usual ‘Like’, ‘Comment’, ‘Share’, and ‘Favorite’ icons. What sets TikTok’s visionOS app apart from its iOS and Android versions is its expanded interface designed for the Vision Pro’s widescreen view.

TikTok user interface on Apple Vision Pro

(Image credit: TikTok)

When you tap the icons in the navigation bar they appear as floating panes to the right of your ‘For You’ page without interrupting the main video display, giving you a better view of comment sections and creator profiles. Better yet, the app is also compatible with Vision Pro’s Shared Space tool, allowing you to move TikTok to a different space in your headset view so that you can open other apps. 

If you really want to reap the benefits of using TikTok in the Vision Pro, you can immerse yourself even further by viewing content in the headset’s integrated virtual environments – so you could enjoy your favorite clips on the surface of the Moon if that’s your thing. 

If you thought TikTok was ubiquitous and immersive now, just wait –  it’s already far too easy to get lost in the endless feed you’re presented with in your phone, never mind having it take over the majority of your central view in a headset. 

There is one thing missing from the TikTok Vision Pro app: the ability to capture and create new videos. 

TikTok has also beaten Netflix and YouTube to the punch by arriving on the Vision Pro. While Netflix has no plans to launch a Vision Pro app right now, YouTube recently announced the app Juno – a service that lets you browse YouTube videos specifically for Apple’s ‘latest and greatest device’. 

@techradar

♬ Papaya – Pastel

You might also like

TechRadar – All the latest technology news

Read More

The 3 reasons people are sending back their Apple Vision Pro headsets

Apple Vision Pro owners are announcing they’re returning their headsets because they’re disappointed by the experience offered by the $ 3,500 mixed-reality gadget. 

We’ve highlighted the positives and negatives of using the device in our Apple Vision Pro review, but if you’re still on the fence then the reasons people are giving for returning could help you decide if the headset is the right fit for you.

It also might be worth starting to keep an eye on the Apple Store’s refurbished section. While it’ll likely be a while before the Vision Pro appears – and it’ll probably still be fairly pricey – you might be able to buy one of these returned Vision Pros for a discount in the future. 

As an aside, we’ve been impressed with Apple’s refurbished tech; the checks and replacements it makes mean you’re basically getting a new gadget at a lower price so it’s worth checking its refurb store for the Vision Pro or any other piece of Apple tech you’re after before just buying new – provided you aren’t after something super recent.

Anyway, let’s get into why the Vision Pro headset is being returned.

Two Apple Vision Pros on stands with people taking pcitures

Why is the Vision Pro’s popularity waning? (Image credit: Future / Lance Ulanoff)

The end of a (trial) era

There are individual reasons people will be looking to return the Apple Vision Pro, and we’ll get to those, but the main reason you’ll be seeing social media post after social media post on the topic right now is because of Apple’s returns policy.

When you buy a new Apple product from its store you have 14 days to be able to send it back and get a full refund. The Apple Vision Pro launched on February 2 so at the time of writing we’re at that two-week mark.

If someone has decided the experience isn’t perfect enough for them to part with $ 3,500 – or more if they bought a model with bigger storage – then it’s getting to the stage where they either have to live with that subpar experience or send the device back.

Apple Vision Pro on a stand showing the Solo Knit band

(Image credit: Future / Lance Ulanoff)

Comfort is king 

As for the specific Apple Vision faults, a lot of people’s problems come down to comfort.

When you’re spending as much as you’re spending on the Vision Pro you’ll probably feel the need to use it a lot to feel your purchase is justified. But as we heard from some early test events that media were invited to the device could be uncomfortable to wear for long stretches – especially when using the Solo Loop band that offers zero over-the-head support.

On top of complaints that it’s too heavy people have said it can cause motion sickness and eye strain. These issues also exist for other VR headsets – especially among people who are new to VR – but the Vision Pro may exacerbate these problems as, again, people are probably immersing themselves for very long stretches to feel like they’re getting the most out of the headset.

Not only in terms of bang for their buck but also for productivity and watching films – the two main Vision Pro uses. Blockbusters can stretch on for two hours or longer, and typical work shifts are eight hours. Even if you are just sitting looking at virtual windows this is a very long time for new users to be spending in VR without long breaks.

Apple Vision Pro apps floating in front of a snowy background

What’s the Vision Pro’s killer app? (Image credit: Future)

What does it do? 

The other frequently cited issue we’ve seen on social media is the lacking software ecosystem. 

The Vision Pro does have a lot of apps (over 1,000 at the time of writing) at its disposal and has some really neat features. But as many reviewers have pointed out – such as The Verge – the majority of those programs are ported over from iPadOS. 

There are some bespoke spatial apps and improvements have been made to make the iPad programs feel more interactive in mixed reality, but when people think of VR software they imagine epic immersive gaming like Asgard’s Wrath 2, fitness apps like Supernatural, or educational adventures like Out of Scale from Kurzgesagt.

The Vision Pro doesn’t have a good answer (or in some cases any answer at all) to these apps that you can find on rival platforms, and unfortunately for Apple, this is something that will take time to change. And if it seems like all you’re getting are iPad apps, why not save a lot of money and just buy an iPad – or even an iPad Pro?

Given that people have to decide to keep or send the device back for a refund now it’s a lot safer to assume the software problems will persist until the next headset or two launch rather than pray some killer exclusive apps are on the horizon and risk wasting $ 3,500.

Two people sit at a desk with a Mac Studio, a Studio Display, and a Vision Pro headset in front of them.

Don’t like the Vision Pro? You can send it back (Image credit: Apple)

More to the story? 

Remember it’s worth taking the posts you see with a pinch or two of salt – and remembering that most people who bought a Vision Pro are probably keeping it.

Apple tech has a lot of devout fans and haters who will engage with every single post they see about people returning the Vision Pro because it either affirms their negative view or because they feel the need to defend the 2.8 trillion dollar company. No matter how someone chooses to respond to the post, their interaction will boost engagement and amplify the voice of what is very likely a minority of Vision Pro users sending the headset back.

We also wouldn’t be surprised if a chunk of people returning the headset always planned to send it back for a refund, and are just giving whatever excuse they can that isn’t “because I can’t actually afford it.”

Apple’s Vision Pro has, as many expected, created a buzz online with post after post going viral – be they someone giving their hands-on impressions, or finding a weird way to use it like that person who walked their robot dog down the street while sporting the Apple headset. There’s also just a certain level of perceived internet clout that comes from being able to show off that you own and have used a $ 3,500 device.

Once you’ve soaked up that early hype and boosted traffic to your socials do you want to be left with a $ 3,500 hole in your wallet? Or would you rather get the boosted attention and not have to spend a dime? 

That’s not to say there aren’t some genuine issues with the Vision Pro, but don’t let all these reports necessarily put you off if you’ve tried it yourself, love it, and want to own one. As these posts have made clear, you do have just under 14 days to use it at home before you’re locked out from a full refund if you decide the Vision Pro isn’t for you after all.

You might also like

TechRadar – All the latest technology news

Read More

Apple could be working on a new AI tool that animates your images based on text prompts

Apple may be working on a new artificial intelligence tool that will let you create basic animations from your photos using a simple text prompt. If the tool comes to fruition, you’ll be able to turn any static image into a brief animation just by typing in what you want it to look like. 

According to 9to5Mac, Apple researchers have published a paper that details procedures for manipulating image graphics using text commands. The tool, Apple Keyframer, will use natural language text to tell the proposed AI system to manipulate the given image and animate it. 

Say you have a photo of the view from your window, with trees in the background and even cars driving past. From what the paper suggests, you’ll be able to type commands such as ‘make the leaves move as if windy’ into the Keyframer tool, which will then animate the specified part of your photo.

You may recognize the name ‘keyframe’ if you’re an Apple user, as it’s already part of Apple’s Live Photos feature – which lets you go through a ‘live photo’ GIF and select which frame, the keyframe, you want to be the actual still image for the photo. 

Better late than never? 

Apple has been notably slow to jump onto the AI bandwagon, but that’s not exactly surprising. The company is known to play the long game and let others beat out the kinks before they make their move, as we’ve seen with its recent foray into mixed reality with the Apple Vision Pro (this is also why I have hope for a foldable iPhone coming soon). 

I’m quite excited for the Keyframer tool if it does come to fruition because it’ll put basic animation tools into the palm of every iPhone user who might not know where to even start with animation, let alone make their photos move.

Overall, the direction Apple seems to be taking in terms of AI tools seems to be a positive one. The Keyframer tool comes right off the back of Apple’s AI-powered image editing tool, which again reinforces the move towards user experience improvement rather than just putting out things that mirror the competition from companies like OpenAI, Microsoft, and Google.

I’m personally glad to see that Apple’s dive into the world of artificial intelligence tools isn’t just another AI chatbot like ChatGPT or Google Gemini, but rather focusing on tools that offer unique new features for iOS and macOS products. While this project is in the very early stages of inception, I’m still pretty hyped about the idea of making funny little clips of my cat being silly or creating moving memories of my friends with just a few word prompts. 

As for when we’ll get our hands on Keyframer, unfortunately there’s no release date in sight just yet – but based on previous feature launches, Apple willingly revealing details at this stage indicates that it’s probably not too far off, and more importantly isn’t likely to get tossed aside. After all, Apple isn’t Google.

You might also like…

TechRadar – All the latest technology news

Read More

Opera: new DMA rules a chance “to put pressure” on Apple to open up for all

It looks like Apple's enclosed ecosystem is slowly opening up—in the EU, at least. On January 25th, 2024, the Big Tech giant revealed changes to its App Store and business model given new requirements under the Digital Market Act (DMA) due to come into force in March.

Apple's announcement was met with controversies, though. Many commentators, including Meta's founder Mark Zuckerberg and music app behemoth Spotify, deemed it as a farce. According to VPN service provider Proton VPN, “Apple is trying to profit off the DMA.” Echoing such concerns, web browser Mozilla sees this as “another example of Apple creating barriers to prevent true browser competition on iOS.”

Developers at Opera are more optimistic about Apple's new iOS browser rules and decided to celebrate launching an AI-powered alternative to Safari. I talked with Jona Bolin, Product Manager at Opera browser for iOS, to understand what all of this means for users in and out of Europe. 

An opportunity to get more control

“I think it's great that they are changing the regulations,” Bolin told me. “For us, it's an opportunity to have high control.” 

He went on to explain that while distribution is a major factor for other developers, the fact that Opera browser is a free service means that it won't be affected as much by new fees and payment requirements.

“Even though, we would have to develop two different apps,” Bolin told me, adding that the challenge will be encouraging users to migrate from one app to another instead. 

That's because as Apple opens up to third-party web browser engines for the first time—until now only Safari's WebKit engine was allowed for iOS—the provider has only done so for EU apps. This ultimately means twice as much work for browsers' developers.

Despite this burden, Bolin expects Apple's changes to make it easier for the team to implement the same level of features across Opera's range of apps. “Out of the box, we would get high security and a better process from where we can build on top of,” he added. 

See more

The Norwegian browser already announced plans to bring its AI-centric browser, Opera One, to iOS to give users a better AI-powered alternative to Safari. This is expected to be released in the next few months.

Outside the EU, both the UK and the US are voting on legislation that echoes the DMA's effort to ensure fair competition within the tech market and protect people's digital rights. 

Bolin hopes that new DMA requirements in the EU could then be only the first step “to put pressure” on the big tech giant to open up its ecosystem for all.

He said: “I think more countries need to move forward and then maybe Apple will also change. We also believe that [the DMA] can be a good test run, so maybe Apple would realize that it's also working on their side. We hope that in the future they will bring it to other markets—we believe that it will happen, eventually.” 

TechRadar – All the latest technology news

Read More

Apple working on a new AI-powered editing tool and you can try out the demo now

Apple says it plans on introducing generative AI features to iPhones later this year. It’s unknown what they are, however, a recently published research paper indicates one of them may be a new type of editing software that can alter images via text prompts.

It’s called MGIE, or MLLM-Guided (multimodal large language model) Image Editing. The tech is the result of a collaboration between Apple and researchers from the University of California, Santa Barbara. The paper states MGIE is capable of “Photoshop-style [modifications]” ranging from simple tweaks like cropping to more complex edits such as removing objects from a picture. This is made possible by the MLLM (multimodal large language model), a type of AI capable of processing both “ text and images” at the same time.

VentureBeat in their report explains MLLMs show “remarkable capabilities in cross-model understanding”, although they have not been widely implemented in image editing software despite their supposed efficacy.

Public demonstration

The way MGIE works is pretty straightforward. You upload an image to the AI engine and give it clear, concise instructions on the changes you want it to make. VentureBeat says people will need to “provide explicit guidance”. As an example, you can upload a picture of a bright, sunny day and tell MGIE to “make the sky more blue.” It’ll proceed to saturate the color of the sky a bit, but it may not be as vivid as you would like. You’ll have to guide it further to get the results you want. 

MGIE is currently available on GitHub as an open-source project. The researchers are offering “code, data, [pre-trained models]”, as well as a notebook teaching people how to use the AI for editing tasks. There’s also a web demo available to the public on the collaborative tech platform Hugging Face. With access to this demo, we decided to take Apple’s AI out for a spin.

Image 1 of 3

Cat picture new background on MGIE

(Image credit: Cédric VT/Unsplash/Apple)
Image 2 of 3

Cat picture lightning background on MGIE

(Image credit: Cédric VT/Unsplash/Apple)
Image 3 of 3

Cat picture on MGIE

(Image credit: Cédric VT/Unsplash/Apple)

In our test, we uploaded a picture of a cat that we got from Unsplash and then proceeded to instruct MGIE to make several changes. And in our experience, it did okay. In one instance, we told it to change the background from blue to red. However, MGIE instead made the background a darker shade of blue with static-like texturing. On another, we prompted the engine to add a purple background with lightning strikes and it created something much more dynamic.

Inclusion in future iPhones

At the time of this writing, you may experience long queue times while attempting to generate content. If it doesn’t work, the Hugging Face page has a link to the same AI hosted over on Gradio which is the one we used. There doesn't appear to be any difference between the two.

Now the question is: will this technology come out to a future iPhone or iOS 18? Maybe. As alluded to at the beginning, company CEO Tim Cook told investors AI tools are coming to its devices later on in the year but didn’t give any specifics. Personally, we can see MGIE morph into the iPhone version of Google’s Magic Editor; a feature that can completely alter the contents of a picture. If you read the research paper on arXiv, that certainly seems to be the path Apple is taking with its AI.

MGIE is still a work in progress. Outputs are not perfect. One of the sample images shows the kitten turn into a monstrosity. But we do expect all the bugs to be worked out down the line. If you prefer a more hands-on approach, check out TechRadar's guide on the best photo editors for 2024.

You might also like

TechRadar – All the latest technology news

Read More

The Apple Vision Pro is compatible with Intel Apple Macs – even if the performance may not be the same

The Apple Vision Pro has finally launched, and if you were thinking you may have to upgrade your Mac or MacBook to use the new headset (piling on another expensive purchase onto an already very pricey device) there is some good news, as it seems like the Vision Pro headset is compatible with Intel-based Macs, potentially opening the door for users with older models. 

A support page on the official Apple website, explaining how to use the headset with a Mac as a display, reveals that support for this feature is not limited to Apple Silicon Macs (such as recent MacBooks with the M1, M2 or M3 chips). The post explains that if you happen to be using a Mac with an Intel processor, you can still use the Vision Pro as a workspace, however, you’ll be working with resolutions capped at 3K rather than 4K as you normally would with an Apple Silicon-powered Mac. 

You’ll still be able to resize the Virtual Display window and use the computer's keyboard and trackpad. That being said, if you’re looking to take advantage of the Virtual Display feature, your Mac will need to be running on macOS 14 Sonoma or newer, so if you are planning on giving it a go you’d probably have to upgrade your operating system. Very old Macs and MacBooks may not be compatible with macOS Sonoma, which means you won’t be able to use the Vision Pro as an additional screen with those products.

Cool, but not very useful.

While I am glad to see support for older Macs, I’m not sure I see the point. Of course, Intel-based Macs are still good computers despite their age, but with the cost of the Apple Vision Pro, you could buy yourself an M3 iMac and have plenty of cash to spare. 

Of course, I’m sure plenty of people may have an older iMac collecting dust at home that would like to give it a go, but again the Apple Vision Pro isn’t exactly a product you buy on a whim. I wouldn’t really encourage anyone to buy the headset if they exclusively work on an Intel Mac since you won’t get the full 4K experience. You’d be better off just upgrading your device to a new MacBook, Mac mini or iMac and buying a Vision Pro later… if at all. 

There’s also no guarantee that this support on the Intel Macs will last forever – now that the M3 iMac has launched I wouldn’t be surprised if we started to see support for newer accessories or features being limited. So, if you are in the position to try out the Vision Pro with your older Mac, I suggest you get on it soon and decide if you like the pairing enough to justify upgrading to an Apple Silicon Mac – because you might have to in the future. 

Via 9to5Mac

You might also like

TechRadar – All the latest technology news

Read More

Don’t forget your Vision Pro passcode – if you do you’ll have to send your headset back to Apple

There are a few big features that the Apple Vision Pro is missing – such as support for Bluetooth mice and location tracking for the Apple Find My network – but perhaps the strangest omission from the Apple Vision Pro is the ability to reset your device if you forget your passcode.

During the Vision Pro set-up process you’ll be asked to enter a six-digit passcode, just as you would when setting up an iPhone or iPad. You can also optionally set up an Optic ID login method, but just as with Face ID on your other Apple gadgets there will be times when you’ll be forced to enter your passcode – for example after your headset has restarted.

If you ever forget your iPad or iPhone passcode you can unlock your Apple device by connecting it to your Mac or PC and wiping the data on it, and on the Apple Watch you can use the digital crown or your connected iPhone to do the same thing. Yes you’ll delete all the data, but a blank gadget is better than a gadget you’re forever locked out of.

However, while the Apple Vision Pro also has a setting that allows you to erase all your content – including the passcode – it’s only accessible via the Settings app. If you're locked out of your headset because you’ve forgotten your passcode there’s currently no at-home way to get into your Vision Pro. 

Instead, as reported by Bloomberg ($ /£), you’ll need to either take your headset back to your local Apple Store, or ship it back to Apple to have it reset if there isn’t a physical store near you.

Apple Vision Pro battery pack

Locked out? Send it back to Apple, or say hello to your new paperweight (Image credit: Apple)

Is there a workaround? 

Unfortunately, the only workaround to this problem available to most people is to not forget your passcode in the first place.

We’ve seen reports that users with the Developer Strap – a dongle that adds a USB-C port to the Vision Pro so that it can be connected to a Mac computer – could erase the Vision Pro’s content and passcode using a Mac. However, the Developer Strap costs $ 300 and is only available to officially registered developers, so most people won’t have access to it – and we’ve not been able to confirm that this method works, so there’s a chance the dongle wouldn’t even help you if you had one.

We expect that Apple will launch some kind of alternative way to erase your Vision Pro passcode in due course, especially once the gadget is made available outside the US, and sending your headset back becomes even more inconvenient for some. But for now you might want to make a note of your passcode, taking the usual precautions to ensure that this is secure.

You might also like…

TechRadar – All the latest technology news

Read More

YouTube has arrived on the Apple Vision Pro, though it’s not thanks to Google

There's been a lot of chatter this week about just how many apps are available inside the Apple Vision Pro, and it seems third-party developers are taking up the challenge of filling in any notable gaps in the app selection.

As per MacRumors, developer Christian Selig has released a dedicated YouTube app for the Vision Pro, called Juno for YouTube. Notably, it's the only YouTube client on the headset, as Google hasn't released an official app.

Costing $ 4.99, the app comes with a number of useful features, including options to resize and reposition the playback window, as well as dim the area surrounding the video for that virtual cinema theater feeling inside mixed reality.

As we already know, Google has specifically said it doesn't currently have plans to develop a YouTube app for the Vision Pro. For the time being, the only official way to get at YouTube in the Apple headset is to load it up through Safari.

There might be an app for that

Juno for YouTube app

It’s a better experience than the YouTube website (Image credit: Juno for YouTube)

Initial worries over app availability on the Vision Pro were somewhat assuaged as the device went on sale, with news that more than 600 apps are on the way soon (though the current selection is much smaller).

We've already seen Adobe make the leap into mixed reality, with its Firefly AI app. You can use it to create images generated by artificial intelligence, from any text prompt – with the end results floating in front of your eyes.

However, there are notable holdouts, including Netflix and Spotify, as well as Google. While YouTube does allow developers some access to its inner workings, that's not the case with Netflix or Spotify, so don't expect third-party clients for them.

Clearly the limited number of people who actually have an Apple Vision Pro is making software developers think twice about whether or not to support the hardware – but based on our time with the headset, it's likely to get more popular very quickly.

You might also like

TechRadar – All the latest technology news

Read More

Apple says AI features are coming to your iPhone ‘later this year’ – here’s what to expect

For the past year or two, the world has watched as a string of incredible artificial intelligence (AI) tools have appeared, and everyone has been wondering one thing: when will Apple join the party? Now, we finally have an answer.

On a recent earnings call (via The Verge), Apple CEO Tim Cook revealed that AI tools are coming to the company’s devices as soon as “later this year.” Cook then added that “I think there’s a huge opportunity for Apple with generative AI.” While the Apple chief didn’t reveal any specifics, the small amount he did discuss has already been enough to get tongues wagging and for speculation to run riot.

It’s no surprise that Apple is working on generative AI tools – Cook admitted as much back in August 2023, when he explained that Apple has been developing its own generative AI “for years.” But the latest admission is the first time we’ve seen anyone put a launch date on things, even if it is a very rough date.

Given that this is a software update (and a big one at that), it seems likely that Apple has is its Worldwide Developers Conference (WWDC) in mind. The company will use this June event to unveil its upcoming operating systems and software upgrades (like iOS 18). And with its audience mostly comprised of developers, it makes sense for Apple to tease something like generative AI that could give devs a new tool in their iOS arsenal.

As well as that, industry analyst Jeff Pu has previously claimed that iOS 18 will be one of Apple’s biggest software updates ever precisely because of its inclusion of generative AI, so Cook’s statements seem to confirm Pu’s claim. That means there could be a lot to look forward to at WWDC – and some big new features coming to your iPhone.

What's en route?

The most likely upgrade that Cook is referring to is a rebooted version of Apple's Siri voice assistant. Bloomberg's reliable Apple commentator Mark Gurman recently predicted that iOS 18 will be “one of the biggest iOS updates – if not the biggest – in the company's history” and that this will be largely tied to a “big upgrade to Siri”.

According to another respected leaker Revegnus, Apple is building a proprietary LLM (large language model) to “completely revamp Siri into the ultimate virtual assistant”. It's about time – while Siri was impressive when it landed over a decade ago, it's since plateaued. So we can expect a much more conversational, and powerful, voice assistant by the end of 2024.

Close-up of the Siri interface

(Image credit: Shutterstock / Tada Images)

But what else might benefit from the generative AI that Apple's been working on? Messages, Apple Music and Pages are all expected to receive significant AI-based improvements later this year, with some of Apple's rivals recently giving us hints of what to expect. Google Messages will soon get added Bard powers for texting help, while Spotify has already shown that the future of streaming is AI-powered DJs.  

Lastly, there's photography and video, but it seems likely that Apple will tread more carefully than Samsung and Google here. The Galaxy S24 cameras are all about AI skills, which are something of a mixed bag. While Instant Slow-Mo (which generates extra frames of video to turn standard 4K/60p video into slow motion clips) is very clever and useful, Generative Edit opens the floodgates to digital fakery (even with its watermarks).

It'll be fascinating to see how Apple treads this line across all aspects of the iPhone. But one other key iPhone feature, privacy, could also put the brakes on Apple getting too carried away with generative AI… 

Why Apple is taking its time

Siri

(Image credit: Unsplash [Omid Armin])

Apple has been consistently criticized for not launching its own generative AI, especially as arch-rival Microsoft has been so decisive in spreading its Copilot AI to almost every aspect of Windows and its own apps.

But there’s a likely reason for Apple’s sluggishness, and it comes down to user privacy. Apple takes a strong stance on this, and often touts its devices’ privacy-protecting capabilities as one of their main benefits. AI tools are known to sweep up user data and have been known for their privacy compromises, so it’s no surprise that Apple has taken its time here, presumably to ensure its AI is as pro-privacy as possible.

As well as that, Apple doesn’t usually rush into a new market before it is ready, instead preferring to wait a little longer before blowing its rivals away with something it thinks is far superior. We saw that with the original iPhone, for example, and also with the Apple Vision Pro, and it seems that generative AI is just the latest thing to get this treatment from Apple.

Whether Apple’s own AI actually is better than the likes of ChatGPT and Copilot remains to be seen, but it looks like we’ll find out sooner rather than later.

You might also like

TechRadar – All the latest technology news

Read More