You can now talk to ChatGPT like Siri for free, but it won’t reveal OpenAI’s secrets

ChatGPT has conveniently distracted us from OpenAI's boardroom drama, which has just seen Sam Altman return to the company as CEO, by making its Voice chat feature available to all free users.

The AI chatbot got its impressively conversational voice powers in September, but this feature was limited to the paid Plus and Enterprise tiers. But now OpenAI, which is looking for a shiny object to take eyes away from its recent meltdown, has made 'ChatGPT with voice' available to all users.

To use it, you just need the latest version of ChatGPT's iOS or Android app. Tap on the headphones icon at the bottom of the screen and you can start quizzing the chatbot about anything you like – as long as your question isn't about recent events, like OpenAI's CEO merry-go-round.

That's because the GPT-3.5 model that's available to free users has only been trained on data going up to January 2022. So when you ask it, for example, 'Why was Sam Altman fired from OpenAI?', it answers that there are “no public reports or indications” of this happening. How convenient.

Still, if you're looking for a voice assistant that's a bit chattier and more knowledgeable than the likes of Apple's Siri, then the ChatGPT voice function is a fun new tool (assuming the service hasn't gone down, like it did at around 2pm PT / 10pm GMT yesterday).

You can choose from five different voices and your chats (but not the audio clips) are saved just like your text-based conversations. It'll also auto-detect languages, though you can also choose this in the Settings menu.

 A Siri replacement?

A phone on a pink background showing ChatGPT's voice feature

(Image credit: Future)

Given it's now possible to use ChatGPT with Siri, the arrival of voice powers on the chatbot's free version is a potentially big deal. That's particularly the case for owners of the iPhone 15 Pro, who can map ChatGPT to the new Action button (by going to Settings > Action Button > Shortcut).

Siri and ChatGPT still have notable differences though. For example, Siri is deeply integrated with the iPhone, allowing it to perform actions like setting timers and controlling your phone's volume.

But ChatGPT's depth of knowledge and more conversational style is arguably better when it comes to general knowledge questions – as long as you're aware of its propensity to hallucinate.

It's certainly a fun, free feature to play with and will no doubt take some of the attention off OpenAI's Succession-like boardroom tussles, which could ultimately have a big impact on how the AI chatbot tussle plays out in 2024.

You might also like

TechRadar – All the latest technology news

Read More

Apple is secretly spending big on its ChatGPT rival to reinvent Siri and AppleCare

Apple is apparently going hard on developing AI, according to a new report that says it’s investing millions of dollars every day in multiple AI projects to rival the likes of ChatGPT.

According to those in the know (via The Verge, citing a paywalled report at The Information), Apple has teams working on conversational AI (read: chatbots), image-generating AIs, and 'multimodel AI' which would be a hybrid of the others – being able to create video, images and text responses to queries.

These AI models would have a variety of uses, including supporting Apple Care users as well as boosting Siri’s capabilities.

Currently, the most sophisticated large language model (LLM) Apple has produced is known as Ajax GPT. It’s reportedly been trained on over 200 billion parameters, and is claimed to be more powerful than OpenAI’s GPT-3.5; this was what ChatGPT used when it first became available to the general public in 2022, though Open AI has since updated its service to GPT-4.

As with all rumors, we should take these reports with a pinch of salt. For now, Apple is remaining tight-lipped about its AI plans, and much like we saw with its Vision Pro VR headset plans, it won’t reveal anything official until it’s ready – if it even has anything to reveal.

The idea of Apple developing its own alternative to ChatGPT isn’t exactly far-fetched though – everyone and their dog in the tech space is working on AI at the moment, with Google, Microsoft, X (formerly Twitter), and Meta just a few of those with public AI aspirations.

Close-up of the Siri interface

Siri can reportedly expect a few upgrades, but when? (Image credit: Shutterstock / Tada Images)

Don't expect to see Apple AI soon

We should bear in mind that polish is everything for Apple; it doesn't release new products until it feels its got everything right, and chatbots are notoriously the antithesis of this philosophy. So much so that AI developers have a term – 'to hallucinate' – to describe when AI chatbots are incorrect, incoherent, or make information up, because they do it embarrassingly frequently. Even ChatGPT and the best ChatGPT alternatives are prone to hallucinating multiple times in a session, and even when you aren’t purposefully trying to befuddle them.

We wouldn’t be too surprised if some Apple bots started to trickle out soon, though – even as early as next month. Something like its Apple Care AI assistant would presumably have a fairly simple task of matching up user complaints with a set of common troubleshooting solutions, patching you through to a human or sending you to a real-world Apple store if it gets stumped. But something like its Ajax GPT? We’ll be lucky to see it in 2024; at least not without training wheels.

If given as much freedom as ChatGPT, Ajax could embarrass Apple and erode our  perception of the brand for delivering finely-tuned and glitch-free products out of the box. The only way we'll see Ajax soon is if AI takes a serious leap forward in terms of reliability – which is unlikely to happen quickly – or if Apple puts a boatload of limitations on its AI to ensure that it avoids making errors or wading into controversial topics. This chatbot would likely still be fine, but depending on how restricted it is, Ajax may struggle to be taken seriously as a ChatGPT rival.

Given that Apple has an event on September 12 – the Apple September event, at which we're expecting it to reveal the iPhone 15 handsets, among other products – there’s a slim chance we could hear something about its AI soon. But we wouldn’t recommend holding your breath for anything more than a few Siri updates.

Instead, we’d recommend keeping your eye on WWDC over the next few years (the company’s annual developer’s conference) to find out what AI chatbot plans Apple has up its sleeves. Just don’t be disappointed if we’re waiting until 2025 or beyond for an official update.

You might also like:

TechRadar – All the latest technology news

Read More

Apple rumored to be announcing major Siri updates at WWDC 2023

Apple's Worldwide Developers Conference for this year – WWDC 2023 – gets underway tomorrow, June 5. We've already heard plenty of rumors about what to expect, and it would seem that Apple's digital assistant Siri is in line for some major updates too.

According to well-known Apple tipster Mark Gurman (via MacRumors), there's a possibility that Apple will announce that the “hey Siri” phrase used to trigger Siri on iPhones and other devices is being shortened to simply “Siri”.

While this might not sound major from a user perspective, it has apparently required a significant amount of engineering work: accurately recognizing a single word rather than two words is a lot trickier, and Apple's AI engines have been updated to cope.

See more

Listen up

Gurman first suggested this update was on the way back in November, though at the time it wasn't clear exactly when “hey Siri” would become simply “Siri”. Deeper Siri integrations with third-party apps and a better understanding of context have also been rumored.

At the moment, Google Assistant still requires a “hey Google” wake up command, though you can disable it for certain quick commands, and there has been talk of further changes here. As for Amazon Alexa, just an “Alexa” command is enough to get started.

We're expecting a whole host of software and hardware announcements at WWDC 2023 this year, including all the news about iOS 17 and a big reveal for the Apple VR headset – and of course you'll be able to read all about it here on TechRadar.


Analysis: expect yet more AI

Amidst the flurry of generative AI updates we've had in recent months, it's easy to forget that digital assistants like Siri have been around for many years now, with AI models leveraged to recognize and interpret voice commands from users.

At Google I/O 2023, Google seemed keen to remind everyone that it has a lot of artificial intelligence tools to show off, and the company has since been busy pushing more AI into more of its products – such as Google Messages.

We can probably expect the same from Apple at WWDC 2023: a look back at the AI that it's already been using, and a look forward to new innovations on the way. Siri, based on tech Apple acquired in 2010, is likely to play a big part in those new innovations.

AI is a hot topic at the moment, and we know that Apple isn't going to want to miss out or fall behind, whether that's with Siri or any of its other software: Google, OpenAI, Microsoft and others have set the pace, and Apple needs to catch up.

TechRadar – All the latest technology news

Read More

This Siri replacement may be the closest thing to having ChatGPT on your iPhone

Siri’s 11-year reign may be drawing to a close as a new generative AI rival called Perplexity has just landed on the Apple App Store.

Note that Perplexity is not based on ChatGPT and has its own AI. However, it does function in a similar manner to ChatGPT. You ask it a question, like who designed the Golden Gate Bridge, and it’ll churn out a response in a few seconds. From there, you can either type in a follow-up question, use the voice command feature to verbally ask questions, or choose one from the generated selection. Responses can then be shared via a messaging app or social media platform. It’ll even save “your thread history so you can pick up where you left off,” according to the App Store listing. But unlike ChatGPT, Perplexity actually provides the sources it used to generate content. Underneath, you will see citations from Wikipedia, local news sites, and national newspapers like NPR accompanied by a brief piece of text providing context. 

To use Perplexity, you will need an iPhone supporting iOS 16 or later to install the app or an M1 Mac sporting macOS 13.0 or later.

As for an Android version, it doesn’t appear the developers behind Perplexity have any plans to make one at the time of this writing. We reached out to the team via the official PerplexityAI Discord channel, but haven’t heard back yet. The closest thing you can do is use the AI through a web browser on your Android device or Windows computer. There’s also a Chrome extension available if you prefer more direct access.

Perplexity on an iPhone

Perplexity on an iPhone (Image credit: PerplexityAI/Twitter)

Siri's future

With the introduction of Perplexity on iPhone, you essentially have some form of ChatGPT-like AI on every Apple platform and we don’t mean accessing the generative AI through a web browser. Mac computers have MacGPT, a native app offering quick access on desktops that saw a big update recently. A couple of weeks ago, the Apple Watch got Petey as an easily accessible, on-device assistant to replace Siri.

So we can’t help but wonder what’s cooking behind the scenes at Apple Inc. The company has been suspiciously quiet, allowing the likes of ChatGPT to roam on its platform unfettered as third-party apps. It's unlike Apple to just leave this new trend in the tech industry alone so it must be up to something. The closest indication we currently have is Apple enacting some new rules on its App Store for generative AI. According to a report by CNBC, it recently rejected an update to the BlueMail app due to concerns about the software’s ChatGPT feature not including a filter to protect minors from inappropriate content.

The date for Apple’s Worldwide Developer Conference 2023 has been set for June 5. We’re looking forward to seeing what changes, if any, Siri may get. It'll be interesting to see if the long-standing virtual assistant will get a revamp allowing it to go toe-to-toe with its newfound rivals or be thrown out for a brand new AI model. 

Until then, be sure to check out TechRadar’s list of the best virtual assistants for 2023

TechRadar – All the latest technology news

Read More

ChatGPT lands on the Apple Watch and Siri should be worried

Just a few months after its integration into Bing, ChatGPT has made the leap to iOS as a third-party app exclusively for the Apple Watch.

It’s called Petey – AI Assistant and it was created by developer Hidde van de Ploeg (listed as Modum B.V. on the App Store). Originally, it was known as watchGPT, but due to trademarking issues with the acronym “GPT”, the name had to be changed. Looking at a demo video posted by the developer on Twitter, Petey functions similarly to Siri. You open the app, ask it a question and it answers in just a few seconds via Text to Speech. To continue an inquiry, you swipe down on the watch face, then tap Reply. Unlike Apple’s own Siri, Petey as an assistant can provide fairly complex answers like giving steps on how to catch a fish.

One of the problems with voice assistants like Siri is that they are fairly rigid in what they can do. You have to ask those AIs specific questions in a certain manner to get a response. ChatGPT, on the other hand, is more flexible in what it can do, from writing business letters to even drafting Christmas stories. It’s hard to say exactly how capable Petey is, but at the very least, it appears you won’t have to struggle with it as much.

A work in progress

Petey is a work in progress as new features are constantly being added. Right now you have a handful to work with. For starters, you can share the responses with other people “via text, email, or social media” although the App Store listing doesn’t specify which ones.

The app can be set as a complication on the Apple Watch’s face for quick access. Support for multiple languages is growing as well, bringing the total to 14.  Petey now supports German, Italian, and Japanese, just to name a few. Also if you prefer, Petey comes with a tiny, on-screen keyboard so you can type in your questions. You’re probably better off using your voice.

As for future updates, there are several things in the works. From what is known, van der Ploeg is working on adding a History tool so you can go back to a previous question, making vocal inputs the default setting, and improving the app’s overall performance so you can ask it multiple questions.

There are a couple of caveats, however. One: the app isn’t free as you’ll have to purchase it for $ 4.99 (about £4, over $ 7 AUD, and almost €5) on the App Store. To use Petey, you must have an Apple Watch running on watchOS 9 or up. So make sure you update your device if you haven't already. We should mention the software does not collect user data so rest assured, your privacy is safe.

Users with an Android smartwatch will be out of luck, unfortunately. When asked about a potential Android version, van der Ploeg said there won't be one as his “skillset wouldn't allow [for] that”.

TechRadar – All the latest technology news

Read More

ChatGPT will bury Siri for good if Apple doesn’t move fast

Back in 2011, Apple added its Siri voice assistant to any iPhone running iOS 5 and above, and since then, Siri has made its way onto the whole range of Apple’s products.

Technically speaking, Siri’s original developers should get the credit here – many people still don’t know that it was actually a third-party iOS app for just a few months before Apple acquired it, stopping any plans to bring the software to competitor operating systems like Android and Blackberry.

In response, we saw a veritable uprising from the world’s biggest tech companies to try and compete. Microsoft introduced Cortana in 2013, Amazon Alexa joined the fray in 2014, Google with Google Assistant in 2016, and even more recently we’ve seen newcomers like Bixby and Baidu pop up.

As these applications have battled it out, however, a new player has entered the field with an entirely different skill set that could disrupt the voice assistant space completely; ChatGPT.

Spot the difference

Original comic sourced from @pedro_bilohh on Twitter, edited masterfully by yours truly. (Image credit: Future / Twitter @pedro_bilohh)

So let’s get the differences out of the way first. ChatGPT is an incredibly powerful chatbot with a human-like vocabulary bolstered by near-unfettered access to information. Siri and other voice assistants, alternatively, are programmed to be more binary, with set requests and responses that they can understand.

If you were to ask ChatGPT for assistance in writing or problem-solving, or even some more unique use cases, you’re likely to be surprised and delighted by its capabilities. Powered by the same technology, Bing can also comprehend more challenging questions, even if you ask it about love

ChatGPT was created by OpenAI, a company which – as its name suggests – allows its technology to be implemented by other organizations, rather than the closed-source proprietary tech that's found in Siri. This means app developers can easily add ChatGPT to all kinds of interesting and exciting apps.

Siri, however, wouldn’t be able to do the same. It’s great for task assistance, especially when boosted by shortcuts, and for quickly navigating tasks hands-free on your phone. 

However, it’s frustratingly limited in scope beyond this and struggles to deal with more complex requests even in comparison to Alexa, despite Apple's efforts to enhance it over the years. Plus, I still have a bone to pick with how rubbish its voice recognition can be. 

Get with the program, Apple

Microsoft is now stealing the lead in the innovation race with Bing, despite some early teething issues, but Google is hot on its tail. Now, while these are both progressions in the search engine space, it’s only a matter of time before eyes turn to voice assistants.

Siri is used for search, after all – but despite having many years to iterate, search remains one of its most frustrating, clunky features.

Already, keen users are creating ways to embed ChatGPT’s more advanced conversational processing into Siri. It’s far from perfect, and there are some natural, very warranted security and privacy concerns on ChatGPT’s part here, but this eagerness to bolster Siri’s capabilities shows the potential here for Apple to capitalize on. 

So, why is Apple dragging its feet? 

An Apple Watch on a grey background showing an emergency fall detection screen

I wonder if Apple’s fall detection encompasses lagging behind in the innovation race? (Image credit: Apple)

Playing the long game 

For me, there’s only one reason a giant like Apple wouldn’t move with pace to recapture the voice assistant market. Like the great tactician Cruella de Vil, Apple may just be biding its time before striking to recapture the voice assistant market.

“You come to realize, you’ve seen her kind of eyes watching you from underneath a rock”

Disney’s 101 Dalmatians

Apple made a slew of AI acquisitions in recent years that we haven’t seen amount to much, and despite a few small rumors indicating something might be coming, the tech giant has been characteristically reserved since the big Bing and Bard blowup.

My take is that Apple was always planning to release something, but I find it hard to believe ChatGPT and Bing didn’t somewhat blindside it. After all, even Google seemed a little pressed to get Bard up and running quickly in response. As a result, Apple was faced with two choices; rush to join the race, or wait and see how the chips fall. Seemingly, it chose the latter. 

Now, while Apple wouldn’t stand to lose much by biding its time, it could win big if it comes out with a Siri far more capable than anything else on the market – and if we look at the wider Apple ecosystem and progress elsewhere in the tech space, it seems likely that the company is hoping to kick off with a bang. 

Take the smart home space, for example. This year will see Matter, the software standard driving smart home interoperability, begin to really make an impact in people’s homes. If Siri can get the jump on Alexa and Google Assistant, a more conversive and customizable Siri could rocket Apple Home into the lead. Think Disney’s Smart House, but without the murderous vibes. 

We’re just a few months out from WWDC, Apple's developer conference in California, which is an event where the company usually showcases its latest software updates (and launches the odd piece of hardware as well). By the time it rolls around, the dust will have settled somewhat on Google and Microsoft’s forays into AI. So long as Amazon doesn’t step in with its own major Alexa overhaul, this could be the perfect opportunity for Apple to sweep in and steal the limelight – potentially, even, with a much more thoroughly thought-out AI.

Time will tell – but one thing is certain: slow and steady may win the race, but not if you never leave the starting line.

TechRadar – All the latest technology news

Read More

Microsoft’s CEO calls Alexa and Siri ‘dumb’ – but ChatGPT isn’t much smarter

In an interview with the Financial Times a few weeks ago, Microsoft’s CEO Satya Nadella dismissed voice assistants, such as Alexa and Siri, as “dumb as a rock”.

This might seem a little rich coming from the CEO of a company that launched (and then abandoned) the unloved Cortana voice assistant, but I actually agree. However, unlike Nadella, I'm not so sure that the new wave of AI chatbots are where the future really lies – or at least not yet. 

Sure, they appear to be smarter than the first bunch of voice assistants, including Amazon's Alexa, Apple's Siri and Google's (less charmingly named) Assistant, but that's not saying a lot. I was initially really impressed with these assistants, particularly Alexa, to the extent that I put aside my misgivings about how much information Amazon already collected about me, and filled my home with Echo devices of all shapes and sizes.

That all seems a long time ago now, though. Despite all the promise those voice-activated digital assistants had when they launched, I can't help but feel they’ve turned into little more than hands-free light switches and timers for when I’m cooking. They even made me temporarily forget how to use a real light switch. Seriously.

That’s it. I don’t even use Alexa to play music any more. Partly because none of the Echo devices I have come with remotely decent speakers, and also because Alexa seems to have developed a strange habit where when I ask for a song to be played, it more often than not chooses a random alternative take or live version, rather than the studio version I was after. All very frustrating, especially if you're a Bob Dylan fan.

Even as a light switch, I’ve found it increasingly unreliable. I now often have to repeat myself several times before Alexa understands my request and complies. That’s if I’m lucky. Sometimes it listens, then just does nothing.

It’s become more of an inconvenience and annoyance – the exact opposite of what these virtual assistants were supposed to be. To be fair to Nadella, he told the Financial Times that “Whether it’s Cortana or Alexa or Google Assistant or Siri, all these just don’t work. We had a product that was supposed to be the new front-end to a lot of [information] that didn’t work.”

We’re not alone in getting disillusioned with voice assistants. As the Financial Times reports, Adam Cheyer, co-creator of Apple's Siri, says that “the previous capabilities have just been too awkward… No one knows what they can do or can’t do. They don’t know what they can say or can’t say.”

It also seems like the companies behind the voice assistants are losing interest. Not only did Microsoft unceremoniously dump Cortana after years of trying to get Windows 10 and Windows 11 users to embrace (or at least tolerate) it, Amazon has cut a large number of jobs recently, and there are reports that the teams involved with Alexa and Echo devices have been particularly hard hit.

Two wrongs don’t make a right

It may be easy to suggest that Nadella’s dismissal of voice assistants is down to sour grapes, as Microsoft’s Cortana was the least popular out of the ‘big four’ – which also includes Alexa, Google Assistant, and Siri (sorry, Samsung, but no one likes Bixby either) – but I certainly agree with him. The shine has worn off.

However, it’s increasingly looking like Microsoft thinks that artificial intelligence chatbots, most noticeably ChatGPT, could solve these problems – and it’s here where I’m going to have to disagree, at least for now.

Microsoft is a big investor in ChatGPT and OpenAI, the company behind it, but when it announced it was bringing the power of ChatGPT to its Bing search engine, it managed something rare: it got people excited about Bing.

Suddenly, people were keen to try out a browser which had for so long been neglected in favor of Google. This surge in interest, plus widespread coverage in the press, has deepened Microsoft’s love affair with ChatGPT.

Having an AI bot that can converse with humans in a life-like way, and use huge amounts of stored data in its own libraries and on the internet to answer questions, seems like the natural evolution of voice assistants.

And, one day it might be. However, the technology has so far not lived up to expectations. People using ChatGPT or the version included in Bing have found the chatbot can give incorrect information, and it can also behave strangely, especially if you challenge it when it replies with the wrong answer. A similar issue emerged with Google’s rival Bard AI, which returned an incorrect answer to a question during the launch event. This quickly became quite embarrassing for Microsoft and Google, and it proved to a lot of us that AI bots are not quite ready for the limelight.

Can’t live up to the hype, sometimes unreliable and even a bit frustrating? That certainly sounds familiar, so if Microsoft and other companies don’t want history repeating, they’d do well to think twice before rushing to implement AI bots in voice assistants.

TechRadar – All the latest technology news

Read More

Siri is yet to regain crucial accessibility features that disappeared with iOS 15

Apple is still yet to reinstate functionality which was removed in iOS 15, where Siri could perform various tasks such as checking and playing voicemails.

The commands in question are valuable on the accessibility front, with the visually impaired or legally blind – or indeed other Apple device owners – are able to use them to keep on top of their voicemail as mentioned, or to instruct Siri to send an email, check recent calls, or call history.

MacRumors highlighted the functionality which went missing back in September, just after iOS 15 was released, and the site reminds us that these features haven’t been restored to Siri, despite growing complaints around this.

An inaccessible Siri for some

One reader told MacRumors: “For many fully blind people (like my blind mom) this makes their phone almost unusable, because they can’t ask Siri who has called, and they can’t ask Siri if they have voicemail. (Their official ‘workaround’ for voicemail, in fact, is calling the old-school carrier voicemail number, to check your voicemail over the phone.)”

Another message on Apple’s support forum reads: “My cousin who is legally blind is also experiencing this issue. He uses the read missed call and open voicemail when doctors call and leave messages. It allows him the opportunity to hear who called and he can call back. It has been a wonderful feature for him and I’m hoping that a fix will happen so that he can be able to use the feature again.”

There are also complaints on Reddit like this: “My dad is blind and uses Siri to operate a lot of his iPhone. This week we’ve noticed when he asks Siri to play voicemail or missed calls she says she can’t help with that.”

We've reached out to Apple for comment and will update this story once we hear back.


Analysis: Reasons why are unclear – but keep the feedback flowing on this

It’s unclear whether Apple intentionally removed the support for these features with iOS 15 – and if so, why that was done – or if it’s wrapped up in some bug – and if it’s the latter, whether a fix is coming soon. But whatever the case, this represents a worrying step backwards in terms of accessibility for iOS devices, to say the least.

While Apple itself hasn’t commented directly, as MacRumors notes – we’ve also reached out to the company to try and ascertain what’s going on here, and will update this story if we hear back – one message on the Apple help forum claims to have had a reply from the Accessibility Support team. They were advised to again contact Accessibility Support and “have your Apple ID added to the official engineering issue as an ‘affected user’ so that you receive the mass emails about the status of this fix”.

Further advice was to submit feedback on the missing features, as the more of that feedback which is gathered “does affect engineering’s prioritization of this issue”.

There may be hope yet, then, for some kind of return of these capabilities to Siri in the near future. Fingers crossed.

TechRadar – All the latest technology news

Read More