Microsoft’s CEO calls Alexa and Siri ‘dumb’ – but ChatGPT isn’t much smarter

In an interview with the Financial Times a few weeks ago, Microsoft’s CEO Satya Nadella dismissed voice assistants, such as Alexa and Siri, as “dumb as a rock”.

This might seem a little rich coming from the CEO of a company that launched (and then abandoned) the unloved Cortana voice assistant, but I actually agree. However, unlike Nadella, I'm not so sure that the new wave of AI chatbots are where the future really lies – or at least not yet. 

Sure, they appear to be smarter than the first bunch of voice assistants, including Amazon's Alexa, Apple's Siri and Google's (less charmingly named) Assistant, but that's not saying a lot. I was initially really impressed with these assistants, particularly Alexa, to the extent that I put aside my misgivings about how much information Amazon already collected about me, and filled my home with Echo devices of all shapes and sizes.

That all seems a long time ago now, though. Despite all the promise those voice-activated digital assistants had when they launched, I can't help but feel they’ve turned into little more than hands-free light switches and timers for when I’m cooking. They even made me temporarily forget how to use a real light switch. Seriously.

That’s it. I don’t even use Alexa to play music any more. Partly because none of the Echo devices I have come with remotely decent speakers, and also because Alexa seems to have developed a strange habit where when I ask for a song to be played, it more often than not chooses a random alternative take or live version, rather than the studio version I was after. All very frustrating, especially if you're a Bob Dylan fan.

Even as a light switch, I’ve found it increasingly unreliable. I now often have to repeat myself several times before Alexa understands my request and complies. That’s if I’m lucky. Sometimes it listens, then just does nothing.

It’s become more of an inconvenience and annoyance – the exact opposite of what these virtual assistants were supposed to be. To be fair to Nadella, he told the Financial Times that “Whether it’s Cortana or Alexa or Google Assistant or Siri, all these just don’t work. We had a product that was supposed to be the new front-end to a lot of [information] that didn’t work.”

We’re not alone in getting disillusioned with voice assistants. As the Financial Times reports, Adam Cheyer, co-creator of Apple's Siri, says that “the previous capabilities have just been too awkward… No one knows what they can do or can’t do. They don’t know what they can say or can’t say.”

It also seems like the companies behind the voice assistants are losing interest. Not only did Microsoft unceremoniously dump Cortana after years of trying to get Windows 10 and Windows 11 users to embrace (or at least tolerate) it, Amazon has cut a large number of jobs recently, and there are reports that the teams involved with Alexa and Echo devices have been particularly hard hit.

Two wrongs don’t make a right

It may be easy to suggest that Nadella’s dismissal of voice assistants is down to sour grapes, as Microsoft’s Cortana was the least popular out of the ‘big four’ – which also includes Alexa, Google Assistant, and Siri (sorry, Samsung, but no one likes Bixby either) – but I certainly agree with him. The shine has worn off.

However, it’s increasingly looking like Microsoft thinks that artificial intelligence chatbots, most noticeably ChatGPT, could solve these problems – and it’s here where I’m going to have to disagree, at least for now.

Microsoft is a big investor in ChatGPT and OpenAI, the company behind it, but when it announced it was bringing the power of ChatGPT to its Bing search engine, it managed something rare: it got people excited about Bing.

Suddenly, people were keen to try out a browser which had for so long been neglected in favor of Google. This surge in interest, plus widespread coverage in the press, has deepened Microsoft’s love affair with ChatGPT.

Having an AI bot that can converse with humans in a life-like way, and use huge amounts of stored data in its own libraries and on the internet to answer questions, seems like the natural evolution of voice assistants.

And, one day it might be. However, the technology has so far not lived up to expectations. People using ChatGPT or the version included in Bing have found the chatbot can give incorrect information, and it can also behave strangely, especially if you challenge it when it replies with the wrong answer. A similar issue emerged with Google’s rival Bard AI, which returned an incorrect answer to a question during the launch event. This quickly became quite embarrassing for Microsoft and Google, and it proved to a lot of us that AI bots are not quite ready for the limelight.

Can’t live up to the hype, sometimes unreliable and even a bit frustrating? That certainly sounds familiar, so if Microsoft and other companies don’t want history repeating, they’d do well to think twice before rushing to implement AI bots in voice assistants.

TechRadar – All the latest technology news

Read More

Bing chatbot just got smarter – and it’s about to get different AI personalities

Microsoft has deployed a new version of its Bing chatbot (v96) featuring improvements to make the AI smarter in a couple of key areas – and a big change has been flagged up as imminent, too.

Mikhail Parakhin, who heads up the Advertising and Web Services division at Microsoft, shared this info on Twitter (via MS Power User).

See more

So, what’s new with v96? Parakhin explains that users of the ChatGPT-powered Bing will now experience a ‘significant’ reduction in the number of times that the AI simply refuses to reply to a query.

There will also be “reduced instances of hallucination in answers” apparently, which is industry lingo meaning that the chatbot will produce fewer mistakes and inaccuracies when responding to users. In short, we should see less misinformation being imparted by the chatbot, and there have been some worrying instances of that occuring recently.

The other major news Parakhin delivers is that the so-called tri-toggle, known more formally as the Bing Chat Mode selector – featuring three settings to switch between different personalities of the AI Bing – is set to go live in the “next couple of days” we’re told.


Analysis: Long and winding road ahead

The ability to switch between a trio of personalities is the big change for the Bing chatbot, and to hear that it’s imminent is exciting stuff for those who have been engaging with the AI thus far.

As detailed previously, the trio of personalities available are labeled as Precise, Balanced, and Creative. The latter is set to provide a chattier experience, and Precise will offer a shorter, more typical ‘search result’ delivery, with Balanced being a middle road between the two. So, if you don’t like how the AI is responding to you, at least there will be choices to alter its behavior.

Various different versions of the Chat Mode selector have been tested, as you would imagine, and the final model has just been picked. This is now undergoing honing before release which should happen later this week as noted, but we’re guessing there’ll be plenty of further fine-tuning to be done post-release.

Certainly if the overall Bing AI experience has been anything to go by, as the whole project is, of course, still in its early stages, and Microsoft is chopping and changing things – sometimes in huge ways – seemingly without much caution.

The current tuning for v96 to ensure Bing doesn’t get confused and simply not reply will help make the AI a more pleasant virtual entity to interact with, and the same will hopefully be true for the ability to switch personalities.

At the very least, the Creative personality should inject some much-needed character back into the chatbot, which is what many folks want – because if the AI behaves pretty much like a search engine, then the project seems a bit dry and frankly in danger of being judged as pointless. After all, the entire drive of this initiative is to make Bing something different rather than just a traditional search experience.

It’s going to be a long road of tweaking for the Bing AI no doubt, and the next step after the personalities go live will likely be to lift that chat limit (which was imposed shortly after launch) to something a bit higher to allow for more prolonged conversations. If not the full-on rambles initially witnessed, the ones that got the chatbot into hot water for the oddities it produced…

TechRadar – All the latest technology news

Read More