Microsoft’s ChatGPT-powered Bing gets a personality makeover you may not like

Microsoft has applied some further fine-tuning to its Bing AI, upping some chat limits and making changes to one of the chatbot’s personalities.

Chats have now been extended to allow up to 15 sessions per day, with the maximum length of a session pushed out to 10 queries (meaning a total of 150 queries is now your daily limit).

Microsoft has slowly but surely been pushing up those chat limits since the AI first launched and it was heavily restricted (to 5 sessions and 50 queries daily) when the chatbot’s behavior was observed going seriously awry in longer sessions.

But the more interesting change, as revealed by Yusuf Mehdi, Corporate VP & Consumer Chief Marketing Officer at Microsoft, is optimizing the ‘Balanced’ personality for better performance.

See more

As you may be aware, there are three personalities available for Microsoft’s AI. Balanced is the middle setting for the chatbot, a halfway house between Precise and Creative, which remain unchanged.

Precise offers more concise and business-like answers, more akin to a standard search, whereas Creative gives the AI more free rein in its replies – with Balanced lying in-between the two as a compromise option.


Analysis: Precise, A Bit Less Precise, and Creative?

As with any compromise, deciding exactly where to draw the line can be a tricky affair. However, it seems that Microsoft is shifting that line to a more conservative position with this latest change.

With Balanced now giving “shorter, quicker responses,” that sounds clearly more in line with the Precise setting, rather than Creative which is where the AI is allowed more freedom to ramble – and frankly, to be more interesting and human-like.

Therefore, moving the Balanced dial more towards the conservative end of the spectrum could be viewed as making the Bing AI a bit more straightlaced and, well, boring.

The whole point of having the three personalities is to give users the choice of how the AI will respond, so if they’re not happy with their interactions with the ChatGPT-powered entity, they can switch things around. But now it feels like there’s slightly less choice in terms of there being a ‘very conservative’ setting, a ‘somewhat conservative’ option, and a ‘freer rein’ choice.

Why has Microsoft moved in this direction? Our guess is that folks who want a more human-like chat experience are using Creative and maybe wouldn’t dream of dipping a toe into Balanced anyway. Perhaps few people are using Balanced overall, so tuning it towards Precise may tempt those on the latter into making use of the middling option – whereas those on Creative are going to stick there, most likely, as they want the AI to be as interesting and open as is inhumanly (ahem) possible.

Whatever the case, we can expect further tuning, and indeed likely other personality choices, down the line. We may even get a mode whereby the Bing AI can impersonate famous celebrities, too, if leaks are on the money. And that would likely help push user numbers even higher, when there are already a good few folks signed up to test drive the chatbot.

Via MS Power User

TechRadar – All the latest technology news

Read More

Fed up with the Bing AI chatbot’s attitude? Now you can change its personality

Microsoft’s Bing chatbot is now offering a choice of personalities for all users, with the rollout of the Bing Chat Mode selector having been completed.

This news was shared on Twitter by Mikhail Parakhin, head of Microsoft’s Advertising and Web Services division, as spotted by MS Power User.

See more

As you can see, at the time of the tweet, 90% of Bing chatbot users had the tri-toggle chat selector that lets you switch between three different personalities for the AI (Precise, Balanced, or Creative).

The remaining control group (10%) then had the selector rolled out to them across the course of yesterday, so everyone should have it by now. That’s good news for those who want more options when it comes to the chatbot’s responses to their queries.

Earlier this week, we saw other work on the AI to reduce what are called ‘hallucinations’ (where the chatbot gives inaccurate info, or plain makes a mistake). There was also tinkering to ensure that instances where Bing simply fails to respond to a query happen less often.

While that’s all good, it seems on the latter count, there’s a fresh stumbling block that has been introduced with the latest version of the chatbot which has the personality selector – namely a ‘something went wrong’ error message when querying the ChatGPT-powered AI.

In the above Twitter thread, there are a few complaints along these lines, so hopefully this is something Microsoft is already investigating.


Analysis: Creative for the win? Maybe for now…

Doubtless there will be plenty of experimentation with the chat modes to determine exactly how these three personalities are different.

Thus far, the ‘Creative’ setting seems to be getting the most positive feedback, and this is likely the one many Bing users are plumping for. Simply because this is where the AI has the most free rein, and so will seem more human-like – rather than ‘Precise’ mode which is more like a straight answer to a search query. (Arguably somewhat defeating the point of having an AI carrying out your searches, anyway).

‘Balanced’ is a middle road between the two, so that may tempt fans of compromise, naturally.

Initial feedback indicates that in Creative mode Bing gives more detailed answers, not just adding a more personal touch, but seemingly fleshing out replies to a greater depth. That’s going to be useful, and likely to lead to this being the more popular choice. Especially as this setting is where you’re going to get the more interesting – or perhaps occasionally eccentric, or even outlandish – responses.

Microsoft may need to look at working on the Balanced setting to be a more compelling choice, particularly if it sees that traffic is heavily skewed towards the Creative option.

That said, the latter being popular is likely to be partly tied in with how new the AI is, attracting people who are curious and just want to mess around with the chatbot to see what they can get Bing to say. Those kind of users will doubtless get bored of toying with the AI before too long, giving a different picture of personality usage when the dust settles a bit more.

At any rate, tweaking Bing’s personalities is something that’ll doubtless happen on an ongoing basis, and we may even get more options aside from these initial three eventually. Come on, Microsoft, we all want to see ‘Angry’ Bing in action, or maybe a ‘Disillusioned’ chatbot (or how about an ‘Apocalypse Survivor’ setting?). No?

TechRadar – All the latest technology news

Read More