Google Search can help you with your math homework thanks to new update

Google is updating its search engine and Lens tool with new features to help students visualize and solve tricky math problems.

We’re not talking about basic arithmetic either. The upgraded Google Search can now tackle more complex forms of math like calculus and trigonometry. All you have to do is type in the equation or integral  into the text bar at the top or take a picture of your homework with Lens. You’ll then see a series of step-by-step instructions explaining how to solve it with the answer at the bottom. Geometry is also supported with the company recommending people use Google Lens to solve those since they can have diagrams. You won’t be able to draw shapes into the search bar so uploading a photo of the equation is your best bet. 

Google Search's new math tutor on mobile

(Image credit: Google)

Additionally, you can type in word problems for physics questions. Google Search will highlight the “known and unknown values” and then show you the correct formula to use for that particular equation. As an example, if you need to find out the average acceleration of a cyclist going down a hill, it’ll tell you the specific kinetic formula needed. 

Google Search helping with physics

(Image credit: Google)

The math update is currently live on desktop and the mobile app. Google states you can type the phrase “math solver” in the search bar to try out their new experience on desktop. However, when we did, nothing popped up. It's possible this could be referring to future expansion, but we're not sure. Either way, feel free to directly type the math problem into the search bar. You don’t need to bring up anything else.

Advancement in science

Alongside the mathematics help, Google is rolling out interactive 3D models for certain fields of science such as physics, biology, and chemistry. The diagrams will let you zoom into an object as well as provide definitions of what you’re looking at. 

At the time of this writing, the patch doesn’t appear to be widely available. We saw interactive 3D models for basic concepts like individual parts of a cell and periodic elements, but nothing for specific types of cells or molecules. You can look up a model for an oxygen atom, but not a carbon dioxide molecule for instance. What’s more, nothing had a definition. It was just the model.

We reached out to Google asking if this patch is seeing a global release or will only be available in a few countries like the United States. We’ll update this story if we hear back.

Undoubtedly, this will help students advance in their courses. But don't forget about the hardware. If you're in the market for a computer, be sure to check out TechRadar's list of the best student laptops for 2023.

You might also like

TechRadar – All the latest technology news

Read More

Bing Chat is acting like a sulky teenager, refusing to do its homework and throwing tantrums – what gives?

The last few weeks have brought some trouble for Microsoft’s flagship chatbot, Bing Chat, powered by OpenAI’s ChatGPT-4 tech. People who have made use of Microsoft Edge’s ‘Compose’ box, which has Bing Chat integrated into it, have reported that it’s been less helpful in answering questions or falling short when asked to assist with queries.

Windows Latest investigated these claims and found an increase in the following response: “I’m sorry, but I prefer not to continue this conversation. I’m still learning, so I appreciate your understanding and patience.” 

When Mayank Parmar of Windows Latest told Bing that “Bard is better than you,” Bing Chat seemingly picked up on the adversarial tone and quickly brought the conversation to an end. 

After Bing Chat closed off the conversation, it provided three response suggestions: “I’m sorry, I didn’t mean to offend you”, “Why don’t you want to continue?” and “What can you do for me?” Because these were provided after Bing Chat ended the conversation, they couldn’t be clicked.

What's Microsoft got to say about it?

You may find this behavior to be like I did – whimsical and funny, but a little concerning. Windows Latest contacted Microsoft to see if it could provide some insight on this behavior from Bing Chat. Microsoft replied by stating that it is making an active effort to observe feedback closely and address any concerns that come up. It also emphasized that Bing Chat is still in an ongoing preview stage and has plenty of development to go.

A Microsoft spokesperson told Parmar over email: “We actively monitor user feedback and reported concerns, and as we get more insights… we will be able to apply those learnings to further improve the experience over time.” 

Asking Bing Chat to write 

When looking at Reddit posts on the subject, Windows Latest discovered a user in one comment thread describing how they bumped up against a similar problem when using the “Compose” tool of Bing Chat, which is now integrated into the Edge browser. This tool allows users to try different tone, format, and length options for Bing’s generated responses.

In Windows Latest’s demo, the Compose tool also refused a request to simply write a tongue twister, and then started spouting excuses about humor being subjective and not wanting to generate harmful content. Puzzling. 

Another Reddit user asked Bing Chat to proofread an email in a language not native to them. Bing responded a bit like an angry teenager by telling the user to “figure it out” and gave them a list of alternative tools. The user then finally got Bing to do what they asked after they downvoted Bing’s responses and multiple follow up attempts.

One theory that’s emerged to explain this odd behavior is that Microsoft is actively tweaking Bing Chat behind the scenes and that’s manifesting in real time. 

A third reddit user observed that “It’s hard to fathom this behavior. At its core… AI is simply a tool. Whether you create a tongue-twister or decide to publish or delete content, the onus falls on you.” They continued that it’s hard to understand why Bing Chat is making seemingly subjective calls like this, and that it could make other users confused about the nature of what the tool is supposed to do. 

I tried it for myself. First in the Chat feature, I asked it for a maxim for the day that I could use as a mantra, which Bing obliged. It returned, “Here’s a maxim for you: ‘The only way to do great work is to love what you do.’ – Steve Jobs.” Checks out. 

Bing Chat replying to a request to provide a maxim for the day.

(Image credit: Future)

Next, I tried asking for a draft of an email to join my local garden club in an enthusiastic tone in the Compose feature. Again, Bing helped me out.  

Image 1 of 2

User asking Microsoft's Bing Chat to write a letter requesting to join the local gardening club.

(Image credit: Future)
Image 2 of 2

Bing Chat's Compose feature writing a letter to join the local gardening club for the user.

(Image credit: Future)

As far as I can tell, Bing Chat and its AI are working as intended, but Windows Latest did provide screenshots of their trials as well. It’s intriguing behavior and I see why Microsoft would be keen to remedy things as quickly as possible. 

Text generation is Bing Chat’s primary function and if it straight up refuses to do that, or starts to be unhelpful to users, it sort of diminishes the point of the tool. Hopefully, things are on the mend for Bing Chat and users will find that their experience has improved. Rooting for you, Bing.

YOU MIGHT ALSO LIKE

TechRadar – All the latest technology news

Read More

Google denies that Bard AI copied ChatGPT’s homework

Google’s Bard AI has found itself at the center of controversy again, this time over allegations that the Bing rival was trained using data pulled from OpenAI’s ChatGPT.

As you may be aware, ChatGPT is the power behind the throne of Bing AI, and the accusation of nefarious activities behind the scenes comes from a report by The Information.

We’re told that Jacob Devlin, a software engineer at Google – an ex-engineer, we might add, having departed the firm over this affair – claims that Google used ChatGPT data (scraped from the ShareGPT website, apparently) to develop Bard.

Devlin notes that he warned Google against doing so, as this clearly went against OpenAI’s terms of service.

According to the report, Google ceased using the mentioned data after the warnings from Devlin (who left Google to join OpenAI, we’re informed).

Google denies any of this, though. A company spokesperson, Chris Pappas, told The Verge: “Bard is not trained on any data from ShareGPT or ChatGPT.”


Analysis: A denial amid some desperation

There we have it, then – a clear denial from Google in no uncertain terms that nothing underhand was going on data-wise with Bard. And to be fair, there’s certainly no evidence that Bard’s answers are remotely like the ones given by ChatGPT. (Devlin had further warned that the alleged data hoovering could mean just that, and it’d be obvious enough what had gone on as a result).

We suppose the trouble with this episode is that it very much feels like Google has rushed Bard to release – dropping clangers while doing so – as it was forced to play catchup with Microsoft’s Bing AI. Given that the latter is now successfully pushing search engine adoption to Bing, already at this early stage, all this could make it easy enough for some to believe that Google might be getting a bit desperate with tactics behind the scenes.

Whether or not the tale about poached data is true – we’ll take Google’s word that it isn’t – the report still makes an interesting revelation that Google’s Brain AI group is now working with AI firm DeepMind (both of these existing under the Alphabet umbrella, the parent company).

DeepMind has seemingly been recruited into the mix to swiftly hone and power up Bard, and it’s notable because the two AI outfits are big rivals and are very much being forced to collaborate on this.

This again sketches a picture of a rather desperate scramble to get Bard steadier on its feet, while Microsoft’s Bing AI keeps getting updated with new features at a fair old rate of knots. (Although fresh rumblings about one of the potential next ‘features’ for the Bing chatbot have us very concerned, it has to be said).

You may also recall alarm bells being rung on the privacy front when Bard itself made an apparent revelation that it used internal Gmail data for training, again prompting Google to tell us that this is not the case and that the bot got things wrong. Bard getting things wrong, of course, is very much part of a bigger issue.

TechRadar – All the latest technology news

Read More