Amazon tests a new AI assistant to answer your questions while you shop

Amazon is reportedly testing a new AI assistant on its mobile app that can answer customer questions about specific products.

This feature appears to have been initially discovered by e-commerce research firm Marketplace Pulse. According to the firm, the AI can be found under the “Looking for specific info?” section on product pages. The LLM (large language model) powering the feature relies on listing details provided by companies and user reviews to generate responses to inquiries. For example, you can ask if a particular workout shirt is good for running or if it fits well on a tall person. Marketplace Pulse states its main purpose is to save people the trouble of having to read individual reviews by summarizing all the information present into a succinct block of text. 

Amazon AI assistant

(Image credit: Marketplace Pulse/Amazon)

Because it’s in the early stages, the AI assistant is limited in what it can do. You can’t command it to compare two items or “find alternatives.” Although it can’t recommend specific products, Amazon’s chatbot can make soft suggestions. In another example, MarketPlace Pulse asked the app assistant if e-bikes are good for romantic dates. The AI said “not really” and recommended buying a tandem bike instead.

Quirks and unintended features

There are several quirks affecting the chatbot. Unsurprisingly, it’s “prone [to] hallucinating wrong information” about an item. MarketPlace Pulse even claims it outright refused to “answer basic questions”. What’s more, the assistant is capable of answering prompts that apparently “Amazon didn’t build it for.” 

It can generate Python code, write jokes about a product, or answer in languages besides English. CNBC had access to the test and was reportedly successful in describing items “in the style of Yoda from Star Wars.” Despite these abilities, you can’t hold a regular conversation with the AI like you could with ChatGPT.

Amazon's AI assistant quirks

(Image credit: Marketplace Pulse/Amazon)

It’s unknown how widespread the test is. We didn't have access on our phone. Amazon hasn’t said anything official so far, but we reached out to the platform asking for more information about the AI. We also asked Marketplace Pulse if it knows whether the assistant is available to a lot of people or just a select group. This story will be updated at a later time.

Alexa upgrade

Amazon’s AI ambitions don’t stop there as a report from Business Insider reveals the tech giant is currently working on a revamped and paid version of Alexa. The upgrade is called Alexa Plus, which is said to offer a “more conversational and personalized” experience akin to ChatGPT.

The team is aiming to launch Alexa Plus on June 30, according to the report. Unfortunately, development is not going smoothly. A source with intimate knowledge told Business Insider the revamp is “falling short of expectations”. The AI is reportedly hallucinating false information as the team is having a hard time getting the tech to work properly. The project may also be causing a lot of internal fighting with some arguing people are not going to want to pay for another Amazon service.

At a glance, it seems Alexa Plus might miss the June 30 deadline.

If you want your own digital sidekick, check out TechRadar's list of the best AI-powered virtual assistants.

You might also like

TechRadar – All the latest technology news

Read More

We may finally know when Apple’s Vision Pro will launch – but big questions remain

If you’re an Apple fan and your new year’s resolution is to save your money this January, we’ve got some bad news: a new rumor says Apple’s Vision Pro headset will go on sale in just a few weeks’ time. However – and perhaps fortunately for your finances – there are some serious questions floating around the rumor.

The mooted January launch date comes from Wall Street Insights, a news outlet for Chinese investors (via MacRumors). According to a machine-translated version of the report, “Apple Vision Pro is expected to be launched in the United States on January 27, 2024.”

The report adds that “Supply chain information shows that Sony is currently the first supplier of silicon-based OLEDs for the first-generation Vision Pro, and the second supplier is from a Chinese company, which will be the key to whether Vision Pro can expand its production capacity.”

With the supposed launch date just 25 days away, it might not be long before we see Apple’s most significant new product in years. Yet, despite the apparent certainty in the report, there are reasons to be skeptical about its accuracy.

Date uncertainty

For one thing, January 27 is a Saturday, an unlikely day for an Apple product launch. It could be that Wall Street Insights is referring to January 27 in China which, thanks to time zone differences, aligns with Friday January 26 in the United States. That’s a much more probable release date, as it doesn't coincide with the weekend, when many of the media outlets that would cover the Vision Pro will be providing reduced news coverage. Yet the report specifically mentions the date in the US, meaning that questions remain.

Moving past the specific date, an early 2024 launch date has been put forward by a number of reputable Apple analysts. Ming-Chi Kuo, for example, has suggested a late January or early February timeframe, while Bloomberg reporter Mark Gurman has zeroed in on February as the release month.

Either way, it’s clear that the Vision Pro is almost upon us. Apple has reportedly been training retail staff how to use the device, which implies that the company is almost ready to pull the trigger.

We’ll see how accurate the Wall Street Insights report is in a few weeks’ time. Regardless of whether or not it has the correct date, we’re undoubtedly on the brink of seeing Apple’s most anticipated new product in recent memory.

TechRadar – All the latest technology news

Read More

Google working on an AI assistant that could answer ‘impossible’ questions about you

Google is reportedly developing an AI assistant that will analyze personal photos, files, as well as Search results with the goal of telling “your life story”.

This news comes from CNBC which saw documents revealing that the tech giant recently held an “internal summit” where company execs and employees presented Project Ellman. According to the piece, the AI will offer a “bird’s–eye view” of someone’s life by grabbing files from your Google Account, and utilizing written biographies and adjacent content to understand context. This process includes sifting through the information files to pinpoint important moments. The Google employees claimed Project Ellman could deduce the day a user was born, who their parents are, and if they have any siblings. 

It doesn’t stop there because apparently, it's able to highlight chapters in your life like the years you spent at college or living in a certain city. Ellman can even learn your eating habits. If, for example, you upload a bunch of photos of pizza and pasta, the AI can infer that perhaps you’re a big fan of Italian food. The tech isn’t restricted to one person either as it can identify friends and family, plus social events you’ve been to.

Based on the report’s description alone, Project Ellman sounds very reminiscent of Memories on Google Photos, although on a much wider scale. 

Personal chatbot

CNBC states the presentation continued with demonstrating Ellman Chat, which was described as ChatGPT, but with the ability to “answer previously impossible questions”. Judging by the examples given, the questions aren’t necessarily impossible; just tricky especially if you're a forgetful person. For instance, you can ask the chatbot the last time your brother visited or for suggestions on a location you can move to based on the pictures you upload. 

Then we get to what may be one of Project Ellman’s secret purposes. By analyzing the screenshots users upload, the tech can make all sorts of predictions – from products you might buy, what interests you may have, plus future travel plans. The presenters also pitched the idea that it can learn what websites you frequent.

Project Ellman may know you better than you know yourself.

Analysis: All about you

We don’t think we have to tell you just how creepy all this sounds. We’re talking about an AI diving deep into your files, scrounging for every bit of data it can grab. Where is all that information going? 

Gemini, Google's new large language model (LLM) is implied to be the model that’ll power Project Ellman because it’s multimodal, or in other words, it can accept multiple forms of inputs besides text. Generative AIs need a constant stream of content to stay up to date. It seems like Google might be pole-vaulting over privacy boundaries, seeking more data to feed Gemini and keep it growing.

Granted, there’s no guarantee Ellman will ever see the light of day. A Google spokesperson told CNBC this is all an “early internal exploration”. If there are plans for a release, developers will take the time to ensure it’s helpful to people while keeping user privacy at the forefront. 

We urge you to take this statement with a grain of salt. Despite their supposed best efforts, the company has a storied history when it comes to privacy issues. The company gets into a lot of trouble for it. Just look at the Wikipedia page on the topic; it’s huge.

Hopefully, this is all overblown and the tech giant doesn’t launch a digital vacuum cleaner sucking up everything.

If you're looking for ways to start protecting your data, check out TechRadar's list of the best ad blockers for 2023

You might also like

TechRadar – All the latest technology news

Read More

5 Questions with CIO Herman Brown

Herman Brown is the Chief Information Officer at SF District Attorney’s Office where he also holds the function of CISO. He’s got private sector experience to match his public sector experience. He re…

Articles RSS Feed

Read More