Google is giving Android users hands-free navigation and a way to talk with emojis

Google is rolling out several new accessibility-focused features to platforms like Android and ChromeOS, timed to Global Accessibility Awareness Day, May 16. Leading the long list is the arrival of Project Gameface on Android

If you’re unfamiliar, Gameface is software that lets people use “head movement and facial gestures” to navigate a computer UI. Up until now, the software was used to help people with disabilities play video games among other things. But with its inclusion on Android, those same groups now have a new way to control their smartphone. 

The company states that Gameface supports 52 different facial gestures that can be mapped to specific functions. For example, looking to the left can be used to select items on the screen, while raising your eyebrows can send you back to the home screen. The individual controls depend on how people set up Gameface.

Project Gameface on Android

(Image credit: Google)

Also, it’ll be possible to adjust the sensitivity of a function to establish “how prominent your gesture has to be in order to” register an input. A slight open mouth can be attached to one action, while a wider open mouth can work for another. Over in the bottom corner will be a live camera feed of yourself. Google states their team added the view so users can make sure they’re making accurate facial gestures.

Project Gameface is open-sourced and available for download on Github complete with instructions on how to set it up. Do note it requires the Android Studio developer tool to configure it so you may need someone to help you out.

Notable features

The rest of the features in the update may not be as individually impactful as Gameface, but together, they become greater than the sum of its parts. Google’s Lookout app is receiving a new Find mode to help blind people locate real-world objects across seven different categories. It can tell where the tables are in a restaurant or where the door to the bathroom is. Users have to hold their smartphone in front of them, and through the rear camera, Lookout’s AI will tell you the “direction and distance” of an item or exit. Keep in mind, Find mode is in beta so it may be a little buggy.

Google Maps is seeing a similar upgrade, and it’ll soon provide more details about the area around you. The app will tell you the names of nearby places and how far you need to go to reach your destination.

Lookout app's new Find mode

(Image credit: Google)

Next, Android’s Look to Speak is adding a text-free mode. This mode lets you communicate with the app’s speech function by selecting emojis, symbols, and images. For example, a hand-waving emoji can be used to say “Hello.”

Chromebooks are set to receive their own accessibility patch, too. Google is giving owners a way to increase the size of the mouse cursor, and the screen magnifier tool will follow along with the words as you read them. 

Those are all the major updates coming to the Google platform; however, it’s just the tip of the iceberg. Other small upgrades include Google Maps on desktops, pointing out wheelchair-accessible entrances. Everything mentioned here is already live except for the Chromebook changes, which will roll out within the coming weeks.

Google isn't the tech giant celebrating Global Accessibility Day. Apple recently revealed multiple accessibility features including Eye Tracking, Vocal Shortcuts, and Vehicle Motion Cues for its hardware; however, they aren't arriving until later this year. It's unknown exactly when they'll come out, but they'll most likely be made available as a part of iOS 18, VisionOS 2, “and the next version of macOS.”

While we have you check out TechRadar's list of the best Android phones for 2024.

You might also like

TechRadar – All the latest technology news

Read More

Best Buy is giving its customer assistance an AI boost – but with a human touch

Best Buy is taking the plunge and incorporating AI-powered shopping tools for its customers, announcing today on its website that it’s partnered with Google Cloud and the consulting firm Accenture to bring users AI-powered customer assistance. The retailer claims that this move will enable it to give customers “even more personalized, best-in-class tech support experiences.”

Customers can expect a self-service support option when they visit and shop on BestBuy.com, when using Best Buy’s app, or when they call Best Buy’s customer support line (presumably through a conventional automated selection system). When customers make use of one of these, they’ll be able to interact with Best Buy’s new AI-powered virtual assistant, which it expects to debut in late summer 2024. 

These new customer support tools are part of Best Buy’s efforts to offer customers the most tech-forward ways of getting the assistance they need, expanding that it’s making use of Google Cloud’s AI capabilities, including Vertex AI (a Google Cloud machine learning platform), and Google’s new Gemini generative AI models

Inside of a Best Buy, an every day scene at the customer service section with people milling around

(Image credit: Shutterstock/Icatnews)

What the generative AI will help Best Buy do

The retailer explains that the virtual assistant will enable customers to troubleshoot product issues easily, manage their order deliveries and scheduling (including the ability to make changes), manage subscriptions they have from Best Buy such as software and Geek Squad, and navigate their My Best Buy memberships (Best Buy’s customer loyalty program). 

Many people, myself included, find it very frustrating when trying to interact with automated customer service tools, and thankfully it looks like Best Buy is at least somewhat aware of this. It writes: “We also know that sometimes customers prefer speaking with an actual person to get the support they need.”

It follows this up by explaining that Best Buy customer care agents will be equipped with a suite of tools aided by generative AI to assist agents when they’re dealing with customers over the phone. Best Buy details that these tools are designed to help agents assess real-time conversations with customers, and suggest recommendations that might be useful in the moment. The tools will also summarize conversations, collecting and using information gathered during the call to hopefully reduce the chances of individual customer service issues being repeated, as well as detecting the sentiment expressed by the customer.

A close up on a woman working at a computer, wearing a headset and smiling

(Image credit: Shutterstock/OPOLJA)

The wider implications of this change

There are legions of AI-powered assistance tools being developed for employees everywhere at this point, with Best Buy also discussing an assistant that makes it easier for employees to find product guides and company resources. The retailer states that its aim in developing tools like these is to be able to help customers more efficiently.

We’ve seen implementations of similar practices by other, smaller retailers, but Best Buy is one of the first companies of this scale to adopt an AI-first approach. While many companies already use automated customer service tools in some form, Best Buy is joining a limited cohort that make such explicit use of AI-assisted customer service technologies. 

I’ve had positive and negative experiences when dealing with automated customer service, and when you’re particularly stressed out, I don’t see the addition of machine learning as much of a consolation. I am glad that employees will also see a boost behind the scenes with additional tools to help them help customers, and I’m glad that it sounds like customers will still be able to speak to an actual person – I just hope it’s not too difficult to get through to a human and it’ll be open to feedback about its new strategy. 

My gut reaction is that this is a bold move that could be met unenthusiastically by customers, but I appreciate that Best Buy is being forthright about it. If it works, we could see it spread to more retailers big and small, and generative-AI-aided assistance might be well on its way to becoming the industry norm. If not, hopefully, retailers will be wise enough to listen to customer sentiment and understand that there are still some jobs that you simply need a human for.

YOU MIGHT ALSO LIKE…

TechRadar – All the latest technology news

Read More

Microsoft is giving Windows Copilot an upgrade with Power Automate, promising to banish boring tasks thanks to AI

Microsoft has revealed a new plug-in for Copilot, its artificial intelligence (AI) assistant, named Power Automate that will enable users to (as the name suggests) automate repetitive and tedious tasks, such as creating and manipulating entries in Excel, handling PDFs, and file management. 

This development is part of a bigger Copilot update package that will see several new capabilities being added to the digital AI assistant.

Microsoft gives the following examples of tasks this new Copilot plug-in could automate: 

  • Write an email to my team wishing everyone a happy weekend.
  • List the top 5 highest mountains in the world in an Excel file.
  • Rename all PDF files in a folder to add the word final at the end.
  • Move all word documents to another folder.
  • I need to split a PDF by the first page. Can you help?

Who can get the Power Automate plug-in and how

As of now, it seems like this plug-in is only available to some users with access to Windows 11 Preview Build 26058, available to Windows Insiders in the Canary and Dev Channels of the Windows Insider Program. The Windows Insider Program is a Microsoft-run community for Windows enthusiasts and professionals where users can get early access to upcoming versions of Windows, features, and more, and provide feedback to Microsoft developers to improve these before a wider rollout.

Hopefully, the Power Automate plug-in for Copilot will prove a hit with testers – and if it is, we should hopefully see it rolled out to all Windows 11 users soon.

As per the blog post announcing the Copilot update, this is the first release of the plug-in, which is part of Microsoft’s Power Platform, a comprehensive suite of tools designed to help users make their workflows more efficient and versatile – including Power Automate. To be able to use this plug-in, you’ll need to download Power Automate for Desktop from the Microsoft Store (or make sure you have the latest version of Power Automate). 

There are multiple options for using Power Automate:  the free plan, suitable for personal use or smaller projects, and there are premium plans that offer packages with more advanced features. From what we can tell, the ability to enable the Power Automate plug-in for Copilot will be available for all users, free and premium, but Microsoft might change this.

Once you’ve made sure you have the latest version of Power Automate downloaded, you’ll also need to be signed into Copilot for Windows with a Microsoft Account. Then you’ll need to add the plug-in to Copilot To do this, you’ll have to go to the Plug in section in the Copilot app for Windows, and turn on the Power Automate plug-in which should now be visible. Once enabled, you should be able to ask it to perform a task like one of the above examples and see how Copilot copes for yourself.

Once you try the plug-in for yourself, if you have any thoughts about it, you can share them with Microsoft directly at [email protected]

Copilot in Windows

(Image credit: Microsoft)

Hopefully, a sign of more to come

The language Microsoft is using about the plug-in implies that it will see improvements in the future to enable it and, therefore, Copilot to carry out more tasks. Upgrades like this are steps in the right direction if they’re as effective as they sound. 

This could address one of the biggest complaints people have about Copilot since it was launched. Microsoft presented it as a Swiss Army Knife-like digital assistant with all kinds of AI capabilities, and, at least for now, it’s not anywhere near that. While we admire Microsoft’s AI ambitions, the company did make big promises, and many users are growing impatient. 

I guess we’ll have to just continue to watch whether Copilot will live up to Microsoft’s messaging, or if it’ll go the way of Microsoft’s other digital assistants like Cortana and Clippy.

YOU MIGHT ALSO LIKE…

TechRadar – All the latest technology news

Read More

Microsoft is giving two Windows 11 apps nifty extra powers – and one of them is AI-related (surprise, surprise)

Microsoft is trying out some interesting new changes in testing for Windows 11, including bolstering a pair of core apps for the OS – with one of them getting supercharged by AI.

Those two apps are Notepad and Snipping Tool, with new versions rolling out to testers who are in the Dev and Canary channels.

The big one is Notepad which is getting an infusion of AI in the form of an ‘Explain with Copilot’ option. This allows you to select any written content in Notepad and via the right-click menu (or Ctrl + E shortcut), summon Copilot to explain more about the selected text, as you might guess.

As Microsoft notes: “You can ask Copilot in Windows to help explain log files, code segments, or any selected content directly from within Notepad.”

Windows 11 Notepad Copilot Panel

(Image credit: Microsoft)

This feature should be available to all testers in those earlier Windows Insider channels in version 11.2401.25.0 of Notepad, though Microsoft observes that some folks may not see it right away. (This is labeled as a ‘known issue’ so it’s seemingly a bug with the deployment).

What’s going on with Snipping Tool? Well, a previously leaked feature is now present in version 11.2401.32.0 in testing, namely the ability to annotate screenshots with shapes and arrows.

That’s pretty handy for composing screen grabs for the likes of instructional step-by-steps where you’ll be pointing out bits to the person following the guide.

Elsewhere in Windows 11 testing, the Beta channel has a new preview version, but there’s not all that much going on here. Build 22635.3140 does make a small but impactful change, though, for Copilot, moving the icon for the AI in the taskbar to the far right-hand side (into the system tray).

Microsoft observes that it makes more sense for the Copilot button to be on the right of the taskbar, given that the panel for the AI opens on the right, so it’ll be directly above the icon. It’s worth remembering that regarding the Copilot panel, Microsoft just made it larger, apparently as a result of feedback from users of the AI.


Analysis: Cowriter MIA?

Regarding that Beta channel tweak for the Copilot icon, that seems a fair enough adjustment to make. Although that said, rumor has it the next update for Windows 11 – which will be Moment 5 arriving later this month in theory – will allow for the ability to undock the AI so it isn’t anchored to the right side of the desktop. Still, that remains speculation for now, and even then there will be those folks who don’t undock Copilot, anyway.

As mentioned, the big testing move here is the new Notepad ability, and it’s no surprise to see more Windows 11 apps getting AI chops. The integration with Copilot here is on a pretty basic level, mind, compared to previous rumors about a fully-featured Cowriter assistant along the lines of the existing Cocreator in Paint. Still, it’s possible this is an initial move, and that a more in-depth Cowriter function could still turn up in the future at some point.

That said, Notepad is not supposed to be a complex app – the idea is it’s a lightweight and streamlined piece of software – so maybe further AI powers won’t be coming to the client.

You might also like…

TechRadar – All the latest technology news

Read More

Tiny11 is now even smaller, giving you Windows 11 23H2 but without the clutter

There’s a new version of Tiny11, the super-streamlined take on Windows 11 which is now even more compact, and it’s based on the latest release of Microsoft’s desktop OS – and yes, you can add Copilot into the mix if you wish (we’ll come back to that).

The new Tiny11 2311 is based on Windows 11 23H2, the recently released upgrade for the operating system, coming with major improvements including being 20% smaller than the previous Tiny11 23H2 installation.

See more

The developer, NTDEV, tells us that the new version “fixes most, if not all of the nagging issues with previous releases of Tiny11.”

One of the most important fixes is that the OS now works properly with Windows Update. Previously, there were problems with implementing the monthly cumulative updates for Windows 11, but that’s no longer the case, and you can keep your Tiny11 fully up to speed with all the latest introductions from Microsoft.

As mentioned at the outset, while Copilot isn’t in the new Tiny11 by default – this project is all about streamlining, of course – if you want the AI assistant in your Tiny11 installation, you can have it.

As the dev notes: “You just have to install Edge using Winget and voila, you have Copilot on Tiny11! It’s all about choice!”


Analysis: Important advances

This is very impressive: a reduction in the footprint of an already small Windows 11 installation by 20% is no mean feat. For those interested in super-compact sizes for Microsoft’s OS, this is obviously going to be a major boon.

The ability to get Windows updates fixed is even more important, though. Cumulative updates are very necessary in terms of keeping your OS secure, of course, as without them you won’t get the latest patches for vulnerabilities, and your PC won’t be as secure as it should be.

The choice to get Copilot on board is welcome, too, for those who may want a decluttered Windows 11, but still fancy having the AI assistant on tap. While the desktop-based Copilot isn’t very fleshed out yet – at all – and not many folks will take up this option, more choice is always good (and naturally the AI will be improved considerably going forward).

Note that there’s a version of this bloat-banishing OS for Windows 10, which as you might imagine is called Tiny10 (which can run on very low-spec old hardware, it should be noted).

Via Windows Central

You might also like

TechRadar – All the latest technology news

Read More

Check your emails Oculus Quest 2 owners, Meta might be giving you a free upgrade

If you own an Oculus Quest 2 VR headset then go and check your emails, as Meta might be trying to give you a free Elite Strap accessory for it.

User u/claimingmarrow7 took to Reddit to show off an email they claim to have received from Meta. In it they’re told they’ve been sent a unique code they can redeem to be sent a Quest 2 Elite Strap with “no string attached” – all they have to do is take advantage of the promotion before it expires on August 4, 2023.

It’s not currently clear if this is an offer exclusive to select users like claimingmarrow7, or if all Quest 2 owners will be sent similar emails in the near future. We’ve reached out to Meta for clarification, but while we wait for a response we’d recommend looking in the inbox for your Quest account’s registered email (and the spam folder too) to see if you’ve also got a code for a free VR accessory.

The Meta Quest 2 headset next to a plastic Elite Strap

(Image credit: Meta)

What is the Elite Strap?

The Elite Strap is an optional Quest 2 upgrade that replaces the original elastic strap with a plastic one that tightens using a fit wheel. This mechanism gives the headset a much more secure fit on your head and is generally more comfortable than the regular strap.

It doesn’t come cheap, however, with the strap usually costing $ 59.99 / £59.99 / AU$ 89.99 – so getting one for free is a solid deal. 

Just note that this offer appears to be for the regular Elite Strap rather than the version with a battery. The upgraded (but more expensive) Elite Strap with battery model not only provides an extra hour or two of battery life – effectively doubling your Quest 2’s usage time – but further improves the Quest 2’s comfort as the battery serves a counterweight to the usually front-heavy design of the headset.

If you aren’t lucky enough to get a code for a free Elite Strap from Meta and are looking to buy your own, the Elite Strap with battery option is the one we’d recommend – it’s definitely worth the higher cost for people who use their headset a lot.

Looking for a bigger upgrade to your Quest 2? Check out our picks for the best VR headset to see what other options are out there for you to try out.

TechRadar – All the latest technology news

Read More

Uber is giving riders the ability to view the awful truth about their star ratings

Uber is ending the mystery of your ride-sharing rating score. If three drivers gave you 2 stars, you’ll now be able to see it. You won't, however, see which drivers gave you which scores.

After more than a decade in business, whipping out a smartphone and summoning an Uber driver has become second nature for many people, as has the sometimes-awkward process of giving the driver a rating after the trip. However, Uber’s star rating system is a two-way street as drivers also rate passengers. Until now, though, riders could only see an average of their scores. In a company blog post on Wednesday, Uber announced that riders could view a breakdown of their star scores in the app.

“Now, we’re making it easier than ever to see exactly how your rating is calculated, and for the first time, we’re showing you the good (and the bad) ratings you received,” the company wrote.

Understanding how Uber drivers perceive and rate you could be either comforting, “Oh, they notice I’m always on time and in the right place,” or demoralizing, “I guess drivers don't like me eating my foot-long heroes, in their back seats.” At the very least, it might lead to a more egalitarian relationship between riders and drivers.

If you’re ready for this experience, here’s how to access your detailed ratings in the Uber app. 

Start by heading to Settings in your Uber app. There, select Privacy and then the Privacy Center. Swipe over to the section titled “Would you like to see a summary of how you use Uber?” In addition to a breakdown of ratings drivers have given you, this menu shows general statistics, such as how many trips you’ve taken, how long you’ve been a member, and more. You can also view information on individual trips you’ve taken and how you paid for them. 

The feature is available to all riders and is part of a transparency push by the ridesharing company. Though Uber keeps your data for the duration of the time you have an active account, the ratings you see in the app only reflect an average of the last 500 trips. If you’re looking at your score and wondering how you can bump your average rating, Uber has some tips, most of which are common sense:

  • Don’t leave trash in the car when you leave
  • Buckle up
  • Don’t make the driver wait for you
  • Don’t be a jerk
  • Don’t slam the door

 Uber also outlined the cities where riders get the best and worst scores from their drivers. Seattle and Washington, DC were among the worst, but New York City ranked the lowest. At the other end of the spectrum, riders in Nashville, St Louis, and San Antonio earned better scores. 

Wherever you live, this Uber passenger information could help you make a difference in your and your next Uber driver's shared experience.

TechRadar – All the latest technology news

Read More