Google Chrome gets new 4 mobile features to boost your search game

A Google Chrome update is revamping the way you search on mobile so you can find the information you’re looking for quicker than before. In total, four new features are being introduced.

Starting from the top, Chrome will now show relevant search suggestions whenever you tap the address bar on certain websites. The example given by Google is to imagine yourself “reading an article about Japan as you plan for an upcoming trip.” Upon tapping the URL of said article, a section called Related To This Page will appear below giving “suggestions for other searches” from local tourist attractions to restaurants. This feature will be available on both iOS and Android.

System exclusive

What won’t be coming to iOS (at least initially) is a list displaying all of the trending Google searches for a day. You’ll be able to see the list by tapping the address bar on a freshly opened tab. The company says this will hit Android phones first. Later on in this year, Chrome on iOS will get the same thing although an exact date wasn’t given.

Third in the Chrome update is the seemingly exclusive upgrade to Touch to Search on Android. Moving forward, whenever you highlight text on a website, a carousel of related topics will appear at the bottom of the page so you can quickly learn about the topic at hand. There is a chance you won’t be able to see the carousel as Touch to Search may be deactivated. Detailed instructions on how to activate the tool can be found on the Chrome Help website

And finally, “typing in the Chrome address bar” on the iOS app will now display 10 suggestions instead of six. The Android app has had this feature for a while now. This is just Google updating the iPhone version so it’s on par.

Potential desktop changes

The company says all four updates are currently making their way to all users so keep an eye out for the patch when it arrives. 

As for Chrome on desktop, officially there’s nothing officially new. However, a report from TheVerge reveals the download tray on the web browser is in fact seeing some changes. There is a ring animation that will now appear displaying the progress of a download. Plus the tray will list every file “you downloaded within the previous 24 hours” alongside options to pause, resume, retry, or cancel the download. 

It’s unknown when the desktop changes will be released. As we said, Google hasn’t said a word about it. We asked the company for more information regarding the download tray upgrade as well as clarification on some of the mobile features. We wanted to know if it plans on extending the Touch to Search carousel to iOS among other things. This story will be updated at a later time.

TechRadar – All the latest technology news

Read More

Android’s Nearby Share boost means it’s almost a match for Apple’s AirDrop

Nearby Share on Android has received a major upgrade, giving you the ability to send entire folders to other devices.

This feature was recently discovered by industry insider and tech journalist Mishaal Rahman who shared his findings on X (or Twitter, if you prefer the older, less obtuse name). Rahman states you’re able to transfer folders from one Android phone to another as well as to Chromebooks and Windows PCs via the Files by Google app. He says that all you have to do is long-press any folder within Google Files and then select the Nearby Share icon on-screen. From there, you will see all of the connected devices which can accept the transfer. Pretty simple stuff.

See more

There are some limitations to be aware of. Tom’s Guide states in their report, “Nearby Share has a 1,000-file limit”, so folders can’t be too big. Another piece from Android Police reveals the upgrade is exclusive to Google Files as it doesn’t seem to work properly with Samsung’s own file manager. Files will still be shared on Samsung's app, but it won’t retain the folder structure, according to Rahman.

What’s interesting is there’s a good chance you already have this feature if your device has Google Files. Rahman says that Nail Sadykov, another notable industry insider, claims “the earliest he saw someone mention it was back in May” of this year. It’s just that no one knew about it until very recently. Apparently, Google didn’t give anyone the heads-up.

So, if you have Google Files on your phone and haven’t updated it in a while, we recommend downloading the patch to get the boosted Nearby Share.

Closing the gap

Admittedly, it’s a small update, but an important one as it allows Nearby Share to close the gap a bit between it and Apple’s AirDrop. Android users will save a lot of time since they won’t be forced to transfer files one by one. It’s a function iPhone owners have enjoyed for many years now. It’s hard to say exactly when AirDrop first gained the ability to send folders to Macs. The oldest instance we could find was one of our How-to guides from 2015.

However, Nearby Share still has a long way to go before it can be considered a proper rival to AirDrop. For iOS 17, Apple plans on further enhancing its wireless file transfer tool by introducing new features like Contact Posters for friends plus improved security for unsolicited images.

If you’re looking for other management options besides Google Files, be sure to check out TechRadar’s list of the best file transfer software for 2023

TechRadar – All the latest technology news

Read More

Google Assistant gets AI boost – but will it make it smarter?

The AI chatbot race is far from over, despite ChatGPT’s current dominance, and Google is not showing any signs of letting up. In fact, reports suggest Google is preparing to “supercharge” Assistant, its virtual personal assistant, by integrating generative AI features similar to the ones found in OpenAI’s ChatGPT and Google’s own generative AI chatbot Bard

Google has begun development on a new version of Google Assistant for mobile (check out the full list of devices that will be able to run , as stated in an internal email circulated to employees as reported by Axios. This is allegedly going to take place through a reorganization of its present Assistant team which will see a reduction of “a small number of roles”.

The exact number of employees that are expected to be let go has not been specified, though Axios has claimed that Google has already laid off “dozens” of employees. We have contacted Google to find out more.

Google Assistant

(Image credit: Google)

The newer, shinier, and AI-connected Google Assistant

As reported by The Verge, Google is looking to capitalize on the momentum of the rapid development of large language models (LLMs) like ChatGPT to  “supercharge Assistant and make it even better,” according to Google spokesperson Jennifer Rodstrom. 

Google is placing a big bet on this Google Assistant gambit, being “deeply committed to Assistant” and its role in the future, according to Peeyush Ranjan, Google Assistant’s vice president, and Duke Dukellis, Google product director, in the email obtained by Axios.

This step in Google’s AI efforts follows Bard’s recent big update which enabled it to respond to user queries by “talking” (presumably meaning that it will reply using a generated voice, much like Google Assitant does), visual prompts, opening up Bard to more countries, and the introduction of over 40 languages. 

Google has not yet revealed what particular features it’s focusing on for Assistant, but there are plenty of ways it could improve its virtual assistant such as being able to respond in a more human-like manner using chatbot-like tech.

Making sure customer data remains safe and protected

Google Assistant is already in many people’s homes thanks to it being included in many devices such as Android smartphones and Google Nest smart speakers (find out how the Google Nest currently compares here) , so Google has an extensive number of users to test with. “We’re committed to giving them high quality experiences,” Rodstrom told the Verge. 

Of course, this does raise concerns about the privacy and security of its customers, as Google is likely to try and implement changes of this type to its smart home products, and some people may not be comfortable with giving the search giant even more access to their private lives. 

There is also a major concern (which, to be fair, also applies to other chatbots such as ChatGPT); accuracy of information.

google home

(Image credit: Google)

Tackling the issue of bad information and final thoughts

Google could tackle accuracy and misinformation concerns by making the generative AI being developed for Google Assistant devices linked to Google Search, as Bard is not intended to serve as an information source.

In a recent interview, the Google UK executive Debbie Weinstein emphasized that users should double-check the information provided by Bard using Google Search (as reported on by The Indian Express). 

If we’re talking hands-free Assistant devices, I assume that there is development happening to add mechanisms of this sort. Otherwise, users have to carry out a whole interrogation routine with their Assistant devices which could interrupt the flow of using the device quickly and intuitively.

It’s an enticing idea – the home assistant that can fold your laundry and tell you bedtime stories, and steps like these feel like pushes in that direction. It all comes at a cost, and the more tech saturates out lives, the more we expose to those who wish to use it for ill-intentioned purposes. 

This is going to be a huge issue for many people, and it should be, and Google should make just as much of an effort to secure its users data as it does doing magic tricks with it. That said, many Google device users and Android users will be looking forward to a more intelligent Google Assistant, as many report that they don’t get much sense from it at the moment. We’ll see if Google can deliver on its proposed steps (hopefully) forward.

Hopefully, these upgrades to both Bard and Google Assistant will make them, well, more intelligent. Putting security and privacy aside (only for a brief moment), this has real potential to make users' home devices, like Nest devices, more advanced in their ability to react to your questions and requests with relevant information and tailor responses using your personal information (responsibly, we hope).

TechRadar – All the latest technology news

Read More

Your Oculus Quest 2 just got better hand-tracking to boost your virtual boxing skills

Meta has released the v56 update for the Meta Quest Pro and the Oculus Quest 2, which introduces a bunch of upgrades for two of the best VR headsets out there.

With the new Quest update Meta is rolling out Hand Tracking 2.2, which says aims to bring responsiveness more in line with what users experience with controllers. According to Meta, Hand Tracking 2.2 will reduce the latency experienced by a typical user by 40%, with a 75% latency reduction for fast hand movements. 

Meta recommends that you download the Move Fast demo app from App Lab to get a feel for what these improvements mean in practise. It looks like a simple fitness trainer in which you have to punch, chop and block incoming blocks while looking out over a lake decorated with cherry blossom trees. Meta has said we can expect more hand-tracking improvements when the Meta Quest 3 launches later this year. It's yet to be seen if these upgrades can keep up with the hand-tracking Apple is expected to launch with its Apple Vision Pro headset.

Another important improvement is coming just for Meta Quest Pro owners. One of the premium headset’s best upgrades over the Quest 2 is its display, which offers local dimming. This allows screens to achieve deeper black levels and improved contrast, something which can help a lot with immersion, as dark spaces actually look dark without it being impossible to see. However, local dimming isn’t available in every app, so with v56 Meta is launching a Local dimming Experimental Setting (which can be found in the Experimental menu in your headset’s Settings).

The feature is off by default, but if you turn it on you should see the benefits of local dimming in a load more titles – that is, unless a developer chooses to opt out. Just note that as with other experimental settings, you may find it isn’t quite perfect or causes some problems.

Quest 2 users aren't missing out on visual upgrades entirely though, as Meta recently announced that a Quest Super Resolution upscaling tool is coming to help developers make their games look and run better.

This month Meta is also improving the accessibility of Quest titles by introducing button mapping and live captions. Live captions will appear in your Quest headset’s Settings, under the Hearings section of the Accessibility menu. Once turned on you’ll see live subtitles while using the Meta Quest TV app, Explore, and the in-headset Quest Store. In the same Accessibility menu, go to the Mobility section and you’ll find an option to remap your Quest controllers – you can swap any buttons you want on the handsets to create a completely custom layout.

These accessibility settings won’t revolutionize your headset overnight, but they’re a great first step. Hopefully, we’ll see Meta introduce captioning to more apps and services, and perhaps it’ll launch custom-accessible controllers like the ones that Sony and Microsoft offer for their PS5 Access controller and the Xbox Adaptive Controller.

New ways to stay connected 

Beyond these major upgrades, Meta is rolling out a handful of smaller improvements as part of update v56.

First, when you leave your headset charging on standby between play sessions it can smartly wake up and install updates whenever it detects that your installed software is out of date. This should help to reduce instances of you going to play a game only to find that you need to wait for ages while your headset installs a patch.

Second is the new Chats and Parties feature. Whenever you start a call in VR a chat thread is also connected with all of the call members, so you can keep in contact later; you can also now start a call from a chat thread (whether it’s a one-on-one chat or a group chat).

Third, and finally, meta is making it easier to stream your VR gameplay to Facebook, and while you play you’ll be able to see a live chat, so you can keep in contact with your viewers. While the platform isn’t many people’s first choice, it hopefully opens the door for easier real-time live streaming to more popular platforms like YouTube and Twitch.

TechRadar – All the latest technology news

Read More

Google Bard just got a super-useful Google Lens boost – here’s how to use it

Google Bard is getting update after update as of late, with the newest one being the incorporation of Google Lens – which will allow users to upload images alongside prompts to give Bard additional context.

Google seems to be making quite a point of expanding Bard’s capabilities and giving the chatbot a serious push into the artificial intelligence arena, either by integrating it into other Google products and services or simply improving the standalone chatbot itself.

This latest integration brings Google Lens into the picture, allowing you to upload images to part, identify objects and scenes, provide image descriptions, and search the web for pictures of what you might be looking for.

Image 1 of 2

Screenshot of Bard

(Image credit: Future)
Image 2 of 2

Asking Google Bard to show me a kitten

(Image credit: Future)

For example, I asked Bard to show me a photo of a kitten using a scratching post, and it pulled up a photo (accurately cited!) of exactly what I asked for, with a little bit of extra information on why and how cats use scratching posts. I also showed Bard a photo from my phone gallery, and it accurately described the scene and some tidbits of interesting information about rainbows.

Depending on what you ask Bard to do with the image provided, Bard can provide a variety of helpful responses. Since the AI-powered chatbot is mostly a conversational tool, adding as much context as you possibly can will consistently get you the best results, and you can refine its responses with additional prompts as needed. 

If you want to give Bard's new capabilities a try, just head over to the chatbot, click the little icon on the left side of the text box where you would normally type out your prompt, and add any photo you desire to your conversation. 

Including the image update, you can now pin conversation threads, get Bard to read responses out loud in over 40 languages, and get access to easier sharing methods. You can check out the Bard update page for a more detailed explanation of all the new additions.

TechRadar – All the latest technology news

Read More

The latest Oculus Quest 2 update comes with a serious performance boost

The latest software update for the Oculus Quest 2 and Meta Quest Pro is here, and it’s bringing some serious performance upgrades to both of Meta’s VR headsets.

Meta teased this update following the Meta Quest 3 announcement where a press release revealed that the Quest Pro and Quest 2 would see their CPU speed rise by up to 26% each. What’s more, the Quest 2’s GPU will, according to Meta, improve by up to 19%, while the Quest Pro’s GPU will improve by 11%.

These hardware upgrades are achievable via a software update because Meta’s new update is allowing the CPU and GPU in each headset to run at a higher clock speed. Previously both headsets ran underclocked systems – read: maximum performance is being held back – in order to prevent the headsets from getting too hot and causing discomfort for the player. Clearly, Meta decided that it was a bit too conservative with its underclocked approach, so now it's releasing a bit more power.

On top of its faster processing, Meta has announced that the Quest Pro is getting a boost to its eye-tracking accuracy. While the update post doesn’t go into much detail we can’t help but feel like this is Meta’s first step to helping the Quest Pro catch up to Apple’s newly unveiled Vision Pro headset – which threatens to usurp Meta’s spot at the top of the best VR headsets list.

The Apple Vision Pro headset on a stand at the Apple headquarters

What will Meta learn from the Apple Vision Pro? (Image credit: Future)

One innovation Apple’s headset has is that it uses eye-tracking to make hand-tracking navigation more accurate. Rather than awkwardly pointing at an app you just have to look at it and then pinch your fingers.

The Quest Pro’s improved eye-tracking accuracy could allow Meta to implement a similar system to the Apple Vision Pro – and help make its eye-tracking technology more useful.

More minor changes

Beyond these performance boosts, the Meta Quest v55 update brings a few minor software improvements.

Now when using Direct Touch hand tracking, you’ll be able to tap swipe notifications away or tap on them like buttons as you can with other menu items. If this doesn’t make interacting with your headset feel enough like using a smartphone, Meta has also said that the full Messenger app will now launch on the Quest platform – allowing you to call and message any of your contacts through the app, not just the people that use VR.

Two new virtual environments will be made available too. The Futurescape – which was featured in the 2023 Meta Quest Gaming Showcase – combines futurism with nature, while the Great Sand Sea is a vast desert world that’s an exclusive space for people who have preordered Asgard’s Wrath 2. To change your current environment to either of these options you’ll need to go into your Quest headset Settings and find the Personalization menu. You should see the option to change your environment to either one of these new spaces or the previously released virtual homes. 

Check out our interview with one of the developers to find out how Asgard's Wrath 2 will bring out the best of the Oculus Quest 2.

TechRadar – All the latest technology news

Read More

Our favorite free video editing software gets unexpected performance boost from new macOS Sonoma

One of the big announcements at Apple’s WWDC 2023 was macOS Sonoma (we looked it up; it means “Valley of the Moon”). 

Apple claims the new operating system has a sharp focus on productivity and creativity. It says “the Mac experience is better than ever.” To prove it, the company revealed screensavers, iPhone widgets running on Macs, a gaming mode, and fresh video conferencing features. 

But the new macOS has another surprising feature for users of our pick for best free video editing software.  

The final cut 

Beyond WWDC’s bombshell reveal – yes, Snoopy is an Apple fan now – the event served up more than enough meat to keep users happy. There’s a new Macbook Air 15-inch on the way, said to be the “world’s thinnest.” The watchOS 10 beta countdown has started. And the Vision Pro is dividing opinion. Is the VR headset the future or will it lose you friends?

The reveal of the new Mac operating system, meanwhile, feels quieter somehow. Muted. Perhaps new PDF editor functionalities and a host of “significant” updates to the Safari browser aren’t as eye-catching as a pair of futuristic AR/VR ski goggles.  

However, Craig Federighi, Apple’s senior vice president of Software Engineering, said, “macOS is the heart of the Mac, and with Sonoma, we’re making it even more delightful and productive to use.” 

What he didn’t say, but the company later revealed, is that Sonoma adds an extra bonus for video editors. 

Designed for remote and hybrid in-studio workflows, the operating system brings a high-performance mode to the Screen Sharing app. Taking advantage of the media engine in Apple silicon, users are promised responsive remote access with low-latency audio, high frame rates, and support for up to two virtual displays. 

According to Apple, “This mode empowers pros to securely access their content creation workflows from anywhere – whether editing in Final Cut Pro or DaVinci Resolve, or animating complex 3D assets in Maya.” It also enables remote colour workflows that previously demanded the best video editing Macs and video editing PCs

It seems Final Cut Pro is getting a lot of attention lately. May saw the launch of Final Cut Pro for iPad – how did it take so long? – and now better support in the operating system. What next? Perhaps that open-letter from film & TV professionals pleading for improved support really did focus minds at Apple Park.  

TechRadar – All the latest technology news

Read More

Windows 11 update has system-wide live captions to help boost its accessibility aims

Despite Windows 11 being sequestered behind hardware requirements such as TPM, Microsoft is doing its best to make its latest OS as accessible as possible for the deaf and hard of hearing communities, with all new system-wide live captions. 

Available today (April 5) after Microsoft's event, the brand new live captions feature allows users who may be deaf, hard of hearing or those who just like subtitles to easily access captions across all audio experiences and apps across Windows. 

Live Captions will also work on web-based audio, allowing users to view auto-generated captions on websites and streaming services that might not otherwise support or have the best captions. 

Unfortunately, it is currently unclear if Microsoft will be bringing the live captions feature to Windows 10, in order to let as many users as possible utilize this useful accessibility feature. 


Analysis: an accessibility win that is not accessible for everyone

There is no denying that more accessibility options are a good thing regardless of where you use them yourself or not, however, Microsoft deserves as much criticism as praise for this new feature as, for now, they’re keeping it exclusive to Windows 11. 

With Windows 11’s growth recently being shown to have dramatically stalled in March, it makes sense that Microsoft’s latest OS may need some more killer features to tempt users into upgrading from Windows 10, however holding accessibility features random certainly is not the way to do it. 

While holding this feature to ransom would be bad enough if upgrading was a simple one-click process, Windows 11 does not make things that easy as it infamously requires TPM 2.0, a feature that many computers, manufactured before 2017, do not have.

Mercifully, captioning services are becoming more and more common across web pages and streaming services, you can even listen to all of our articles, for instance, however, these services all have their potential problems and require individual set up, so it's far from a perfect solution. 

With Microsoft having only just announced this new feature for Windows 11 during their hybrid work event, we can only hope that it is not too long before the tech giant sees sense and brings this feature to older versions of Windows to benefit all users, rather than just those on Windows 11.

TechRadar – All the latest technology news

Read More

Windows 11 update has system-wide live captions to help boost its accessibility aims

Despite Windows 11 being sequestered behind hardware requirements such as TPM, Microsoft is doing its best to make its latest OS as accessible as possible for the deaf and hard of hearing communities, with all new system-wide live captions. 

Available today (April 5) after Microsoft's event, the brand new live captions feature allows users who may be deaf, hard of hearing or those who just like subtitles to easily access captions across all audio experiences and apps across Windows. 

Live Captions will also work on web-based audio, allowing users to view auto-generated captions on websites and streaming services that might not otherwise support or have the best captions. 

Unfortunately, it is currently unclear if Microsoft will be bringing the live captions feature to Windows 10, in order to let as many users as possible utilize this useful accessibility feature. 


Analysis: an accessibility win that is not accessible for everyone

There is no denying that more accessibility options are a good thing regardless of where you use them yourself or not, however, Microsoft deserves as much criticism as praise for this new feature as, for now, they’re keeping it exclusive to Windows 11. 

With Windows 11’s growth recently being shown to have dramatically stalled in March, it makes sense that Microsoft’s latest OS may need some more killer features to tempt users into upgrading from Windows 10, however holding accessibility features random certainly is not the way to do it. 

While holding this feature to ransom would be bad enough if upgrading was a simple one-click process, Windows 11 does not make things that easy as it infamously requires TPM 2.0, a feature that many computers, manufactured before 2017, do not have.

Mercifully, captioning services are becoming more and more common across web pages and streaming services, you can even listen to all of our articles, for instance, however, these services all have their potential problems and require individual set up, so it's far from a perfect solution. 

With Microsoft having only just announced this new feature for Windows 11 during their hybrid work event, we can only hope that it is not too long before the tech giant sees sense and brings this feature to older versions of Windows to benefit all users, rather than just those on Windows 11.

TechRadar – All the latest technology news

Read More

Windows 11 is getting new Focus tools to boost productivity

Focus Assist has been a tentpole feature in Windows 11, and thanks to a new update, they're improving it further from today (April 5), after Microsoft announcing it at its event, thanks to the introduction of an integrated focus timer.

First introduced in Windows 10, Microsoft’s Focus Assist tools have been a useful ally in the war against notification vying to steal your attention away from work, games, and media.

Working similarly to your phone's alert slider and settings, Focus Assist allows you to filter out some or all of the notifications and alerts that could pop up and steal your attention away from whatever you are meant to be doing, so long as you remember to turn it on in the first place.

Increasing your Focus further

Thankfully that should not be as much of a problem anymore, as Windows 11 will soon be getting an integrated focus timer that is able to schedule dedicated blocks of ‘focus time’ into your schedule based on your calendar and working habits for the week.

Previously this useful Focus tool has only been available to customers of Microsoft’s paid Viva Insights program, which helps users build better working habits, so it is great to see the feature roll out to all users across Windows 11 who want to be more productive.

Currently, users looking to maximize their productivity and get the most out of Focus Assist have had to rely on setting up automatic timers or conditions upon which Focus Assist would activate, such as when a second monitor is connected or when you are playing a game.

TechRadar – All the latest technology news

Read More