Get ready to learn about what Windows 11 of the future looks like at Microsoft’s March 21 event

We’ve begun getting hints of what Microsoft is gearing up to announce for Windows 11 at its March event, and now we’ve got new pieces of the puzzle. We’re expecting information about a new feature for the Paint app, Paint NPU, and about a feature that’s being referred to as ‘AI Explorer’ internally at Microsoft. 

Microsoft has put up an official page announcing a special digital event named “New Era of Work” which will take place on March 21, starting at 9 PM PDT. On this page, users are met with the tagline “Advancing the new era of work with Copilot” and a description of the event that encourages users to “Tune in here for the latest in scaling AI in your environment with Copilot, Windows, and Surface.”

It sounds like we’re going to get an idea of what the next iteration of Windows Copilot, Microsoft’s new flagship digital AI assistant, will look like and what it’ll be able to do. It also looks like we might see Microsoft’s vision for what AI integration and features will look like for future versions of Windows and Surface products. 

A screenshot of the page announcing Microsoft's digital event.

(Image credit: Microsoft)

What we already know and expect

While we’ll have to wait until the event to see exactly what Microsoft wants to tell us about, we do have some speculation from Windows Latest that one feature we’ll learn about is a Paint app tool powered by new-gen machines’ NPUs (Neural Processing Units). These are processing components that enable new kinds of processes, particularly many AI processes.

This follows earlier reports that indicated that the Paint app was getting an NPU-driven feature, possibly new image editing and rending tools that make use of PCs’ NPUs. Another possible feature that Windows Latest spotted was “LiveCanvas,” which may enable users to draw real-time sketches aided by AI. 

Earlier this week, we also reported about a new ‘AI Explorer’ feature, apparently currently in testing at Microsoft. This new revamped version which has been described as an “advanced Copilot” looks like it could be similar to the Windows Timeline feature, but improved by AI. The present version of Windows Copilot requires an internet connection, but rumors suggest that this could change. 

This is what we currently understand about how the feature will work: it will make records of previous actions users perform, transform them into ‘searchable moments,’ and allow users to search these, as well as retract them. Windows Latest also reinforces the news that most existing PCs running Windows 11 won’t be able to use AI Explorer as it’s designed to use the newest available NPUs, intended to handle and assist higher-level computation tasks. The NPU would enable the AI Explorer feature to work natively on Windows 11 devices and users will be able to interact with AI Explorer using natural language

Using natural language means that users can ask AI Explorer to carry out tasks simply and easily, letting them access past conversations, files, and folders with simple commands, and they will be able to do this with most Windows features and apps. AI Explorer will have the capability to search user history and find information relevant to whatever subject or topic is in the user’s request. We don’t know if it’ll pull this information exclusively from user data or other sources like the internet as well, and we hope this will be clarified on March 21. 

Person working on laptop in kitchen

(Image credit: Getty Images)

What else we might see and what this might mean

 In addition to an NPU-powered Paint app feature and AI Explorer, it looks like we can expect the debut of other AI-powered features including an Automatic Super Resolution feature. This has popped up in Windows 11 23H4 preview builds, and it’s said to leverage PCs’ AI abilities to improve users’ visual experience. This will reportedly be done by utilizing DirectML, an API that also makes use of PCs’ NPUs, and will bring improvements to frame rates in games and apps.

March 21 is gearing up to bring what will at least probably be an exciting presentation, although it’s worth remembering that all of these new features will require an NPU. Only the most newly manufactured Windows devices will come equipped with these, which will leave the overwhelming majority of Windows devices and users in the dust. My guess is Microsoft is really banking on how great the new AI-driven features are to convince users to upgrade to these new models, and with the current state of apps and services like Windows Copilot, that’s still yet to be proven in practice.

YOU MIGHT ALSO LIKE…

TechRadar – All the latest technology news

Read More

Google Search can help people learn English with new language tutor tool

Duolingo may have a new rival on its hands as Google Search on Android will soon begin helping people in certain countries practice and improve speaking English.

Over the next few days or so, the company will be rolling out an “interactive speaking” tool to users in Argentina, Colombia, India, Indonesia, Mexico and Venezuela. It provides practice sessions where students will be asked questions and they must verbally respond to them “using a provided vocabulary word” in their answer. As an example, Google Search might ask “what do you do for fun?” with the vocab word being “Play”. Students can respond by saying “I play video games in my free time” or “I like to play sports with friends”. Above the question will be a little animation of a cartoon character interacting with you.

Each session lasts about three to five minutes. After which, the tech will deliver “personalized feedback” as well as the “option to sign up for daily reminders” to continue lessons so you don't fall behind.

English tutor on Google Search

(Image credit: Future)

According to the announcement post, the feature can be accessed through a small window under Google Translate on the search engine. Tapping it activates the lesson. Once done, you’ll be taken to a “Speak” section where users can see a calendar of how many times per week they practice, the total amount of words practiced, and the classes they’re a part of. 

You can try out multiple courses at once. Plus you can pause them whenever you want if you’re short on time. Google states that since this will be on Android phones, people can learn “at their own pace, anytime, anywhere.” 

Focusing on context

Something we found particularly interesting is the type of feedback students will receive because it focuses heavily on context. 

You have semantic feedback, telling users if their “response was relevant to the question” at hand and if it could be understood by the other person (or in this case, the AI). It’ll also teach you ways to improve your grammar by pointing out missing words. Below the feedback will be a series of sample answers “at varying levels of language complexity”. They’re meant to show a student alternate ways of responding to questions. You don’t always have to say the same thing – that’s the idea Google wants to teach.

Google's personalized feedback

(Image credit: Future)

Additionally, the search engine will provide “contextual translation” if someone is having a tough time understanding a phrase. You can tap on any word in a sentence to see what it means in a particular context.

Future expansion

We highly recommend reading through the post on the company’s Research blog as it explains the technology behind this feature. It's rather interesting. It explains how the feature is powered by several machine learning models like LaMDA, the same AI behind Google Bard.

Google does have plans to expand its language tutor to “more countries and languages in the future” although no word on exactly when this update will arrive. So we reached out asking for more details on its future availability. We also wanted to know if the tutor will ever arrive on desktop or iOS. At this time, it’ll remain exclusive to Android. We will update this story at a later time.

While we have you, be sure to check out TechRadar's list of the best language learning apps for 2023.

You might also like

TechRadar – All the latest technology news

Read More

Google Classroom is using AI to help children learn in a whole new way

Google has announced a new feature for its online learning platform that will provide students with a more personal learning experience through interactive lessons and real-time feedback.

With practice sets in Google Classroom, educators will be able to transform their teaching content into interactive assignments while an autograding tool will help them save time so that they can focus on the needs of their students instead of being bogged down with paperwork. At the same time, practice sets can help teachers figure out which concepts require more instruction time and determine what students need extra support.

As students complete practice sets, they get real-time feedback so that they can know whether or not they're on the right track. For instance, if a student is struggling to solve a problem, they can get hints through both visual explainers and videos. Then when they get the answer correct, practice sets uses fun animations and confetti to celebrate their success.

According to a new blog post, Google is currently in the process of testing out practice sets with some schools ahead of the feature's beta launch in the coming months. Once practice sets become available in Google Classroom, any educator with the Teaching and Learning Upgrade or educational institution using Google Workspace for Education Plus will be able to test them out.

Adaptive learning technology

The concept of adaptive learning has been around for decades and refers to a type of learning where students receive customized resources and activities to address their unique learning needs.

Now though, thanks to recent AI advances in language models and video understanding, Google is working to incorporate adaptive learning technology into Google Classroom through practice sets. Adaptive learning technology also saves teachers time and provides data to help them understand the learning processes and patterns of their students. 

In a separate blog post, Google explained that a teacher testing out practice sets likened the new feature to having a teaching assistant in the classroom at all times. This is because the technology provides students with one-on-one attention and validation so that they know right away whether or not they got a problem correct. Practice sets also helped drive both student motivation and engagement.

Now that Google is adding AI capabilities to Google Classroom, expect the search giant to add even more automation to its online learning platform going forward.

TechRadar – All the latest technology news

Read More