The Apple Vision Pro comes with a Guest Mode dilemma – you can share the love but can’t keep the settings

Apple’s newly-launched Vision Pro comes with a guest mode, but it appears to be frustratingly limited. It seems that (rather reluctantly), Apple has included a “Guest User” mode to let users share their shiny new device with family and friends without having to give them access to your personal information and data. That said, if you hope it’ll be like guest modes on other devices we’ve become accustomed to, you’ll need to think again. 

While friends and family will be able to experience the magic of the Vision Pro on a user’s device, according to 9to5Mac the device won’t store any of their settings. This will no doubt be disappointing for anyone who got it hoping to be able to share it with a group – such as with the rest of their family. Also, Guest Mode will allow you to “share specific apps and experiences with family and friends,” which sounds like the ability to share may not extend to all apps.

So, guest users will only have limited settings and app capabilities, settings will not be stored from any sessions, and the Vision Pro won’t actually even save guest calibration data. If a guest wants to use a specific user’s Vision Pro other than their own, they’ll have to go through the process of calibrating eye tracking, hand scanning, and pairing ZEISS Optical Inserts every time.

An Apple Store staff member shows a customer how to use a Vision Pro headset.

(Image credit: Apple)

Possible concerns ahead for the Vision Pro

This isn’t due to a technical limitation either, Apple chose to have it be this way. If a friend or family member just wants to give it a spin and try it, this isn’t so bad. However, with a $ 3,500 price tag, some people probably bought it hoping to be able to share it with people they live with. 

This Guest Mode makes it tough to do so, and puts users and guests off of using it like this multiple times. As far as we know, that’s how things stand for now – you can have one main user account, plus the built-in Guest Mode, but there's no option to create separate accounts (guest or otherwise). 

While not totally unheard of for Apple, I can imagine this being disappointing news for some recipients of the Vision Pro. For example, the iPad doesn’t have guest-sharing specific features, but this doesn’t really hinder sharing the iPad with people, and a guest mode probably doesn’t add as much to it. To be able to use the Vision Pro at all, you have to at least calibrate it to your face and eyes, so it’s a different story. 

We await the Vision Pro’s arrival in US stores on February 2 and reviewers have already started posting their first impressions of the device. I can see this becoming a real drawback that users get vocal about  – but would it convince Apple to change the guest mode? Because this is a bold first-gen launch for Apple, users are willing to let its vision develop and give it a chance. Hopefully Apple doesn’t burn through that good will.

You might also like

TechRadar – All the latest technology news

Read More

Microsoft Edge on Android could soon get extensions to tempt you away from Chrome

Will browser extension support be enough to tempt Android users away from Google Chrome to the welcoming arms of Microsoft Edge? We might soon find out, as it looks like Edge is now prepping extension support for its mobile app.

This comes from some digging into the Edge for Android code by tipster @Leopeva64 (via 9to5Google). For now the functionality is hidden behind a flag in the early testing versions of the app, but it could reach the main app as early as March.

Certain extensions – for switching to dark mode, for blocking ads, and for changing the speed of media playback – are already showing up on a rudimentary extensions page, which is another sign that the feature is launching soon.

From the screenshots that have been posted so far, it looks as though a new Extensions button will be added to the menu that pops up when you tap the three horizontal lines, down in the lower-right corner of the Edge for Android interface.

See more

Extended features

Firefox extensions

Firefox recently added extension support to its Android app (Image credit: Future)

You may well know how useful third-party extensions can be on a desktop browser, adding all kinds of additional tools and features to your browser of choice – from changing the way tabs are arranged, to letting you annotate webpages, to managing website volume.

While there are a huge number of extensions available for Chrome on the desktop, Chrome and other browsers have typically shied away from adding extension support on mobile, for a host of different reasons: the screens are smaller, there are fewer system resources available, the interface is simpler, and so on.

Now though, the situation is changing. Firefox recently reintroduced extension support in its Android app, and now it looks as though Edge will follow suit – in an attempt to try and chip away at Chrome's market share. Chrome is the default on around two-thirds of mobile devices worldwide, though that includes iPhones as well as Android devices.

You won't be able to use all the existing Edge extensions on Android – clearly not all of them will work, and the developers will have to adapt them for the different platform – but watch this space for these add-ons arriving on Microsoft's browser.

You might also like

TechRadar – All the latest technology news

Read More

YouTube TV could soon get some big upgrades for sports fans

Improvements are being made to YouTube TV ensuring sports fans can watch multiple games with little to no interruptions.

The annoying thing about watching sports online is there can sometimes be broadcast delays. This results in laggy streams and it's awful. Back in December 2023, YouTube introduced a way to temporarily reduce latency for up to 48 hours at a time. It ensures interference or fluctuating internet speeds don’t cause streams to freeze. But now, according to CordCuttersNews, the latency reduction option can be enabled permanently. 

Images on a 9To5Google report reveal text mentioning the 48-hour time limit is no longer present on Decreased Delay. What's more, it applies to all channels on the service. The publication states enabling the tool will only go into effect after closing and then reopening the app on Android TV. What’s interesting is that Decreased Delay is still labeled as an experimental feature so there could be some performance issues. It’s possible YouTube will patch Decreased Delay at a later time. Nothing's confirmed, right now.

Activating Decreased Delay is simple. On the YouTube TV app, select the three-dot menu then go to Broadcast Delay. The “Decrease” and “Default” options will be underneath that setting. The official YouTube TV Help page explains the former is best for minimizing playback interruptions while the latter is more for reducing “live spoilers.” 

Build your own stream

The second improvement is an update for Multiview. This feature was first released back in March 2023, giving users a way to stream up to four sports games at the same time. Back then, people were forced to pick from preset options. However, thanks to the new Build a Multiview tool, you can choose the four games you want to watch. 

Build a Multiview was initially discovered by a Reddit user who stumbled across the option one day on YouTube TV. They claim they were able to pick out a group from all of the games that were on at the time; not just from a specific sport. 

There is a catch: Build a Multiview is only seeing a limited release. Google told CordCuttersNews they’re currently testing the feature, so only a select few have access. But there are plans for a wider release. It’ll be available on “all devices that support multiview.” A full list of these devices can be found on the YouTube Help website. They include video game consoles, recent smart TVs, and streaming dongles like the third-generation Fire TV Stick.

Super Bowl 2024 kicks off on Sunday, February 11 and these updates could not have come at a better time. If you’re looking for a new TV to watch the big game, check out TechRadar’s list of the best smart TVs for 2024

You might also like

TechRadar – All the latest technology news

Read More

Two days with Vision Pro: Apple’s almost convinced me to part with $3,500 by transforming everything I do

Whatever you've heard or read about Apple's new Apple Vision Pro mixed reality headset, nothing quite prepares you for seeing it in person, putting it on, and experiencing for the first time Apple's vision for spatial computing. You realize quite quickly that this is more than a marketing term, it's a new approach to the digital experience. 

I'm still getting a feel for the glass, aluminum, and fabric system but I thought I'd start by sharing my first hours with the $ 3499 (to start), US-only mixed reality headset. It was mostly smooth sailing with one early, albeit tiny, bump in the road.

Apple Vision Pro box

Apple Vision Pro box (Image credit: Future)

A package arrives

January 30th 4:30 PM:

The box arrives! It's large because Apple sent me both the 1TB Apple Vision Pro ($ 3,899) and a carrying case ($ 199). Inside is a tall white box that reminds me of oversized iPhone packaging. I mean, it is different, but also oddly familiar – at least on the outside.

The carrying case looks like it might be more at home on the moon. A covering I initially took for packaging is the case's Apollo-mission space-suit-like material. I quickly put the case aside so I could get to the business of unboxing the fruits of Apple's first new product category in almost a decade.

While it's not remotely cramped, there is a lot in the Vision Pro box. First is the spatial computer itself, nestled comfortably inside with its Solo Knit Band already attached. Every accessory is wrapped in Apple-ly cardboard. There's the Dual Loop Band, which can replace the Solo Knit Band and potentially offer more support for the 1.3lb. headset. The bands are easy to swap but I'm determined to try wearing the Vision Pro with the default gear (though in most of my previous brief demos, I preferred the Dual Loop and wish Apple had created a hybrid that combines the Solo Knit with a top loop band).

There's an extra Light Seal Cushion. They come in a few sizes but I also have to use the thicker one because I'll be wearing the Vision Pro with my optional custom Zeiss lens inserts (an extra $ 149). 

There's a cover to protect the Vision Pro's lustrous glass front, and a cleaning cloth to wipe away the smudges that instantly appear when you pick it up.

There's the battery which is attached to a cable that runs to a proprietary power port on the Vision Pro. While some might think it odd that Apple didn't simply go with a USB-C charge port, I think that would stick too far out from the headset and look more awkward than the battery-power solution Apple cooked up. 

There's also a USB-C cable and power adapter to charge the battery. 

What comes in the Apple Vision Pro box

What comes in the box. (Image credit: Future)

Unboxing Vision Pro

5:00 PM ET

I unbox the Vision Pro during a TikTok live stream. While doing so, I realized that Apple still has my Zeiss lens inserts. Without them, the visuals in the headset will be blurry. I decide to plug in the battery to charge it up while I wait for the Zeiss lenses to arrive. 

In the meantime, I examine the Vision Pro and practice swapping the Solo Knit for the Dual Loop Band. It's an easy process because, like almost everything else on the Vision Pro, the bands are held in place mostly by magnets or magnetized posts. Things easily pop off. I noticed that if I picked up the wrong part of the Vision Pro, the whole light seal would pop off. Again, super easy to put back on.

I pop one light seal foam off and put the thinner one on to see how it looks and feels. The difference between the two is barely perceptible.

6:00 PM ET

Time to take some photos of the Vision Pro

Image 1 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 2 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 3 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 4 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 5 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 6 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 7 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 8 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 9 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 10 of 11

Apple Vision Pro Review

(Image credit: Future)
Image 11 of 11

Apple Vision Pro Review

(Image credit: Future)

7:15PM ET

My custom Zeiss lenses arrive. Now the fun begins. To get started, I connect the power to the side of the Vision Pro. It's a push-and-turn operation, similar to how you might mount a lens of a DSLR. It's easy (very little with the Vision Pro isn't easy). Next, I insert my lenses, which are clearly marked left and right and, like everything else, snap in with strong magnets. These lenses are not going anywhere.

Image 1 of 2

Apple Vision Pro Review

(Image credit: Future)
Image 2 of 2

Apple Vision Pro Review

(Image credit: Future)

Setup is familiar

Vision Pro starts by teaching you about using Vision Pro (there's also a nice booklet-sized, color manual to help you get started). It explains the eye tracking and subtle gestures you use to control the device. I think Apple did a good job here. 

There are a few steps to go through to get set including making sure the pupillary distance is right (just a press of the digital crown), scanning my Zeiss lens code, scanning a code with my phone to get it properly paired with my iPhone and set up with my Apple ID details, scanning the front and backs of your hands, and the process of staring at a circle of dots (three sets) while pinching my thumb and index finger, which calibrates the system.

The headset also asks if I want to set up Optic ID, which registers my Iris for some security and commerce functions but, though I try multiple times, I can't get it to work.

I start by using  the Solo Knit Band, which means the headset is fairly tight on my face. However, the back of the band is, at least initially, more comfortable than the Dual-Loop.

As with any VR or mixed reality headset, there are prominent safety reminders including, Stay Aware of Your Surroundings, Use in Safe Areas, and Take Frequent Breaks.

It's during the setup that I learn that Vision Pro is not intended for kids, or at least anyone under 13.

Meet my Persona

My Vision Pro Persona

My Vision Pro Persona (Image credit: Future)

You can't get around creating a Persona, which is a digital representation of you that will be used in things like FaceTime and Zoom calls, so you don't have to appear on camera wearing the headset and looking ridiculous (I did this once or twice).

Vision Pro guides me to take off the headset, and then use the system's 3D cameras to capture my face (left side, right side, top, bottom), as well as a couple of expressions. It takes less than a minute for Vision Pro to build my Persona (the system is still in beta, by the way).

I decide to slide the battery pack into my front pocket.

With the questions about transferring existing data and keeping the device up to date with updates, sharing audio recordings with Apple, Apple Pay and Card setup, this is a lot like setting up an iPhone. You go through virtually all the same steps.

I make a FaceTime call to my wife in the other room. Her reaction to my digital persona is not exactly enthusiastic. She calls it disturbing. My son says it reminds him of one of those AI avatars in sci-fi movies that can only answer questions they've been pre-programmed to answer (see iRobot for reference). I ask my wife to grab some screenshots and send them to me (see above).

I think it did a decent job, though Apple appears to have shaved my goatee and fixed my teeth, the latter of which I do not mind.

7:35PM ET

The visuals are still pretty astounding. The home screen floats in my home office with icons sharp enough to touch (I like how some interface elements look like frosted glass – such an Apple thing to do). I use Siri to open Safari. The expert integration of Siri throughout the system is a nice revelation. Imagine if it had worked this well when Apple launched it on the iPhone 4s.

7:50PM ET

Had to take a break because it was hurting my forehead.

The right fit and an endless desktop

Apple Vision Pro home screen

The home screen that you reach by pressing the Digital Crown. (Image credit: Future)

8:10PM ET

Switched to Dual Loop Band. Now that I got the adjustment right, I think it's more comfortable.

I want to play Wordle, as I do every night, but to do so, I must use Vision Pro's Safari instead of the Chrome browser I usually use on my Mac. This means I have to sign into my NY Times account again, which gives me a nice opportunity to use the virtual keyboard. It lets you type on an AR keyboard in the air using your fingers. It's pretty cool, though without tactile feedback, typos proliferate.

My two-factor authentication uses my iPhone, which I naturally cannot unlock with FaceID but, fortunately, my PIN works fine. I never have to take off the headset to see my phone or anything else, for that matter. The passthrough is good enough that I can always see whatever I need to see.

Apple Vision Pro Review

Apple Vision Pro with the Dual Loop Band. (Image credit: Future)

I've been typing on my MacBook Pro M3 and get ready to expand my desktop into augmented reality. Using the control panel, I access the Mac Virtual Display. Vision Pro immediately finds my MacBook and once I select it, the Mac Screen goes dark and a giant virtual MacBook desktop appears floating in front of me. No more looking down at a laptop screen! Of course, I still have to occasionally look at my hands to type. Later when I switch to my real desktop it feels incredibly cramped.

I'm a bit torn about the control panel system. You access it by looking up at a tiny green arrow near the top of your viewport. The Control Center, which is one level down, looks like the one that you'd find on the iPhone but with some Vision Pro-specific touches. I just feel like that little arrow is one of the rare, non-obvious interface bits in the Vision Pro system.

Adding Mac Virtual Desktop to my Vision Pro interface

Adding Mac Virtual Desktop to my Vision Pro interface. (Image credit: Future)

Immersive landcapes and the real feel

8:30PM ET

Have not solved Wordle, which is not designed for this interface but the gaze and pinch system of letter selection works well enough. Itching to have some more immersive fun.

I try the moon environment, which virtually puts you on the surface of the moon. I spin the digital crown to make the environment fully immersive and then realize that by doing that, I can no longer see my keyboard – just my hands floating about the dusty, gray surface of the moon.

I take a break from typing and get ready to sample the 3D version of Avatar: Way of Water….Oh, wait, I have to pay for that. Never mind.

I choose Prehistoric Planet: Immersive, which is just wild. The visuals here are stunning. This is what I imagined when I first started thinking about virtual reality. Having a realistic dinosaur just centimeters from your face changes you.

Vision Pro control panel

Vision Pro control panel. (Image credit: Future)

Perfect for panoramas and meeting EyeSight

8:40PM ET

I switch back to Wordle to give it another shot. I'm enjoying moving things around my endless virtual desktop. 

Do some screen recording, which shows the view inside the Vision Pro, and then I switch to checking out my own panoramic photos. There is simply no better platform for viewing all these photos than the Vision Pro. I have almost 150 panoramic images in my library and I can finally see them in all their vivid detail and beauty. In a photo of a lovely rainbow cresting over my neighborhood, I spot colors I previously missed.

The spatial videography that I captured on my Phone 15 Pro Max looks great.

I leave my home office and walk into the living room. It's easy enough to use the digital crown to dial back the immersion so I can see where I'm going. I sit down on the couch next to my wife and as I start to talk to her she appears slowly, breaking through the immersive landscape as if coming through a fog. On her side, she can see my “eyes” in the Vision Pro's front display. I could almost hear the air quotes in her voice. She did not love the look of Vision Pro Eyesight, which creates a simulacrum of my eyes and their movements based on what the internal cameras can see.

Vision Pro EyeSight in action

The view of my Vision Pro EyeSight in action. (Image credit: Future)

The home movie house

9:05PM ET

I discover that I can use my MacBook mouse across all the apps floating in my virtual desktop; it doesn't matter if they're native to macOS or visionOS.

While the Vision Pro works with virtually all iOS and iPadOS apps, I wanted to see what the platform could do with apps that were built for it. There are, at the moment, about 20 such apps. I install a half-dozen free ones.

I load up Disney Plus and am even able to copy and paste a password from the Mac Pro into the Vision Pro Disney Plus app. I love how smoothly the different platforms work together.

It takes a beat to download an environment like the Avengers Tower.

9:30PM ET

The degree to which I enjoy watching 3D movies with the Vision Pro surprises me. Watching Doctor Strange Multiverse of Madness in the darkened Avengers Tower environment takes me back to being in a real movie theater. Even though the headset has some heft, I'm noticing it less and less. I'm sure I can handle a two-hour movie in this thing. Where is my popcorn?

As I type this, I realize that my pocket is warm. The battery does generate some heat while in use. Also, I see I'm down to 37% power. Doubtful I'll make it through this whole movie.

Battery life

9:45PM ET

Down to 20% battery life. Movies seem to drain the battery fast.

Found a game called Loona. There's an adorable blue character. When I look at her (it?) and pinch my fingers she hiccups and giggles. It's intoxicating. Loona turns out to be a calming puzzle game that I manipulate by pinching and dragging pieces into place.

I switch back to the movie. What a wonderful experience.

10:05PM ET

Vision Pro ran out of power. The battery is warm. Time to recharge and catch some shuteye.

Image 1 of 3

Apple Vision Pro carrying case

(Image credit: Future)
Image 2 of 3

Apple Vision Pro carrying case

(Image credit: Future)
Image 3 of 3

Apple Vision Pro carrying case

(Image credit: Future)

January 31, 7AM ET

My goal is to work, play, and learn about the headset all day long. Instead of running solely off battery power, I'm keeping the battery plugged into a wall outlet. This has the unfortunate side effect of doubling the number of wires running near my body. Not a big deal but I can't just get up and walk away from my desk.

Just realized I never finished Wordle. Oh well, there goes that streak.

While I've viewed a lot of spatial imagery through the headset, both in demos with Apple, and during my first day with Vision Pro, I'd never taken a spatial photo or video with the device.

I press the dedicated button on the upper left side of the headset and it asks about location tracking (I set it to While using the App), and then lets you toggle between spatial photos or video with a gesture. I take a spatial photo, which is pretty straightforward, but when I take a video, there's on-screen visual guidance that seeks to keep the view straight and fixed in one position.

The 3D spatial photo of my hand is so good it's creepy.

The 3D spatial video, despite the somewhat annoying visual guidance, looks excellent.

Image 1 of 2

Apple Vision Pro Review

(Image credit: Future)
Image 2 of 2

Apple Vision Pro Review

(Image credit: Future)

Showing your work

7:30AM ET

Noticing that some of the interface text nearest to me and at the bottom of the field of view is broken into two images. Not sure if something has gone wrong with the calibration.

The system just asked me to move the Vision Pro slightly to the left on my head. It's constantly tracking my eyes, so perhaps it noticed the eye-tracking was slightly off. That may have solved my little parallax issue.

Been experimenting with capture. I don't know how to just record my Persona in action, besides having someone else screen-record my call. I try doing it by screen recording the view of my Persona in Settings but the recording also captures all my real-world head movements, making the video unwatchable.

I did just discover that the easiest way to capture a screenshot of your Vision Pro environment is to simply ask Siri to grab a screenshot of the desktop. It works perfectly every time.

7:53AM ET

I experience my first app crash. The App Store stopped responding and then it disappeared. Can't seem to get my virtual keyboard to appear at all in the App Store or Safari.

Answering questions

8:06AM ET

Pull the headset off for a short break, not because I'm uncomfortable but because I want to let the rest of my face breathe.

8:20AM ET

Back in it and the keyboard malfunction appears to have solved itself. Realize that if I make my Virtual Mac Desktop too large and put it too high on the Vision Pro desktop, I'm craning my neck to read what's at the top. Making adjustments.

I haven't spent much time in environments but I think I prefer them dialed in about 50% when working. 100% and I can't see my physical keyboard and the atmospheric audio is maybe a bit too much for the workday.

Someone asks me on Threads if there's a lot of light leakage. I tell them little, if any. I notice just a bit around my nose, but, especially in passthrough mode, your real-world blends seamlessly with the augmented one. It's quite something.

My wife asks me if I feel disoriented when I remove the headset. I don't. Perhaps that's because I'm often using it with the real-world view intact. Still, I think it has a lot to do with the virtual quality and eye-tracking capabilities.

Heading into video meetings that my Vision Pro persona does not support.

Using the Apple Vision Pro virtual keyboard

Using the Apple Vision Pro virtual keyboard. (Image credit: Future)

Ready to game

10:00AM ET

I want to tie off this initial test run with a game. Apple provided an Xbox controller that I should be able to hook up to the Vision Pro and play some Apple Arcade Games.

Turns out there are a lot of simple mini-games designed explicitly for the Vision Pro. I end up playing What the Golf, which takes me a little while to master. Later I connect the controller and use it to play Asphalt 8: Airborne Plus. I find that I prefer these virtual gaming screens as large as possible and often with the Environment immersion turned to 100. I do think gamers who can afford it will come to love the Vision Pro.

Apple Vision Pro Review

Asphalt 8 in Vision Pro. (Image credit: Future)

10:45AM ET

I end up playing for just 15 minutes before getting back to work. I launch Photoshop on my MacBook Pro and try editing photos on the big screen. It's generally a good experience though I do wonder if I'm seeing the most accurate colors on the Vision Pro Virtual Mac Display.

As I'm working, an iMessage alert comes through. I pinch on the floating iMessage icon and it launches iMessage where I can read it in the app. I could use the Virtual keyboard to type my reply, but it's not good for any more than a few words of typing. I want to use the MacBook's keyboard, but since that app is not inside the Mac, I can't. So I switch to iMessage on the Mac for full control and the ability to type on a physical keyboard.

Initial thoughts

Apple Vision Pro

Wearing Apple Vision Pro. (Image credit: Future)

What did I learn from the first two days with Apple Vision Pro? It delivers on its promises. It's versatile and powerful. The eye and gesture tracking is almost faultless. I only had to occasionally remind myself that a hand hanging down at my side would not be seen by the system cameras.

While I'd struggled to find a comfortable fit in some of my demo experiences, the time and space to select my best fit with the Dual Loop Band resulted in long-term comfort. I wore it for an hour or more at a time without any pain or discomfort.

It's as good at fun and content consumption as it is at work. I especially appreciated the Mac virtual display integration, something I now believe could transform my work life. I've always wanted a bigger desktop and now I have an almost limitless one.

For all that, I still don't know if I would spend $ 3,500 on it. The reality is that I don't even spend that much on my computers (if I can help it). Is a device that's equal parts work machine and entertainment room worth those extra bucks? Maybe. To be fair, it's early days and I may have a more concrete opinion when I finish my review.

You might also like

TechRadar – All the latest technology news

Read More

Wondershare Filmora 13 releases update with a better video editing experience for users at all levels

Creating content and sharing our lives online has become the norm, but not everybody can just sit down at their computer and put together high-quality video footage. Editing can be complicated even for advanced users. With Wondershare Filmora, it doesn’t have to be. Filmora 13.1.0, the latest update to the video editing suite from Wondershare, was designed to make content creation accessible to all, regardless of skill level. Ease of use doesn’t mean lacking in functionality, though, and Filmora is packed with useful features to give your videos an extra kick. 

AI Music Generator and Text-to-Speech 

Wondershare Filmora 13.1.0 update

(Image credit: Wondershare)

Sometimes we just want to create and share videos about our day-to-day lives, but we want to make those videos more interesting with background music. If you’ve taken an incredible vacation and want to share video footage of your adventure, you’re going to need music to accompany that, even if you’re just planning to share the footage with family and friends. However, finding the right music for your videos can be time-consuming. 

Filmora offers a solution with their AI Music Generator tools that can help you create soundtracks for your videos that fit your vibe and are safe to commercialize. With Filmora you can easily make those shareable moments in your life look and sound good without worry. Filmora’s latest slate of enhancements makes it even easier to use, as well, allowing you to utilize Text-to-Speech to add voice-overs to your vlogs with natural-sounding tones that are categorized by scene type. 

Vlogs are not the only content that can benefit from these new features, either. Many of us have taken our educational endeavors online in recent years. Teachers and professors have had to find new ways to engage their students via video, becoming content creators in the process. Soundtracks created with Filmora’s AI Music Generator can help set the tone for your lectures. Text-To-Speech to translate your lesson, giving your students clear, natural-sounding audio that is easy for them to understand and easy for you to create.

Special effects for everybody 

Some stories are too good not to be told, but not everybody has the backing of a major motion picture studio at their disposal. Filmora 13.1.0 features improved professional caliber tools that allow you to easily create short films and music videos with ease, regardless of skill level (or production teams.)

Special effects have traditionally been thought of as an extremely skill-dependent part of content creation and cinematography. Filmora demystifies special effects. With just a few clicks of your mouse, your video’s action sequences can be taken up a notch with realistic motion blur that can be customized to suit your specific needs. Want to draw extra attention to a particular element in a scene? Filmora features a Lens Zoom Effect to simulate camera zoom, giving you creative freedom to hone in on a part of a scene and further enhance your storytelling. Get ready for your close-up, a well-timed zoom-in can set the scene and change the tone of your video. 

With the ability to digitally zoom also comes the option for digital magnification. The Magnifying Glass Tool in Filmora makes it easy for you, as an editor, to examine a scene in your video by getting up close and personal with it. Zoom in, make adjustments, correct your footage as necessary, and then return the frame to its proper size with the corrections intact. That’s professional-quality editing with no more effort than a few clicks of your mouse.

Create with the power of the cloud 

Whether you’re creating with the power of a production team or you’re a personal creator looking to share your life, one thing remains true: video content is a resource hog. If you’re working on projects that involve others, you may find that harnessing the power of the creative cloud can streamline the process and make it more accessible for everybody involved. 

Filmora 13 features improvements to Cloud Resource Management and Beautification tools, making it easier to enable migration of custom LUTs to cloud storage. Seamless synchronization allows you and your collaborators to color-grade assets across multiple devices, streamlining remote work and improving your workflow. Custom LUTs can even allow for the direct import of media files from cloud storage. If your video content features episodic content and color grading is important, the cloud-based custom LUT feature of Filmora 13 can streamline that process by allowing you to enhance and color grade your footage with the power of the cloud.

Every day editing at a professional scale 

With Filmora from Wondershare, creators of all skill levels can create professional quality videos and content with ease. From the DIY homemaker creating short content for YouTube to full-scale production teams working on episodic content, Filmora’s suite of tools can help you put out the best content with less work. Wondershare continues to work and improve Filmora with each upgrade so that you can spend less time editing and more time creating.  

TechRadar – All the latest technology news

Read More

Did we just catch our first glimpse of Windows 12? If so, we won’t get the new OS until 2025

We might have just caught our first glimpse of Windows 12, although we can’t be sure about that – but what we do know is that Microsoft is making a big change with test builds of Windows.

XenoPanther on X (formerly Twitter) noticed that the internal Canary versions of Windows 11 – those in the earliest testing channel, in other words – were just forked with a new build 27547 coming into play.

See more

The most recent Canary channel build is version 26040 as you may be aware if you follow these preview releases (which comes with a new Voice Clarity feature to improve video chats).

So, now we have builds in the 26XXX range and also the 27XXX range, prompting the obvious question: Is the latter Windows 12 in its first test phase? Let’s discuss that in more depth next.


Analysis: I’m giving her all she’s got, Captain!

As Zac Bowden, the well-known Microsoft leaker (of Windows Central fame) points out, the likelihood here is that the next release of Windows is the 26XXX branch, which is currently rumored (by Bowden) to be Windows 11 24H2 coming later this year.

See more

That means the 27XXX preview versions could be the next incarnation of Windows after that, the one arriving in 2025 (and these builds probably won’t go into testing with Windows Insiders for some time yet). Hence the (tentative) conclusion that this might be Windows 12, or an all-new Windows, whatever it may be called.

(Although we should further note that technically, Windows 11 24H2 will be all-new. Not the front-end mind, but the underlying foundations – it will be built on a new platform known as Germanium, which will offer considerable performance and security benefits deep under the hood).

At any rate, this pretty much underlines the idea that Windows 12 (or next-gen Windows, whatever the final name) is not coming this year, and will probably arrive next year. After all, Windows 10 gets ditched in 2025, so it makes some sense that a new OS comes in as one shuffles out the exit door (in October 2025 to be precise).

As we’ve discussed before, one of the dangers of bringing in Windows 12 this year is that the move would fragment the desktop user base into three camps, which is clumsy and a headache for organizing updates. So that scenario is neatly avoided if Windows 12 doesn’t turn up until 2025.

As a side note, Microsoft has codenames for its OS development semesters, and the next one should have been arsenic – but due to it being perceived as “scary and violent” Bowden tells us, the software giant has avoided it, and is instead using the codename Dilithium. Which is pretty cool for Star Trek fans (maybe Duranium will be next in line when another unsuitable real-world element pops up).

Via Neowin, Deskmodder

You might also like…

TechRadar – All the latest technology news

Read More

OpenAI quietly slips in update for ChatGPT that allows users to tag their own custom-crafted chatbots

OpenAI is continuing to cement its status as the leading force in generative AI, adding a nifty little feature with little fanfare: the ability to tag a custom-created GPT bot with an ‘@’ in the prompt. 

In November 2023, custom ChatGPT-powered chatbots were introduced by OpenAI that would help users have specific types of conversations. These were named GPTs and customers who subscribed to OpenAI’s premium ChatGPT Plus service were able to build their own GPT-powered chatbot for their own purposes using OpenAI’s easy-to-use GPT-building interface. Users would then be able to help train and improve their own GPTs over time, making them “smarter” and better at accomplishing tasks asked of them by users. 

Also, earlier this year, OpenAI debuted the GPT store which allowed users to create their own GPT bots for specific categories like education, productivity, and “just for fun,” and then make them available for other users. Once they’re on the GPT store, the AI chatbots become searchable, can compete and rank in leaderboards against GPTs created by other users, and eventually users will even be able to earn money for their creators. 

Surprising new feature

It seems OpenAI has now made it easier to switch to a custom GPT chatbot, with an eagle-eyed ChatGPT fan, @danshipper, spotting that you can summon a GPTs with an ‘@’ while chatting with ChatGPT.

See more

Cybernews suggests that it’ll make switching between these different custom GPT personas more fluid and easier to use. OpenAI hasn’t publicized this new development yet, and it seems like this change specifically applies to ChatGPT Plus subscribers. 

This would somewhat mimic existing functionalities of apps like Discord and Slack, and could prove popular with ChatGPT users who wanted to make their own personal chatbot ecosystems populated by custom GPT chatbots that can be interacted with in a similar manner to those apps.

However, it’s interesting that OpenAI hasn’t announced or even mentioned this update, leaving users to discover it by themselves. It’s a distinctive approach to introducing new features for sure. 

You might also like

TechRadar – All the latest technology news

Read More

iCloud Down: What’s happening and when will it return?

Apple's iCloud service is encountering service disruptions across at least one of its major services, with users expressing their frustration on X (formerly Twitter), with some TechRadar staff being locked out as well.

So what's going on and when will iCloud be back to full service? We've reached out to Apple for answers and are covering the outage so you can find out when you'll be back to business as normal on Apple's popular cloud service.

An Apple iCloud error message

(Image credit: Future / Lance Ulanoff)

Apple iCloud services are down for at least some of TechRadar's US staff, with widespread reports online from frustrated users who cannot access Apple's iCloud email server.

Users have taken to X (formerly Twitter) to express their frustration with the iCloud outage, with Downdetector reporting at least 1,499 reports of trouble as of 4:06PM EST.

See more
See more

A screenshot of downdetector showing an Apple iCloud outage

(Image credit: Downdetector)

The major services that appear to be hit are iCloud mail, which Apple reports as a total outage, with some partial outages being reported for other apps.

See more

TechRadar – All the latest technology news

Read More

iCloud Down: What’s happening and when will it return?

Apple's iCloud service is encountering service disruptions across at least one of its major services, with users expressing their frustration on X (formerly Twitter), with some TechRadar staff being locked out as well.

So what's going on and when will iCloud be back to full service? We've reached out to Apple for answers and are covering the outage so you can find out when you'll be back to business as normal on Apple's popular cloud service.

An Apple iCloud error message

(Image credit: Future / Lance Ulanoff)

Apple iCloud services are down for at least some of TechRadar's US staff, with widespread reports online from frustrated users who cannot access Apple's iCloud email server.

Users have taken to X (formerly Twitter) to express their frustration with the iCloud outage, with Downdetector reporting at least 1,499 reports of trouble as of 4:06PM EST.

See more
See more

TechRadar – All the latest technology news

Read More

Elon Musk’s Neuralink has performed its first human brain implant, and we’re a step closer to having phones inside our heads

Neuralink, Elon Musk's brain interface company, achieved a significant milestone this week, with Musk declaring on X (formerly Twitter), “The first human received an implant from yesterday and is recovering well.”

Driven by concerns that AI might soon outpace (or outthink) humans, Musk first proposed the idea of a brain-to-computer interface, then called Neural Lace, back in 2016. envisioning an implant that could overcome limitations inherent in human-to-computer interactions. Musk claimed that an interface that could read brain signals and deliver them directly to digital systems would massively outpace our typical keyboard and mouse interactions.

Four years later, Musk demonstrated early clinical trials with an uncooperative pig, and in 2021 the company installed the device in a monkey that used the interface to control a game of Pong.

It was, in a sense, all fun and games – until this week, and Musk's claim of a human trial and the introduction of some new branding.

Neuralink's first product is now called 'Telepathy' which, according to another Musk tweet, “Enables control of your phone or computer, and through them almost any device, just by thinking.”

As expected, these brain implants are not, at least for now, intended for everyone. Back in 2020, Musk explained that the intention is “to solve important spine and brain problems with a seamlessly implanted device.” Musk noted this week that “Initial users will be those who have lost the use of their limbs. Imagine if Stephen Hawking could communicate faster than a speed typist or auctioneer. That is the goal.”

Neural link devices like Telepathy are bio-safe implants comprising small disk-like devices (roughly the thickness of four coins stuck together) with ultra-fine wires trailing out of them that connect to various parts of the brain. The filaments read neural spikes, and a computer interface interprets them to understand the subject's intentions and translate them into action on, say, a phone, or a desktop computer. In this first trial, Musk noted that “Initial results show promising neuron spike detection,” but he didn't elaborate on whether the patient was able to control anything with his mind.

Musk didn't describe the surgical implantation process. Back in 2020, though, Neuralink introduced its Link surgery robot, which it promised would implant the Neuralink devices with minimal pain, blood, and, we're guessing, trauma. Considering that the implant is under the skin and skull, and sits on the brain, we're not sure how that's possible. It's also unclear if Neuralink used Link to install 'Telepathy.'

The new branding is not that far-fetched. While most people think of telepathy as people transmitting thoughts to one another, the definition is “the communication of thoughts or ideas by means other than the known senses.”

A phone in your head

Still, Musk has a habit of using hyperbole when describing Neuralink. During one early demonstration, he only half-jokingly said “It’s sort of like if your phone went in your brain.” He also later added that, “In the future, you will be able to save and replay memories.”

With the first Neuralink Telepathy device successfully installed, however, Musk appears to be somewhat more circumspect. There was no press conference, or parading of the patient before the reporters. All we have are these few tweets, and scant details about a brain implant that Musk hopes will help humans stay ahead of rapidly advancing AIs.

It's worth noting that for all of Musk's bluster and sometimes objectionable rhetoric, he was more right than he knew about where the state of AI would be by 2024. Back in 2016, there was no ChatGPT, Google Bard, or Microsoft CoPilot. We didn't have AI in Windows and Photoshop's Firefly, realistic AI images and videos, or realistic AI deepfakes. Concerns about AIs taking jobs are now real, and the idea of humans falling behind artificial intelligence sounds less like a sci-fi fantasy and more like our future.

Do those fears mean we're now more likely to sign up for our brain implants? Musk is betting on it.

You might also like

TechRadar – All the latest technology news

Read More