WhatsApp is making it easier to transfer chat logs from your old phone to a new one just by scanning a QR code.
Meta CEO Mark Zuckerberg made the initial announcement on his Instagram channel where he states this method lets you move your data privately without ever having to leave your devices.” Looking at the video he posted, you first open up the QR code on the older device, then scan said code on the newer phone. Give it about 10 seconds to finish up and you’re done. Other reports state the Chat Transfer tool can be found under the Chats section in the Settings menu.
📱📲 Now you can transfer your full chat history seamlessly, quickly and securely across the same operating systems without ever having to leave the app. Out today 👀 pic.twitter.com/UqNpyw8bCCJune 30, 2023
See more
Compared to the old method of having to back up your history on either Google Drive or iCloud, this is a lot more straightforward. You’re effectively cutting out the middleman plus you don’t have to worry about hitting storage limits if your WhatsApp account has several gigabytes worth of media saved on it.
As great as this new feature may be, it appears there is a catch. TheVerge claims the QR code chat log transfer “only works between devices running the same operating system, so Android to Android or iOS to iOS.” If you want to move your data from, say, a Samsung Galaxy phone to an iPhone or vice versa, you’ll have to head over to WhatsApp’s Help Center for instructions on how to do so.
We asked Meta to confirm if this is true. We’ll update this story at a later time.
Availability
Meta is currently rolling out the Chat Transfer tool in waves to all its users. Be sure to keep an eye out for the patch once it arrives. No word if there are plans to add a similar feature to the desktop version of WhatsApp.
If any of this sounds familiar to you, that’s because WABetaInfo first revealed the update back in early May when it was only available to beta testers. The publication has since shown off other interesting changes coming to WhatsApp. For instance, a WhatsBeta beta on Android from late May introduces screen-sharing for video calls, which you can activate right after installation and try out with others. There are also plans to introduce multi-account support to the platform giving people a way to swap between profiles on the same smartphone.
In Holopresence land you can be two places at once. One is sitting on a director’s chair in front of a green screen, sweating under a half dozen stage lights. The other is half a world away on a semi-translucent screen, addressing an audience who almost believes you’re sitting right there with them.
I walked across midtown Manhattan in the soaking rain to see ARHT Media’s Holopresence experience in person earlier this week. (And with water dripping off my hat and coat, I found myself wishing I’d done this meeting as a hologram.)
To be clear, what ARHT provides is not, technically, a hologram. It’s a canny projection system that employs mostly off-the-shelf technology, a proprietary screen, and special software to make people believe someone is sitting in front of you, as opposed to – in my case – Toronto.
He was never really there
ARHT Media is a Toronto, Canada, telepresence company that just opened its first Holopresence studio in a WeWork building in midtown Manhattan. They invited me for a look.
As I walked into the WeWork space, basically a vast, mostly unfurnished office floor, I was greeted by ARHT Media SVP Terry Davis and company CEO Larry O’Reilly, who was standing off to the side looking at his phone. O’Reilly looked a little odd, as though he was standing before a bright light that I couldn’t see. Suddenly he abruptly dematerialized and was gone — my first experience with this Holopresence technology.
I wanted to try this for myself, but before anyone could transform me into a Holopresence, Davis walked me through the technology's fundamentals.
“We’re a projection system,” Davis told me. Gesturing toward the cube-like set up in a semi-darkened space on the far side of a cavernous WeWork room, where O’Reilly had “stood” just moments ago, Davis explained that the entire system is portable and “breaks down into a couple of duffle bags. We go anywhere in the world.”
The cube that “virtual you” beam in and out from consists of poles, black curtains on the back and sides, and a special screen stretched across the front. Unlike a standard movie screen, this one is a nylon-like mesh with a high-gain reflective coating. “It’s transparent and reflective at the same time,” explained Davis.
Aside from ARHT’s matrixed software (handling multi-channel communication for various holopresences in real-time), the screen is the company’s only other piece of proprietary technology. Still, it is effective.
Behind the screen, I note a few props, including a pair of plants and some floor lighting. These and the distance to the back curtain create the illusion of a depth of field behind a Holopresence. “You have to have a certain degree of depth of field in order for your brain and eyes to perceive that parallax,” said Davis.
A world of Holograms
AHRT is by no means the only company creating virtual people for events, concerts, panels, exhibits, and families. There’s Epic HoloPortl, for example. It has white, booth-like boxes, called PORTLs, in which people appear to materialize. The effect is arresting. Davis, while not wanting to criticize Epic HoloPortls, called them “white coffins with no depth of field.”
He also noted that his product can accommodate multiple people from multiple locations on one screen, while PORTL fits one in a box.
Plus there’s the portability factor. A Holopresence system, which would include the screen, curtain, poles, an off-the-shelf projector (they were using a Panasonic DLP for my demonstration), and microphones and speakers, can fit in a large bag. It’s not clear how portable the PORTL boxes are.
Still, on the other side of a Holopresence presentation is someone sitting in front of a green, black, or white screen. They’re mic-ed-up, facing a camera, and, in my case, hunkered down under substantial lighting. Meaning that for a live Holopresence event, there are always two sides to the technology equation.
Davis told me that the technology they use to create these hologram-like presences is not much different than what we’ve seen with virtual Michael Jackson in Concert or Tupac Shakur at Coachella. In those instances, the projection was from the ground up to a reflective surface that bounces it off a giant screen. Holopresence’s projector is outside the curtained area, facing the screen.
Lance Ulanoff and ARHT CEO Larry O’Reilly (beaming in) (Image credit: Future)
Most of ARHT Media’s clients are businesses, enterprises, and billionaires (there was an Antarctic yacht cruise where people like Malcolm Gladwell beamed in to talk to a select audience). Davis described multiple panels where they beamed people in from around the world. Back at each of their studios, panelists are surrounded by screens that stand in place of other panelists. If someone is seated to the left of you, that’s where the screen will be. They even try to accommodate height differences. If the speaker on the left is much short than you or, say, on a different level on the stage, they adjust the screen height accordingly. A feed of the audience is usually placed in front of the speaker. What they see is holo-panelists looking back and forth at other holo-panelists.
To accommodate large panels or events with large audiences, ARHT offers a range of screen sizes that can be as small as 5 feet and as large as, well, a stage.
ARHT does have some consumer impact. During COVID travel restrictions, the company helped a bridesmaid in England virtually attend a wedding in America. In New Jersey’s Hall of Fame, the company has built a kiosk where visitors can “speak” to life-sized video versions of Bon Jovi and Little Steven.
Still, ARHT is not priced for your average consumer. A single-person Holopresence can run you $ 15,000. For more people on the screen, it could cost as much as $ 30,000.
Beaming in
Lance Ulanoff in ARHT studio becoming a Holopresence (Image credit: Future)
After a power outage at the Toronto headquarters (no amount of tech magic can overcome a lack of electricity), we finally got ARHT’s CEO back for a quick virtual chat. The roughly 6ft tall O’Reilly looked solid. As we talked and he reiterated many of the points Davis and I covered, I found myself focusing on the image quality. Dead-on, it was perfect. From O’Reilly’s white hair down to his shoes, he appeared to be standing before me (on a slightly raised stage). I shifted to the left and right and found the effect holding up pretty well. Davis claims the projection doesn’t flatten out until you hit between 120 -to-140-degree off-axis. I’d argue the viewport is a bit narrower.
As we conversed, though, I experience another key part of ARHT’s Holopresence secret sauce: latency. The conversation between the two of us was free-flowing. Even when we did a counting test (we counted to ten with each of us alternating numbers), there was, perhaps, a sub-second delay.
To achieve this effect, ARHT uses low packet bursting transmission to create a smooth, conversational experience between people in Hong Kong and Australia or a reporter in New York City and a CEO in Toronto.
(Image credit: Future)
One thing I noted throughout the demo were the references to Star Trek transporter technology. There was even a screen in the space showing a loop from the original Star Trek series where the team beams down to an alien planet. When you start a Holopresence experience, people “beam in” with a very Star Trek-like graphic flourish and sound effect. I asked O’Reilly if he's a Star Trek fan and what he thought about the connection. He didn’t answer directly and instead pointed out how the sound and graphics are completely customizable.
Finally, it was my turn. I sat in the green screen space and tried to look like I wasn’t about to experience a lifetime dream of mine. My beam-in moment was, initially, a little underwhelming. I couldn’t see myself; the Holopresence space was across the room.
When it was over, I walked over, and Davis replayed my big moment. Seeing myself teleport into the room like a bald Captain Kirk was everything I hoped it would be.
In Holopresence land you can be two places at once. One is sitting on a director’s chair in front of a green screen, sweating under a half dozen stage lights. The other is half a world away on a semi-translucent screen, addressing an audience who almost believes you’re sitting right there with them.
I walked across midtown Manhattan in the soaking rain to see ARHT Media’s Holopresence experience in person earlier this week. (And with water dripping off my hat and coat, I found myself wishing I’d done this meeting as a hologram.)
To be clear, what ARHT provides is not, technically, a hologram. It’s a canny projection system that employs mostly off-the-shelf technology, a proprietary screen, and special software to make people believe someone is sitting in front of you, as opposed to – in my case – Toronto.
He was never really there
ARHT Media is a Toronto, Canada, telepresence company that just opened its first Holopresence studio in a WeWork building in midtown Manhattan. They invited me for a look.
As I walked into the WeWork space, basically a vast, mostly unfurnished office floor, I was greeted by ARHT Media SVP Terry Davis and company CEO Larry O’Reilly, who was standing off to the side looking at his phone. O’Reilly looked a little odd, as though he was standing before a bright light that I couldn’t see. Suddenly he abruptly dematerialized and was gone — my first experience with this Holopresence technology.
I wanted to try this for myself, but before anyone could transform me into a Holopresence, Davis walked me through the technology's fundamentals.
“We’re a projection system,” Davis told me. Gesturing toward the cube-like set up in a semi-darkened space on the far side of a cavernous WeWork room, where O’Reilly had “stood” just moments ago, Davis explained that the entire system is portable and “breaks down into a couple of duffle bags. We go anywhere in the world.”
The cube that “virtual you” beam in and out from consists of poles, black curtains on the back and sides, and a special screen stretched across the front. Unlike a standard movie screen, this one is a nylon-like mesh with a high-gain reflective coating. “It’s transparent and reflective at the same time,” explained Davis.
Aside from ARHT’s matrixed software (handling multi-channel communication for various holopresences in real-time), the screen is the company’s only other piece of proprietary technology. Still, it is effective.
Behind the screen, I note a few props, including a pair of plants and some floor lighting. These and the distance to the back curtain create the illusion of a depth of field behind a Holopresence. “You have to have a certain degree of depth of field in order for your brain and eyes to perceive that parallax,” said Davis.
A world of Holograms
AHRT is by no means the only company creating virtual people for events, concerts, panels, exhibits, and families. There’s Epic HoloPortl, for example. It has white, booth-like boxes, called PORTLs, in which people appear to materialize. The effect is arresting. Davis, while not wanting to criticize Epic HoloPortls, called them “white coffins with no depth of field.”
He also noted that his product can accommodate multiple people from multiple locations on one screen, while PORTL fits one in a box.
Plus there’s the portability factor. A Holopresence system, which would include the screen, curtain, poles, an off-the-shelf projector (they were using a Panasonic DLP for my demonstration), and microphones and speakers, can fit in a large bag. It’s not clear how portable the PORTL boxes are.
Still, on the other side of a Holopresence presentation is someone sitting in front of a green, black, or white screen. They’re mic-ed-up, facing a camera, and, in my case, hunkered down under substantial lighting. Meaning that for a live Holopresence event, there are always two sides to the technology equation.
Davis told me that the technology they use to create these hologram-like presences is not much different than what we’ve seen with virtual Michael Jackson in Concert or Tupac Shakur at Coachella. In those instances, the projection was from the ground up to a reflective surface that bounces it off a giant screen. Holopresence’s projector is outside the curtained area, facing the screen.
Lance Ulanoff and ARHT CEO Larry O’Reilly (beaming in) (Image credit: Future)
Most of ARHT Media’s clients are businesses, enterprises, and billionaires (there was an Antarctic yacht cruise where people like Malcolm Gladwell beamed in to talk to a select audience). Davis described multiple panels where they beamed people in from around the world. Back at each of their studios, panelists are surrounded by screens that stand in place of other panelists. If someone is seated to the left of you, that’s where the screen will be. They even try to accommodate height differences. If the speaker on the left is much short than you or, say, on a different level on the stage, they adjust the screen height accordingly. A feed of the audience is usually placed in front of the speaker. What they see is holo-panelists looking back and forth at other holo-panelists.
To accommodate large panels or events with large audiences, ARHT offers a range of screen sizes that can be as small as 5 feet and as large as, well, a stage.
ARHT does have some consumer impact. During COVID travel restrictions, the company helped a bridesmaid in England virtually attend a wedding in America. In New Jersey’s Hall of Fame, the company has built a kiosk where visitors can “speak” to life-sized video versions of Bon Jovi and Little Steven.
Still, ARHT is not priced for your average consumer. A single-person Holopresence can run you $ 15,000. For more people on the screen, it could cost as much as $ 30,000.
Beaming in
Lance Ulanoff in ARHT studio becoming a Holopresence (Image credit: Future)
After a power outage at the Toronto headquarters (no amount of tech magic can overcome a lack of electricity), we finally got ARHT’s CEO back for a quick virtual chat. The roughly 6ft tall O’Reilly looked solid. As we talked and he reiterated many of the points Davis and I covered, I found myself focusing on the image quality. Dead-on, it was perfect. From O’Reilly’s white hair down to his shoes, he appeared to be standing before me (on a slightly raised stage). I shifted to the left and right and found the effect holding up pretty well. Davis claims the projection doesn’t flatten out until you hit between 120 -to-140-degree off-axis. I’d argue the viewport is a bit narrower.
As we conversed, though, I experience another key part of ARHT’s Holopresence secret sauce: latency. The conversation between the two of us was free-flowing. Even when we did a counting test (we counted to ten with each of us alternating numbers), there was, perhaps, a sub-second delay.
To achieve this effect, ARHT uses low packet bursting transmission to create a smooth, conversational experience between people in Hong Kong and Australia or a reporter in New York City and a CEO in Toronto.
(Image credit: Future)
One thing I noted throughout the demo were the references to Star Trek transporter technology. There was even a screen in the space showing a loop from the original Star Trek series where the team beams down to an alien planet. When you start a Holopresence experience, people “beam in” with a very Star Trek-like graphic flourish and sound effect. I asked O’Reilly if he's a Star Trek fan and what he thought about the connection. He didn’t answer directly and instead pointed out how the sound and graphics are completely customizable.
Finally, it was my turn. I sat in the green screen space and tried to look like I wasn’t about to experience a lifetime dream of mine. My beam-in moment was, initially, a little underwhelming. I couldn’t see myself; the Holopresence space was across the room.
When it was over, I walked over, and Davis replayed my big moment. Seeing myself teleport into the room like a bald Captain Kirk was everything I hoped it would be.
In a blog post, the company revealed that members of the Targeted Release early access program can now use a small number of Microsoft Teams apps from within email service Outlook and Office.com.
“With this enhancement, apps built for Teams not only run everywhere Teams runs, but also in more of the places that users spend their time in Microsoft 365,” Microsoft explained.
Microsoft Teams apps
Since the start of the pandemic, Microsoft has fought to establish Teams as the central hub for working. The objective was to give workers access to all the tools they need in one place, by integrating a variety of first- and third-party services into the platform.
And the strategy appears to have paid dividends. The latest data suggests Teams has racked up more than 270 million monthly active users (MAUs), up from fewer than 50 million daily active users before the pandemic began.
However, Microsoft has now shifted its approach to focus more closely on creating fluid experiences that streamline the transition between various Microsoft 365 services. The introduction of Teams applications to other Microsoft platforms can be seen as part of this process.
At first, there will only be a handful of Microsoft Teams apps available outside the regular client; some from Microsoft itself (e.g. Power BI) and some from third-party vendors like Zoho and Mural. And these apps will also only be available to a small number of users in preview.
However, the company has promised to double down on the program, with a raft of Teams apps migrating to other Microsoft spaces in the months to come.
Microsoft says IT administrators can control which “enhanced Teams apps” are available to employees from within the regular Teams admin center.
In a blog post, the company revealed that members of the Targeted Release early access program can now use a small number of Microsoft Teams apps from within email service Outlook and Office.com.
“With this enhancement, apps built for Teams not only run everywhere Teams runs, but also in more of the places that users spend their time in Microsoft 365,” Microsoft explained.
Microsoft Teams apps
Since the start of the pandemic, Microsoft has fought to establish Teams as the central hub for working. The objective was to give workers access to all the tools they need in one place, by integrating a variety of first- and third-party services into the platform.
And the strategy appears to have paid dividends. The latest data suggests Teams has racked up more than 270 million monthly active users (MAUs), up from fewer than 50 million daily active users before the pandemic began.
However, Microsoft has now shifted its approach to focus more closely on creating fluid experiences that streamline the transition between various Microsoft 365 services. The introduction of Teams applications to other Microsoft platforms can be seen as part of this process.
At first, there will only be a handful of Microsoft Teams apps available outside the regular client; some from Microsoft itself (e.g. Power BI) and some from third-party vendors like Zoho and Mural. And these apps will also only be available to a small number of users in preview.
However, the company has promised to double down on the program, with a raft of Teams apps migrating to other Microsoft spaces in the months to come.
Microsoft says IT administrators can control which “enhanced Teams apps” are available to employees from within the regular Teams admin center.
Taking photos in iOS has always been a relatively simple affair, just by using the Camera app by Apple. But third-party developers have gone further to make the iPhone camera work harder for you and the photos you take every day.
This is what Obscura has been doing since its launch in 2015. Developed by the Obscura team of Ben Rice McCarthy, Adam K. Schmidt and Sara Lovic, the third version of the app launched this week (February 17) for $ 9.99 / £9.99 / AU$ 10.99.
This new version brings a redesigned gallery view, video capture, refined layouts for controlling exposure settings, and the multiple lenses of the iPhone models, alongside controller support. This allows anyone with an Xbox or PlayStation controller, to take a photo through Obscura 3.
Having used the update for a month, it’s a significant improvement over Obscura 2. The new gallery view brings your albums front and center, giving you a quick overview of what you want to select.
There’s also the ability to rate your photos, not just a thumbs up or down as in Apple’s Photos app. Here, you could take a selection of photos, say different locations for a wedding venue for instance, and rate them in order. It makes sorting some photos much easier, as it could help you decide on certain locations or products for those important situations.
(Image credit: TechRadar)
It’s the gestures that help make Obscura 3 shine – especially the exposure gesture. As you’re taking a photo, you can press the exposure icon on the bottom-left of the app to change how light or dark you need the image to be. But if you use your thumb to slide up and down on the icon, you can more accurately choose the exposure point instead.
These little touches are found across the app in this third version. While you can’t currently change the default camera app in iOS, Obscura 3 makes a compelling case for why the option should be there for pro users.
A chat with Obscura’s developer
Speaking with McCarthy after the launch of Obscura 3, I asked them whether the pandemic inspired the development of the new update, in regards to features and what users were asking for. “Not particularly. In an ideal world we would have taken a trip to somewhere exciting to take incredible marketing photos of rainforests or glaciers,” McCarthy clarifies. “But for the most part, the production of Obscura 3 wasn’t all that different to Obscura 2.”
With every major update to an app, there’s always the question of what the main objective was for the newest version. We asked McCarthy what the aim was for Obscura this time. “Am I allowed to say everything? Because we really did throw it all out and start from scratch,” McCarthy continues. “There are obvious changes like the new camera interface, but everything has been rewritten and improved, like the gesture to close the camera, the photo capture pipeline, the filters to support P3 color, I could go on all day.
“If I had to choose just one though, I’d probably say the Image Detail view,” McCarthy reveals. “There’s an astonishing amount of complexity to it. It was honestly pretty janky in Obscura 2. It now has better support for RAW files, depth data, video (for the first time!), and is much smoother at handling changes to the photo library while you’re browsing. The triage features are also really neat if you care about keeping your library organized.”
(Image credit: TechRadar)
We wanted to mention the Exposure wheel, which we found very intuitive for allowing certain amounts of light in. We asked McCarthy how this came to be, and why it’s arriving in this update.
“Conceptually, the Exposure and Focus dials were planned from the very start. In fact, I had built a very rudimentary version of them in Obscura 1, but it wasn’t great,” McCarthy explains.
“We played around with the functionality quite a bit. Should the dial have values displayed around the ring? Should the sensitivity vary as it expands? How sensitive should the haptic feedback be? But everything we added made it feel less intuitive and more distracting. In the end, the simpler it was, the more natural and like using a physical camera it felt.”
With the new gallery view being a tentpole feature of Obscura 3, we asked McCarthy whether there was going to be an option for opening the app and having the gallery appear first.
“I had thought of making that an option for the forthcoming iPad version, which is well suited to browsing and editing photos, but I hadn’t really considered it for the phone,” McCarthy explains. “But if we build that functionality anyway, I don’t see why we wouldn’t add it to the phone.”
(Image credit: TechRadar)
In-app purchases, or IAPs, are ways for users to buy more features for an app. In previous Obscura versions, this allowed you to buy additional filters, but for Obscura 3, there are no IAPs this time.
We asked McCarthy what the reasons for this were, and if IAPs have had their due, especially for photography apps.
“There were a few reasons behind this decision, benefitting both us and the user. The first is that we wanted to avoid the feeling of upselling, especially when the user is in the middle of taking photos,” McCarthy explains.
“Secondly, the StoreKit API has also been a pain to work with in the past and was the source of more support email than any other part of the app. And thirdly, having IAPs for filters necessitated having example photos for the product pages, and those added quite a considerable amount to Obscura’s download size (O2 was about 70MB and O3 is down to about 5MB, though the sample photos weren’t the only factor).”
The gallery view also shows promise for other Apple platforms, such as macOS, an operating system that doesn’t have an Obscura app. We wondered whether this is something up for consideration.
“I’m certainly not promising anything right now, but I have tried building it for the Mac using Catalyst and it mostly runs without issue,” McCarthy reveals. “The real work would be in making it feel more at home on macOS, so I guess we’ll have to see if we can find the time to make it happen.”
(Image credit: TechRadar)
A surprising feature was the integration of controller support in Obscura 3. You could use a Dual Sense controller to take a photo if needed. We asked whether this was always intended and if there are further plans to expand this in the future.
“As I was working on the Apple Watch companion app, it occurred to me that it would be nice if there was an alternative for people who don’t have one. And I had a spare PS4 controller (in theory for playing more games on iOS, though I rarely use it) and I realized that could be a decent alternative,” McCarthy reveals.
“There’s not much functionality there right now, but we have plenty of ideas on improving this feature that just didn’t make the cut for launch.”
Finally, widgets are still being heavily used on iPhone and iPad devices, where you can place bite-sized information on your home screen without launching the app. For Obscura, this seems to be a natural step, especially for rated photos and shortcuts for launching different modes of the app.
We asked McCarthy whether this was something that they were considering. “Definitely. As soon as the launch chaos is over we’re going to start work on widgets, and we already have a few planned,” McCarthy continues. “Having access via the lock screen is a big bonus that widgets also provide. And given that we may never see an option to set third-party camera apps as the default, we have to take what opportunities we can get.”
Google, Intel, Nvidia, AMD. These and other companies made it a mission to mention Apple in some way at this year’s CES.
While Apple wasn’t actually at the event in Las Vegas, it still felt like everyone was talking about the company.
If you watched Intel and Nvidia’s live streams, you would have heard about products that were faster than Apple’s M1 Max chips for example (although further research looks to have put cold water on these claims by both companies already).
It’s only when you scratch the surface that you find that Apple has already been regarded as the winner at CES this year in a variety of categories, without being there in any official capacity.
Apple was in the eye of many at CES 2022
Intel was quick to compare its newest Alder Lake chips with the M1 Max, currently available in the MacBook Pro 14-inch and 16-inch models. The press release directly states that the Intel Core i9 chip is faster than the M1 Max, but when you consider the heat that dissipates from this CPU compared to Apple’s chips, there’s more than just speed that Apple beats the competition on.
(Image credit: Intel)
The Core i9 can draw up to 115 watts of power, while the M1 Max requests 60 watts in regular use, and it usually doesn’t exceed 90 watts when macOS demands more power from the chip.
But this is just one example from Intel, as the company also showcased Apple Watch and iMessage integration with upcoming Evo PCs through Screenovate, an app that was recently acquired by the company in December.
Google also announced an effort to mimic the connectivity that Apple’s ecosystem of devices boasts. Soon, you’ll be able to pair multiple devices through an upcoming feature called Fast Pair. This allows your Android phone to unlock your Chromebook, or having your Pixel Buds being able to swap between your phone and laptop with no issue.
However, this isn’t just about Google and Intel being inspired by Apple’s software features.
But wait, there’s more
Find My is Apple’s service of integrating with other products that can be found through the Find My app. It’s the same method that AirTags offer, but the company is allowing other companies to use the same technology. Targus was one example at CES by integrating Find My into a backpack.
Belkin also announced earbuds that would feature Find My, alongside a mount for the iPhone 12 and iPhone 13 series that will be able to track your face, ideal for video calls or for creating the next viral TikTok video.
There’s also more efforts by companies such as Eve to integrate Apple’s HomeKit, which is a way of managing your home devices through the Home app, so you can control lights, your heating, and soon window blinds to switch on and off around your home when needed.
These are just some examples of what was announced at CES 2022, but it only shows how Apple was everywhere at the event, but not present itself. The only time where someone from Apple appeared at CES was in 2020 when Jane Horvath, Apple's senior director of privacy, took to the stage in a privacy roundtable.
And, the year before, the company decided to put up a banner in front of CES promoting the fact that your information is stored on your iPhone only.
(Image credit: Future)
This just proves that Apple doesn’t need to be at CES – companies such as Belkin with its products and Intel with its charts do the hard work for the company regardless. While there’s still doubt on whether events like this are still needed in a post-pandemic world, the underlying theme was that companies are getting ready to suit up for a battle that’s mostly, already been won by Apple.
Check out all of TechRadar’s CES 2022 coverage with reviews, reaction, and analysis of the best new tech we’ve seen, from 8K TVs and foldable displays to new phones, laptops and smart home gadgets.
Microsoft is doubling down on efforts to drive the adoption of collaboration platform Teams in conference room settings.
As per two new entries to the company’s product roadmap, users will soon be able to perform a wider range of actions via Microsoft Teams panels, the touchscreen devices mounted outside of meeting rooms.
For example, Microsoft Teams users will be able to “check out” of conference rooms if their meetings end earlier than expected, as a courtesy to colleagues who might need the space. In a similar vein, it will also be possible to extend existing reservations via Microsoft Teams panels, provided doing so does not create a clash.
Microsoft Teams in the office
Both updates are still currently under development, but should roll out to all Microsoft Teams panel users by the end of May.
With the majority of experts predicting a widespread transition to hybrid working once the pandemic recedes, Microsoft is eager to position its collaboration and video conferencing platform as the go-to choice for in-office deployments too.
The company faces stiff competition from the likes of Zoom, which has its own range of conference room solutions, but Microsoft is putting in the legwork with a series of updates of its own.
In addition to the new functionality set to land in the spring, Microsoft recently announced a feature to help companies monitor and limit the capacity of meeting rooms, to facilitate social distancing (if it’s still required in future).
The desired room capacity is set by the administrator and overcrowding is detected using cameras with people-counting functionality. If a room is over capacity, an alert will appear on the in-room display and Microsoft Teams panel outside.
Microsoft will hope that features such as these, in addition to a roster of intelligent meeting room hardware (displays, webcams, microphones etc.), will help build on the strong foundation established during the pandemic.
Thousands of new domains are registered everyday so that businesses and individuals can build websites but new research from Palo Alto Networks has revealed that cybercriminals often register malicious domains years before they intend to actually use them.
The cybersecurity firm's Unit 42 first began its research into dormant malicious domains after it was revealed that the threat actors behind 2019's SolarWinds hack used them in their attack. To identify strategically aged domains and monitor their activity, Palo Alto Networks launched a cloud-based detector in September of 2021.
According to the findings of the firm's researchers, 22.3 percent of strategically aged domains pose some form of danger with a small portion being straight-out malicious (3.8%), a majority being suspicious (19%) and some being unsafe for work environments (2%).
The reason cybercriminals and other threat actors let a domain is age is to create a “clean record” so that their domain will be less likely to be blocked. Newly registered domains (NRDs) on the other hand are more likely to be malicious and for this reason, security systems often flag them as suspicious. However, according to Palo Alto Networks, strategically aged domains are three times more likely to be malicious than NRDs.
Detecting malicious domains lying dormant
When a sudden spike in traffic is detected, it's often the case that a strategically aged domain is actually malicious. This is because normal websites typically see their traffic grow gradually from when they're created as more people visit a site after learning about it through word of mouth or advertising.
At the same time, domains that aren't intended for legitimate purposes often have incomplete, cloned or questionable content and usually lack WHOIS registrant details as well. Another sign that a domain was registered and intended to be used at a later time in malicious campaigns is DGA subdomain generation.
For those unfamiliar, DGA or domain generation algorithm is a method used to generate domain names and IP addresses that will serve as command and control (C2) communication points used to evade detection and block lists. Just by examining sites using DGA, Palo Alto Networks' cloud-based detector was able to identify two suspicious domains each day.
During its investigation, the cybersecurity firm discovered a Pegasus spying campaign that used two C2 domains registered in 2019 that finally became active two years later in July of 2021. Palo Alto Networks' researchers also found phishing campaigns that used DGA subdomains as well as wildcard DNS abuse.