Microsoft has given Windows 11’s desktop email app, Outlook, a major revamp with the addition of Apple iCloud functionality for people who use iPhones or other Apple devices, plus other features. This upgrade is available to all Windows 11 users and you can add your iCloud account to your Outlook app by doing the following:
1. Click the cog icon in your Outlook menu, which should open your Email accounts setting. This is where you can see all of the accounts that are connected to your Outlook and manage them.
2. Select Add account and sign into your Apple iCloud account. This should connect your iCloud account.
The Outlook app had supported Apple’s email service in the past before Windows 11’s launch, but according to Windows Latest, Microsoft is in the process of deploying a new Outlook app in place of the old one. Apparently reception has been lukewarm from users, but Microsoft is adding lots of new features with every new version.
One of the biggest complaints users have with the renewed Outlook app has been that it launches in a web wrapper. The old app was a fully functional UWP app, with both online and offline support. However, the new app only got offline support very recently. User complaints about the new app persist, and Microsoft is continuing to develop the app to hopefully improve users’ experiences and improve their opinion of the new app.
The latest in a string of new developments
This development follows shortly after Microsoft also added compatibility with Gmail, Google Calendar, and contacts to Outlook. iCloud support is also now available to all Windows 11 users, and Microsoft is reportedly working on extending offline support for more parts of the Outlook app, including events and Calendar.
One feature that users have to look forward to as part of Microsoft’s new Outlook is being able to RSVP to meetings. Windows Latest spotted this as an upcoming update in the Microsoft 365 roadmap, which details what Microsoft has in store for various Microsoft 365 apps. This will help users receive information about the nature of any specific meeting and better decide if they would like to attend. This development is expected to debut in March 2024.
Another feature that has been added will help users understand their meetings and schedules. Microsoft explained on its Tech Community blog that users will be able to track declined meetings better in the Outlook calendar. This will be useful for many users, especially those who have overlapping or densely-packed meetings, and want to better understand what they are and aren’t attending.
How to turn on visibility for declined meetings
The above is now available within the most up to date version of Outlook, but is disabled by default. You can enable it through the following steps:
1. Open the Outlook app.
2. Go to: Settings > Calendar > Events & Invitations > Save declined events
3. Tick (Click) the Show declined events in your calendar box.
This should turn on the feature and declined meetings should begin to be displayed in your Calendar.
In order for a meeting to be classified as declined, you will have to have declined the meeting in all Outlook clients and Teams, with the exception of the original Windows Outlook client.
It’s going to take a little more to win over Windows users it seems, but these seem like some solid steps. These are available to all Windows 11 users with a valid copy of Outlook as far as we know and if you don’t have these features yet, you may need to update your Outlook app. It is to be confirmed if this extends to free users who use Outlook online.
Adobe is currently holding its MAX 2023 event showing off what it has in store for the next year or so. One of the focal points of the conference was a series of 11 “Projects” that have the potential to become “important elements” of Adobe products in the future.
Recently, the company provided a sneak peek at one of these elements called Project Stardust, which has the ability to separate objects in a photograph into individual layers for easy editing. Users will have the ability to move objects around or delete them. From there, you can have a generative AI create something to take its place. The other 10 perform similarly as they harness AI technology to power their robust editing and creative capabilities. The group is split into three main categories.
Alongside Stardust in the Photos category, you have Project See Through, a tool that removes reflections in a photograph. Adobe states that glass reflections can be really annoying since they can obscure subjects. Instead of having to go through a multi-step process of editing the image on Photoshop, See Through does it all for you quickly.
Image 1 of 2
Image 2 of 2
Video & Audio
Similar to how Stardust can remove objects in images, Project Fast Fill can remove them in videos thanks to the company’s Generative Fill tech. It can also add or change content via “Firefly-powered text prompts.” In the example shown to us, Fast Fill can add a tie to a man whose suit doesn't have or alter latte art in a cup of coffee from a heart to a flower.
#Adobe has unveiled Project Fast FillThe experimental #AI system brings Firefly’s Generative Fill to video, letting artists add or remove objects in footage using simple text promptshttps://t.co/7q6ZnCXVSv #AdobeFirefly #generativeAI #VFX #motiongraphics @creativecloud pic.twitter.com/9PuVmqyam1October 11, 2023
Next, Project Res Up can bump up the resolution of a clip via diffusion-based upsampling technology. Scene Change is third and it can swap out the background of a video from, say, an office building to a jungle. For audio, there’s Project Dub Dub Dub, a software tool claimed to be able to translate speech from one language to another “while preserving the voice of the original speaker”.
3D & Design
For the last category, these five are all about helping users create – even if they’re not the best artist.
Project Draw & Delight can turn your doodle into a polished drawing utilizing a text prompt to guide it. Glyph Ease “makes customized lettering more accessible” by instantly applying specific design elements to a word in Illustrator. All you have to do is provide a rough outline of what you want the AI to add.
Image 1 of 2
Image 2 of 2
The trio of 3D imaging software is more situational, but still impressive nonetheless.
Project Poseable’s AI can morph a 3D model to match “poses from photos of real people.” So if you upload a picture of someone striking a karate pose, the model will do the same. Project Primrose lets artists quickly alter the texture of a rendered piece of clothing. And finally, we have Neo which aids creators in creating 3D objects using “2D tools and methods.
To reiterate what we said earlier, these projects are prototypes at the time of this writing. There’s no guarantee any of these will become a new feature in Photoshop or any other Adobe product. However, there are some we believe have the potential for an eventual release.
Stardust, Res Up, as well as Draw & Delight, appear to be the most “complete”. There aren't as many visible flaws as with some of the others. Certain projects require more time in the oven in our opinion. For example, the voice from Dub Dub Dub sounds really stilted and robotic. It's not natural.
Microsoft tells us that its Bing search engine has hit new highs, now boasting 100 million daily active users.
The news came via a post on the Microsoft Bing blog site observing the difference that the ChatGPT-powered AI has made to Bing traffic in its first month of existence, pushing numbers up over 100 million for the first time ever.
The company notes that there are over a million users of the new Bing AI (in preview), and that this helped boost the figures of the search engine to realize the fresh milestone.
Microsoft then goes on to say that a third of the “millions of active users” of the Bing AI are new to Bing, which shows that the chatbot is driving folks to the search engine, although there seems to be an element of confusion here.
Is it a ‘million plus’ users of the AI, or ‘millions’? While technically those aren’t contradictory statements, we don’t see why Microsoft didn’t just call it ‘millions’ in both cases as that clearly sounds like more. Like a couple of million, or a few, even? Insert shoulder shrug here.
At any rate, Microsoft also acknowledged the obvious truth that Bing is still a long old way off the pace compared to Google: “This [100 million] is a surprisingly notable figure, and yet we are fully aware we remain a small, low, single digit share player. That said, it feels good to be at the dance!”
Analysis: Can this growth be sustained?
Let’s face it: Bing AI isn’t just a chatbot. It’s a vehicle to help Bing challenge Google, and one that Microsoft hopes will be swiftly moving up through the gears to gain momentum.
This isn’t just about pushing for market share with the Bing search engine either, it’s also a line of attack for the Edge browser, as we’ve already seen with the taskbar implementation of the Bing AI in Windows 11: an icon that linked through to the Bing page, opening it up in Edge.
(Note that after the disappointment around this, and the way Microsoft made it seem like the AI was integrated into the taskbar, rather than just being a link, the Bing icon has vanished from the search box for now, though we’re told it will be returning periodically).
Anyway, we can see that Microsoft’s plan is working thus far, with the Bing AI preview successfully recruiting regular users to add to the ranks of Bing searchers – and a good dollop of them – but will this state of affairs last?
And doubtless there’s still entertainment to be had prodding the AI, trying to engage with it using different angles – humor inevitably being one of them – and generally messing with the chatbot. That won’t last, though.
Don’t get us wrong, there will be serious users of the Bing bot out there, of course, but we’d imagine a sizeable chunk of the early attraction comes from the curious or mischievous.
And in that respect, initial figures are not really a yardstick of how much impact the ‘new Bing’ as Microsoft calls it will make. If growth is sustained, and the AI is meaningfully honed and improved over the next few months, we can come back and talk about a new wave of adoption for Bing.
Until then, we remain skeptical, and our overall feeling is that Microsoft has opened the doors too early on this one. We’re not sure the AI is going to be well tuned enough to seriously impress in the way it should for some time yet, but it’s easy to see why Microsoft was keen to launch. It needs all the weaponry it can muster in the battle against Google (and Chrome for that matter), and the latter company is forging ahead with its own AI tech (Bard).
There’s no shortage of challenges you’re going to face once you decide to create a website. Whether you want to boost your business with a beautiful site or kick off that blog you’ve always wanted to create – one is sure. You’re going to need a solid web hosting service – it’ll be your site’s forever home, hopefully.
The good news is that there’s a perfect web hosting solution somewhere out there, regardless of your level of technical know-how and available budget. However, finding the right one for you and your business can take some time and careful consideration.
It’s not all about the cost, you know? A solution fit for a fabulous WordPress blog probably won’t fulfil the needs of an online store with an ever-growing number of visitors.
Before being able to pick out the perfect solution for your online project, you should make sure that it provides the right amount of resources, as well as an ability to scale up/down. A seemingly small thing like this can set your site for success right at the start.
So, to make things simpler for you, we’ll go through the three most common types of web hosting and tell you everything you should know before opting for any of them.
The three common types of web hosting
Whether you’re starting out or wish to switch your web hosting solution for a superior one, you’ll want to catch wind of the most common web hosting types and tackle their meaning. After that, you can be sure you’ve made the best decision for the future of your website.
Most people start their journey into cyberspace with shared hosting. It’s not only simple to start with but also considerably cheaper than other solutions out there. Consequently, it’s also the least powerful one. This is because with this type of solution a single server is shared among several users. Thereby, the resources are also shared – storage space, bandwidth, CPUs, and RAM included.
Once your site begins to grow and outgrow your shared hosting solution, you’ll probably want to upgrade to a virtual private server (VPS) hosting solution. Without breaking the bank, it will let you get rid of the primary drawback of shared hosting – shared resources. So, with a VPS solution, you’ll still share a physical server with other sites, but you will get a set of resources that are dedicated to your site, and your site only.
If you want to step up your game from a VPS solution and don't mind paying a pretty penny for it – you should consider dedicatedserver hosting. As the name suggests, you’ll get your dedicated physical server with your dedicated resources and you won’t have to share them with anyone else.
It’s no secret that shared hosting is popular for its cost-effectiveness and ease of use. It’s the cheapest out of these three options, so if you can spare a mere $ 5 per month – you can afford it. Also, shared hosting is exceedingly easy to use, so even if you’ve never made or managed a site before, you’ll catch up with everything in no time. So, there’s no need for technical know-how – the technical team will walk you through every step of the journey.
Acting as a bridge between shared hosting and dedicated server hosting, VPS offers powerful performance, high uptime, superb long-term scalability, enhanced security, customizability, and control over your server space. However, the ease of use will vary depending on whether you’re using a managed or unmanaged service.
While dedicated server hosting will cost you a big buck, it can get you everything a VPS provides plus complete control over your solution. With full root access, you can perform direct server customizations without any restrictions – alter your hardware specifications, add advanced security tools, install applications across your server, and much more. Also, a dedicated solution comes with a dedicated, round-the-clock support staff.
Who should use each type of web hosting?
Being the simplest solution out of the three, shared hosting is the best choice for small sites and blogs that don’t get too many visitors. It’s also a solid solution for young entrepreneurs that lack a big budget and technical know-how but don’t mind starting small.
A VPS solution offers a fine balance between resources and budget, and it’s aimed at those who have outgrown shared hosting. It’s perfect for those running high-traffic sites for small to mid-sized businesses including ecommerce platforms, SaaS providers, software engineers, and so forth.
Dedicated servers are “state-of-the-art” web hosting solutions geared towards mid-sized to big businesses that employ over 500 people that are processing massive amounts of data every day. For instance, if you’re running a booming ecommerce store with hundreds of transactions per hour, you’ll want to consider this type of solution.
The benefits of each type of web hosting
The three primary advantages of choosing a shared hosting solution are its budget-friendliness (the pricing starts at $ 5 per month), a beginner-friendly approach (simple setup, built-in control panel, and site-building tools), and solid customer support (expect to get live chat support and access to well-supplied knowledgebase).
In comparison with shared hosting, VPS will get you more powerful performance, higher reliability, and the ability to scale up/down your server with ease. It also gives you more control over your server and a superb level of customizability.
The main benefit of utilizing a dedicated server solution is having dedicated resources that can keep up and promote the growth of your business. Also, your business site/sites should benefit from increased speed, improved SEO, and superior security.
While fully managed dedicated servers are pretty popular with large enterprises, you can also opt for a partially managed or unmanaged server and save up some money.
Things to avoid when choosing a web hosting service
Since we’ve already shared our tips and trick on how to choose a web hosting service, now we’re going to uncover what mistakes you should avoid making when searching for your solution.
If you want to go with shared hosting, don’t settle for a free hosting service just to save up some money at the start. It will cripple your site with seriously slow speed, unreliable uptime, and non-existent customer support. Before you know it, most of your potential customers will lose trust in your brand and your business will go bust.
Likewise, don’t purchase a shared or VPS hosting solution before trying it out with a free trial – if one is provided. If not, check whether there is a money-back guarantee. Plus, don’t forget to see what the small print says – this is true for all three types of hosting.
If you thinking about purchasing a self-managed VPS solution even if you aren’t particularly tech-savvy – don’t do it. It’s harder than you think and it’ll take plenty of time before you get the hang of it.
Also, don’t fail to check the company behind the solution including their track record and history of security incidents – if there are any. Take some time to read out reviews about your potential web hosting provider, professional reviews and customer testimonials alike. And if a provider has a bad track record, avoid it like the plague.
This is particularly important when picking out a dedicated server solution as with this type of hosting one should never make compromises in terms of security.
On the end note, don’t forget to consider the needs of your online project before picking out a web hosting package for it. What type of site do you wish to create? Will you be creating a single site or more of them? How do you plan to build your site? What amount of traffic do you expect to receive each month? What additional features do you want to receive? And how much money are you willing to give up each month?
Once you’ve answered all these questions, you’ll be a couple of steps closer to choosing the best web hosting solution for your business.
In an effort to further secure open source software, GitHub has announced that the GitHub Advisory Database is now open to community contributions.
While the company has its own teams of security researchers that carefully review all changes and help keep security advisories up to date, community members often have additional insights and intelligence on CVEs but lack a place to share this knowledge.
This is why GitHub is publishing the full contents of its Advisory Database to a new public repository to make it easier for the community to leverage this data. At the same time, the company has built a new user interface for security researchers, academics and enthusiasts to make contributions.
All of the data in the GitHub Advisory Database is licensed under a Creative Commons license and has been since the database was first created to ensure that it remains free and usable by the community.
Contributing to a security advisory
In order to provide a community contribution to a security advisory, GitHub users first need to navigate to the advisory they wish to contribute to and submit their research through the “suggest improvements for this vulnerability” workflow. Here they can suggest changes or provide more context on packages, affected versions, impacted ecosystems and more.
The form will then walk users through opening a pull request that details their suggested changes. Once this done, security researchers from the GitHub Security Lab as well as the maintainer of the project who filed the CVE will be able to review the request. Contributors will also get public credit on their GitHub profile once their contribution has been merged.
In an attempt to further interoperability, advisories in the GitHub Advisory Database repository use the Open Source Vulnerabilities (OSV) format. Software engineer for Google's Open Source Security Team, Oliver Chang provided further details on the OSV format in a blog post, saying:
“In order for vulnerability management in open source to scale, security advisories need to be broadly accessible and easily contributed to by all. OSV provides that capability.”
We'll likely more on this change to the GitHub Advisory Database once security researchers, academics and enthusiasts begin making their own contributions to the company's database.
Rumors of the Sony WH-1000XM4 – the successors to the best headphones we've ever tested – have been ramping up in recent months, coming to a head when a Walmart listing that appears to describe all the specs of the new headphones was leaked.
While the existence of the XM4s is yet to be confirmed by Sony, the Walmart listing revealed the kind of changes (or lack thereof) we can expect over the Sony WH-1000XM3.
Now, normally we would expect a brand to make some significant changes when bringing out a successor to its last pair of headphones. In this instance, however, we’re relieved that Sony hasn’t tinkered too much with its class-leading noise-cancelling headphones.
Instead, it looks like the company is making some very considered tweaks to the XM3, which could genuinely improve the user experience without detracting from a winning design. As they say, ‘if it ain’t broke, don’t fix it’.
What’s new with the Sony WH-1000XM4?
According to the leaked listing, one of these tweaks is the ability to connect more than one audio source to the headphones at once; multipoint pairing is something users of the XM3s have been calling out for, and it will allow you to pair the WH-1000XM4s with your laptop and your smartphone at the same time.
There should also be a slight improvement to the sound quality. A feature known as “Edge-AI” can restore the detail lost in highly compressed audio files by upscaling them in real time, which Sony says delivers sound “as close as possible to that of a wired connection.”
Wireless headphones have long been considered inferior to their wired counterparts in terms of audio fidelity, so this tweak will likely appease audiophiles who haven’t yet committed to cutting the cord.
Connectivity in general should be improved too, as Sony makes the leap from Bluetooth 4.2 to Bluetooth 5, which brings faster pairing times, connectivity over longer distances, and stronger pairing in high traffic areas.
Anyone who has used the Sony WH-1000XM3s to make phone calls should notice an improvement in the sound quality as well, with a feature called Precise Voice Pickup that uses the headphones’ five microphones and advanced audio signal processing to make your voice sound clearer.
The noise cancellation that made the Sony WH-1000XM3s so popular is also due an upgrade. According to the leaked listing, a feature called Adaptive Sound Control will “learn to recognize locations you frequently visit, such as your workplace or your favorite cafe.”
“In addition it automatically detects what you’re up to – for example, walking, waiting, or traveling – and then adjusts ambient sound settings to best suit the situation,” says Sony. This is a feature that’s already been brought to the XM3s via a firmware update, so we've had a bit of a preview already.
These are all smart tweaks to already-great features. So what’s staying the same with the Sony WH-1000XM4?
What’s staying the same?
Aside from these little tweaks and upgrades, the new XM4s seem to be very similar to their predecessors.
It looks like there won’t be any material changes to the design of the Sony WH-1000XM4s, which we think is a great thing. We loved how comfortable the XM3s felt, with big padded earcups and a soft headband.
They also looked great, with a sleek, minimalist build that appeals to a wide range of people, and we liked the touchpad controls – another feature that will be making a return.
The sound quality shouldn’t change substantially either, aside from that AI upscaling feature that will help to curb the data loss from highly compressed files. Judging from the leaked listing, the XM4s will use the same 40mm drivers as their predecessors and support for Sony’s LDAC transmission technology – and as the XM3s are among the best-sounding headphones on the planet, we’re happy to see that the audio profile hasn’t been tweaked too much.
Some may be disappointed to find that there’s no improvement to battery life – but with 30 hours of juice, the Sony WH-1000XM3 weren’t exactly short-lived. Plus, with a return of USB-C fast charging, the XM4s shouldn’t take too long to top up.
A considered approach
Sony has a history of making careful tweaks to its products with each upgrade, and it’s something we’ve seen with the brand’s noise-cancelling 1000X range before.
It’s a great way of instilling a sense of trust in the products, and it makes us feel confident that each new upgrade will bring genuinely useful updates, rather than skin-deep design changes that don’t really improve the experience of using the headphones.
Sony wouldn’t be able to be subtle with its upgrades to the 1000X series if the original product wasn’t so good – and in a market where every company is trying to outdo one and other with headline-grabbing features like gesture controls and built-in AI (like the TicPods Pro 2), it’s a risky move to let the sound, feel, and look of the headphones speak for itself. That's especially true with the first-ever Apple over-ear headphones looking like they're going to launch in a matter of weeks and shake up the headphones market.
Trends (or gimmicks, if you prefer) like virtual 3D audio, bone conduction, and crazy form factors (see: the Bose Frames) may come and go – but we don’t think there will ever be a time when people won’t want a great-sounding pair of noise-cancelling headphones that do their job with minimal fuss.
Hopefully, that’s exactly what the Sony WH-1000XM4 will do when they’re finally released – and with this recent leak, it’s only a matter of time before we can get our hands on them and find out for ourselves.
Can't wait until then? Check out the best Sony WH-1000XM3 deals we've found today:
In a blog post that came seemingly out of nowhere, Sony finally revealed the new DualSense PS5 controller. It marks a radical departure from the DualShock 4, but the redesigned pad will share one thing in common with its predecessor – developers will continue to ignore almost all of its unique features.
And that’s a shame, as the DualSense is stuffed full of exciting and potentially game-changing technology. Sony wants to tingle your fingertips and massage your palms in a variety of interesting ways using haptic feedback and adaptive triggers – and I’m all for it.
We’ve seen the tech used effectively in VR controllers, but if you’re new to haptic feedback it basically means you’ll feel more of what you see on screen – the sludginess as you drive a car through mud or the tension of pulling back a bow string as you shoot an arrow, for example.
The problem is – and I hate to admit this – that these features will largely be ignored by everyone but Sony’s first-party studios. History has shown us time and time again that even if you design a console entirely around a distinctive input device (hello, Nintendo Wii), third-party developers will still find a way to ignore 95% of a controller’s special qualities.
Let’s take a look at the DualShock 4 as our primary suspect. It’s got a lovely light bar which can change color to reflect what’s happening in a game, such as flashing white if you’re using a torch, or turning red if your health is low. How many games use it in this way, though? The answer is: barely any.
Next up, the DualShock 4 touch bar. If you ever needed a more concrete example of developer apathy in full effect, it’s that battery-draining touch bar. We saw Killzone: Shadowfall, a PS4 launch title, use the touch bar in some interesting ways – as did Infamous: Second Son. But how many other games can you name that transform the experience in any meaningful way using this feature? Probably no more than a handful, because basically every game just uses it as an oversized map button. Brilliant.
What about the PS4 accelerometer? A feature that’s been around since the SixAxis controller, which launched with the PlayStation 3. When did you last play a video game that used the accelerometer for something other than a silly gimmick? Yeah, didn’t think so.
But hold on. Maybe it’s because those features were rather superfluous. I mean, come on, a flashing light that you can’t even see most of the time? Who cares! Members of the court, may I present to you exhibit B: HD Rumble on Nintendo Switch.
The masters of cramming quirky technology down gamers’ throats, Nintendo always tries to introduce some bizarre new input system into their consoles. With Nintendo Switch it was no different. We were promised the sensation of feeling ice cubes in a controller – because of course we were. Despite the technology genuinely wowing in games like 1-2-Switch, it’s basically been ignored by even Nintendo themselves, and hasn’t come close to reaching the potential we were promised.
Still in denial? Okay, let’s wrap this up with one more sorry example. You might not know this, but the Xbox One controller has impulse triggers. And they’re freaking awesome and never, ever get used.
Do yourself a favor and play any of the Forza Motorsport games on Xbox One and you’ll experience a fingertip-defining moment that will make every other racing game seem a little sad in comparison. The triggers rumble and respond according to where your tyres are on the track, so you can physically feel the sensation of a wheel locking up, moving over gravel and responding to torque. It’s so damn good, but clearly not a priority for any developers.
One feature fits all
So why does this worrying trend constantly happen? Truth be told, it all comes down to time and money. Video games are extremely expensive to make, and require a lot of resources to do so. There’s no monetary benefit to developers spending the extra time to code for features that are specific for one console. Occasionally it can happen, but it’s an anomaly.
The odds are stacked against the DualSense controller, then. There’s no doubt that we’ll see some truly awe-inspiring moments from Sony’s first-party studios (firing Aloy’s bow in Horizon: Zero Dawn 2 is a given for the adaptive triggers), but try not to feel too disappointed if half the time these features come as a pleasant surprise, rather than a new standard moving forward.