NVIDIA Instant NeRFs need just a few images to make 3D scenes

NVIDIA sees AI as a means of putting new tools into the hands of gamers and creators alike. NVIDIA Instant NeRF is one such tool, leveraging the power of NVIDIA’s GPUs to make complex 3D creations orders of magnitude easier to generate. Instant NeRF is an especially powerful tool in its ability to create these 3D scenes and objects. 

In effect, NVIDIA Instant NeRF takes a series of 2D images, figures out how they overlap, and uses that knowledge to create an entire 3D scene. A NeRF (or Neural Radiance Field) isn’t a new thing, but the process to create one was not fast. By applying machine learning techniques to the process and specialized hardware, NVIDIA was able to make it much quicker, enough to be almost instant — thus Instant NeRF. 

Being able to snap a series of photos or even record a video of a scene and then turn it into a freely explorable 3D environment offers a new realm of creative possibility for artists. It also provides a quick way to turn a real-world object into a 3D one. 

Some artists are already realizing the potential of Instant NeRF. In a few artist showcases, NVIDIA highlights artists’ abilities to share historic artworks, capture memories, and allow viewers of the artworks to more fully immerse themselves in the scenes without being beholden to the original composition.

Karen X. Cheng explores the potential of this tool in her creation, Through the Looking Glass, which uses NVIDIA Instant NeRF to create the 3D scene through which her camera ventures, eventually slipping through a mirror into an inverted world. 

Hugues Bruyère uses Instant NeRF in his creation, Zeus, to present a historic sculpture from the Royal Ontario Museum in a new way. This gives those who may never have a chance to see it in person the ability to view it from all angles nonetheless.

Instant NeRF of inside NVIDIA HQ

(Image credit: NVIDIA)

With tools like Instant NeRF, it’s clear that NVIDIA’s latest hardware has much more than just gamers in mind. With more and more dedicated AI power built into each chip, NVIDIA RTX GPUs are bringing new levels of AI performance to the table that can serve gamers and creators alike. 

The same Tensor Cores that make it possible to infer what a 4K frame in a game would look like using a 1080p frame as a reference are also making it possible to infer what a fully fleshed out 3D scene would look like using a series of 2D images. And NVIDIA’s latest GPUs put those tools right into your hands. 

Instant NeRF isn’t something you just get to hear about. It’s actually a tool you can try for yourself. Developers can dive right in with this guide, and less technical users can grab a simpler Windows installer here which even includes a demo photo set. Since Instant NeRF runs on RTX GPUs, it’s widely available, though the latest RTX 40 Series and RTX Ada GPUs can turn out results even faster. 

The ability of NVIDIA’s hardware to accelerate AI is key to powering a new generation of AI PCs. Instant NeRF is just one of many examples of how NVIDIA’s GPUs are enabling new capabilities or dramatically speeding up existing tools. To help you explore the latest developments in AI and present them in an easy-to-understand format, NVIDIA has introduced the AI Decoded blog series. You can also see all the ways NVIDIA is boosting AI performance at NVIDIA’s RTX for AI page. 

TechRadar – All the latest technology news

Read More

Nvidia finally catches up to AMD and drops a new app that promises better gaming and creator experiences

Nvidia has announced plans to bring together the features of the Nvidia Control Panel, GeForce Experience, and RTX Experience apps all in a single piece of software. On February 22, Nvidia explained on its website that this new unified app is being made available as a public beta. This means that the app could still be changed in the hopes of improving it, but you can download it now and try it for yourself.

The app is made specifically to improve the experience of gamers and creators currently using machines equipped with Nvidia GPUs by making it easier to find and use functions that formerly lived in separate programs. 

Users with suitable Nvidia GPUs can expect a number of significant improvements that come with this new centralized app. Settings to optimize gaming experiences (by tweaking graphical settings based on your hardware)  and downloading and installing new drivers can now be found in one easy interface.

It’ll be easier to understand and keep track of driver updates, such as new features and fixes for bugs, with clear descriptions. While in-game, users should see a redesigned overlay that makes it easier to access features and tools like filters, recording tools, monitoring tools, and more. Speaking of filters, Nvidia is introducing new AI Freestyle Filters which can enhance users’ visuals and allow them to customize the aesthetics of their games. As well as all of these upgrades, users can easily view and navigate bundles, redeem rewards, get new game content, view current GeForce NOW offers, and more.

Screenshot of the webpage where users can download the Nvidia app beta

(Image credit: Future)

Nvidia's vision

It certainly seems like Nvidia has worked hard to create a more streamlined app that makes it easier to use your RTX-equipped PC. It’s specifically intended to make it easier to do things like make sure your PC is updated with the latest Nvidia drivers, and quickly discover and install other Nvidia apps including Nvidia Broadcast, GeForce NOW, and more. The Nvidia team also claims in its announcement that this new centralized app will perform better on RTX-GPU-equipped PCs than its separate predecessors. That’s thanks to reduced installation times through the app, better responsiveness from the user interface (UI), and because it should take up less disk space than its predecessors (I assume combined). 

This isn’t the end of the new Nvidia app’s development, and it seems some legacy features didn’t make the cut, including 360/Stereo photo modes and streaming directly to YouTube and Twitch, because they see less use. Clearly, Nvidia felt it wasn't worth including these more niche features in the new app, and anyone who wants to continue to use them can still use the older apps (for now, at least). The new app is focused on improving performance, and making it easier to install and integrate new features into users’ systems. 

An Nvidia GeForce RTX 2060 slotted into a PC with its fans showing

(Image credit: Future)

By combining its apps into one, easy-to-use piece of software, Nvidia is finally catching up to AMD in one aspect where Team Red has the advantage: software. AMD's Radeon Adrenalin app already offers a lot of these features, as well as others, like a built-in browser and HDMI link assurance and monitoring that can automatically detect any issues with the HDMI’s connectivity – all in one single interface.

Finally, AMD doesn’t require users to make an account to be able to use its app. We don’t expect that Nvidia will fully catch up to AMD’s app just yet (though it would be nice not to have to sign in), but this is definitely a push in the right direction and hopefully users will see a lot of use out of the new app.

YOU MIGHT ALSO LIKE…

TechRadar – All the latest technology news

Read More

Nvidia explains how its ACE software will bring ChatGPT-like AI to non-player characters in games

Earlier this year at Computex 2023, Nvidia revealed a new technology during its keynote presentation: Nvidia ACE, a ‘custom AI model foundry’ that promised to inject chatbot-esque intelligence into non-player characters in games.

Now, Nvidia has more to say about ACE: namely, NVIDIA NeMo SteerLM, a new technique that will make it easier than ever before for game developers to make characters that act and sound more realistic and organic.

We’ve heard about NeMo before, back when Nvidia revealed its ‘NeMo Guardrails’ software for making sure that large language model (LLM) chatbots such as the ever-present ChatGPT are more “accurate, appropriate, on topic and secure”. NeMo Steer LM acts in a similar but more creative way, allowing game devs to ‘steer’ AI behavior in certain directions with simple sliders; for example, making a character more humorous, or more aggressive and rude.

I was a bit critical of NeMo Guardrails back when it was originally unveiled, since it raises the question of exactly who programs acceptable behaviors into AI models. In publicly accessible real-world chatbot tools, programmer bias could lead to AI-generated responses that offend some while appearing innocuous to others. But for fictional characters, I’m willing to believe that NeMo has huge potential. Imagine a gameworld where every character can truly react dynamically and organically to the player’s words and actions – the possibilities are endless!

The problems with LLMs in games

Of course, it’s not quite as simple as that. While SteerLM does promise to make the process of implementing AI-powered NPCs a lot more straightforward, there are still issues surrounding the use of LLMs in games in general. Early access title Vaudeville shows that AI-driven narrative games have a long way to go, and that’s not even the whole picture.

LLM chatbots such as ChatGPT and Bing AI have proven in the past that they’re not infallible when it comes to remaining on-topic and appropriate. Indeed, when I embarked on a quest to break ChatGPT, I was able to make it say things my editor sadly informed me were not fit for publication. While tools such as Nvidia’s Guardrails can help, they’re not perfect – and as AI models continue to evolve and advance, it may become harder than ever to keep them playing nice.

Even beyond the potential dangers of introducing actual AI models into games – let alone ones with SteerLM’s ‘toxicity’ slider, which on paper sounds like a lawsuit waiting to happen – a major stumbling block to implementing tools like this could actually be hardware-related.

Screenshot of 'Jin the ramen shop owner', an AI-generated non-player character.

Nvidia’s Computex demo of ‘Jin the ramen shop owner’ was technologically impressive but raises a lot of questions about AI in games. (Image credit: Nvidia)

If a game uses local hardware acceleration to power its SteerLM-enhanced NPCs, the performance could be affected by how powerful your computer is when it comes to running AI-based workloads. This introduces an entirely new headache for both game devs and gamers: inconsistency in game quality dependent not on anything the developers can control, but on the hardware used by the player.

According to the Steam Hardware Survey, the majority of PC gamers are still using RTX 2000 or older GPUs. Hell, the current top spot is occupied by the budget GTX 1650, a graphics card that lacks the Tensor cores used by RTX GPUs to carry out high-end machine-learning processes. The 1650 isn’t incapable of running AI-related tasks, but it’s never going to keep up with the likes of the mighty RTX 4090.

I’m picturing a horrible future for PC gaming, where your graphics card determines not just the visual fidelity of the games you play, but the quality of the game itself. For those lucky enough to own, say, an RTX 5000 GPU, incredibly lifelike NPC dialogue and behavior could be at your fingertips. Smarter enemies, more helpful companions, dynamic and compelling villains. For the rest of us, get used to dumb and dumber character AI as game devs begin to rely more heavily on LLM-managed NPCs.

Perhaps this will never happen. I certainly hope so, anyway. There’s also the possibility of tools like SteerLM being implemented in a way that doesn’t require local hardware acceleration; that would be great! Gamers should never have to shell out for the very best graphics cards just to get the full experience from a game – but I’ll be honest, my trust in the industry has been sufficiently battered over the last few years that I’m braced for the worst.

You might also like

TechRadar – All the latest technology news

Read More

Nvidia fixes a weird GPU driver bug that tanked CPU performance

Nvidia’s GPU driver was recently found to have a bug that was spiking processor usage after quitting out of a game, but the good news is this problem has been fixed in a freshly released hotfix.

Neowin reports that the affected GeForce driver version, 531.18, now has a hotfix (531.26), and it cures two issues including the gremlin that was eating CPU resources.

This was an odd bug which saw an Nvidia Container process hang around after you’d stopped playing a game and exited. Going into Task Manager, gamers were seeing CPU resources being eaten up to the tune of 10% or even 15%, causing some slowdown to the host gaming PC.

If you didn’t open Task Manager and notice this process, then manually close it, your machine could run rather sluggishly and you’d have no idea why.

Still, the cure has arrived now, and if you were holding off updating to version 531.18 due to the presence of this bug, you can now go ahead.


Analysis: Notebook crashing blues also fixed

This fix has been deployed quickly, which is good to see. Nvidia chose the route of a hotfix because that can be pushed out immediately to those with GeForce graphics cards, rather than having to wait for a cure bundled with the next version of Team Green’s graphics driver.

The hotfix also comes packing a resolution for a second problem. Namely a random crash (stop error) happening with some laptops that have GeForce GTX 10 Series, or MX 250 / MX 350 mobile GPUs.

Both of these are quite nasty little glitches, so it’s good to see them stamped out by Nvidia in a swift manner. Indeed, because there was apparently some noticeable slowdown evident with the persisting Nvidia Container bug, slightly more paranoid types may even have wondered if something had happened malware-wise, as sudden system slowdown or lack of responsiveness can be a symptom of infection – so they may have worried unduly.

TechRadar – All the latest technology news

Read More

Little-known Japanese CPU threatens to make Nvidia, Intel and AMD obsolete in HPC market

Sandia National Laboratories has announced it will be the first Department of Energy labs in the US to deploy the Fujitsu A64FX, the only ARM-based processor designed from ground up for HPC projects and supercomputers.

Fujitsu is known primarily for its business laptops, tablets and desktops, but is a behemoth in its own right when it comes to processors, having been in the business for well over half a century.

Launched in 2019, the CPU has 48 cores, a theoretical peak performance of 3.38 TFLOPS, runs at 2.2GHz and has 32GB HBM2 memory on the die itself.

What makes it ideal for the HPC market is that it provides far higher bandwidth performance between memory and the CPU – up to 1TBps. Moving data to and from the CPU is the biggest obstacle by far to what researchers refer to as exascale computing.

What makes the A64FX even more exciting is that Fujitsu wants the technology to trickle down to hyperscalers and major cloud computing giants so that the masses can benefit too.

Given it is based on ARM architecture, it can (and has) run Linux distributions out of the box and even Microsoft Windows.

It is considered a general purpose CPU, but surpasses even GPUs from Nvidia and AMD on the all-important metric of performance per watt. Indeed, a 768-CPU prototype sits on top of the Green500 list – the leaderboard for supercomputers that deliver the most power per watt.

The A64FX was designed expressly to power the successor of Japan’s main supercomputer, the K, which was decommissioned back in August 2019. 

Its replacement – the Fugaku – is expected to be 100 times faster when it launches later this year, will run on a Linux distribution called McKernel and will reach a staggering 400 petaflops. The aim is for it to be the first supercomputer to hit one exaflop when fully deployed with half a million processors buzzing.

TechRadar – All the latest technology news

Read More