Hello everyone, I’m new here and I have a question. My NVIDIA Background isn’t detecting CUDA even though I’ve installed all the necessary drivers and SDKs.
I’m using an RTX 5080 graphics card — I hope someone can help me with this.
Hey, can you tell us what build you are using? That TOP does not work in the 2023.10000 series for the newest 5070/5080 etc GPUs, due to a large change in how it operates. It works in the 2025.30000 series though, but requires an extra external installation from Nvidia.
I try
Thank for reply me ![]()
I have used it on both the 2023 and the latest 2025 versions, but it still doesn’t work. Currently, I can only use it through NVIDIA Broadcast, but I’m facing an issue where, after using it for a while, the software freezes and I can’t close it.
My laptop specifications:
CPU: Ultra 9
GPU: RTX 5080
RAM: 64GB
SSD: 2 × 1TB SSDs
In the 2025.31500, can you tell me what error you see when you middle click on the node?
Thanks, can you confirm the Nvidia Driver version you have installed from nvidia.com?
Can you also place down a Monitors DAT, and show me the full contents of that (including the columns to the far right)
New update,I just updated to the latest NVIDIA driver and reinstalled the new version of TouchDesigner — and it’s great, it’s finally working! However, it seems that using NDI via NVIDIA Broadcast gives better results — the background and the person are separated more cleanly.
But now I’ve run into another issue: when I connect through NDI while using background separation, the software freezes and can’t be closed even from Task Manager. I’ve just uninstalled Armoury Crate and similar support software from my computer, hoping that it helps.
If you know anything about this issue, please give me some advice. Sincere thanks to you and your team.
Which software freezes?
Also can you post any video or frames that show the difference? We are using the same AI models, so the results should be similar.
Sorry for the late reply. I have a few updates after using it in my project.After I updated everything, TouchDesigner has been running very stably and no longer crashes unexpectedly.However, I still think that background separation using NVIDIA Broadcast looks cleaner than when we use NVIDIA Background.I’ve attached a clip below, and as you can see around the edges, when I use NVIDIA Broadcast, it looks smoother and much neater compared to NVIDIA Background.However, this option is limited since it can’t receive a source from SDI, so I hope you can take a look — maybe something can be improved
Link: Clip
I confirm that there is a problem with Nvidia Background Removal in Touchdesigner.
In a recent project, we had to decide whether to use it, and it was impossible because the response in both Quality mode and Performance mode was unworkable at a professional level. The segmentation result in Touchdesigner is very poor, with numerous artifacts. We compared the results with Notch, OBS, and Nvidia Broadcast (in these cases, the result is perfect and similar).
Mask tests as a comparison between Touchdesigner and Notch:
Touchdesigner version: 2025.31550
Graphic card: Nvidia 5070TI with the latest drivers
SDK Video Effects Nvidia Broadcast: 0.7.6
Thanks for the comparison. I’ve made a task to look into it further here and we will let you know.
As a long shot, I’d be curious if you see any better results if you do a Y flip on the image before running it through the Nvidia Background TOP. A few years ago we discovered that the AI model being used is very sensitive to the vertical direction of the image and our internal pipeline was sending it inversely to what it expected. This was resulting in a similar quality reduction to what you are reporting. At that time we added an internal flip to correct for this, but I’m wondering if something has changed there.
Hi Rob, I’m sending you the test with Y flip before NVBR TOP. It looks even worse.
Y flip: link
Thanks for trying that. That is what I expected since it should have been fixed a few years ago, but I was just wondering if somehow it had broken again in an update.
We’ll keep digging on our end and see what can find.
I’ve been doing some testing using the Nvidia Broadcast tool and looking at your videos and I’m wondering if the problem is that the mask and color images are out of sync. TouchDesigner’s background removal model is running in a separate thread, which means the results can be a frame behind the original color image.
I can see the effects of that fairly clearly in @lectromind ‘s composited video when you step through frame by frame and you can see the outline catch up to the musician when they move.
Does this sound like the effect you guys are seeing? You can compensate for this a bit by using a Cache TOP with the Output Index set to -1 to delay the color stream by a frame before compositing.
We have discussed having an inline mode for the Background TOP that would bypass the threading which would fix this issue as well. The threading model was more important for avoiding delays when we first implemented the TOP and it had to run through the CPU, but now the images all remain on the GPU so it shouldn’t have as much impact on performance.
Rob, thanks for the research.
The problem is not that the mask and the color image are out of sync. When sending the tests, I did not include the cache TOP as explained in the snippets.
The problem I see is directly related to the mask created in Nvidia background TOP, those “artifacts/noise” that are created intermittently within the mask in quality mode and in performance mode, regardless of which one I use.
This is a comparison of masks without compositions or processing between NOTCH (left) and TOUCHDESIGNER (right):
My question is, if Notch/OBS/Nvidia Broadcast/Touchdesigner all use the same SDK, shouldn’t the mask be exactly the same? As I said at the beginning, in the others it´s the same or similar, but in Touchdesigner the resulting mask has those artifacts, especially with moving images. This result has made it difficult for us to use Touchdesigner for live music works, for example, forcing us to use Notch or send the result directly from OBS Studio to Touchdesigner via spout.
The idea is to be able to use Touchdesigner exclusively, and I think that if the mask processing within Nvidia Background were similar or identical to the result of the other tested, the tool would be very powerful.
I hope this helps.
