Some are but some not at the moment of testing. I don’t think it seems to hang the same way with regular IPs if those IPs are not resolvable but maybe it does? Maybe it should also bail sooner if mdns names aren’t resolvable? MDNS artnet is kind of niche but nice to have. Not sure if it could check for resolvability less frequently than every packet? I also do seem to remember that the sacn and artnet implementations had a difference in how they handled those unresolvable mdns addresses. Like I have had some installations where switching to one or the other protocol all else equal fixed the hang or delay issue. Probably just back burner it again for now and once I have another installation where I notice this I can collect some more information and do real A/B testing.
In my testing, it is only mDNS addresses that stall (not IP addresses), and my guess is that’s because of the DNS lookup to resolve the address. However, the stall occurs in an OS-level network call so I cannot say for certain. Anyway, I’ll take a look into how we can avoid having the network thread stall the main thread in this case, maybe by aborting the call early, if possible.
I also do seem to remember that the sacn and artnet implementations had a difference in how they handled those unresolvable mdns addresses.
Hm, interesting. The part it stalls at is shared by all network operators (including both sACN and Art-Net), but it does scale up with the number of unique addresses used so perhaps that was the case here.
Hi
I noticed a big oddity, the simplest example is from a single point with RGB channels. DMXOutPOP sends a huge number of packets to the network, about 700-1100 per second. If you enable ArtSync, the values grow very much more than 10000+. I’ve seen this in ArtNetominator. Build 2025.31760.
At the same time, using a simple circuit on DMXOutCHOP, I get stable results when setting 30FPS - 30 packet/s. I also checked it with the DMX Workshop program. Maybe the problem lies somewhere here
This is a bug with 2025.31760 that will be fixed in the next official build we release. For now I recommend reverting to 2025.31550. Apologies for the inconvenience.
