I’m trying to build a setup to feed a videowall with 7 screens from a single GPU(Quad P620 V2)
My first aproach was to connect 3 screens directly to GPU and 4 screens to a hubMST (mDP to 4x DP Multi Monitor Adapter/MST - DisplayPort & Mini DisplayPort Adapters | StarTech.com Sweden) but i only achieve to feed 4 screens. The other 3 apears connected on Windows but without resolution and unavailable.
Maybe it’s GPU max screens suported limitation, i thought the hubMST was seen by the GPU as a single screen but it’s not after all
Maybe using a QuadHead2Go it may works, is the QuadHead2Go seen by the GPU as a single out/screen?
Any advices for this setup?
not all MSTs are created equal, some eat up two nvidia displays. I’d look into video wall controllers, an AJA box, or the “pro” solution, a datapath
You should use these: https://www.datapath.co.uk/datapath-products/video-wall-controllers/datapath-fx4/
I would suggest staying away from the QuadHead2Go. Some people like them but I’ve had many bad experiences with them in the past.
Thanks for the input’s, will look on the video wall controllers and the rest of the solutions.
With the solution using the FX4 can we define the input/output resolution?
If i feed the FX4 from TD with 3840x1200. in the outs can i get 1920x600(resolution of the screens)?
Was reading the docs from fx4 and cant find this kinda info about resolutions suported
@matthewwachter @drmbt anyone with experience with FX4, cant we get 1920x600 in the outputs.
Another solution using a NVS810 (https://www.pny.com/nvidia-nvs-810-for-8-dp-displays) if i can find this product still in sales.
Thanks will keep digging
Yeah I think you can. We’ve done some pretty strange resolutions coming out of those units. I think it will let you capture the EDID and even edit it if you desire.
I have done something with 2x 1080tis and an SLI although I don’t think the SLI is really needed. Might be cheaper than a data path at this point.
this could be old information, but its my understanding that with two cards, you don’t actually double your outputs, you just get one extra output if they aren’t quadro cards. someone might be able to prove that wrong, but that’s an IIRC
Or if you are feeling super cheap, get something like this and do all your complex mapping in TD: https://www.amazon.com/dp/B0BJBM681H
@Peeet Hmmm… looks interesting. Have you used it successfully?
Just taking a quick read there seems to be 2 things to consider:
Questionable QC as several reviews talk about it dying pretty quickly. One part of the cost of reputable brands is often the QC.
Only 1080P in and out, no 4k splitting!
Fair point. I just checked it again and while it does TELL the computer that it is receiving 3840x2160, I think it is actually immediately downscaling that to 1920x1080, THEN splitting the 1920x1080 into four 960x540 quadrants, then upscaling each 960x540 to 1920x1080 to be output to each HDMI port which seems like more work than just splitting the 4k signal but . As for how long it lasts, I can’t say as I’ve only used it once, but I will say “you get what you pay for” :-/
That being said, there are a couple other “2x2 Video Wall Splitter” options that come and go on Amazon, and over the years I have bought a few of them. Some of the more expensive ones ($200-400 range) do actually work fine and legitimately split a 4k image into four 1080p ones, but your mileage may vary. Also don’t expect things like frame sync between those four outputs or any other “fancy” features.
If you want to try getting one of the $200-300 ones to mess around with, more power to ya but if you’re trying to do this for real, you should have the budget to get a proper Datapath or something.
My personnal preference has always been to have one GPU output per display rather that using intermediate multiplication technologies like QuadHead2Go and such. I’ve used this setup with up to 12 outputs in TouchDesigner and it has always worked quite well with multiple identical nVidia Quadro cards. Even cheap ones will do if you’re doing 2D stuff and your GPU requirements aren’t too high (P1000’s have worked well for me in the past for such projects).
My only complaint when doing so has been the Windows vs TD display numbering mismatch which can become a pain if you are using multiple Window OPs in TD (generally a big NO! NO!) and arranging the displays in the proper order and orientation in the Windows Display Settings which can also become a pain with several displays when you have to connect/disconnect displays often. Managing EDIDs through the nVidia Control Panel can sometimes help in this regard.
If using discrete GPUs for all your outputs, I would first try arranging all the monitors in the Windows Display Settings as if they weren’t rotated by 45° and position them as they are in physicial space (portrait next to landscape with another portrait offset above the landscape etc…) and perform the final 45° rotation at render time in TD.
Jus curious - Why is using multiple Window OPs a big no no? Is it ok to have multiples if they are in comps with cooking turned off so only one is on at a time? Sometimes I like to use Window comps so I can manipulate the network while watching output on a second monitor.
Frankly I do it all the time especially when trying to whip something up quickly and the end result doesn’t have to look perfect or if all the displays are showing different content and occasional frame drops are acceptable. I’ve always used nVidia Quadro cards and don’t know if this works as well with other GPUs.
However if your displays are all part of a single image and especially if they are next to each other then you risk having issues with performance and/or tearing if you use multiple Window CHOPs. Apparently it also varies according to the GPU used.
So in other words for a potentially large or demanding multi screen project I would try starting with a single Window COMP because it can be painful going back afterwards and having to redo everything because the project was built on “shaky ground”.
See here, here and here for other arguments.
Ok got it. I will make one big single window in future with all monitors. Thanks for the info.