SYNC | shutter cine camera + frame playback in LED display

hi everyone!

this is an issue i am trying to solve for a long time with no success and would be great to develop this solution to continue using our system in touchdesigner.
the goal is to sync hardware and frame between a LED display and a cine camera (such as Alexa, Red, Sony Venice, etc). and for this first step, we can think about only one output of 4k content (h265, pro res, etc). later it will be necessary a cluster of GPUs genlocked in frame and hardware also for 8k + outputs.

my first setup was to use an AJA GEN10 to genlock the video processor, sending card of the LED and camera in 24Psf1080 and use vsync (in the GPU A6000 in On or Adaptive or Off) and also in the Window COMP (vsync On or Off). also, i am using nvidia decode with h265. in this configuration, the genlock sometimes gets in time and sometimes it gets off. i tried all possibilities for days, and, unfortnatelly, that didnt work.

we also tried to send the ref of the AJA GEN10 to the nvidia Sync 2 connected to the GPU A6000, passing through an Esync2 Optitrack to make the signal work in 48hz (not PsF). even with the GPU locked to the genlock signal we didnt saw any improvements.

my second attemp was already in Unreal making the software work with the “ref in” of a Decklink 4k SDI input 24PsF1080 genlock signal. this makes the software slave of that clock (timestamps) and locks the frames output with the shutter of the camera. and that is what i need in touchdesigner. is there anyway to make the ref in of the Decklink 4k master of the fps of Touchdesigner? or, is that possible to make the steps of the movie in slave of that reference?

if you have any more ideas and solutions i will be pleased to hear.

thank you! :slight_smile:

Hey, thanks for your question.
Yeah, the GEN10 or any signal generator controls the phase/interval of when work is done, but doesn’t control what is output. Different devices will have different queues of frames, so although they’ll process the frames at the same time, the frame that is being processed for each may be offset from each other.

We do support the similar workflow to Unreal using the ‘Sync To Input Frame’ mode for the Video Device In TOP. However that is only supported on AJA cards currently.

@novasfronteiras may I ask what output device are you using? Are you outputting frames directly using nvidia output, or are you using some video i/o card?