Use BMD / AJA Signal are reference for touch frame timer

Hi All,

Would it be possible for touch to use either the timing from an SDI signal on a BMD/AJA etc capture card for the internal timing of touch. This workflow is used in other packages, such as unreal to allow for sync’ing the render process to a genlocked signal. I believe they use the actual video feed and not the reference port.

Doing this would allow AR/XR shows, or anything for TV to be rendered in perfect software sync, and the use of the reference port on the card to keep hardware sync.

Thanks,
Scott

2 Likes

You will need extra hardware for that:
https://docs.derivative.ca/Hardware_Frame_Lock

This is two different things. The hardware lock for sure for output is required. The vast majority of unreal AR setups, with just a single output, use the input signal on the capture card to time the frame and keep everything in sync with consumer output cards.

It’s a tried and tested solution. If you have multiple output machines, then for sure you need to frame lock between machines.

This is very interesting, but I am not sure if I completely understand concept of syncing to genlocked signal. I guess one part is to match TD cook rate with input video frame rate. That should essentially mean only one frame is rendered for each incoming frame. That is easy to do.

However syncing TD to genlock would mean that internal timing of TD would have to shift (in terms of phase) to match timing of input video signal, right?
This sounds like syncing to genlocked signal would “only” make a difference in latency. TD would cook immediately once new input frame would be available as their timing would be in sync.

Please correct me if I am wrong, but in case this could help video I/O latency (in case of blackmagic / aja cards) I would very much be a fan of such feature.

Hello,
I am quite curious about topic of genlocking in terms of video I/O cards (such as Blackmagic, AJA, Bluefish…) and would like to ask for more information on how it actually works at the moment.

I guess right now TD doesn’t sync to genlocked video I/O cards when using Video Device In/Out TOP, right? Therefore it means that phase of video I/O card is different from TD’s internal cooking phase (they could be set to the same refresh rate, but it doesn’t mean they are synced). When thinking about this further, I feel like this could also mean that due to some very little differences in refresh rate, phase between TD and video I/O card might be slowly shifting and after many hours it could eventually cause a frame drop.

To better illustrate what I mean lets imagine that internal clock of both TD and video I/O card isn’t perfect (I guess it never is) and that there is an extremely small deviation. For example TD is running at 59.999999 while video card is running at 60.000001. You start up your system that needs to be super stable and everything seems to be running just fine. However, half a million frames later (about 2.3 hours) phase drifts too much and frame drop happens. This could be a nightmare right? :smile: If I am right here, it could mean that your stable system isn’t really that stable (which is a problem).

Maybe this video could better visualize what I am trying to say. :slightly_smiling_face:
sync_drift.zip (67.4 KB)

I guess right now TD doesn’t sync to genlocked video I/O cards when using Video Device In/Out TOP, right?

And that kinda is the question. @ben Maybe you have some insight.
What I understand is that the Driver is taking care of running and transporting the captured frames to the COU (or GPU with GPU Direct). Touch then takes the frames in buffer and does stuff with it. The AJA driver for example is able to detect the Framerate and format and brings the intel into touch. So, in A way, is always synced. As long as touch runs at the same framerate as the input it also is synced.
But as you said we normally will not run on perfect 60 fps, neither camera (59.94 for nts) nor computer (was running at 60.01 HZ on some projects. For whatever reason).
And this is where the whole hardware with Syncboards, Blackburstinput on the SDI Grabber come in.
Basicly, with this two we are no longer taking the internal information from our devices,
but instead everyone is listening to one conductor. piping out a burst at 60Hz, Even if we do not get perfect 60 Hz from the source, we still all fire at the same moment and in the same timeframe, so even if we get a deviation because of non matching framerates, all devices will reset on the next burst.
This also starts to get more and more interresting when working in broadcast to reduce flickering and such.

This is something I’m thinking about a lot more after talking to Scott recently too. There are many different things in the pipeline where syncing can occur. You can sync when the content starts getting generated, which is what this RFE is about. This is different from the sync that is required for broadcast to avoid flickering though. That sync is on output, and it keeps the refresh intervals of your output device (SDI output using the genlock connector on the card or Displayport output using QuadroSync’s genlock input) in sync with the shutter intervals of the studio cameras. That can happen even if the content isn’t generated in-sync with an external signal. So you can already get flicker-free output using the tools in TD, as long as it’s ok that the content itself may be off by a frame or two.

What type of sync is needed for a project is very dependent on the project and the content being shown. Slow moving content, or content only output on a single display, often doesn’t need content sync.
Right now to achieve input/content sync with TD the tools are the Sync In/Out CHOPs, but those are just to sync between TD instances, not to an external trigger.

Thanks for info.

I am not sure about whether this relates to sync (between TD and genlocked video I/O card) as I haven’t managed to do a proper test, but in general I spent quite a lot of time working with Blackmagic cards and I have noticed that even when using very simple scene, I can very occasionally see a frame drop on video output. This might and might not be related to this topic, but I though I might share it here. It isn’t something that could be easily spotted as it usually happens once in a whole day. Obvious conclusion would be that there is a performance issue - some CPU / GPU peak, many operators cooking at the time. However I sometimes feel like this isn’t the case as the scene itself is very lightweight, hardware is powerful enough and not a single frame drop happens in hours. And then it suddenly drops one frame out of nowhere. Seeing this happen many times over last years, I started to wonder if this could be related to sync between TD and video I/O card. But as I said, so far I haven’t been able to test this properly and it could be related to something totally different. My question is, please do you think this could be happening because of not having TD synced to video output?

Another part of my wonders goes to latency as there is a constant struggle to minimize this as much as possible. :slightly_smiling_face: I am not sure if sync between TD and video I/O could help in this area, but based on what you said I feel like it could make a difference, right?

I guess there is probably some reason for which Unreal decided to go this way by adding option to sync the engine to genlocked video I/O card.

Apart from these genlock/sync related questions I would also like to ask about timecode.

Please is it possible to get timecode information from something like blackmagic video I/O device (which has camera with timecode data on its input)?

I have been searching the forum and documentation recently, but it seems that TD currently doesn’t support this. While browsing trough blackmagic SDK I have noticed support for accessing timecode - using IDeckLinkVideoInputFrame::GetTimecode.

Please would it be possible to add timecode capture for Video Device In TOP (that could be accessed trough something like Info CHOP)? Thanks. :slightly_smiling_face:

EDIT: Moved to Reading timecode from Blackmagic / AJA devices as this isn’t particulary related to sync mechanisms discussed above.