# UDP - Free D Protocol

Thank you Rob. I currently don’t have a camera in the home office, but I found a calibration file for this camera from aximmetry. It holds a dict of 8 entries, with two keys (“Zoom” and “Focus”) for each matching FocalLength. Unfortunately I don’t understand how to use two dict keys (freeD CHOP channels) to do the lookup

And according to the manual (freed section)

Camera Zoom:
magnification of optical zoom of this unit between 000555h (WIDE) and 000FFFh (TELE): 1365 → 4095
Camera Focus:
value of focus position of this unit between 000555h (NEAR) and 000FFFh (FAR): 1365 → 4095
Spare:
value of the iris position of this unit between 0555h (CLOSE) and 0FFFh (OPEN): 1365 → 4095

Generally speaking, if you’ve got two keys and your data points are in a regular grid (focus and zoom are sampled at regular intervals) then you can use bilinear interpolation (Bilinear interpolation - Wikipedia) to blend between the data points.

However, if your data points are more irregular (as I suspect they are), you could try Delaunay triangulation (Delaunay triangulation - Wikipedia) instead which will let you interpolate between a set of random points in any number of dimensions. With 2 keys, this is basically dividing the space into triangles and then for any given focus/zoom combination you can find the 3 closest calibrations and blend between them based on how close they are to your key. Some python code here: scipy.spatial.Delaunay — SciPy v1.10.1 Manual

@snaut is actually working on calibration tools that use delaunay triangulation to interpolate calibration data in 3 dimensions (x, y, z) and might be able to point you to some useful libraries.

Hi @Achim,

not quite sure if this is the correct approach but it seemed logical so… i did just go with scipy at the moment to build the initial Delaunay cells. The `find_simplex()` function will return the cell you are currently located in and it’s returned value can be used to find the cells corners. From here you can get the distance and start looking at a weighted average between them.

for a visualization using a scriptSOP and a bunch of random points as input:

``````# me - this DAT
# scriptOp - the OP which is cooking

# press 'Setup Parameters' in the OP to call this function to re-create the parameters
def onSetupParameters(scriptOp):
page = scriptOp.appendCustomPage('Position')
p = page.appendXYZ('Pos', label='Camera Position')
return

# called whenever custom pulse parameter is pushed
def onPulse(par):
return

from scipy.spatial import Delaunay
import numpy as np
def onCook(scriptOp):
scriptOp.clear()
input = scriptOp.inputs[0]
delaunay = []
for i in input.points:
myP = scriptOp.appendPoint()
myP.P = i.P
delaunay.append([i.P.x, i.P.y])

tri = Delaunay(delaunay)
allTriangles = tri.simplices
camP = np.array([(scriptOp.par.Posx, scriptOp.par.Posy)])
inArea = tri.find_simplex(camP)

locatedIn = allTriangles[inArea]

camPosP = scriptOp.appendPoint()
camPosP.P = (scriptOp.par.Posx, scriptOp.par.Posy, scriptOp.par.Posz)

for i in locatedIn[0]:
connectTo = scriptOp.points[i]
line[0].point = camPosP
line[1].point = connectTo
return

``````

cheers
Markus

thank you @snaut and @robmc. I have to admit this is quite a bit above my head. When I purchased a cam with free-d support I was expecting things would be plug n play.

I appreciate your input and example, but I will need some time to figure out how this applies to the xml

No problem. All of this virtual production tech is evolving so quickly, we’re still working on wrapping our heads around it too

I’d recommend probably working on alignment at a single zoom level first and then you can worry about the interpolation stuff afterwards.

Let us know if you have any other questions.

Hi Achim,

I wonder if you could calculate the FOV of the lens as you have the sensor width given in the file as well: `α = 2 * arctan[l/(2f)]` where `l` is the sensor width and f is the focallength.

The values for zoom and focus in the file I would interpret as normalized encoder positions…

In general I’m not sure this is a 2D lookup. Theoretically given a lens with `focus` set to infinity, you should get the `focuslength` which is equivalent to the `focuslength` specified on the lens. The `focus` then is a slight adjustment to that `focuslength` following the formula: `1/s1 + 1/s2 = 1/f` where `s1` is the distance to the object and `s2` the rear nodal point of the lens which I would interpret as the actual `focallength`. Following this article here on wikipedia:

https://en.wikipedia.org/wiki/Focal_length#In_photography

So my assumption would be that for each zoom level you should at a minimum have a `focallength` measurement for closest and furthest away `focus` setting - with that it’s a 2 stage lookup: First for the `zoom` level which gives you a focus lookup and next a lookup on that curve (with only 2 measurments it would be linear).

cheers
Markus

Thanks for your second file, you helped me a lot with understanding the protocol after learning a bit more about python! All this is rather new to me. My plan was to be able to send out freed data just to test some Unreal projects without actually having a tracking system in my office. I decided against using Touchdesigner for this for now and will utilise Widget Designer with an extension, though.

1 Like

Is this Freed Out Chop available to use in any of the experimental builds? User Guide | Derivative

Running TD 2021.15020 and don’t see it inside of that release.

The FreeD Out CHOP will be part of our next experimental series, but we don’t have a release date for that yet. In the meantime, we do have a custom build based on an earlier 2021 release that you’re welcome to try here: Dropbox - File Deleted - Simplify your life

Let me know if you have any issues.

Thanks robmc! This is working great. Appreciate the support and quick response!

1 Like

@Prasith
any update ontranslate UDP data to FreeD Protocol…?

Sorry, i’m an end user I’m not a programmer.
but I found this tool, (“github DOT com/max-verem/VRPN-FreeD-OpenVR”)