I’m attempting to calibrate a projector using a new OpenFrameworks/OpenCV library called mapamok, but I’m having trouble and I think this question is generally applicable to anyone doing projector calibration.
My process is as follows:
Load a model of my physical scene into Mapamok and perform the calibration routine to obtain an estimate of the transformation matrix (GL_MODELVIEW) and the camera intrinsic field-of-view. (i.e. the outputs of openCV’s cameraCalibration method)
In TouchDesigner, load the same model. Use the Matrix CHOP parameter of a Geometry COMP node to transform it with the transformation matrix derived in step 1. Set the horizontal FOV of my Camera COMP to the value derived in step 1.
I would expect the rendered scene to match my physical scene, but it doesn’t. The orientation of the model is correct, but it appears to be off by some translation and/or scale.
I believe the screen resolutions match. Are there other camera intrinsics that I’m neglecting? Or other parameters that have to be taken into account when rendering the scene from a different computer?
Any hunches? People who’ve had success with projector calibration?
It was an intrinsic parameter that I was neglecting – the principal point.
I don’t know of a way to set the principal point of a camera in TouchDesigner without writing a custom shader, so instead I set Mapamok to use the default principal point (w/2, h/2) and not solve for it. This fixed the discrepancy.
Check out Mapamok for your projector calibration, and maybe we’ll have a TouchDesigner version of it soon.
this is great! I’ve always wanted to modify kyle’s procamtoolkit to send the final calibration data via osc to touch, but never found the time. Did you get mapamok to compile on windows? Are you sending calibration data via OSC or just manually transferring the values?
does this mean you are working on getting the calibration part ported to touch?
Anyway, it’s great that you got it working
We’re using mapamok on OSX 10.6.8 and transferring the values manually. I like the OSC idea though.
No immediate plans to port mapamok to touchDesigner (though these hints about openCV in TD would be a good first step), but I’ll let you know if we build anything to speed up the calibration workflow.
In the meantime, my first two posts in this thread should serve as a starting point for anyone else out there using TD & Mapamok.
Has anyone else had any luck bringing Mapamok into touch designer? I have the output from mapamok below. You can use the rotation-matrx and translation matrix to feed the Matrix Chop parameter of the Geometry Comp, and the FOV for the camera. This is not quite enough though. I think there are still some rotations that need to happen on the camera. I can’t figure out what my camera rotations should be to finish calibrating the projection.
data: [ 1.9607565512738850e+003, 0., 640., 0.,
1.9607565512738850e+003, 400., 0., 0., 1. ]
fov: [ 3.6153843846913780e+001, 2.3060578544788939e+001 ]
principalPoint: [ 640., 400. ]
imageSize: [ 1280, 800 ]
data: [ 4.0487444880761092e-001, 3.1152405009210704e+000,
data: [ 1.3625383241544182e+000, -7.3408257705204967e-005,
data: [ 9.9947325880363769e-001, -2.5250114385999105e-002,
data: [ 9.9966811300484670e-001, -5.3858224138155733e-005,
data: [ 1.4419804811477661e+000, -1.1745921373367310e+000,
Well, have had some success bringing this information into Touch. As originally suggested in this post, I used the rotation and translation matrix as an input to the Matrix Chop parameter on the geometry comp. I used the FOV to set the camera horizontal field of view. One trick was I had to rotate the camera by 180 degrees on the x-axis. I am guessing this is because of differences in the coordinate systems between the two.
My current struggle is instead of moving the geometry, I want to leave the geometry at the origin and instead move the camera (to make it easier in the future to add additional camera’s). I though I would need to take the negative of the transpose of the rotation matrix to accomplish this, but no luck so far…
Anyone have any additional insight?
Could you attach what you have so far? I’d love to take crack at it.
As Malcolm mentioned here: viewtopic.php?f=4&t=3725
There is now a Custom Projection parameter in the Camera COMP in TouchDesigner 088 Beta so you can apply these adjustments to the camera directly now.
I’m trying to follow the process explained here and getting a bit stuck when i get to
The exporting the matrix data from mapamok to a matrix CHOP in touch. do i find
the matrix data in the calibration-advanced.yml file? I’m defiantly stretching my knowledge at this point as this is the first time i’ve used a matrix …
With the new ‘Custom Projection’ parameter is this process now simpler? what info do you put custom projection parameter ?
any help would appreciated!
It’s still very unclear to me. Can anyone explain which values to copy from the mapamok .yml?
Is there going to be more documentation on the new features of the custom projection setting in the CameraCOMP? How to use the GLSL input?
I’ve updated both the 077 and 088 documentation with how these parameter(s) should be used.
I tried this tonight and I wasn’t able to get it to work. I used a GLSL MAT and added the standard shader. I wasn’t able to drag it into the Custom Projection field so I typed it in. I also tried it in the ‘Display/Material’ field. I get an ‘Invalid Source’ warning on the CameraCOMP and the following in the GLSL MAT.
Program Link Results:
0(56) : error C1013: function "TDCamToProj" is already defined at 0(66)
0(60) : error C1013: function "TDSOPToProj" is already defined at 0(68)
088 Version 4580
You should specify only a DAT to the custom projection parameter, not a GLSL MAT. The DAT should contain those two functions in it (you can have other functions as well as uniforms as needed, you can pass uniforms to it using the GLSL page of the Render TOP). I’ve updated the help further to avoid this confusion.
here’s how I managed to get it to work without transforming the geo, only the camera.
take the rotation matrix from the yml file.
pre-multiply it by the following axis conversion matrix :
1 0 0
0 -1 0
0 0 -1
transpose the resulting matrix, you thus obtain the inverted rotation matrix, lets call it i_rot.
take the translation vector from the yml file. let’s call it trans.
compute the following scalar multiplication : -1 X trans
let’s call the result snart
premultiply snart by i_rot. let’s call the result i_trans
append i_trans to i_rot as fourth column
append 0 0 0 1 as fourth line to the result.
transpose this, dat_to and you’ve got your extrinsic parameters. put it in matrix chop in the pre-xform tab of the camera
for intrinsic parameters (the matrix chop/dat in the view tab) :
compute the following 4x4 matrix :
2focalLength/imageSize.x 0 0 0
0 2focalLength/imageSize.y 0 0
0 0 -(cam.far + cam.near)/(cam.far - cam.near) (-2cam.farcam.near)/(cam.far - cam.near)
0 0 -1 0
for this to work you must check CV_CALIB_FIX_PRINCIPAL_POINT in mapamok.
hope this helps.
Thx so much for sharing your findings! It Would be really great if you could upload a little sample, as I find it a little hard to follow the instructions. Many thx again
here is an example : mapamok_calibration.tox (2.17 KB)
Thank you. Much appreciated
there’s a little problem with the example I uploaded in my previous post.
as with the rotation matrix, we need to premultiply the translation vector by the cv_gl matrix.
this time I think it should be ok.
mapamok_calibration.tox (1.88 KB)
Just to add to this post for new users finding it, you’ll want to also look here for a tool Markus made that does a lot of this for you. Requires TouchDesigner 088