I have been playing with projection mapping in touch, and am interested in the tricky area of projector calibration.
I notice that all the massive projects that use mapping - all seem to use some bespoke software for their camera calibration. I’m thinking of camera obscurer, and the ISAM project which all seem to say in their descriptions of their project that they are able to calibrate the real world with their virtual models quickly using outside calibration software.
I was very interested in this project by Kyle MacDonald for the Open Frameworks platform that seems to be an excellent calibration technique, and was wondering if it were able to be ported to touch relatively easily.
Unfortunately I’m not an expert at these things, but it seems that Open frameworks uses Open Gl as does touch, so persumable the callibration that is done in the Mapamok application could be exported to touch somehow?
Presumable using their callibration tool - it creates a shader that warps the vertices appropriately for your individual real world setting? Would it be possible to save this shader and use it in touch somehow?
I hope I am not completely off the mark here, and things I’m talking about are within the realms of how the program works!! I know calibration is the hardest part of mapping, but I am trying to avoid changing the mesh vertice by vertice to map the texture to the real world scene.
A new feature in 088 in the Camera COMP called Custom Projection was added to make it easier to use these custom projection formulas in a global manner.
The problem I’m having is mapamok is not giving me the values I’ve measures in real-life. The degrees in which my projector is rotated doesn’t seem to correspond with any values in the .yml file.
here’s an implementation of mapamok to TD 088: Map an object with CamSchnappr we’ve used this component in lots of our shows and it works great. BUT! As long as the actual geometry is pretty close to your model.
I did a workshop for beginners demonstrating the workflow using Kantan and CamSchnappr (the implementation of mapamok in TouchDesigner). Maybe it’ll answer some of your questions?
Thank you for sharing that. I’ll be sure to check it out. My current workflow is to get a mesh from 123D Catch and bring it into Houdini Apprentice. I’m able to get the texture map in as well so this helps when building. Next, I snap a large polygon to the mesh, then create smaller and smaller polygons inside it to match the mesh model. Sometimes it’s a polygon mess so I delete what I’ve created while keeping the points and rebuild the model.
123 is remarkable for what it does, certainly more capable than kinect for scans. But the mesh are fugly, mangled. I can see how it might be a good starting point for a rebuild, as per what maasta I think is suggesting. A very exciting tool