Vive - UV painter + Vive tracking Perspective

We had a foam model of a car made for a project at 1/3 scale.

Using a UV unwrapped model of the object and 2 projectors in this case with a renderpick chop to give UV coordinates for masking between a matte TOP for the alphas of two separate PBRs.

https://drive.google.com/file/d/1RmXZo9PagbRW5eywz6IiaN0IRRHbekW0/view?usp=sharing

Tracking the position of the Vive controller for correct POV camera position.

https://drive.google.com/file/d/1oOlkBjQBGpPTIV1CVovNWC7pGJkdXK33/view?usp=sharing

Nicely done!

It’s clear you got the whole system figured out, good work!

Do you have any issues with the trackers/ controllers re-aligning their world space coordinate?

Since the VIVE lighthouse system is auto-calibrated by itself. I would often run into issues when it would decide without warming to adjust its calibration throwing off all of the world space coordinates and thus the whole mapping.

I have never found a way to keep the trackers/controllers from changing their steamVR calibration randomly.

We treated the Vive world and the projection mapping world as separate. We used a select SOP of the same model for both, then got UV coordinates from the Vive world and used them to change the texture of the object in the PM world.

In fact the vive world existed on a laptop and sent UV and vive controller info via touchout chop data to the projection mapping system on a desktop.

This way placement and scale could be independent for the two worlds.

I don’t recall the auto calibration causing any problems for us.