I’ve created a GitHub repository in the interest of fixing / improving my attempt at Oculus Rift Support in Touch Designer.
The repository includes:
A Visual Studio 2012 project for modifying/compiling the DLL[/]
A TD 088 Composition with my attempt at implementing proper rendering[/]
A compiled DLL if you want to get straight to the action (it’s in the TD folder with the Comp)[/]
It’s so close - the head tracking is fine, the 3D effect is there, but something’s not right. I think it has to do with the offset for each eye (ie the ViewAdjust matrix to be applied after the projection matrix and barrel distortion). I’ve tried the values that make sense that are coming out of the Rift Sensor: 0.0395 offset for each eye, in my case - but I still see some sort of de-convergence in my periphery. I don’t see it in the pre-compiled demos or when I render a scene in Cinder, so it must be a step I’m missing, or some parameters I’m not applying right.
The basic flow I’m using is:
Extract a 4x4 Projection Matrix from the Sensor for the Left Eye[/]
Flip coordinate at M to make Matrix for the Right Eye[/]
Create one camera for each eye, use custom matrices generated above[/]
Use Render TOP to render each eye separately at 640x800[/]
Feed output of Render TOPs into GLSL Shaders created from example code[/]
Shaders are fed incoming Uniform parameters from Rift sensor[/]
Feed output of each shader into a Crop TOP, extending the left eye to the right and the right eye to the left[/]
Composite the Crop TOPs using Additive Blending[/]
Send final output to Window set to 2nd monitor, fullscreen[/]
I’ve tried adding a Transform Top after the Crops in order to change the eye separation but none of the values from the sensor fix the problem I see.
We’re so close! Looking forward to some input on this, hope someone else has a Rift they can test with!