I might know what’s wrong. I have a Rift and will look at your example tomorrow. Totally stoked. Thanks for doing this work. I have stereo camera component for TD and it works well. From what you describe you might be missing the horizontal shift necessary for an asymmetrical viewing volume.
hmmm. I was wrong. In reading the SDK documentation, it seems the rift does not need anasymmetrical frustum. I’ll keep testing.
The only thing I found is that the cameras weren’t being translated. So I changed the camera parameters to use the Interpupillary_distance in the riftData (±ipd/2) and that I think made things more comfortable. Not sure if solves everything. I have to test with some different geometry.
Looks like there’s something not quite right with the shader. I’ll take another look tomorrow after I digest the documentation for the SDK.
I modified the shader and changed the parameter values. Seems to work now. But still hangs TD every time I tried to exit. Next thing to fix
Also no chromatic aberration fix in the shader.
I’m attaching the modified .toe
MMMRiftChopTest.12.toe (10.8 KB)
I put the System::Destroy() call after the delete instance call and now I can exit TD without a hang.
Hi mlantin - thanks for jumping in on this!
Good catch with the System::Destroy() call, though it doesn’t seem to hang on my system without it.
Your version looks even further off in my headset - is it looking right for you? I noticed that you changed some values in the shader, namely:
ScreenCenter-float2(0.25,0.5), ScreenCenter+float2(0.25, 0.5)
ScreenCenter-float2(0.5,0.5), ScreenCenter+float2(0.5, 0.5)
Can I ask why? Maybe you’re accounting for that somewhere else? The first line is verbatim from the Oculus documentation.
Hmmm. It looks right in my headset though I did notice that when the sphere is very big it starts to be uncomfortable. That might just be too much disparity. I’ll investigate.
In terms of the shader, I changed the value because the shader in the doc is designed to work on the full viewport. Since the way you have it structured, the shader runs on only half the image, the sampling needs to be clamped differently.
Thanks for the explanation! Turns out I had the wrong setting loaded on my headset, it was still calibrated for a friend. Also, sometimes I have to just look at the scene for a few seconds and my brain seems to automatically adjust things. I think your method is now looking better than mine.
I’m updating the Chop and example file, will post up a new version soon.
New version pushed to github:
Uses Oculus SDK 0.2.4[/]
Shuts down cleanly[/]
Uses mlantin’s shader/setup fixes[/]
Shader fix for Chromatic Aberration[/]
DLL only outputs configuration data once (optimization)[/]
DLL compiled in Release mode[/]
One thing I noticed in this version, mlantin, is that the Barrel Distortion feels a little bit off in the sphere, as you mentioned. I feel like the FOV isn’t quite right, but we’re inching ever closer
I think I fixed another issue - I moved the lens separation math from the cameras to after the render. I think the camera separation is accounted for when I flip the custom projection matrix and feed it into the cameras. What was missing was a post-render separation for display in the lenses, which I now do via a Translate TOP after the shader and before the crop.
I also added in a bunch of Instanced cubes and attached a spotlight to your head
Cool! I can’t wait to go home and test!
Okay I found a bit of time to look at this again. It’s so much fun. I wish I could do this all day
- I simplified the flip comp
- I simplified the distort shader by making it operate on both images at once. I’m sorry I didn’t have time to integrate the chromatic stuff you had added in
- I noticed that the scale parameters needed to be modified by the aspect ratio. That seems to make things a bit more comfortable.
- The cameras do need to be shifted. This is the view matrix they mention in the doc.
- There should be no need for an additional post-shift of the images
for myself, I made the IPD a bit smaller since my actual IPD is 54mm. Apparently that is the minimum that Oculus supports. They say in their doc that 64mm is average but in my experience that is not true at all. People that I’ve measured rarely get to 60mm.
In any case, the discomfort with far objects is because the objects are farther apart than eye distance…so going wall-eyed (divergent eyes). I have to investigate further.
I’m also trying to make a camera with the right viewing parameters without using the projection matrix from the stereo util…So far…major headache inducing problems.
MMMRiftChopTest.22.toe (11.6 KB)
I have access to a rift and have installed 088free version on the attached machine (at my local hackerspace). What do i need to try out what you have so far?
will this work in the free version or do I need to install a commercial license?
… or should I wait until you have the shader a bit more happy?
best wishes and awestruck thanks!
Hey Rodney - this currently uses a Custom CPlusPlus Chop, so a Commercial or Pro license is needed, sorry.
no worries, I’ll temporarily move my pro license over to that machine when I next go there.
I’ll go back through the previous posts and see if I can set it up.
Many thanks to the Derivative team for releasing an official Oculus Rift setup!
Here’s my scene updated to work with the official method.
Having fun with this thing.
Has anyone got some shadows and other lighting tricks working with the Rift?
I am working on a large imported model that could use a nice VR viewer. It’s a map of a very complex cave system. I can’t share it until the caver dudes OK it but it is very exciting. can’t wait to see the 3D print from the same model.
+1 on the spelunking
What is the current turnaround on the Rift orders anybody?
Since it is still a beta … when are they promising 1080p, or is it already?
Shipping in 3-5 days, according to their website.
They say “Stay tuned for more details” regarding the 1080p consumer version.