Hmmm. It looks right in my headset though I did notice that when the sphere is very big it starts to be uncomfortable. That might just be too much disparity. I’ll investigate.
In terms of the shader, I changed the value because the shader in the doc is designed to work on the full viewport. Since the way you have it structured, the shader runs on only half the image, the sampling needs to be clamped differently.
Thanks for the explanation! Turns out I had the wrong setting loaded on my headset, it was still calibrated for a friend. Also, sometimes I have to just look at the scene for a few seconds and my brain seems to automatically adjust things. I think your method is now looking better than mine.
I’m updating the Chop and example file, will post up a new version soon.
[]Uses Oculus SDK 0.2.4[/]
[]Shuts down cleanly[/]
[]Uses mlantin’s shader/setup fixes[/]
[]Shader fix for Chromatic Aberration[/]
[]DLL only outputs configuration data once (optimization)[/]
[]DLL compiled in Release mode[/]
One thing I noticed in this version, mlantin, is that the Barrel Distortion feels a little bit off in the sphere, as you mentioned. I feel like the FOV isn’t quite right, but we’re inching ever closer
I think I fixed another issue - I moved the lens separation math from the cameras to after the render. I think the camera separation is accounted for when I flip the custom projection matrix and feed it into the cameras. What was missing was a post-render separation for display in the lenses, which I now do via a Translate TOP after the shader and before the crop.
Okay I found a bit of time to look at this again. It’s so much fun. I wish I could do this all day
Few things:
I simplified the flip comp
I simplified the distort shader by making it operate on both images at once. I’m sorry I didn’t have time to integrate the chromatic stuff you had added in
I noticed that the scale parameters needed to be modified by the aspect ratio. That seems to make things a bit more comfortable.
The cameras do need to be shifted. This is the view matrix they mention in the doc.
There should be no need for an additional post-shift of the images
for myself, I made the IPD a bit smaller since my actual IPD is 54mm. Apparently that is the minimum that Oculus supports. They say in their doc that 64mm is average but in my experience that is not true at all. People that I’ve measured rarely get to 60mm.
In any case, the discomfort with far objects is because the objects are farther apart than eye distance…so going wall-eyed (divergent eyes). I have to investigate further.
I’m also trying to make a camera with the right viewing parameters without using the projection matrix from the stereo util…So far…major headache inducing problems. MMMRiftChopTest.22.toe (11.6 KB)
I have access to a rift and have installed 088free version on the attached machine (at my local hackerspace). What do i need to try out what you have so far?
will this work in the free version or do I need to install a commercial license?
… or should I wait until you have the shader a bit more happy?
Has anyone got some shadows and other lighting tricks working with the Rift?
I am working on a large imported model that could use a nice VR viewer. It’s a map of a very complex cave system. I can’t share it until the caver dudes OK it but it is very exciting. can’t wait to see the 3D print from the same model.
+1 on the spelunking
What is the current turnaround on the Rift orders anybody?
Since it is still a beta … when are they promising 1080p, or is it already?