Hi, it would be nice to have equirectangular 360 render mode (renderTOP.par.rendermode) that is actually compatible with Environmental Light. This would allow for realtime light probes in the scene and this kind of map would be also compatible with majority o HDRIs.
I know there is a cube map to equirectangular projection, but there are two problems with it - Cube map requires much bigger texture (half of it is unused) and it requires additional computation.
Hi @boros,
thank you for the suggestion.
As you say you only use half of the Cubemap - a potential Equirectangular solution would include a horizontal and vertical view angle?
cheers
Markus
Yes, horizontal being the longitude and vertical being the latitude. Ideally it would enable render TOP output to be used directly as a TOP input in Environment Light COMP. This would enable usage like Reflection Probes in Unity.
It would also work well with background sphere mapping - used for static far horizon in 3d. This way we would have a nice workflow:
3d geometry → equirectangular render → 360 sphere background + environment light
+1 for equirectangular render mode & view angle pars.
There are some large problems with direct equirectangular rendering in a rasterizing renderer. One I think isn’t solvable without geometry shaders. We can’t use geometry shaders since macOS doesn’t support them, and presumably they’ll eventually be dropped from other GPUs in the future.
The first solvable issue is: Since you are taking straight lines (polygon edges) and drawing them onto a curved projection, you need to make sure your polygons are well tessellated, otherwise the ‘curving’ will look very broken. Fisheye projection has this same issue though.
The bigger issue is at the projection edges: when a triangle overlaps the seam between the right side and left edge of the image (and top/bottom). A vertex shader doesn’t know where the other vertices on the triangle are, so the vertices on the right will be placed on the right, and the vertices on the left will be placed on the left, which means you now have a triangle that instead of going from 350 degrees → 10 degrees (for example), it ends up going 10 → 350, almost covering the entire viewport.
This is solvable using a geometry shader that detects the edge cases, and emit two triangles that are at the left and right of the viewport.
Maybe this will be possible when mesh shaders are generally usable.
Thanks for the explanation. I am aware that quirectangular maps require some additional math magic to render and use, although I don’t understand why. I noticed long time ago that using animated big res textures for environment light (with equirectangular map) causes significant performance drops.
I can use realtime 256x128, maybe 512x256 but nothing near 4k, which seems like minimum reasonable res for 360 panoramas and sharp reflections. Performance monitor shows some heavy lifting happening in Environmental light and renderer, but never knew why.
Regarding lack of geometry shaders support - aren’t the pops supposed to be a bit like vertex / geometry shaders in nodes? Sorry if I misunderstand anything.
If you are trying to do lighting with an animated Env Map for the Environment Map, that will be very heavy before there is a pre-processing pass that needs to occur to that environment map before it can be used in a PBR render.
POPs can help with this in theory, but you would need to take your entire scene’s geometry, allocated enough memory to do another full copy of it, and then put it through a pass to process the geometry. It would be very expensive in terms of memory to do. Unlikely a big win vs just doing a cubemap, since there are lots of GPU tools specifically made to improve the performance of cube map rendering (and more coming)
Thanks for thourough explanation… tbh its been puzzling me for years, since I started doing full geometry VJing. With your explanation in mind I’ve done some tests and used prefiltering outside of the Environment Light COMP.
Turns out prefilterTOP takes 30ms gpu time on 3090 to process a single fullHD frame, unlike any other TOP that I know of. Is there any way to make it realtime ready besides serious downsampling (even as low as 256x128 takes 4.5ms)? Was it because it was never intended to be realtime and highres? Similar technique in Unity 3d does take quite a bit of preprocessing too.
The test also showed me the importance of prefiltering in terms of pbr shading quality. Reflections look good only at lowest roughness (high, artificial specularity) without prefiltering.
I understand that prefiltering must be adding some sort of serious overhead like mipmaps or other image stacks, but still… maybe we could have pbr reflections done entirely different way - like native SSRs or raytracing? Reflection maps are getting thing of the past anyway, but it would be nice to have a replacement in Touch.
Environment lighting involves sending out 1000s of rays out from each point of interest into the world to see how the light out there is hitting it (for less reflective surfaces). Prefiltering is required to greatly reduce that calculation cost at render time. It’s the standard workflow for PBR lighting.
Certainly there are other things such as raytracing, which has it’s own costs. Currently we do not have raytracing support in TouchDesigner.