I recently did a project in a 70 foot planetarium (more on that soon) and wrote some code to handle fulldome style fisheye rendering. I just finished pulling it out and wrapping it into a block. The README goes into detail about how to use it, so I won’t repeat it here, but check it out on github. The short version is that it uses four cameras, and four render passes, to render to a circular fisheye image which most professional dome systems accept for input.
At the installation, I had it rendering 4k by 4k at 60fps on a PC with a Quadro m6000. It was super fun and rewarding to work that large and immersively, so if you ever get the chance, jump at it! Hopefully this code helps.
Hello, thanks for the great block. But I have some issue and I don’t know how to fix it. When the object is moving and cross the zones between two or more cameras it looks like the changing of the light and one area is always darker or lighter than another.
I believe this happens when you render the lights in camera space, and you instead need to tweak the shader to be in global space. The block works by actually rendering 4 different cameras, and then combining the result. If you’re calculating lighting and reflections based on the camera position/rotation, they will always be slightly different.
It’s been a minute since I’ve touched this code, but I believe the shaders in the sample repo should sort it out! https://github.com/cwhitney/sharkbox-FullDome/tree/master/samples/BasicSample/assets/shaders
Thanks for so quick answer. Looks like I don’t understand the basics of glsl. But when I run the example project with teapots and when I move the camera I see the same exact effect.