360 VR Video Player for iOS in Cinder?

I need to play a 360 video on an iOS device in a VR mode: changing the orientation of the iOS device should change the viewport.

I imagine I should

  1. project a 360 video asset on a sphere,
  2. place a camera in the middle of it and then
  3. rotate it according to the device orientation.

SInce I never worked with the 360 videos I am not sure how to achieve 1.

–8

Do you already have the videos in equirectangular format?

If so, literally just bind the texture and draw a sphere and you’re done. If not, you’ll need to either distort your textures realtime, which considering you’re on a mobile device is probably ill-advised, or generate an equirectangular video.

To do this, i’ve used cmft with pretty good results, though only ever with stills. I suppose you could dump your video to individual frames, and pass them each to cmft (exporting in latlng format, to use their parlance) and then recombine the processed frames into a video. You’ll want your results to come out looking something like this

edit Here’s a quick gist using a rip of the above video to help you get started.

1 Like

@lithium Wow! Your answer is amazing!

No, I don’t have a 360 video yet: I was going to generate one using c4d for simple scenes (like falling rain with alpha channel, to be mixed with a camera video feed) and later to get a physical 360 camera.

Edit: And this is the result of using MotionManager to drive the _camera.
–8

Aside from cubemaps, another really popular 360 video option is dual fisheye. If you’re generating the images yourself, cubemap is probably the way to go, but if you’re using something like a Ricoh Theta or Sony Insta360 you can just write fragment shaders to convert it live. As is almost always the case, Paul Bourke has written about it http://paulbourke.net/dome/dualfish2sphere/

@sharkbox –– Thanks, that is a great approach too. Now if only apple allowed us to use both front and back cameras simultaneously –– add two fisheye lenses, and voila: realtime 360 camera on iPhone!

–8

You can use the sphere normal directly to map to an equirectangular texture coordinate. (Trick that Paul taught me.)

You can ignore lines below, since they are for handling stereo (either top-down or bottom up):

@num3ric Thanks!

I am not sure what the advantage is compared to @lithium approach, which also uses sphere geometry and equirectangular texture, but no special shader.

–8

The advantage is that texture coordinates are calculated per pixel, instead of per vertex, which may solve issues with texture seams (e.g. when using an IcoSphere instead of a normal Sphere). If you don’t have these issues, @lithium 's approach is probably the better option (slightly better performance), but @num3ric 's shader is a great solution if you do.

-Paul