I’m quite interested to try some of the new ARKit features in iOS 11 and thought it could be fun to link in to Cinder.
I noticed @wdlindmeier 's Metal ARKit sample which looks really interesting (and I hope to check it out in more detail soon). Though I was thinking of scoping out a block that could wrap the sdk in some c++ and try to integrated it with Cinder’s OpenGL renderer instead of Metal. Possibly just start with accessors for vectors of vec3 tracked points and potentially a camera transform matrix etc.
Is anyone already working on this? If so i’d love to help out. If not, if anyone has any tips on wrapping native iOS features in Cinder i’d love to hear them as I only have limited experience in this (had to wrap some in app purchases in JUCE once).
let me know
Yeah, I think fundamentally it could be the same approach as the metal example. Rather than converting the CVPixelBuffer into a Metal texture you could use an OpenGL texture.
If you end up creating a proper interface for ARKit, I’d love to incorporate it into my metal example.
I’m interested in this too! I was going to try to hack something together this weekend, but might not have time, happy to contribute where I can if you make something…
I have experience with objective-c++ and I’m happy to help.
Thanks wdlindmeier and reza!
Lets do this. Ive just started a Cinder-ARKit block here and have already got a basic interface to access the real world camera matrices etc. I’m uncovering bits of the SDK as I go so the general architecture will probably need to be rethought.
Very open to all suggestions / collaborations and contributions.
Thanks for your work on bringing ARKit to Cinder.
I just built the sample from github. I needed the following changes:
- bundle identifier has to be unique, org.libcinder.BasicApp is already used for the cinder BasicApp sample
- Architectures/Debug changed “arm7 arm64” to “Standard architectures”
- set my signing profile instead of yours
The app runs, although the camera image is rotated by 90 degrees even in landscape right orientation. Do I need anything else to change?
Edit: testing with the CaptureBasic sample exhibits similar behavior. Although if I start in portrait mode the orientation works nicely, otherwise it is wrong. Although I was not able to start the Cinder-ARKit BasicApp sample in the right orientation.
Rotating the camera texture as in the CaptureBasic sample works, but the aspect ratio is wrong. I think this is because I’m using a 10.5" iPad Pro , where the resolution is 2224x1668, the camera resolution is 1280x720 and their aspect does not match. Scaling the texture coordinate with the aspect ratio in the shader fixes this.
Ah yes i was only able to test on an iphone 6s but should have thought of this. Im about to push a feature branch for different orientations and various fixes like you mentioned. Please feel free to fork and pull request, it would be great to flesh it out a bit.
Thanks, I’m looking into the new branch soon. I still have some issues with the anchor transformation and the camera orientation.