When Apple first launched ARKit – their developer framework for Augmented Reality – it was mind blowing. To think you could see this amazing virtual objects as if in the real world was a game changer.
With the third iteration of ARKit we’re starting to see some really crazy new features.
Key to them is “motion capture”, the ability for your iOS device to see “people” in a scene and emulate their movement.
Bringing a “model” to life is awesome, however it’s the depth and movement tracking which is most exciting.
Apple is now able to see a person and determine their location within a virtual space – this allows the system to place virtual objects in front or behind them.
This makes it look like you are walking “around” objects. Before this update people would be covered up by the Virtual items.
Our demo showed a lot of things, the shine and reflections on the ball are remarkable.
The two person interaction with a virtual object – brilliant.
But notice the pins behind a person, they virtually know where to be and how to appear – that’s amazing.
ARKit is part of a complete update to iOS and the coding tools associated with it – expect to see new AR experiences come out of the woodwork by the end of the year.
Trevor Long travelled to San Jose for WWDC as a guest of Apple.