Refer to this Apple Developer Example to build a demo. I am still investigating the SDK in the hope that we may utilize the Motion Tracking functionality and create something useful.
Overall Suggestions for Real Time Tracking
- The tracked person must begin in a position facing the camera with his/her front side, or the anchoring may fail.
- The person must be fully included in the camera view
- The person had better stay in the center of the camera view
- To track rapid moment like parkour position jump,
- For video-in-video, slow down the video playback (for parkour jumping, 0.25x is not enough … )
- ensure that there is only the tracked object in the scene
Real Time Tracking Review
- Video in video input is acceptable
Tracking from a video playing screen is just the same as tracking from the real surrounding environment.
- Head-down motion not supported!
- Rapid movement cause severe latency (such as motions powered by gravity)
- Is it possible to initialize an ARView that accepts video streaming as input? We may capture this “rapid movements” and create slowed-down video playback, and feed it back to the ARKit. In this way we will make it possible to keep a motion recording.
Motion tracking is only available as a Beta version, so there will be improvements.
For stationary images, the ARKit reaches sufficient accuracy. However, the latency can be annoying when it comes to rapid movement. Due to the fact that RCNN-based models imply uncertainty when selecting bounding box candidates, if ARKit involves this piece of technology, then the speed for tracking every frame can vary.
It is indicated that the NPU on A12X has very similar specs to that of A12, which is equivalent to around 36fps running VGG-16. This may not be enough, since several interval takes surprisingly long to compute. but the performance expenses must be inspected through future tests.