I’ve seen people using the iPad Pro for some cool AR stuff, where objects stay put as the camera moves. I know it uses LiDAR for that precision. But can an iPhone with LiDAR do the same? My main worry is the GPU power needed for all that 3D space calculation. The iPad Pro has that M series chip, while the iPhone has an A series chip. Is it enough for real-time AR rendering?
I’m not sure what you’re asking, but ARKit does exactly this. Real-time AR on iOS works even without LiDAR.
ARKit and RealityKit do real tracking of the world, so AR objects stay in place. It can lose track if you move too fast or if it’s low light, but it’s pretty reliable.
That’s good to know! So it just has some limitations then? Like if you restart the app, it can’t remember where things were?
Exactly! It resets coordinates to your new position, which makes it tricky to place things back where they were.
Honestly, the A series chips have been quite powerful. I wouldn’t count them out for AR tasks. Plus, LiDAR helps a lot!