I’m trying to recreate this Japanese Pocari commercial where they use AR. I found this behind-the-scenes video that shows some details about it.
It seems like they photo scanned the outdoor scene and put it into Unity. They also used Quest and mentioned needing to develop their own software to play objects as far as 120m. Honestly, I’m a bit lost on how they pulled everything together.
I know they have a whole team, so my project won’t be as elaborate, but I’m curious if I can do something similar using the Oculus Quest. I’m thinking I could create the assets and somehow place them with Oculus and record that. I’m just unsure about what app or workflow would be best to use.
Let me know what you think, and thanks for taking the time to read this.
By ‘this scene,’ do you mean the environment? Like if I want to place 3D objects in my backyard, would I need to 3D scan my backyard, then put it in Blender, add the objects, and export it as AR? And thanks for the heads-up about the lens.
I see, I’m wondering if I can skip the photo scan part and just place the objects directly using Quest. If I want to place things more than 10m away, maybe I do need to photo scan and put them in Unity. But how does the AR come into play? I might be confusing myself. I think the grid was used to sync the photo scan in Unity with the real-life camera or VR.
For large distances, you might want to re-anchor the scene. Quest’s spatial resolution is pretty good, but for larger scenes, there could be drift and angular errors that get worse the further away you are from the initial point.