Anyone else working on an AR project like that Pocari commercial?

Hey everyone,

I’m trying to recreate this Japanese Pocari commercial where they use AR. I found this behind-the-scenes video that shows some details about it.

It seems like they photo scanned the outdoor scene and put it into Unity. They also used Quest and mentioned needing to develop their own software to play objects as far as 120m. Honestly, I’m a bit lost on how they pulled everything together.

I know they have a whole team, so my project won’t be as elaborate, but I’m curious if I can do something similar using the Oculus Quest. I’m thinking I could create the assets and somehow place them with Oculus and record that. I’m just unsure about what app or workflow would be best to use.

Let me know what you think, and thanks for taking the time to read this.

If you can build this scene in Blender, you should be able to put it into WebXR or Unity. It doesn’t seem like there’s a lot of interactions needed.

Just be careful with Quest outside since the sun can damage the screens quickly because of the lenses.

By ‘this scene,’ do you mean the environment? Like if I want to place 3D objects in my backyard, would I need to 3D scan my backyard, then put it in Blender, add the objects, and export it as AR? And thanks for the heads-up about the lens.

I mean the digital scene, like what they show around 1:25 in the behind-the-scenes video.

I don’t think the photo scan is really necessary. They probably used it for scale reference and didn’t add it to the AR scene.

Great commercial. I shared it here as well https://www.reddit.com/r/augmentedreality/comments/1fuwzc1/new_film_for_pocari_ar_app_by_basculecojp/

Nice, that’s an awesome commercial and it totally inspired me to get into AR :wink:

I think the scene scans are just for the developers to place objects accurately. I can’t figure out what role the QR codes play in all this.

I see, I’m wondering if I can skip the photo scan part and just place the objects directly using Quest. If I want to place things more than 10m away, maybe I do need to photo scan and put them in Unity. But how does the AR come into play? I might be confusing myself. I think the grid was used to sync the photo scan in Unity with the real-life camera or VR.

For large distances, you might want to re-anchor the scene. Quest’s spatial resolution is pretty good, but for larger scenes, there could be drift and angular errors that get worse the further away you are from the initial point.