I’m not entirely sure if this problem counts as AR or VR, but I’m asking here because it involves seeing the actual world. I’d like to know if there’s some way to have some kind of AR setup where you can see the world as normal, but whenever you look at your own body, it’s replaced with the body of an AR avatar model. I looked at some posts about AR clothes modeling, and I suspect it’s currently way too much to do that much real-time computation on a whole human body. I don’t even know how you’d even try to detect all the parts of the body in a feasible manner. Forgive me if I’m off-base about everything here; I was just curious to see if there’s anything like this that could work.
This seems super possible now. So Quest 3 and most headsets do this. If you are looking for a more open-source solution for full-length people, you could use OpenPose. There are faster versions available now. With the pose data, you can rig a 3D avatar or model.
Like this: https://www.reddit.com/r/Spectacles/s/0EWBGDxgux. This was a basic quick demo, but Lens Studio has pretty good cloth simulation that’s supported on Spectacles. So what you want is definitely possible today, with some limitations. Lens Studio also has face and body tracking built-in, so you don’t have to worry about that part.