🎊 Final Outcome
GLB objects exported out from retargeted MeshyAI on MotionBuilder to be served on web app that replicates Instagram UI. The app displays 3D interactable objects as posts. Imagine if Instagram adds compatibility to display 3D objects in user's post. It can be accessed here.
Full presentation on my Instagram Story Highlights <MoCap> @ekezia
📚 Learning Experience
- Mistakes, bugs, and improvements are mostly discovered during session as a lot of the problems arise with the physical movement. Multiple sessions are crucial to achieve the best mocap recording session. Example: wearable props such as shoes will make a difference on movement.
- Most optimal pipeline for this project purpose is to livestream the body movement and facial expression directly to UE5.6 via Live Link and record it through sequencer. A lot of problems arise when using prerecorded animation / mocap data since MetaHuman bones have different prerequisites than other softwares', so the livestreaming method is preferred to avoid this issue.
- There are several methods to retarget animation. That includes:
(1) Motive -> Mixamo / MeshyAI on MotionBuilder (the method I'm using for the presentation today)
(+) Easy & guaranteed to work
(-) MeshyAI result is not the best for human character.
(2) prerecorded Motive -> MetaHuman on UnrealEngine 5.6
Retargeting could be done through different routes: (a) IKRig/IKRetargeting (very complicated), (b) Animation Sequence (unable to move the head along with the body), (c) Retargeting animation right-click menu on imported Animation Sequence from retargeted MotionBuilder animation (the method I'm going to use)
(+) Nice human work by MetaHuman
(-) Complicated skeleton bone, separate head and body skeleton
(3) Livestreaming from Motive -> UE5.6
(+) Most ideal pipeline as it does not require reconfiguring the animation / character
(-) Didn't get to learn it in time before the mocap session
✈️ Future Plans & Improvement
📷 MoCap
1. Facial expression to be included in the recording.
2. More accurate camera work on different scenes according to the reference Instagram post.
3. Replace skin texturing with a texture that resembles the subsequent model's actual skin. Example, including the tattoo and scars that the models may have.
4. Clean up animation.
5. Repeat same process for other models using Rokoko AI for remote workflow.
🌎 Web App
1. Allow public authentication to like and comment the 3D models post.
Thoughts on MoCap
Motion capture has definitely accentuated the behind-the-scenes of a static photograph. It reorients me to see a photograph beyond its 2D space and imagines what motion has been done in order to accomplish that particular photograph. I'll probably not focus on motion capture for my main body of work, however, this experience has added a "missing link" into the experience of viewing a photograph. That being said, I'm still going to finish this project as I found it worthy to present on the potentials of displaying and offering interaction with movements beyond photography and videography.
Elizabeth Kezia Widjaja © 2025 🙂