This is my short pipeline for a realtime character in Unreal, driven by mocap from Optitrack.
Optitrack plugin won’t retarget – so my method is to make the 3d character the same size as the real performer.
In order to do this use Motive to record a performer, say Fred doing a T-pose, A-pose and a range of motion takes to test the character.
1. Export the T-pose take as an FBX and import it into MotionBuilder, the merge in your character mesh.
2. Rotate everything to face down Z-axis and positioned at 0,0.
Scale the mesh down so that the height of the shoulders is the same as the skeleton.
3. Save this FBX and open it in your favourite 3d program e.g 3Ds Max.
Edit the mesh so the wrists, knees waist etc is in the correct position i.e. fit it to the skeleton, not the skeleon to the mesh. Export the FBX.
4. Open the FBX in Akeytsu. Remember to Zero out the mesh in Akeytsu, using the Freeze button. Do all the skinning and everything you want in Akeytsu. Export as FBX
5. Open the FBX in Motionbuilder again, and in the schematic view unparent the mesh from the skeleton. Save the FBX.
6. Import the mesh into Unreal – open the mesh, under Asset Details look for Miscellaneous and Tick the following:
– Convert Scene
– Force Front XAxis
– Convert Scene Unit
then reimport the mesh with these settings.
7. Set up the character in Unreal as shown in the SET UP THE CHARACTER:: section in this post
8. Now you can play the mocap takes: Tpose, Apose, Range of motion etc. in Motive seeing the result in Unreal.
9. Jump back to Akeytsu to tweak the characters skinning — saving out from Akeytsu, reopening in Motionbuilder to unparent the mesh, save, and then hit reimport in Unreal to update the character with the new skinning.