Here is a first video that shows a live connection between Noitoms Perception Axis Player and Unreal Engine 4:
How it is done:
- UE4 connects over a local TCP connection to the Axis player and retrieves the raw ASCII BVH data
- Each tick one motion line is parsed and locally stored
- The stored rotation and transition data is then recalculated to map between the different coordinate systems and rotation orders.
- The animation graph reads this information and replaces the rotation and transition for each bone.
Most of the code is programmed in C++, especially for the connection processing and the data recalculation.
But there is also some blueprint code to start the connection and to pump the animation data into the skeleton.
Tools that I used:
- Unreal Engine 4.5
- Perception Axis Player 2.0 Beta
- Microsoft Visual Studio 2013
- Blender 2.72
- Makehuman 1.02
I want to thank Noitom for providing the perception axis player beta version and the BVH data to us bakers.
If you have missed the last years Kickstarter campaign for the Perception Neuron motion capture suit you can preorder it here: