Week 05 Motion Capture Systems That Don’t Cost $60,000
Lab Documentation
Perception Neuron
This week, our first task is try the perception neuron tracking system and live stream its data into Unreal. The hardware is much more portable than the OptikTrack system but is very breakable and always needs calibration.
We helped Pratik put on the straps, link the wires and put those neuron blocks in. The instruction on the back of each strap can easily guide you where to put them on.
We connected the hub to the computer. Then we need to connect to a strong wifi to set it up, unplug the cable from data port and use a portable battery as its power.
The avatar showed up but looked very funny with several disconnected body parts. We tried to replace those neuron blocks who were showing as grey on screen.
The system requires actor to do four poses to calibrate it including a seated steady pose, an A pose, a T pose, and a bent S pose.
After calibration, the avatar looked much better and we can start doing live streaming.
We downloaded the plugin and example mesh from perception neuron website and screen recorded how to stream the data from Axis Neuron software to Unreal 4.20. The good thing about Axis Neuron is we don’t need to manually retarget every bone if we add a prefix.
During the process, the body got more twisted than beginning, especially when Pratik sat down.
To summarize, this tracking system is fun, but not accurate enough for use for now.
Use a rigid body as camera in OptikTrack
Here is the screen recording of how to use a rigid body as the camera in OptikTrack and stream the data to Unreal 4.19:
This is how it looks like in Unreal. The statue is the actor and we were using a tripod as camera to shoot it.