PBR #8: Expanding the Mixamo/Maya/Unreal animation pipeline to include editable Noitom mocap

I finished the first pass of the film the weekend of July 4th - eight weeks since I had the idea. That was the weekend lockdown was lifted, and I was able to present it on the big TV to my fiancée and visiting parents. I also sent it to a few choice contacts for feedback. The response was positive and I got some useful notes, too.

Several times in the weeks prior, I had wished I’d had a mocap suit, so I could quickly record the actions I had in my head and try them out. It’s a very important step in the process. Rough animation only gets you so far, and even that is too time-consuming at this stage.

But I actually had a mocap suit. Years ago, back when I wanted to make a feature-length Batman fan-film by myself (sigh), it was obvious to me that I wouldn’t be able to animate much of it, if any. There’d just be too much, it would need to be motion-captured. So I invested in Noitom’s Perception Neuron Kickstarter (I was also exploring FaceShift at the time, for facial mocap). By the time the campaign was over and the suit arrived it was a year later, and I’d realised what folly the Batman project was. I plugged the suit in and found it difficult to calibrate, and not having any immediate use for it, put it on the shelf.

With lockdown lifted, I was able to collect it from storage and give it a test run. After a bit of a stock take, I realised it had a broken part (the left foot). So, happy enough with the results I was getting in terms of data quality, I ordered a replacement for the foot. It arrived about a week later, by which time I’d settled upon the changes I wanted to make to the film.

When you’ve gotta mocap characters taking gas masks off, you’d better get a gas mask. I don’t know if it works as PPE down the shops, mind.

When you’ve gotta mocap characters taking gas masks off, you’d better get a gas mask. I don’t know if it works as PPE down the shops, mind.

It is possible to connect the suit straight into Unreal, which is a nice option in theory, but I realised I didn’t want to do that. Firstly, in the conversion from the Noitom rig to Mixamo, the finger captures were going horribly wrong. I decided I wouldn’t record the fingers, as it’s not a big deal to animate those. But this meant I would need to transfer my captured data onto my Mixamo rigs in Maya for editing, then pass those to Unreal.

This was something I probably wanted to do anyway. As I said when I discussed vehicle capture - I’m not a motion-capture performer, I’m an animator. There is very little chance my unfit body can deliver the performance I want it to, even if I had a motion capture suit that was as good as the Xsens, I would always want to augment the performance in Maya to strengthen it. So, if I’m doing that anyway I thought I may as well animate the hands while I’m at it.

If you’re not scribbling over everything are you really accomplishing anything?

If you’re not scribbling over everything are you really accomplishing anything?

I went through the first pass of the film, noting down all the bespoke motions that I thought would allow me to tell the story better. I needed a pipeline to keep everything in check, though. Here’s how I set it up.

First I set all the character rigs to have specific default settings for mocap. For mocap, I like to have IK hands and feet always. If I was animating them I’d always want to have FK arms unless there was a specific reason not to, but with mocap that’s not the case. Say you just need to move the character’s hand slightly to the left - same performance, just relocated slightly. A nightmare if it’s set up on FK arms, but a simple Anim Layer fix on IK.

I also like to have the ability to move the whole character without moving the root node. Why? Well, let’s say I want to blend a mocap performance with a Mixamo anim, in Unreal. The mocap is first, and then it blends to a cycle. The cycle is at the character’s root, so when I’m processing the mocap I want to make sure that the mocap clip ends at the root - otherwise the blend won’t work. So, with the Mixamo rigs I set all the IK controllers, including pole vectors, to Follow Body and where necessary, Follow Hand/Foot. This means that moving the waist controller takes the whole animation with it. As a result, I can change the “origin” of the animation really easily.

Years ago I wrote a small script to help with this, which I’ve called MocOr (mocap orienter, although it only does translations but whatevs) - it’s very simple, and just moves any animated object to its parent’s root on that keyframe, by moving the entire curve. This means that you scrub to the frame of the mocap you want to be at the origin, and hit the shelf button. I’ve included it here for anyone who’d like to try it.

The first version of this sequence used animation straight from Mixamo. Also note the abrupt vehicle turns, caused by keyboard input.

The first version of this sequence used animation straight from Mixamo. Also note the abrupt vehicle turns, caused by keyboard input.

It was vastly improved with bespoke motion capture. Note that I pinned Duggy’s hands to the wheel in Maya also, using constrained IK arms.

It was vastly improved with bespoke motion capture. Note that I pinned Duggy’s hands to the wheel in Maya also, using constrained IK arms.

Next, I set up an anim converter file for each character. This contains a Human IK-defined Mixamo skeleton alongside the character rig, and a Human IK custom rig definition. When dropping any downloaded Mixamo anim into the scene, the rig takes the animation and can be baked out for editing and re-exporting to Unreal - allowing editing of Mixamo animations, which is a good start.

However, the Noitom motion files now needed to be accounted for, so I made a Noitom-Mixamo converter using Human IK as well. This would then output Mixamo-type FBX files from my recorded motion capture, that could be either sent straight to Unreal, or dropped onto a rig for editing.

Here’s a diagram overview of what’s going on here:

PBR Diary.png

For the record, during this session I think I broke the Perception Neuron suit, as it hasn’t worked since. So there’s that.

Thanks for reading! If you have questions, comments or advice (I’m new to this too!), leave ‘em here! Cheers.