2D Game Animations: Exporting FBX from Blender/NIMA vs. creating .pngs into flipbooks in engine

Hello everyone,

I am brand new to Unreal Engine (and game making in general) but I have been playing with the blueprints and Paper2D. I’ve been wanting to do something with my drawings and photoshop creations and decided a 2D game would be a fun project to start building. I know how to set up a character with basic movements/attacks and how to create flipbooks from .PNG files, but I have a few questions (Sorry if these are very elementary):

1: What is the difference between exporting FBX files from a program like Blender vs. creating “keyframe” snapshots in .PNG files and assembling a flipbook for a 2D game ? I may just not fully understand what an .FBX file is/does.

2: Does anyone have experience with using NIMA (or Flare) with Unreal Engine 4? I believe it is a relatively new and underutilized program, so there isnt much help online.

2a: If so, would you recommend NIMA over Blender? If so, how does NIMA export to Unreal? Just in C++ code? Would I just copy that code and place it in Unreal somehow?

Thank you for any and all help. Im

.FBX is an Autodesk file format used mostly for 3d information.
It can certainly be used to move and animate a 2d plane as well. As far as end results go I’m not sure it’s quite the same at all.
As far as why blender uses FBX goes… essentially the Autodesk file format has become the default file format supported by everything (because it’s easy to write I believe).

I think that what program you choose depends vastly on what you are trying to achieve.
If you are going for that streetfighter feel you need sprites rather then a 3d model or a 3d program.

If you are going for that GGX feel with nice and almost vector like graphics it may be better to utilize blender. Maybe. I’m still partial to doing that manually in Illustrator to really… animating stuff it’s just a bit harder, you still have your parts, so I can see why people would put blender to work on it.

Thanks a lot for your reply.

When you say a street fighter feel, do you mean retro feel/ more “pixel” art rather than a vector or “cleaner” line art?

My workflow is basically hand drawn sketches which I bring up in illustrator and then ink (havent really gotten the feel for drawing tablets yet, I prefer paper and a flatbed scanner) Then I colored the converted file in photoshop with textures to then use in a rigging program like blender or Nima (havent gotten to that part yet and I’m not sure how the reformers will treat the textured look). Then I guess I would either export the FBX files or use .png key frames from the animations for the walk cycles/actions/idle/etc.

Would that work/is that efficient? Or am I lost somewhere along the way?

When you say the manual way in illustrator, do you mean drawing every key frame for the sprites in illustrator and exporting as pngs?

Thanks again!

Deformers* not reformers, apologies.

Pixel art for the streetfighter feel, vector art for the GuiltyGearX feel.

The way I work with vectors is that I have body parts I can move around just as if were a skeleton - that’s actually the same way I work with pixel art, except for with pixel art you have to touch up a lot before the final frame is really final.

Surely in 2019 there are more effective ways to do this, but I do prefer the old method.

Unreal Ford support SVG btw, and a static mesh is essentially an SVG as well. A path of vectors.

In the end, I think the workflow you pick depends on how you work, or what you develop while working to get things done faster.
Perhaps someone with more modern 2d experience can also provide more tips… just keep bumping the thread periodically :slight_smile:

Great thank you!! I see what youre saying about the vector art, wanted to make sure I was on the same page. My project is aimed at a more GGX feel, as I want my vector art to look more like the game Salt and Sanctuary or Hollow Knight more than Contra or old school street fighter. Nothing against those games or that style, I am just geared more toward clean lines from an art perspective and want my final product to look as close to my drawings as possible (if that makes sense).

So your old school method is appealing to me (if I cannot figure out a 2d animation program to use deformers to make my sprites). To be clear, youre saying you move your vector layers around to achieve the desired poses for example: a walk cycle. Like re-adjust your “Lower_leg” layer and your “Upper_Leg” and “Foot” layers for the contact, down, passing pose, up, etc.?

As for the .svg files, you would export each of these poses/“key frames” as .svg files (instead of .png files?) and then create the flipbooks and animations in engine with the.svg files?

Thanks again, man. this has been really helpful! :slight_smile:

I’m sure you can find a plugin to support the SVG on flip books’ but realistically you can have the SVG file just animate the walk cycle by javascript.
The theory is simple, you set your poses up as you would in any cycle, and you use JS to transition each point between one frame and the other. It’s a bit intensive for JS (wouldn’t reccomend making a game with it in your browser but its possible).
Think of SVG as a web ready vector art that you can also use as a backup to save on file size. (Because it’s a few lines of text and not an infinite string of bytes you can’t read).

All that said. If you think about it the Persona/animation system would already have all the required math and functions to make this happen if it were just re-coded and adapted to vector art… that’s probably why it is used a lot and why you make blender rigs. It’s a quick cheat to tie into an existing implementation (and it let’s you also use the common animation nodes without having to touch anything in engine).

And yes, the oldschool method is just that. You just split your character up in parts, create a few extras too in case you need the toes to articulate for instance. You actually parent them in a very similar hierarchy as you would create a skeleton.
Pelvis > upper leg > knee > lower leg > ankle > foot > toes.
Torso > shoulder > upper arm > elbow> lower arm > wrist > hand > thumb & fingers.
Head > eyes, nose, mouth ears.
Etc.
When you move stuff around the “in between” parts usually get adjusted to fit. (Shoulder, Knees, elbow, ankle, wrist).
After assembly you would export to PNG with transparency at whatever target size you needed (in a world aligned file size so that you would have the images flickering around).
You could imagine this as taking the files over to photoshop and publishing an animated GIF of the cycle. It’s sort of similar in anything I used at the time.

The alternative for 2d was Flash at the time. You would just move the vector art to flash and use flash to animate the cycles and parts directly. Realistically a whole different sort of story.

As far as modern pipelines go (specific to UE4). I would really look into using straight up SVG. To the point I would consider rewriting the whole thing using the persona system as the base to manipulate groups of SVG points in 2d space.
The engine is really good at moving vectors and incredibly fast on those calcs. But it doesnt (afaik mind you) support a straight up usage without workarounds…