Why does Unreal display bones the way it does? & rant

Yet another strange design choice making my job as difficult and confusing as possible.

Unreal Engine seems to not actually display bones, but rather the connections between them. Why? What purpose does this serve? Do they realize it can be incredibly annoying when trying to diagnose animation problems (which are caused by their own tools?)

The highly-touted, far-famed, Epic Games™ IK Retargeter, does not retarget IK bones, it only uses the IK rig asset to (somewhat poorly) approximate the transforms of two sets of bones for each frame of the animation.

But look! You can map this cartoon Ogre to a twerking robot! How cool is that?

– I don’t believe this, I think their examples are all specifically crafted to showcase features that actually don’t work for the majority of use cases. I have been trying to work with a non-Mannequin humanoid skeleton for months, and it is tremendously difficult. Their tools do not help at all, in fact they make it worse. I would almost be better off authoring my own animations completely from scratch.

It seems like yes, things will generally work for you, provided you are making a game about Manny the Mannequin and his friend Quin hopping about in an untextured area with bouncing balls and so on. The moment you deviate from Manny and friends, it all breaks down, and you are struggling forward in an absolute vacuum of knowledge.

Since the legendary Epic Games™ IK Retargeter does not retarget IK bones, I have to export most animations to 3rd party software and adjust them by hand. It turns out, fixing Epic Games™ retargeter with a specialized control rig and ABP shows what looks like corrected IK placement, when examining the skeletons in 3rd party software, they are not actually fixed.

Anyway I am complaining. It seems Epic Games™ Unreal Engine is determined to frustrate the user in every possible way.

Here’s a picture comparison


Notice the thigh bone (selected in both pictures). The Unreal Engine-depicted thigh bone is nonexistant, it is an empty space between the actual bone and the next. Why do they show it this way? Is there a setting to turn it off? I have been ignoring this problem for months but I’ve had just about enough.

P.S. No, I will NOT use Metahuman
It seems like the intention of Epic Games™ and Unreal Engine is pushing this stupidly high-end luxury tech-demo obscure features while ignoring fundamental features. For example, they’ve invested god-knows how much funding into a web-based character generator that generates the micro-hairs on human skin and blood vessels in eyeballs, absolutely procedural animation for LYRA tech-demo project, but god help you if you want basic tools for working with animated characters.

Oh, and the hierarchy viewer is still busted, and has been for years.

1 Like

Pretty simple as to why.
Why would you put needless data such as bone lenght into something that in order to work needs as little data as possible?

You wouldn’t. So there is no way to draw the same bone as the completely arbitrary display you get in blender.

From reading the rest of your rant, wholehardtedly: Go back to learning.

You obviously have some lacunes to be filled which have little to do with the engine and everything to do with rigging/animation.
For one, if you knew anything about anything you’d never even attempt to correct things in engine…

Ps:
For one, turn on parenting lines in Blender.
The epic bones will match the dotted lines.

“Go back to learning”

What do you think I have been doing? It’s not like there is a comprehensive source of knowledge on any of this. I have looked. Everything I know is gleaned from trial and error, a forum post here or there. I ask rational questions and do not get a response. I am completely on my own, trying to figure all this out. I never claimed to be an expert animator.

What I have found, is there is a severe knowledge gap with game development in general. You have your Youtube tutorials, like & subscribe, your official documentation. That’s just enough to get you started. But where is the further discussion? I have not found it. So I am left to figure it out by myself.

Then, apparently, there are experts such as yourself, to whom everything is obvious and well-known, descending from above to condescend with your secret knowledge.

And of course the Amateur Game-Developer Industrial Complex, the various Marketplaces for each game engine.

I am learning. For you to insinuate that I am not, telling me to “go back”, i.e. stop what I am doing and just leave it to the ‘experts’, is insulting, and I refuse. The complaints I have are valid. Your explanation of why the bones are displayed as relationship instead of spacial position does not make sense. But, I appreciate you giving me the scrap of knowledge of parent lines in Blender matching the unreal display. So thank you for that, I am exceedingly grateful.

For example, I would like you to explain why the animation sequence editor is perfect and does not need to be changed. Go ahead, try deleting the last frame of an animation this way. You have to go back several frames, right? And it will allow you to delete frames you have already deleted? Yes. I am sure there is an amazing reason beyond my pitiful comprehension for this.
Of course, it is better to cut frames, edit animations in a 3rd party software. I am aware. Also I apologize for using Blender, I know that makes me a subhuman for not using Maya or whatever the experts use. I suppose it is my fault for not completely understanding the labyrinthine import-export process, which seems to randomly alter the transform of the object (which I have figured out how to deal with, by the way).

Unreal Engine is not bad, it is quite good. It does have problems, though.

EPIC figured out that t!ts sell and rather milks the fortnite cow with twerking terminators than focusing on developer UX. Add shiny!

:joy: About every common class like Characters, the movement component, controllers and widgets are like that, they appear modular but are overcooked molten spaghetti with 6 types of cheese on top just in case you want that, except you can’t remove them or use them as modular features.

Compared to other things I’ve seen (a lot), this for example:

[UE5.1] Sequencer curve editor not showing curves. Awful accessibility on widget animation creation.

The skeletal animation system is honestly surprisingly good. It suffers from UX hell yes, but overall it’s powerful. Just don’t use UE to create animations, use tools like Blender. Use UE to blend, write logic, IK just nothing else. Logic to c++ if you can… The ALS anim bp bloats the repo 30MB with each change just for reference, because UAssets are a shitshow. ALS is a good reference on how to animate, there is a c++ repo on github.

Can agree with this because you should always fix things at the source file, not post after import.

1 Like

Because who the f uses it if not noobs? Everyone else who has a job already knows that to do things you cannot do them in engine.
The priority for epic to fix whatever the issue, when the whole engine has been trash for about 2 years (and they are painfully aware) is next to null, if not already null.

Sounds like you just want to be spoon-fed. In that case, coding in general and game development in particular is probably not for you.

There is none.
Do or do not, there is no try.

And probably, after you do, go learn something else and after, do it all again. In about 20 or so cycles the end result may approximate something worth writing home about.

They really aren’t.
The bones could be displayed like wet noodles, the end result is going to always be the same.
Because? Yet again:
Don’t use the engine for things that are not fixed in engine.

Whatever works. The experts aren’t going to bother learning a new pipeline to work with a specific engine, f that engine…
On the other hand, you the non expert can pick your poison - and learn multiple ones.

@MostHost_LA
I am a software engineer, and have been for a long time. Just not an animator.

@Roy_Wierer.Seda145
Yeah, I am learning. The problem I keep having is not being able to tell where my IK bones are actually located in UE. I have gotten around this by continuously updating the rotation of arrows in the character BP to match the IK bones. My original complaint was about this.

So in Blender I set up my IK bones with various constraints etc. Obviously these don’t carry over into UE but that’s fine, it was just to keep them pinned during the animation. Basically this whole to-do is my frustration with trying to port the AnimStarterPack onto my skeleton, and the IK bones in those animations aren’t animated for whatever reason.

There’s another problem in that I constrained the tail of the hand_gun_ik to point towards the tail of the hand_l bone, instead of the head of that bone, which is at the wrist. From what I am shown in UE, it looks like this is disregarded. It would be nice to have some visual confirmation in UE that my anims are exporting properly. I know now from what @MostHost_LA said, the IK bone is probably in the correct position, I just can’t see it.

UE seems to show my hand_gun_ik as being in the correct position. But when I attach the arrow to it, (or the gun), I can see it is pointing almost 90 degrees away. I am attempting to use the hand_gun_ik to orient the gun (and, with IK nodes, the arms) in such a way that it is pointing to the center of the screen. I got this to work before using virtual bones, but decided I ought to use actual IK bones instead. Is this the correct approach? I don’t know. It would probably be better to construct the Aim Offset poses to be dead-center in Blender. I might try that next.

But yeah. Lots of problems in the placement of the hand IK bones. Some of it I figured out was due to my use of additive poses, I fixed this by blending out certain bones at certain times. I fixed another issue by realizing the Foot IK pelvis offset wasn’t being applied to the hand IK bones, so I updated the control rig. None of that is the engine’s fault, it is doing exactly what it was told to do, I just didn’t know what was going on. Inexplicably (to me, at least) the IK bones still float around during animation, despite those bones being exactly positioned in all source animations. I made a temporary fix by making another control rig that constantly forces the IK bones into their correct positions.

I refuse to use ALS, it looks nice yeah but as we’ve already covered in this topic I am by no means an expert animator, and I learned a while ago that you’ve got to start small and bolting ■■■■ on will not fix anything, but just create more problems later. It is a good idea to look at it for reference, though.

Here’s what I was talking about. Apparently the hand gun IK bone is pointing off up and to the side. I don’t know for sure, exactly, because I can not see it. It could be something is wrong with the gun, or something else entirely. But without seeing the actual bone, how am I to know?

My original rant and follow-ups were rather incendiary or hot-headed and I apologize for that. I am doing my best here and have no-one to talk to about this stuff so sometimes I get mad. I was hoping that I would be able to lean heavily on UE’s animation and retargeting features to offset my own inadequacies in that area, as I am only one person, but I see that ain’t gonna happen, I am going to have to bite the bullet and just learn animation thoroughly, one step at a time, like everything else.

I’m thinking maybe you can add bone information to the screen using debug arrows, looping through all bones / sockets on a skeleton and drawing a vector (arrow) for each its transform and length, colored by type. I don’t know if that fits your specific needs and if you can pull all the data you want to build one. Debug data should draw at least during PIE, probably not in the editor.

You can also print rotation values relative to a parent bone to see if something is off.

Another option which might help you out is that you can attach objects to sockets on the skeletal mesh viewer, it should attach with the rotation of the parent I think.

1 Like

Thanks. I think the floating IK bones have to do with motion from the base animation. Here you can see what I mean in the AimOffset:
8

The good news is, I figured out what was going on with the ik_hand_gun. It turns out, (big surprise), I was being dumb and not understanding the data I was working with. The “rotation” of that bone does not refer to what I thought it was referring to. I was able to calculate the value I wanted like this:


Basically, get vector from ik_hand_gun to ik_hand_l. Get the vector from ik_hand_gun to the center of the screen. Interpolate the first vector to the second, then apply that rotation to the ik_hand_gun bone. It works well enough so far. I am sure my setup is convoluted and it is far from perfect, but it’s better than what I had before.

The goal with all this being to slightly modify the animation at runtime to center the guy’s gun with a crosshair.


You can see here the blue arrow which is the interpolated vector corresponding with ik_hand_gun.

I think what I am going to do is exactly what you said, just implement a sort of debug overlay with colored arrows so I can see the positions and orientations of various bones.

2 Likes

Good attitude.
But you also need to realize you are rendering in an engine that is very sub-par in terms of features offered compared to competitors, so what you can do gameplay wise, is more up to your c++/c# skills than it is to what the engine does.

Put a socket on the bone. Attach an arrow to the socket. See how it behaves.

Re the rest.

Look, IK is really not the droid you are looking for.

Make a dedicated hand object bone for your skeleton. Animate/use said bone.
Offhand gets one too.
Leave IK bones there for other (ik) stuff.

Other helpful additions are going to be a hat bone and a glasses bone.

Hand IK is used to adjust hand positioning. Not objects attached to it.

At a base level, all your animations will directly animate your objects.
Say you take off the hat, how else are you going to do so without a bone for it?
Say you do gun tricks, how would you get the gun to spin around an index finger without a bone for it?

The fact that the ik bone is used simply means that if you ever do use the bone, then throw the object out your hand, the hand will keep following the object. Instant fail / poor planning / poor understanding.

Given that, you just learned how to fish and skipped at least 2 iterations of re-doing things.

Other thing.
An Aim offset is math, with a really bad approximation. So do not expect the gun to point to centerscreen under the cursor.
The code you have running is probably a better idea.
However.
No one says the bullet needs to leave the correct line of the barrel, and save some slomo breakdown with extremely bad poses (barrel up in recoil shoots again anyway) no one will be able to see a bullet going off in a different angle below 5-10deg difference.
And those who do probably won’t care so long as you hit what you aim at.

Its a game, not real life. As much as making ■■■■ realistic is nice/better/preferred etc… well, let’s all remeber that people play fortnite…

All that makes a lot of sense, and clarifies some other things I was unsure about, thanks. I have read/seen a lot of people attaching things to the ik_hand_gun and I guess… figured that’s what it was there for? But no, I think you’re right. I have looked at skeletons from other games and from what I remember, they had attachment bones. It also makes sense because of what I was trying to do to get his hands to grip properly, and that not working as expected.

I am pretty comfortable with programming in general, just not too familiar with the exact structure of UE, as I have been able to do everything with BP and haven’t needed to go into the engine code, but I expect to at some point. I have made rough implementations of the various feature areas I want, just enough to prove to myself that it could be done. It is this area of character animation that has consistently been a pain point and kept me from being able to move on further.

My minimum goals I guess – have him be able to hold and handle the gun in a satisfactory way – once I have figured that out, other handheld objects should also be similar – and have him able to interact with scene objects (turning wheel, lever, etc).

Best advice I can give you is to get a way to get the right animations into a DCC to clean up for engine use.
Right away, even if a suit is $3k.
Because otherwise you are just going to burn out doing things by hand that would normally take moments.

This hasn’t come up yet, but you may find some use in it.

Particualrly if you get into a rokkoko suit or similar (lower your expectations on the output of those suits too. Its ok for simple stuff, but the moment your avatar needs to touch its nose, well, lets just say it aint happenig) a tool like mine (mr mannequin is an alternative) can help clean up/stabilize stuff on the fly.

Don’t do in code what should be done by default - like original hand IK placement not following animations, for instance.

1 Like

As a programmer, don’t bother with suits. Before you know it you need 4 suits for a scene and a dog version… Look into markerless tracking and pose estimation, both AI.

All we need now is a plugin for this beast:

DeepLabCut · GitHub

1 Like

For consideration.

When Epic was considering release of UE4.0 it was not completed as to full feature completion but worked well enough to release in it’s current form and then continue development. I noticed this from the start as the over all quality was not up to par with the UDK release required as a completed application.

This was a real first as the community was invited to take part in the process by making use of what was available and in the process would improve features with each “preview” build. This route seems to be working as appose of holding back workable features and the control rig makes a good example as being a TA there are missing features that goes beyond the ability of simply being able to animate in app. If you look at Motionbuilders control rig or Mayas FBIK system it would solve and even remove the need for retargeting in UE as a control rig would allow the use of Epic animation on any rig with out consideration of naming convention.

Long way of saying where we are now is not where things or going as feature inclusion is not even completed which can be frustrating how ever using external apps like Blender, MB,Maya or 3ds Max would be considered the best solution.

The way I would put it is doing animation in UE is like using a spreadsheet to write a book. It would work but is not ideal

2 Likes

@Roy_Wierer.Seda145 @MostHost_LA

Hold on a moment, I am confused – do you guys mean ‘suit’ as in a software suite for animation, or a literal motion-capture jumpsuit?
Also DCC – Digital Content Creation?

Doing motion-capture is not something I had ever considered for this project – I was going to use (and have been using) purchasable 3rd party UE4 animations and just clean them up for my skeletons. I am trying to keep this as simple as I can in that regard. I don’t need that many animations and they don’t need to be the highest quality – just not broken or improperly implemented.

To that end BoneBreaker looks like it could help me, I have looked at it but am still not exactly sure what it does. The UE IK retargeter is mostly able to convert UE4 mannequin animations to my skeleton, I just have to do a bit of adjustment on some of them.

This is related I guess – why do the bones of Mannequin skeletons seem to be inverted across the axis of symmetry? Am I just importing them wrong? For example, the left side bones are all oriented oppositely to the right side, when I import into Blender.Is that something Bone Breaker will help deal with?

Also, @MostHost_LA Yes, I think I see – the retargeter is merely the first step, not a complete solution. I should not be designing my in-engine animation logic to compensate for broken or improper animation sequences – this needs to be done with something like Blender. And IK is definitely not for fixing poorly retargeted animations. That honestly is a relief because I think in the long run it is less work and a better, simpler result.

I guess need to figure out what the procedure is. For example, with my 3d modelling and texturing I have a sort of production chain which starts with me creating a plane, or ring of vertices and ends with an LP mesh (or multiple meshes sharing one UV space), 3 texture maps and a material instance.

I will work it out, but I am guessing for animation, it will go something like this: retarget from UE4 to my skeleton. Export. Import to blender. Animate or constrain unanimated bones, clean up rotations, etc. Export and re-import. I will try it out and see.

@FrankieV
Thank you, that is a valuable insight.

I have been using UE since around May of 2022. Before that, for 6 or so years, I was trying to create this project with the Unity engine-- that was incredibly frustrating. I think probably the worst thing that happened was the rendering pipeline bonanza, but also the proliferation of heavily experimental features, preview releases, and incompatibilities between various plugins or extensions.

For example, Unity out of the box did not have any vertex painting feature, so I got a plugin for that. It did not support volumetric effects, so I got a plugin for that. It did not support IK, so I got a plugin for that. It did not have a node editor for authoring shaders. Cloth physics, I had to get something for that. It did not have any kind of framework for behavior trees, either, which is one of the last things I was working on before I stopped. Basically every shortcoming I came up against in Unity, is not a problem with UE.

All these plugins, shortcuts, to avoid my having to implement everything from the ground-up, or invest tremendous amounts of time into learning a highly specialized skill. Then the fragmentation of the engine into these various versions, this plugin supports this one, this one supports that, it was just a mess and I abandoned the entire thing. I have since learned my lesson and am glad UE has all of those features I paid extra for out-of-the-box, which is great.

I should have known better too, because I have wasted so many hours professionally with package versioning problems in every software environment I have ever worked with – .NET DLLs, python dependencies, npm.

I make all my own 3d mesh and textures, if there is something close enough in Megascans I will use that. Animations and sound I purchase (but have to know enough to integrate into my project). Any kind of mechanic or feature addition – it must be a stable, documented part of the engine or I will not use it, or if it is important enough, I will implement it myself – otherwise the entire project is compromised.

I was able to get by this whole time with a very limited in-depth knowledge of rigging, I am not attempting to specialize in animation, I just want to be able to continue with my project, essentially. My progress in this thread is realizing that I have to learn more than I expected to, as UE cannot help me with my problems. Which honestly is a good thing for the long run.

Anyway guys, sorry for so much talk, it is probably a lot to read. I really appreciate having had this discussion.

1 Like

Both, since usually a suit comes with a software suite as well :stuck_out_tongue:
The fact you pronounce them entierly different but spell them the same is just bonus.
Yes for dcc.

That’s how they are. The actual rotations are off, but they are also unconsequential really.

If you make a rig and retarget, youll nver have issues.
If you make animations for the marketplace, they ought to use the wierd epic skeleton.

Also, again (for the n^t time…) its just a visual (of needless data).
Since no data is stored for blender to know the direction of the tip, then whoever coded the import decided to pick some arbitrary way to display stuff. Same for Unreal (though it is closer to how maya looks or was anyway).

Additionally, you’ll notice that none of the bones are connected.
I would steongly suggest you make your own rig that makes sense to you, and ditch the epic skeleton anyway.
Even to export for selling in the marketplace you can always just retarget the animations to the epic skeleton anyway…

Thanks for the advice. I figured out the scale of my character rig was partially responsible for the issues I was having. It was the 100x scale problem many people have described. My theory is that was causing very small movements to be amplified. Since fixing the rig and all animations used, the ‘floating’ behavior of the ik bones is reduced.

I also figured how to ‘fix’ ik bone placement quickly in the source animations by using Blender “actions”, it is pretty easy to transfer an animation to a skeleton with constraints to enforce correct bone placement., and then re-export. So I have been doing that, as well as clearing out all scaling operations for each animation, as it seems to improve them.

1 Like

Thats a blender bug.
Scaling bones still results in wierd parent rotations after 2 years of the inital report.
Apparently no one there gives 2 f about it after they got the epic funding.

So it goes…