I’m wondering if someone would help me to find the most accurate solution to my needs. I’m trying to make a CGI film in UE4, and intend to make and animate the characters with DAZ. My intention is to buy a pair of Kinect 2 for the mocap. I’ve heard about Ipisoft, Brekel or MotionBuilder(too expensive for my budget). Which do you think it’s the best in terms of intuitive, easy to manage,configure and calibrate? Is it possible to capture at once both body movement and facial morphs, though they’re are in separate files (but synchronized)? Maybe running a facial and a movement sofware at time.
As an off the shelf, inexpensive solution, performance capture, as in capturing both body and facial tracking, is still has a way to go before it becomes available to the Joe average user as a complete package at an affordable price and being able to capture anything is just the starting point.
Some thoughts though.
If you want to get serious then I would suggest that you save up to purchase Motion Builder as it was designed from the bottom up to be the host app for the purpose of motion capture and the fact that you can hand key is a bonus. You could do captures using Daz Studio, I’ll get to that in a moment, but the fidelity of the output is not there as it does not record on the tick so the output tends to look like 10 year old mocap.
To be clear you do not capture facial morphs but rather tracking data that is feed into a device driver, or API, that wires up a relationship constraint between the output from the mocap device into the target or host application.
For example
So as an example, and according to their video diary, the data flow comes in from the capture device through their custom API or device driver that contains a matched relationship to their rig in Maya and then pushed into Unreal 4 for real time rendering. Once the data hits Maya it can be recorded for playback or even recorded in real time in UE4. So yes you can do both at the same time if you wish as you only need to have a relationship constraint for whatever you wish to track but the real trick to it all is having an API or device driver that works with the host application.
However
Most of the work and cost involved in performance capture is not in the technology but in the development of the digital actor that all of this stuff is connected to and for a long time digital people even to do simple things was rather expensive as compared to hardware costs.
This is where Daz Studio comes in. More to the point the Genesis 3 framework.
Out of the box, and for free, both the Genesis 3 male and female comes with the necessary foundation of rigging, cluster shaping, and morph targets necessary to hook up a 3rd party device driver or API as the starting point to get all of the hardware hooked up and working.
P.S. A digital production forum would be nice as to technology convergence not necessarily related to games development.
Many thanks for the extensive clarify :). Anyway, I guess I’m too noob yet for understanding it all. I understand the best pipeline is 2 Kinect with Motion Builder and then import into Maya? I’m a little lost with this sentence:
Is host application Motion Builder for instance and device driver a special Kinect Driver for Motion Builder (not just the ones for windows)?
Ive just asked here because Ue4 is the final software to render the CGI, so I thought you might have a clue for doing this directly with it.
PS. I cant save up to buy MB, but as I’m not intending to sell the product, just put in Youtube, I guess there’s no problem on acquiring the student’s version, right?
The API or device driver is what connects the hardware to the software so that the tracking that is captured has a relationship to the rigging that is being animated.
Maybe of interest me messing around with a sound capture device and tweaking the relation constraint between the device and rigging.
Warning some potty mouth involved.
Motion Builder is the rock star of MoCap apps but depending on this that and the other thing getting kitted so to speak is a process of choices of hooking stuff up that does not always plays nice. For example I’ve seen Blender being used for mocap but like a video card needs a driver so does the mocap hardware.
Other considerations
MoCap is not a silver bullet and the result will always need clean up as to purpose of use.
Some critical components
A 3d host app that can record capture data on the tick.
A tracking camera capable of capturing hi fidelity data. I have concerns that Kinect would not be up to the task.
The bottom line though is MoCap is expensive no mater how cheap you want to go as trying to do AAA grade captures on the cheap results in poor results and wasted time and money before even considering the hidden costs. Best advice is start by focusing on and building a digital actor first with all of the working parts as that requirement alone can take months, or more, to develop.
Sorry, I can’t see the video because it says it’s blocked in my country
I’ve heard that Kinect 2 is a good option for mocap. Even with a pair of first Kinect you can get these results, good enough for what I want to achieve:
Off the shelf yes the Genesis 3 is packaged with everything one would need to hook up as a digital actor with rigging as well as cluster and morph shaping.
The Kinect as a camera is 7 years old and limited to 640X480 @ 30Hz and at the time was a cheap USB solution for motion capture. It’s a good option for body capture at 30 fps but does not capture at a high enough resolution for facial tracking as compared to inexpensive cameras you can purchase today that can capture as high as 4K at 30 FPS or 2K at 60 fps at a reasonable price. If you use markerless tracking a cellphone would do the job. Over all the ability of the Kinect, even in video games designed to use the product, is questionable if the need for fast action is required.
A primer so to speak
More or less an Iphone could record enough data at a high enough resolution for markless tracking.
The secret sauce though is in the tracking software that for the most part is still home brewed as to unique needs so if you elect to go with the Genesis framework I would first start by researching for available tracking software, what hardware solutions it works with, and then go shopping for camera solutions.
First: Mocap is a way that experienced animators can take their animations to the next level of realism. It is not a way to get good animations if you don’t have the animators in the first place. It’s not clear to me why you want to use mocap. If the goal is “make more realistic animations at a higher cost than manual animation” then that’s fine. If the goal is “try to avoid having to do a bunch of work” then mocap is not going to help with that.
Second:
The specifics of any particular license may vary a little bit, but in general, a student license is intended to help students learn, and not to monetize. If you don’t “sell” the animations, but make money from ads on a YouTube channel, then that’s likely to count as “monetizing.” If you are not actually a student, then you also generally cannot get student versions of software. I don’t know exactly what your situation and needs are, so I can’t tell you more than that
Hi, thanks for the reply I’m not an animator, so I’m trying to find the best solution to animate my characters (which I though it was mocap)
I’m not even intending to monetize with ads. I just want to spread my novel around, then we will see. By the way, you can see some screenshots of what I want to get:
When I speak about Kinect, I’m refering to Kinect 2, which is supposed to be better
Yeah sure if you already have one go for it but I’m suggesting that in just the past 2 years 4K+ cameras have dropped in price that you could buy a 10 buck web cam that will do the job (if it can keep up with the required FPS)
The deal breaker for me though would be that Kenect (MicroSoft Branding) is no longer being sold and discontinued and since Apple bought out Prime Sense, the manufacturer of the 3D sensor, I would be worried that middle wear solutions requiring Prime Sense binaries would become orphaned. Either case if you don’t have one now better be quick as once they are gone they are gone. Then again there is always e-bay.
Interesting though is Apple buying out both Prime Sense and FaceShift?? /me thinks there is some NSA hanky panky going on.
Here’s another idea: How about you set up the scenes, but without animations, and develop the movie using still shots.
You’ll still want the sound effects, pacing, modeling and lighting from the full thing.
Then you can render out the movie, and tell it, like a comic book.
Once you have the comic book, you could either try to learn animation yourself, or find others who know how to animate and/or apply mocap data.
If you find that you don’t get to the finished comic book version, then you saved yourself a lot of woes worrying about animations that weren’t needed anyway!
And, if you do find animators to help or learn animation yourself, the comic book will help you a lot with the process of getting to the final result.
I thought about this either… But I guess that only children can stand “audio books” till the end Lol. I’ve also considered to make a comic, in paper… Furthermore, the models are temporary, I need to find someone who wants to help me to finish them all and get paid for that (a symbolic price, of course)
To tell the truth, I had in mind buying a 4k GoPro for other purpose than mocap. Would it do the trick in body and facial animation?? I though a special device was needed for tracking movements…
For facial tracking big time but not so much for body as you will need at least two. What the Kenect has going for it is it uses the Prime Sense 3D sensor which captures a Z depth map. More or less there is no real voodoo magic and it’s just all good old fashion camera tracking. A fun toy to play with is Autodesks MatchMover and it’s a free download.
Downside you will have to convert the tracking data to animated markers
I know that this ^^ company has simple programming solutions for 3d programs. They already have prefabs for Unity and Blender and would probably be interested in speaking with Unreal dev