Hi! I’m a vtuber on Twitch that uses Unreal Engine to make a 3D interactive environment. I made a video showcasing some of the stuff users can do, like water plants, buy figurines to decorate the environment, and a virtual pet system.
The virtual pet slimes can customized with colors and equipment, and they can race against each other to win prizes. They can learn abilities like casting spells and playing songs, and everything gets saved between streams, so people can come in and visit their slimes on future streams
Here’s a video showcasing some of the features
You can participate in it the next time I stream here:
4 Likes
Hello there @FrootsyCollins ! Welcome back to posting with us on the forums.
As an aspiring content creator in several ways that overlap with what you do, I am AMAZED! This is truly a fantastic use of UE and Twitch. VTubing is a great resource that allows some major creativity to flow and be showcased in a unique content creation manner.
Does this style of pet system and overall environment make your PC lag or have problems running a stream at all?
Aww, thank you!
My PC is fairly beefy, so it doesn’t lag much. I could definitely do more to optimize it though. I recently implemented level streaming for the different areas, and I’ve been trying to reduce transparency where I can
Hello again @FrootsyCollins !
If I may ask, what are your current PC specs? If that’s a bit too forward I do apologize! Additionally, I was also wondering if your VTuber was run through UE too.
I’ve got at 4070 Super, 64 gb of ram, and an AMD Ryzen 7 5800X.
I used to run it with a 3060 TI and 32 gb of ram.
Everything in the scene is running in Unreal. The character’s face tracking is done via LiveLink Face, the hands are tracked with a Leapmotion 2, and the desktop plane is being captured with NDI
1 Like
Thank you so much @FrootsyCollins !
I think it’s very cool that you’ve made all this and have it running through UE! Is LiveLink Face easy to use? I’ve heard mixed reviews on the product, but it seems to follow you pretty well so I was wondering if you had any tips.
LiveLink Face works well for me. Some troubleshooting things to check are
-make sure computer and phone are going through the same router. With the computer directly plugged into the router, and the phone using the wifi
-to avoid interference, I have a dedicated router that’s only for the phone and computer, with no other devices added
-motion capture software tends to generate a lot of heat. If the phone gets too hot, the frame rate Livelink captures can drop. There are inexpensive cooling devices that latch onto phones. I’d recommend using one
-in the animation blueprint of the character using LiveLink, the morph targets can be adjusted with curves that can modify the sensitivity. For example, in real life I don’t open my mouth very much when I speak. To exaggerate it on the character, I use a curve to make the threshold for opening the mouth more sensitive. I do something similar to make the character smile more when I’m resting my expression. Animated characters can look like they’re frowning or angry when a real person is just relaxing their face and eyebrows
-It generally helps to make the ARKit blendshapes more exaggerated. You’re generally not going to be super expressive when you talk, so it’s good to make the expressions more extreme, because you’re generally not going to ever get the morph targets to 100% during normal speech. A lot of vtubers look kind of lifeless, and doing this can help. There’s a person on Youtube named Kana Fuyuko that has good guides on what the ARKit blendshapes do and tips on exaggerating them effectively. The amount of squash and stretch you want to use varies by art style
-during streams I try to use animation principles to exaggerate my expressions. For instance, Disney characters tend to swing their head down and blink as they turn to look at something. I also try to press my mouth to one side or the other when I’m talking to create kind of a smirking expression. I also try to shake my head to do little micro expressions
-I look at my preview window while I’m talking to the audience, and I set up the camera and reverse the character’s morph targets so it appears like I’m looking in the mirror. It helps me do a more interesting performance
1 Like
Thank you so much for that @FrootsyCollins !
Being new to this kind of thing can be daunting, so being able to connect and learn a little more about it helps a lot. For the best performance which camera would you recommend? I also want to thank you again for answering all of my questions, I just love learning about this before making choices.