Built-in Subtitles

Hi Epic!
I’ve just finished my first solo-project in UE4, it’s a short first person adventure game called What Never Was.

I’ve gotten a lot of positive comments and love for it, a lot of which I owe Unreal for!
But the most common negative feedback I get is that the subtitles are too small.

I’d love to be able to set the font type, size, color and even stuff like z-order as subtitles are rendered behind Widget, without having to go into code since my project is purely blueprint.

I’d also really love to be able to have subtitles formats connected to a certain variable, so that different characters could for example have different colors on their subtitles.

What there is in the subtitle system is really nice already, it just feels very limited.

That said, I know how hard you guys work so keep up the awesomeness! :slight_smile:

In the project settings, it is possible to set the subtitle font type and the size for it. Exposing the color to the project settings would be easy to do as well. Z-order isn’t possible unfortunately. I’ll see what I can implement to expose subtitles to BPs (that way you would get the z-ordering as well). Maybe have an event node that you need to call in UMG to register it as a callback and that all subtitles then get routed through that BP if that makes sense? Let me know what you think. Cheers,

Oh, sorry I actually didn’t know that at all.
Sorry for making this thread into a support thread, but : How would I go about changing the font size, or even just the font through blueprints?

In a perfect scenario, I’d love to have a subtitle “material” or similar for every time I set a subtitle, so that I could have fonts for different characters.
But maybe that’s already how I should be doing it then?

That isn’t possible atm unless you write your own subtitle system in BPs.

Ah, so then that is my feedback request. The subtitle system is really great for implementing, but not for customizing at all.
Readability is a huge thing for accessibility and it would be really nice to give players the option of scaling the subs.

You can simply create your own UMG widget, where you can use custom material for Text widget.

Oh, it reminds me that… subtitle data from audio files hasn’t been exposed to blueprints…
I made pull request for that half a year ago.

If this data would be available you could do anything with subtitles using UMG and blueprints.

While that does expose the subtitle data, how would you access it from BP as there aren’t any nodes that I can find that do that?

But that seems very non-user friendly to have to make my own system when there is one implemented that almost has everything one needs?
Or am I missing something? Is it super easy to make your own UMG widget?

I assumed it’s up to the game developer - if you’d pass reference to sound/dialogue wave to blueprints/UMG, you can simply read properties there.
It’s change meant for blueprint-only project where blueprints already know which VO file is being played.

BTW, putting text in audio files isn’t quite inefficient way to localize game if given game uses lot of text or many language versions. But this native method should be at east made blueprint-friendly :wink:

Yes. Simply create new widget asset, place Text element on canvas. Setting any visual propeties and actual text is super easy. And you can easily animate widget and text itself using UMG animations. And put this Subitles widget on HUD, which should be also a custom widget (not this weird blueprint used as default HUD, probably remnant of pre-UMG times).
I never considered built-in subtitles as something useful in production because you can build fully customizable UI very quickly.

Dialogue Waves to aim to be localization friendly, as we have tools to export dialogue sheets for recording, as well as for handling complex cases (for gender/rank/etc; depending on how well the Dialogue Wave is configured) on a per-culture basis (compared to Sound Waves). That said, I’d love to get feedback in this area if you have it :smiley:

We had a lot of issues when doing subtitles that we needed to sync up lines with timings in the editor (someone had to do that manually) (to explain: we had long audio files with a lot of dialogue in them and some of it was spaced out and needed the lines to show and hide at specific times) and were surprised that there wasn’t a way to populate it through an import button that takes a CVS file for example. Or were we doing something wrong?

Okay, that’s some useful feedback. I can say that Dialogue Waves themselves are not a full dialogue system, but they are the primitive component that you’d use to build one.

In the example you gave, I imagine the idea would be that you’d cut your dialogue up and then queue them to play as needed. I say “I imagine” because I didn’t actually design Dialogue Waves, but you’re right that they don’t really seem to be designed to handle large chunks of text (like you might find in a cut-scene) out-of-the-box. Cutting up the dialogue is also not ideal as it will make it harder for your VA to perform unless you do it in post.

With regards to adding offsets to the existing subtitle data within a Dialogue Wave, you also have to be aware of that fact that those offsets will change on a per-language basis, and we currently don’t have a good way to dealing with localized meta-data like that (Sound Wave assets do have offsets on their subtitles, but they’re not localization friendly).

We do actually have a newer system (called “overlays”, see UBasicOverlays) that support offsets and localization, and can be imported from SRT files. I think these could be a reasonable solution to complex Dialogue Wave subtitles, as you could just provide a subtitle override which is an overlays asset, which we’d then queue into the subtitles system.

In the current subtitle manager code we do have a OnSetSubtitleText function that you can bind a delegate to. This would let you override the subtitle rendering with a UMG widget, which is what Fortnite does. It is, however, marked with the comment “HACK”, so I imagine someone intended to clean it up at some point :slight_smile:

The conversation has steered a bit over my head, so please excuse me if I am not understanding at all :slight_smile:

I feel like making my own subtitle system in UMG feels really unnecessary when the built in subtitle system does just almost everything I already want.
If it were just a bit more flexible and modifiable it’d be great. I really like putting in the subtitle-lines in the audio file, and adjusting the times of them. It works well and simply, all I’d like is to be able to choose font per audio file and to be able to expose the features of said font so that they can be scaled.

You’d only really be replacing the rendering part of the subtitle manager though, as everything else would still work as it currently does (with Canvas), but you’d just put a widget somewhere in your HUD that would display the subtitle strings the subtitle manager told you to.

UMG is certainly more flexible and forward facing, and would let you take advantage of things like its rich-text support to handle all the font control you mentioned.

Oh! I see. Is there a proper tutorial for this? With perhaps pictures of how to set the UMG blueprint up?
Sorry for being dense.

There isn’t, and it would involve you writing some C++ to hook into the delegate. Which version of UE4 are you on? The delegate is relatively new so it may not even exist in the version you’re using :frowning:

Ah, I’m using 4.20 and I’m working in blueprints only. Which again, makes me wish that the built in system was just a tad more flexible :smiley:
At least I’ve now through this thread learned to increase the font-size. :slight_smile:

[USER=“2003”]Jamie Dale[/USER]
To be honest, I never use Dialogue Waves because it still requires to create a new asset for every dialogue line. It time-consuming and incovient if you’d like to write and test lot of dialogues directly in editor. Writer should be able to easily write text, test it out and after that anybody would bother with creating audio-related files. Unreal way isn’t efficient to use for games rich in VO.

For some time I was looking for a more convenient solution. Something that would offer workflow similar to The Witcher engine - no engine beats it.
The best thing I found is Dialogue Plugin.
Basically is a generic node editor with some dialogue-related properties stored in single Data Asset + example UI. I threw away its UI and dialogue logic. With this editor writer can rquickly assemble complex dialogues and test in editor. My opinion is that you should buy out this plugin, turn it into truly generic node editor - generating nodes based on provided Data Asset class at voila. It would be useful for assembling dialogues or custom game systems.

Alternatively: sound cue editor looks like something that could be mutated into convenient dialogue/screenplay/quest editor.

PS Thanks for sharing Fortnite hack. Excellent hack, sir!

[USER=“69”]Cheshire Dev[/USER] I think we’ve strayed pretty far from subtitles at this point, but with regards to Dialogue Waves not being a Dialogue System… that is known (I said so a few posts back).

All Dialogue Waves are is a way to say “I have line X which is spoken from person A to persons B, C, or D”, and then let you assign a Sound Wave to each combination. In English all those combinations may use the same Sound Wave. In French you may need two Sound Waves (because of gender). In Japanese you may need three Sound Waves (because of politeness). That localisation concern is encapsulated within the Dialogue Wave so that on the outside you only need to play it with the correct context, and you’ll get the correct result, accurately localised.

We also have some localisation tools that scrape Dialogue Wave assets and generate per-language recording sheets, as well as importing copy-edits to those sheets made during recording, along with iteratively importing the final WAV files and creating the appropriate Dialogue and Sound Wave assets (and linking everything together correctly using asset localisation).

If your game is dialogue heavy, then you would absolutely want a higher-level system to manage your dialogue tree/script. There’s no reason that higher-level system couldn’t use Dialogue Waves internally, and provide some kind of process to automatically create and link together the appropriate assets (in the same way our WAV importing does). In fact, the plugin you linked to does seem to have a slot in its UI to link to a Dialogue Wave asset.

I appreciate that you may find that frustrating, but if your game is very dialogue focused then a dialogue system is something you’ll be invested in, in the same way that someone writing a game that is resource management focused would be invested in having a good inventory system.

Oh, that’s pretty good insight. Now I understand the philosophy behind Dialogue Wave, its name confused me a bit.