This is just idle curiosity… Has anyone considered how feasible it would be to use UE/MetaSounds as a development environment for creating VST plugins? All of my search results for this return the use OF VST plugins inside MetaSounds rather than the reverse.
You can, theoretically, build UE projects into modules, which would mean it could correctly wrap the MetaSound object(s) into a .dll, and the VST SDK (and/or the JUCE framework) are all in C++, so it should (from what I gather) be possible to work with them inside UE, right? Meaning you could use the appropriate events/handles for MIDI, audio and automation from the host.
If this is an utterly ridiculous question then I apologize, I’m not super familiar with the VST spec, but I don’t believe VSTs have to use exclusively the VST API to work, so I would think that as long as you could use the VST SDK’s required inputs and outputs it would work.
In theory, yes, some kind of adapter should totally be possible.
There’s a question about how to test it – e g, the work in the editor is not a VST host, so the VST SDK integration would need to be somehow tested otherwise.
There’s also the problem of GUI. Unreal really wants to own the main event loop for GUI, and VST doesn’t let a plugin do this. You might be able to spawn a separate EXE of some sort, and communicate with the plugin instance, but then a bunch of VST integration like presets and in-line metering and such will become harder. Although, with VST3, at least it’s possible because they made the “plugin runs on the other side of a network from the GUI” use case work.
That might be painful at first but I would think that a really, really simple plugin could be constructed that would let you test that you had the function calls matched up right; once you did, you could really just use UE to handle the “audio testing” and it would hopefully be a minimum of hope-for-the-best on compile because, e.g., whatever audio stream value was being fed to UE was also being fed to the VST audio out value or whatever. Assuming a sufficiently 1:1 correspondence between what the VST SDK expects to let in and out and what UE works with internally (as I understand, for VST it’s basically all double-floats normalized to 0-1 range internally) that shouldn’t be too hard.
I mean even in the game development context, PIE is still not the same as shipping build, there’s always some amount of hope-for-the-best on compile. Ultimately for any game the last step is loading a packaged build and for any plugin it’d be loading the VST in some host and verifying.
That bums me out! Mostly it was my familiarity with UMG and UE’s shader graphs that made me curious about this design approach. But I know very little about how VST even handles GUI, and I guess I hadn’t considered that getting UE to “speak VST SDK” would mean not only using the SDK’s i/o for audio/MIDI/host data, but also potentially reporting UI elements in a way the VST SDK understands?
I guess come to think of it, I don’t really understand much at all about what it would mean to develop a unique GUI for something that only exists inside of a different application and doesn’t run on its own.
It’s been a while since I did it, but from what I recall, the VST host gives your plugin a native window handle, that you can slap whatever graphics and event handlers you want into. There’s also the “standard” VSTGUI library, which is a thin wrapper on top of this that lets you design a GUI using widgets that are “known to work,” but you could just do your own thing. Some plugins create a new OpenGL context to put inside the window, even.
Also, in VST 3, the “host interaction” part is clearly separated from the “plugin processing” part, and commands for how to change things flow as marshaled data packets across – you’re not guaranteed that your DSP code runs in the same address space as your GUI code. This makes things like in-plugin VU meters annoying to implement, but it does have the benefit that GUI interaction really is separate from the DSP, and the DSP could conceivably run on DSP cards or whatnot, and the GUI could conceivably run across a network. Maybe Yamaha uses this for their large location mixing consoles or something, who knows?
As I said, my UE experience is limited to game development, and unfortunately (well, fortunately for a game developer!) rendering is something UE handles auto-magically. Game logic you build an instruction at a time, but all of the rendering instructions the engine just takes care of, so even if it were possible to force it to work inside of a window owned by another application I would expect it would require modifications to the engine itself and that’s a step beyond what I’m capable of.
I did a bit more looking, I see actually that there is (was) a JUCE plugin for UE that exposes parts of the JUCE library to UE as functions, but that’s 7 years out of date (surely at least a few JUCE versions old, to say nothing of its utter lack of exposure to MetaSounds) so that’s not of much help, though I guess it proves that at least from the DSP side of things the core idea isn’t utterly ridiculous.
I appreciate you weighing in though, if nothing else I guess I have a more realistic sense of the scope of an “Unreal VST design” project knowing what sort of potentially devastating hurdles would exist getting UMG to render a functional VST UI.
Here’s hoping someone makes a UE-plugin or something that simplifies some of this to an extent that would be useful for someone comfortable making a “MetaSound Instrument / UMG UI” but with little experience with VST development.
Interesting question, I had VST 2 working in Unity as an audio effect, and even vsti, but that was with a GitHub plugin that allowed this. From what I know, the pc version from Tranzient app uses a custom unreal built to support VST plugins. With interface and all. But having it in metasound would be the best. No interface is needed, just being able to control the basic params of a vst2 or even vst3 plugin, and having metering feedback would be even better. I would love to buy this as a marketplace asset.