[Feature Request] OSC, MIDI, and Syphon

with these tools, is there any way to drive animations of meshes or lights in unreal in real time using an Mp3, then output to DMX to drive real world lights ?

Hey monsieurgustav, any word on when multicast support will be implemented?

Thank you so much for this plugin, I use it it nearly every UE4 project I do!

Glad you like it ! :slight_smile:

I have no plan for multicast support yet. But it should not be a massive amount of work, so I’ll look into it.

Post an issue on the Github:

Hi,
i want to use unreal for a fulldome 360° video project,
it’s perfect, we just miss the syphon module to send the iamge to the warping software
please add to the next evolution
thanks

Any news on the Midi implementation ?

I’ve been working with TouchDesigner for midi-reactive animation, was hoping to do something more game-like, using physics, hence evaluating UE however no midi support is a bit of a dealbreaker.

Coming from a music production background and knowing the possibilities of hardware software integration, I feel the animation/visual/game dev community is missing out by not encouraging or even supporting the use of hardware control surfaces. Tweaking the multitude of available parameters one at a time using a keyboard and mouse is time-consuming, frustrating and turns the process of creation from an expressive, exploratory one, to a technical process which works well only when the end is known in advance.

E.g. take the basic process of selecting colors for a shader: there are several different colors to select (reflective/ambient/emmisive etc) on color spaces also defined with multiple parameters (HSV or RGBA). This might involve 50 mouse clicks to get right, with the users eyes having to dart back and forth between the parameter panel and the output. The vast majority of permutations will never be tested, because only one parameter is being adjusted at a time, while the rest are locked in place. If these are instead mapped to faders on a control surface, millions of combinations can be cycled through in a few seconds, without need to shift eyes from the output. Serendipity can play a role and the whole process is far more enjoyable. When I am working with TD I map basic functions like translation/scale/rotation of objects to an APC40 as well as more context specific attributes, and discover wild permutations that I never would have come across if I had to set things one at a time. The main drawback is the time needed to set this up/map controllers. Would be nice if the software made this easier.

I knew a character animator who took this further and rigged up an electric piano to his skeletons so that he could “play” his animations expressively rather than script them precisely. This resulted in a very natural look, and once he had set it up, was simple to produce long, complex animations such that he didn’t even bother to save them, just learned them and played until he had the best “take”. An electric piano has 88 keys, there is velocity, after-touch and note-off data, there are effects oscillators and pitch bends, volume and sustain pedals, all of which are just as adept at controlling visual and animation parameters, as they are with audio.

Tactility, immediacy and the possibility to control numerous parameters simultaneously are vital to the creation process in a complex environment. One day I hope video/animation/games developers will design application specific, integrated interfaces as Ableton is doing for DAWs, in the meantime just providing basic midi support so we can map our own is a bare minimum for live performance applications.

I understand the frustration, I too come from TD and came here for a better processing and globally illuminated workflow. Sadly the OSC implementation is the only one that works and even then it doesent work fully for me.

I’m blown away by the power and capabilities of these software platforms having returned from a decade long sabbatical from the scene. But when it comes to hardware interface MIDI is still the most widely supported and available standard, and the visual side of the industry still haven’t come up with anything better than mouse and keyboard (multi-touch screens and 3d gestures lack the precision and low latency). OSC would be great if back compatible with the 95% of hardware which uses MIDI, and not dependent on third party plugins.

So…just encouraging the UE devs to have a look at what Ableton is doing with Push (as an evolutionary step from more generic MIDI devices) and think how something similar might be used in a visual/games context. As a programmer, I’m not feeling entirely helpless and frustrated, just weighing up whether I should be trying to port MIDI to UE, or Physics to TD.

Lmao, so im a musician, and i perform live in the push. I use TD to control leds and projection map etc. My ultimate goal it to projection map my self while making music live on my Push and having amazingly rendered sequences in the background live going to my live music. Soon I will accomplish this and they will write about me in publications, lol sadly until them im stuck using midi to keyboard converters which I am playing with until a real solution comes about.

I Am working on midi implementation if anyone is interested.

Hi James just wanted to chase up or get update about that feature request. Personnally I may go back to MAC for my job and I don’t really like Unity. But I really want to get going with my projects with Unreal and Resolume or Madmapper or even better a plug-in or a built-in feature for video mapping output for Unreal.

I know that your money comes from games for mobile, tablet, PC, console, but there is so much potential with this engine for other market.

I’m from Montreal and I know that studio here http://immersivedesignstudios.com/immersive/ used Unreal Engine 4 for there projects.

Well Thank You

… great work … : ) ,

is this still a thing? is there an updated version for this?

I don’t know about the MIDI plugin, but my OSC plugin is maintained and used:

AFAIK, OSC is a superset of MIDI, so it is easy to convert to/from them.