Announcement

Collapse
No announcement yet.

[Feature Request] OSC, MIDI, and Syphon

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • started a topic [Feature Request] OSC, MIDI, and Syphon

    [Feature Request] OSC, MIDI, and Syphon

    Hello devs,
    I came here on behalf of some of my fellow live visual artist developer friends, casually called VJs. We often work in many different programming environments with various tools to try to accomplish our visions whether simple generative graphics or projection mapping. In recent years game engines that offered real time rendering have provided an interesting tool for us to work with. In example Unity engine has allowed interactive projection mapping such as these examples:

    http://vimeo.com/95460603
    https://www.youtube.com/watch?v=hwROCMIOkZo

    Well after seeing the Unreal 4 demo a lot of us were joking around wouldn't it be nice to get some native support for OSC ( http://opensoundcontrol.org/ ), MIDI, and Syphon ( http://syphon.v002.info/ ) for Unreal. One said he would look at the Unreal plugin API but I wanted to see if anyone suggested it. I searched the forums and didn't get any results so I figure I would post a request. The real time rendering capabilites, among everything else, could provide a wonderful new tool for a lot of us. I personally have my background in using a programming language called Quartz Composer which is losing support from its developers and becoming more outdated. I would be very interested in Unreal 4 becoming the new tool for me to use.
    Thanks,
    noiseismyart
    Last edited by noiseismyart; 05-17-2014, 09:49 AM.

  • replied
    I don't know about the MIDI plugin, but my OSC plugin is maintained and used:
    https://github.com/monsieurgustav/UE4-OSC

    AFAIK, OSC is a superset of MIDI, so it is easy to convert to/from them.

    Leave a comment:


  • replied
    is this still a thing? is there an updated version for this?

    Leave a comment:


  • replied
    ... great work ... : ) ,,

    Leave a comment:


  • replied
    Hi James just wanted to chase up or get update about that feature request. Personnally I may go back to MAC for my job and I don't really like Unity. But I really want to get going with my projects with Unreal and Resolume or Madmapper or even better a plug-in or a built-in feature for video mapping output for Unreal.

    I know that your money comes from games for mobile, tablet, PC, console, but there is so much potential with this engine for other market.

    I'm from Montreal and I know that studio here http://immersivedesignstudios.com/immersive/ used Unreal Engine 4 for there projects.

    Well Thank You

    Leave a comment:


  • replied
    I Am working on midi implementation if anyone is interested.

    https://www.youtube.com/watch?v=trBv67_Bj8I

    Leave a comment:


  • replied
    Originally posted by DanJacobson View Post
    I'm blown away by the power and capabilities of these software platforms having returned from a decade long sabbatical from the scene. But when it comes to hardware interface MIDI is still the most widely supported and available standard, and the visual side of the industry still haven't come up with anything better than mouse and keyboard (multi-touch screens and 3d gestures lack the precision and low latency). OSC would be great if back compatible with the 95% of hardware which uses MIDI, and not dependent on third party plugins.

    So...just encouraging the UE devs to have a look at what Ableton is doing with Push (as an evolutionary step from more generic MIDI devices) and think how something similar might be used in a visual/games context. As a programmer, I'm not feeling entirely helpless and frustrated, just weighing up whether I should be trying to port MIDI to UE, or Physics to TD.

    Lmao, so im a musician, and i perform live in the push. I use TD to control leds and projection map etc. My ultimate goal it to projection map my self while making music live on my Push and having amazingly rendered sequences in the background live going to my live music. Soon I will accomplish this and they will write about me in publications, lol sadly until them im stuck using midi to keyboard converters which I am playing with until a real solution comes about.

    Leave a comment:


  • replied
    Originally posted by Lumli View Post
    I understand the frustration, I too come from TD and came here for a better processing and globally illuminated workflow. Sadly the OSC implementation is the only one that works and even then it doesent work fully for me.
    I'm blown away by the power and capabilities of these software platforms having returned from a decade long sabbatical from the scene. But when it comes to hardware interface MIDI is still the most widely supported and available standard, and the visual side of the industry still haven't come up with anything better than mouse and keyboard (multi-touch screens and 3d gestures lack the precision and low latency). OSC would be great if back compatible with the 95% of hardware which uses MIDI, and not dependent on third party plugins.

    So...just encouraging the UE devs to have a look at what Ableton is doing with Push (as an evolutionary step from more generic MIDI devices) and think how something similar might be used in a visual/games context. As a programmer, I'm not feeling entirely helpless and frustrated, just weighing up whether I should be trying to port MIDI to UE, or Physics to TD.

    Leave a comment:


  • replied
    I understand the frustration, I too come from TD and came here for a better processing and globally illuminated workflow. Sadly the OSC implementation is the only one that works and even then it doesent work fully for me.

    Leave a comment:


  • replied
    I've been working with TouchDesigner for midi-reactive animation, was hoping to do something more game-like, using physics, hence evaluating UE however no midi support is a bit of a dealbreaker.

    Coming from a music production background and knowing the possibilities of hardware software integration, I feel the animation/visual/game dev community is missing out by not encouraging or even supporting the use of hardware control surfaces. Tweaking the multitude of available parameters one at a time using a keyboard and mouse is time-consuming, frustrating and turns the process of creation from an expressive, exploratory one, to a technical process which works well only when the end is known in advance.

    E.g. take the basic process of selecting colors for a shader: there are several different colors to select (reflective/ambient/emmisive etc) on color spaces also defined with multiple parameters (HSV or RGBA). This might involve 50 mouse clicks to get right, with the users eyes having to dart back and forth between the parameter panel and the output. The vast majority of permutations will never be tested, because only one parameter is being adjusted at a time, while the rest are locked in place. If these are instead mapped to faders on a control surface, millions of combinations can be cycled through in a few seconds, without need to shift eyes from the output. Serendipity can play a role and the whole process is far more enjoyable. When I am working with TD I map basic functions like translation/scale/rotation of objects to an APC40 as well as more context specific attributes, and discover wild permutations that I never would have come across if I had to set things one at a time. The main drawback is the time needed to set this up/map controllers. Would be nice if the software made this easier.

    I knew a character animator who took this further and rigged up an electric piano to his skeletons so that he could "play" his animations expressively rather than script them precisely. This resulted in a very natural look, and once he had set it up, was simple to produce long, complex animations such that he didn't even bother to save them, just learned them and played until he had the best "take". An electric piano has 88 keys, there is velocity, after-touch and note-off data, there are effects oscillators and pitch bends, volume and sustain pedals, all of which are just as adept at controlling visual and animation parameters, as they are with audio.

    Tactility, immediacy and the possibility to control numerous parameters simultaneously are vital to the creation process in a complex environment. One day I hope video/animation/games developers will design application specific, integrated interfaces as Ableton is doing for DAWs, in the meantime just providing basic midi support so we can map our own is a bare minimum for live performance applications.

    Leave a comment:


  • replied
    Any news on the Midi implementation ?

    Leave a comment:


  • replied
    Hi,
    i want to use unreal for a fulldome 360° video project,
    it's perfect, we just miss the syphon module to send the iamge to the warping software
    please add to the next evolution
    thanks
    francois

    Leave a comment:


  • replied
    Glad you like it !

    I have no plan for multicast support yet. But it should not be a massive amount of work, so I'll look into it.

    Post an issue on the Github:
    https://github.com/monsieurgustav/UE4-OSC

    Leave a comment:


  • replied
    Hey monsieurgustav, any word on when multicast support will be implemented?

    Thank you so much for this plugin, I use it it nearly every UE4 project I do!

    Leave a comment:


  • replied
    with these tools, is there any way to drive animations of meshes or lights in unreal in real time using an Mp3, then output to DMX to drive real world lights ?

    Leave a comment:

Working...
X