Positional Time Synth

Is it possible to trigger clips from TimeSynth from different positions in the world? So drums from 1 location and piano in the other? The AddTimeSynthComponent seems to only have 1 input for target. And trying to start 2 instances of TimeSynth causes out of sync chaos.

I really hope this is possible! Hearing the granular and modular synths play spatially around me is so cool!

i am trying similar - well identical …- thing…have u made any discoveries you wd care to share?? am trying to split Sounds running off same Timesynth to different buses, and have different audio components in different places…

cracked it - i think…multiple timesynth in one BP, routed to different buses…and then u can have different AudioComponent Bps using your different buses , which u can place wherever u like…

The TimeSynth is a stereo synth source rendering through a Synth Component which has its own Audio Component. You can spatialize a TimeSynth, you cannot spatialize the audio it plays since it is not a 3D renderer. The most you could do with the TimeSynth is extend its architecture to support panning, but that feature doesn’t exist right now.

@stoppedclock, you don’t need to use Source Buses, the TimeSynths already can be positioned in space, just add Attenuation Settings to them.

@Dan Reynolds… hi Dan ! I am no coding expert but am very excited by / and hacking my way thru Timesynth and other audio stuff . . have acheived most of my goals but last thing to confound me is I can get midi in / thru from Ableton - and back, or play a synth in Unreal FROM Ableton, or drive Ableton audio from a midi file in a Blueprint - but I cannot for the life of me work out how to play a synth in Unreal with a midi file in a blueprint …thnx in advance !!

There’s a free maintained plugin for that: Procedural Midi in Code Plugins - UE Marketplace

yup -but i can’t work out how to get midi data to play synth within UE4 !!

If we can control DAW with a blueprint midi file, and control synth in unreal from DAW, shouldn’t it be a matter of chopping away the part that talks to DAW in both, and then hooking the two remaining unreal interfaces together?
Mind you I haven’t messed with the procedural midi plugin in a long while. If the midi interfacing is differently handled, it can get tricky. Also there’s the whole thing with competing midi standards, where different midi instruments and files will do things like note-off differently, which again needs to be handled. The new midi device plugin(not the procedural one) is very good at handling midi, but no midi file read support like in procedural.

ah yes thanks - i was reverse engineering one of your projects, and the procedural midi one - and just had gritty monophonic output - then finally found the polyphony button on the make synth preset node and all is good !!!

Ahaaa, nice!!

@dan.reynolds hi I’m back delving into TimeSynth !! can you point me to explanation of- or explain yourself !! - “distance range” and “volume scale” in TimeSynth clips?? you mention in your timesynth livestream a virtual band - thats not what I’m doing, but technically similar concept- any pointers appreciated, there is suprisingly little floating around vis all of this…!!