Hi UE4 community!
Here’s a cool project that may be interesting to some - we built a realtime interactive music performance environment for a band by the name The Science, using Unreal Engine 4, Kinect v2, OSC and Ableton Live and presented it at an online stream festival last Saturday. The project was developed in ~10 days by a team of 3 people - myself, Iraisynn Attinom Studio (Greece) and SentinelAV (Australia). I would say this is our ver 0.9 with ver 1.0 soon to come.
You can watch our Saturday performance here - The Science - Touchmarks AV - YouTube
We were initially aiming at a solution in the likes of TouchDesigner or Processing/p5, but since my coding experience isn’t all that rich, thanks to a suggestion by Klavdios (IA) I gave UE4 a quick spin and upon realizing the scope of possibilities, we jumped in head first. We spent about 3 weeks of research, I was taking my first steps while trying to figure out different solutions to make good use of the Kinect inside UE. Klavdios tried NeoKinect’s avateering option but we left that visual aesthetic for our next live stream. In the meantime, I was digging around Unreal.JS and the node.js for Unreal plugins. Alas, JS is like German to me - I can read, but I can neither speak, nor write. So after going through the trials of making sure those are usable, we decided to leave them for a later time. We started messing around with Kinect’s IR and depth images as textures, using them to displace the vertices of a mesh plane in realtime. We left some of those styles for a later use as well.
But enough about what’s NOT in the video.
What we ended up using for our beta-performance was a Niagara Module script that Tom (SentinelAV) created which uses the depth image data from the Kinect to update the Niagara particle positions accordingly. Then we proceeded to add pre-directed camera movements, as well as environments. Klavdios made lovely stylistic use of the LiDAR Pointcloud plugin for his environment, while Tom built a nice pre-life version of a deserted Earth using hi quality Quixel foliage megascans.
In the last few days, we’ve grown to a team of ~12 enthusiasts with the idea to further expand on this concept. The immediate steps we are taking before our next livestream are to make use of OSC for further realtime control, camera direction and audio reactivity as well as introduce some of the art styles that we didn’t include. Of course, we would also port this to our LIVE live performances (when that becomes a thing again). However, we our main goal is preparing a full-on artistic abstract game/album experience that will be playable (looking towards an all-round solution - mobile, browser, console, PC, VR, but all of that is subject to change, given we are yet to tackle translatable gameplay & control schemes across platforms). We’ve also entertained the thoughts of doing live “in-game” events. But one step at a time. We will share some info/examples in the next few months regarding our processes but any specific questions may help give us direction what you may find interesting, so feel free to ask.
Cheers,
Martin