[CONTRACT/UNPAID] Motion Interface Exhibition/Art Installation - Developers & Programmers Needed

Project Title:

Motion Design Interface (WIP)

Premise:

Currently we design products, buildings, and 3D assets in a relatively basic way, using GUIs as our primary input mechanism. It is hands down the most efficient approach today. Yet, is it possible to use different methods of data input such as Mocap/AR/VR to generate fundamentally different design responses and geometries? Bringing our body and senses into equation might allow us to understand form and scale much better than using a computer screen.

Elon Musk’s video explains the core concept perfectly, meanwhile, sexy & complex mocap driven simulations & interactions are becoming more and more computable.

So how can we use this tech better within the design industry? Mocap & VR are not efficient but much more interactive, making them a great choice for making game-like interactive design process accessible to less skilled users (kids, stay at home moms, Trump, your grandma, etc).

Description:

As a part of my bigger university research & design project, I am developing an interactive installation which will allow users / public at the upcoming exhibition to design and paint architectural forms in 3D in realtime using a combination of Kinect (input), UE4 (processing), and a projection screening (output). In this setup, human body itself becomes a design tool & an input method for creating digital data. This project will be exhibited in London, June – July 2018 as a part of the bigger show attracting around 10000 visitors annually.

Project will be based on Kinect v2 sensor data processing (position, velocity, RGB, & depth using Kinect4Unreal Plugin by Opaque Media Group) in UE4 running on a Windows machine. Unreal Engine BP and C++ scripting will be used for GUI and scene setup. Followed by deploying the Volume Texture 3D Painting as described here (ShaderBits Plugin by Ryan Brucks) or similar techniques of generating 3D forms at runtime.

Previous Work:

Brief portfolio of my recent design work is available here.

Outline Specification:

  1. Orbit style movement around the Design Volume using Kinect hand positions & gestures as inputs
  2. 3D Pointer Brush functionality X,Y,Z with intensity and size multipliers using Kinect hand positions & gestures as inputs
  3. Painting into 3D Volume Texture space of Design Volume in realtime
  4. On user’s request or after a timeout reset simulation & write generated Volume Texture to disk
  5. Convert the recorded 3D Volume Textures into point cloud / density data to analyse the generated map (can happen outside of UE4, i.e. Houdini)

Help needed:

I have previous experience of coding (C# Unity, Side FX Houdini) and 3D modelling & texturing. With good overall understanding of computer graphics limitations. After playing around in UE4 I think I managed to put together a realistic specification which pushes computational capabilities of the engine (and my machine) just enough to make it a fun experiment to work on. That said, I could really use some input from more experienced UE4 users who are able to advise me on and help out with building Blueprints, Blueprint Interfaces, and C++ scripting.

Outcomes:

Fun and interactive user experience allowing general public to participate in the design process of a building via interactive media. You are free to use generated code & assets as you wish. I will provide concept mock-ups, diagrams, and visuals during the development and extensive photo & video coverage of the final exhibition setup.

**Compensation: **

If you have relevant experience in Mocap data processing / 3D Dynamic Volume texturing I will consider hiring you on a short-term contract basis but I would prefer to team up with someone curious and likeminded. Someone who wants to collaborate and have a sexy piece of design work in their portfolio.

Contact:

Hit me up here or at [EMAIL=“jevarch@gmail.com”]jevarch@gmail.com if you want to team up or know someone who might be interested!