Nuren: Cyberpunk VR music visualizer

Hey guys, thought I’d show you a project I’ve been working on with a couple of friends. It’s a music-synchronized real-time cyberpunk rock opera visualizer… thing. Made with UE4.

We launched a Kickstarter for it a week and a half ago, but I was busy improving it and getting the Mac and Linux demo builds ready (they’re almost ready now!), so I hadn’t yet posted here about it.


Kickstarter page:

Windows demo build (64-bit):

Works just fine on your desktop without an Oculus Rift, of course. Specs needed for desktop are pretty mild. Old GPUs should be fine. There’s an option menu in the demo (press escape) that will let you adjust a bunch of settings.

If you want to run it on an Oculus Rift DK2, you’ll probably want at least a GTX 760 (for 80% sampling percentage) or GTX 970 (for 100% sampling percentage) to get solid 75 FPS. Check out the readme file for some tips on getting more consistent performance out of your GPU (especially if you’re on NVIDIA) if the frame rate is dropping for you.

Let me know what you think! We’ll have the Mac and Linux builds up as soon as we’re able to get HMD to consistently enable HMD correctly on those platforms.

Remaking this thread because I gave it the wrong title the first time, and edited titles don’t seem to work with this forum software. Hope that’s ok!

I don’t like it, I LOVE IT!

Fantastic work! Keep it comin!

Some cool stuff in that video! You should consider doing a couple of technical blog posts maybe, when you have some time. I particularly like the wavy background walls/blocks. Is that a vertex shader or are you moving stuff with a script?

This is phenomenal! Great work!

Thanks! I will be making some blog posts. It’s being done with static mesh components (with some special stripped-down logic to avoid some overhead) because there’s currently no way to get proper velocity buffer rendering when world position offset material pin is used. I hope to use a modified instanced static mesh component in the future, to leverage hardware instancing.

The moving mesh effects are created with a custom system I embedded into Blueprint for doing these sorts of 3D motion graphics things. I hope to show it off in the future, and eventually sell it in the code marketplace.

Thank you!

I’m a huge virt fanboy so I can’t wait for this.

I hope those robots are just placeholders though.

Thanks! Don’t worry, there’s a lot that needs to be worked on, and we’re listening to everyone’s feedback for how we should change/improve the project. We didn’t have much time to refine the character art and get feedback before launching the Kickstarter, so I’m definitely going to bring it up the feedback about the character design I’ve heard since then to the other guys on the team.

Hey man, can you give a rundown on how you worked the AUdio Visualization plugin? I cant seem to understand how to make it work.

Hi! Sorry, I’m still writing up a blog post on some of the tech behind this demo.

But I can give you a short summary: it doesn’t use the audio visualization plugin at all. It reads a pre-made sequence of events (created by the musician) stored on disk (as a MIDI file, at the moment) that is used to trigger events and matinee/scene changes while the song plays. The sequence timing is slaved to the audio at (or within a few milliseconds of) sample accuracy.

We’ve got a Mac build up now, if anyone wants to try it:

There are some issues with Oculus Rift, but it should mostly work. (If you want best performance out of Oculus Rift in general on Mac hardware, I recommend you use Bootcamp)

Hey guys, just giving an update for anyone following this project:

We’re fully funded thanks to some unbelievable generosity (all individual Kickstarter contributors, and Epic, a phenomenal patron). We have a few hours left to go on the Kickstarter, and we put up a few modest stretch goals that would secure us some more dev time to add polish to some bonus stuff to the project. Nothing essential, and we’ve already hit one of them.

I also did a few live coding streams on twitch (thanks everyone who showed up!) and got a pretty funny highlight. I worked on a new feature for about 30 minutes and then tried it out for the first time (explicit language warning!):

OMG!!! It works!

Always amazing when you code something new and it works the first time.