A huge problem I’m noticing with Verse is that fact that you still heavily rely on preconfiguring devices in the workspace prior to runtime. It baffles me that this is strictly necessary to do basic things, like determine if a player is holding a certain item. Currently, you would have to make a conditional_button, and preconfigure it ahead of time to be able to use the IsHoldingItem function to determine whether a player is holding a certain item or not in Verse code. This is not ideal.
I suggest implementing a way to directly instantiate devices during runtime, and then throw it away. This way, we’d be able to write stuff in code directly without having to bring a bunch of random devices in and tie them together.
For example, the IsHoldingItem function is directly tied to conditional-buttons. Instead of having to make a conditional-button and have it floating somewhere in my game preconfigured, I should be able to make a new one in the code directly. And I don’t mean manually. I mean it should automatically happen if IsHoldingItem is called directly on the player. The engine should know it needs to make a conditional_button to check this, and do it itself with my given Params. Then the device should be nuked after the code block is done running.
This would make programming games in UEFN and Verse alot more intuitive.
Currently this is still not possible, correct? It would do me really well do be able to instantiate VFX creators, damage volumes, etc. at runtime because I don’t know how many I need except that I will need as many as 10. I would much rather dynamically create them than set up 10 extra of each device that may never be used in game. If there is any way to create devices at runtime please let me know otherwise I definitely second this request, if possible!
Hope that helps till we can get more flexibility. Oh and Scene Graph supports verse code inside your assets that no longer relies on devices so something to look into. I’ll be posting a tutorial on Scene Graph hopefully this week.
Some stuff (like basic VFX and Props) are already possible to spawn dynamically with verse, but still a little limited.
Scene Graph is what you are aiming for to describe all these wanted powerful behavior.
It’s still in alpha and very early stages, but for example, every device is aimed to have proper APIs migrated to scene graph or/and converted to actual entities/prefabs. It uses entities as containers and components as logic/behavior data, allowing dynamic scene building and interactions both in editor and during game.
This will be allowing dynamic instantiation, configuration, management and behavior not only for devices but pretty much anything in the world/scene.
As I said, it is currently in alpha with lots of changes over time and lots of limitations, but you can check more about it here: