I would like to develop a blueprint to show animated 3d dataviz, and I would like to ask if it’s possible to:
Read data in real time (e.g. take data from a JSON from an URL. This is to avoid making a new package with updated data every month)
Input this data to a Motion Design Actor (e.g. use scale transform on the objects of a Cloner to represent the different values)
It would be cool to go beyond viewing this on a screen and view this in VR with a Windows application or even in Mixed Reality with an Oculus Quest application.
Do you think it’s feasible? Are there any technical limitations right now? (e.g. I don’t know if Motion Design is compatible with VR or Android applications)
In case of technical limits, what kind of workaround should I look for? (e.g. can I bake the Motion Design animation instead of processing it in real time)
Hey,
Motion Design is currently only supported in Windows, Linux and MacOS (see Avalanche.uplugin).
However you could still use Cloners without enabling Motion Design. For that you’d have to enable the “ClonerEffector” plugin.
5.6 introduces a Motion Design plugin called “Motion Design Data Link”. This is Motion Design’s solution to getting data from internal/external sources via the “Data Link Graph” (e.g. like an HTTP Request). Despite its name, this plugin does not really require Motion Design to be enabled and should be useable outside these desktop platforms. However, do note that this plugin is experimental, so if you want a non-experimental way of doing this same data gathering you could look at using the HTTP Blueprint plugin and getting the data that way.
To manipulate the scales of the cloner meshes i’d recommend using an Effector that you can then set the scale (in Offset) to manipulate the scales of the clones.