[=Gigantoad;105008]
UE4 already supports real version control. I use Perforce, so does Epic. Blueprints have visual diffing already. Granted, I work alone currently so I don’t know how much of hassle it is in a team. Seems to work for Epic though. Maybe it’s not even necessary that several people work on the same blueprint? Certainly not simultaneously?
[/]
Well, that’s not really a satisfactory answer, now is it…
Perforce is not free (as opposed to Git/Mercurial/SVN etc.), and is not exactly a common tool for version-controlling scripts…
[]
Well, they are adding scripting support right? You know, actually I’m not even sure why we’re still taking about it
[/]
Well, afaik, what Epic is currently doing is facilitating the infrastructure for implementing external-language bindings - they are just using LUA currently as an example so they have something to work against, and since it’s one of the easiest ones out there to integrate into C++ runtimes.
To extrapolate from that to say that they are implementing/plan-to-implement any other language would be erroneous (at least at this point in time).
And again, this thread is not just a shout-out for Epic, it is mainly about swaying -opinion about the UE4 user-base in that direction, opposing the notion that having JUST C++ source-access is enough.
[]
On your second point, you’ll just have to explain why you would port from blueprint to scripting. What’s the benefit? I think it would be more likely an either/or situation, because if you do end up porting blueprint to code it would be performance-critical stuff. Some door firing an event one time when the player gets close to it would hardly be worth it to port at all. If you’re doing per tick heavy math stuff you may want to port it, but would anyone really then port it to LUA or C# instead of C++? It doesn’t seem to make much sense.
[/]
The way I see it, Blueprints are not a programmer-tool, mainly - their main target (and Epic has stated that on many occasions) is artists, game-designers, generally people that are less heavily-technically-inclined. Given that, and given that Blueprints would probably never get the level of expressiveness as mainstream scripting languages have, it is safe to bet that he Blueprint-code that is generates would more-often-then-not be better treated as not-much-more than a “prototype”, a “proof-of-concept” by some designer/artist. That is also a perspective that has been publicly expressed by Epic personnel on many occasions.
The blueprints that would end-up being generated might end-up being much more convoluted to decipher then what an experienced-programmer would be able to express in code. Moreover, once there is scripting-support, much logic will already be developed by programmers in a scripting-language, and it should be more convenient for them to translate blueprints into their native-languages, if nothing else for re-use and extensiblity, bypassing any inter-op layer between a compiled-blueprint and a written script.
Plus, as noted before, scripts would probably be still faster then compiled blueprints in many cases, and though that may not be an issue a lot of the time, and while C++ would be even faster still, the script-version would be a sweet-spot that gives a nice performance-boost that would probably be sufficient for most cases. Heavy math stuff, yeah, those would probably be most fitted to be coded in C++, but those would be few and far between. This is scripting-optimization land, as is already being done in scripting-environments for many years now.
Case-in-point: C-Extension variation of the “simplejson” package in python (for decoding/encoding formatted text), as an alternative to the default pure-python one (measurable performance-boost - believe me, I’ve checked it extensively). What percentage of a code-base is -decoding/encoding represents in this case? That depends, but if that is “the main” bottleneck in your code-base (given your properly profiled your execution, and are not just “following your gut” which be wrong in most-cases), then once you solve it, the rest would be futile to convert to C++ (there is a point-of-diminishing-return on that performance-graph…)
So, say you have a function in your code-base that is doing ray-trace/ray-cast computation implemented within a complex algorithm. You probably wouldn’t even prototype that using blueprints, and start-off having a script-function. Later, this function gets more and more used, on many occasions in your script-code-base, and it becomes noticeable at run-time. You THEN convert just this function into C++ using UE4’s extensibility-classes, and have direct-access to it from your scripts. You just re-factor them all to use that variation instead of the scripted one you wrote initially, and voila - you are benefiting from both worlds, and had solved your bottleneck - with just a minimal use of C++, just where it’s appropriate.
This is what is called “Hybrid/Heterogeneous Development”, and often provide the best trade-offs all around.