I used the Tiny-js interpreter. So the code wasn’t converted into c++, but rather a simple javascript interpreter was used. For allowing the pins to change to the parameters of the function, I created a custom blueprint node that was based on the composite node. Then just changed the nodes and pins that were created inside that composite node, depending on what type of parameters the javascript function was going to use.
There are a few things I still find that useful for, but in general gave up on that approach quite some time ago. Lately I’ve been digging into the blueprint code, and creating custom compilers for it. So that I can use blueprint as a type of domain specific language for some of the specialist things I need, that would need really ugly and complex graphs if I used standard blueprints.
On the subject of having c++ inside a blueprint, one thing that might be possible in the future (but don’t think that Epic will implement) is allowing a mixed c++ and blueprint class. By that I mean you could define the blueprint in the standard method, but then for certain functions (I mean whole functions, not nodes) in that class, have a text editor window open and write the function in c++. This might be possible because Epic are looking into allowing blueprints to be compiled to c++, rather than just to the Virtual machine bytecode they currently are.
At the moment the way blueprints work, is that when you compile them, a blueprintgenerated class is created and the variables and UFunctions stubbs are added to that class. While each node is compiled into a set of statements (but not actual bytecode at this stage). Then the generated class and the compiled statements are passed to a backend compiler. There are currently two of those backends. The one that is actually used, compiles those statements into bytecode, that is inserted in the UFunctions in the blueprintgenerated class. The other backend converts the blueprintgenerated class and the statements into c++ code. Currently this is only for debugging and in 4.5 can only be shown in the output log. In the master branch they have added the ability to save the c++ to files, but it is still meant for debugging at the moment. However Epic are looking into allowing the c++ code to be used in the future to allow for faster blueprints (faster performance that is, as compiling would be slower than current blueprints).
Now what might be possible but don’t think Epic have any plans for. Is that if the blueprint was being compiled to c++, then it should be (in theory) possible for a function in a blueprint to be defined in c++ and that c++ code passed to the backend compiler, which just inserts in into the class it is writing. One reason I don’t think Epic will go down this route is because they have said they don’t plan to allow c++ only blueprints. Meaning that a blueprint would still need to be able to be compiled to VM bytecode, which wouldn’t be possible for a blueprint that included c++ code. Another reason that I don’t think they would go down this path, is it really isn’t that different than just creating a c++ class and then having the blueprint inherit from that c++ class.
https://forums.unrealengine.com/showthread.php?50988-Request-Blueprint-Variable-Readibility