That function doesn’t return as you think. The python TensorflowComponent wraps callbacks such that they will automatically call back on *json_input_gt_callback *whether you use multi-threading or not. If you do have multi-threading on you wouldn’t be able to receive the answer within a function callback anyway. You need to listen to json_input_gt_callback function which has the json results you’re looking for. See https://github.com/getnamo/tensorflow-ue4/blob/master/Content/Scripts/TensorFlowComponent.py#L101 for the python logic handling this. You can modify that section to return the results directly if you don’t use multi-threading.
I generally haven’t used this plugin with c++ inference, typically developing and calling json input from BP is more amenable to ML prototyping. That said I think a refactor is in order which will allow the tensorflow component to be called natively which would simplify cases like these (and using the same api to call remote python servers), this refactor may be a while though as I don’t have free opensource time in the near term.
Any logs from such crashes would be welcome as a new issue under Issues · getnamo/SocketIOClient-Unreal · GitHub, please give a detailed repro if possible.
Same issue with time currently, but you can try using the c_api natively right now (try any GitHub - Neargye/hello_tf_c_api: Neural Network TensorFlow C API, e.g. https://github.com/Neargye/hello_tf_c_api/blob/master/src/session_run.cpp), it just doesn’t have any ue4 specific data formats or conveniences implemented. If you get some good workflow going consider contributing.