Local llama and UE5 integration

Hi,

I’ve built a local Llama model and developed an API for it. Now, I need to integrate it into Unreal Engine 5 to create a custom offline chatbot for in-game interactions.

Has anyone done something similar or have tips for seamless integration and optimizing performance?

Thanks!