Announcing LLMAI Plugin - AI Realtime Text, Voice, and function calling!

We just dropped something pretty exciting, at least to me, hopefully others will find it useful and or interesting.

Video Demo here

LLMAI, a plugin that lets you have real-time AI conversations and function calling directly in Unreal Engine 5.6 at runtime.

What’s the big deal?

  • Real-time voice chat with OpenAI (yes, you can literally talk to your game!)

  • Function calling system - AI can actually control your game, spawn objects, change settings

  • Two complete demos including an AI-controlled arcade game that responds to voice commands

  • Production-ready with full source code and comprehensive docs

Perfect for prototyping AI NPCs, voice-controlled interfaces, or just having fun conversations with your projects. The AI can even play games for you!

Check it out: LLMAI Plugin | Fab

This version works with OpenAI Realtime API with all the voices, we are working on various AI provider support, along with local hosted integrations, so stay turned!

Questions, feedback, or cool use cases welcome!

Would love to see what the community builds with this.

Cheers!

We’re on our way to adding LocalAI realtime API capability.

Unfortunately the current version of LocalAI’s realtime is basically just a stub and nowhere near complete, so we’re working on updating that too! With the first step pictured above, we have connections properly established, some more error checking, and text mode communications (with qwen3-4b thinking model on the backend, but ofcourse you can configure LocalAI to your own needs)

This will allow for much greater scalability and versatility since LocalAI can run on client PC or network infrastructure as needed.

Stay tuned!

LLMAI Plugin v2.0 Released — Local AI and MetaHuman Lip-Sync Support

We are pleased to announce the release of LLMAI v2.0, a major update to our Unreal Engine AI integration plugin. This release introduces two significant capabilities: full local AI operation and MetaHuman lip-sync with function calling.

Run AI Completely Locally

One of the most requested features has been the ability to run AI processing without relying on cloud services. LLMAI v2.0 delivers this through our updated LocalAI integration, which provides a Realtime API-compatible interface running entirely on your local hardware.

This means:

  • No API keys required for local operation

  • Complete data privacy — your conversations never leave your machine

  • Offline capability — develop and demo without internet connectivity

  • Reduced latency — no round-trip to cloud servers

The LocalAI distribution includes pre-configured models for speech-to-text, language processing, and text-to-speech, all optimized for real-time voice interaction.

MetaHuman Lip-Sync and Character Control

LLMAI v2.0 introduces LiveLink voice support, enabling real-time lip-sync for MetaHuman characters driven by AI-generated speech. Combined with function calling, the AI can control character animations, gestures, and movement through natural conversation.

Watch the MetaHuman Lip-Sync Demo:

The included LLMAI_LiveLink demo project showcases 16 character gestures, waypoint navigation, and integrated lip-sync — all controlled through AI function calling.

Multi-Provider Architecture

The plugin now supports multiple AI providers through a unified interface. Switch between OpenAI (cloud) and LocalAI (local) with a simple configuration change, or build applications that dynamically select providers based on availability or user preference.

Documentation

Complete documentation, including setup guides, API reference, and demo walkthroughs, is available at:


Changelog — Version 2.0 (January 2, 2026)

  • Added LiveLink voice support to enable lip-sync and function calling for MetaHumans with demo

  • Added support for LocalAI system with Realtime API compatible interface and demo update with providers

  • Added mic gating function allows using voice output without feedback to voice input

  • Added mic mute control in demo projects

  • Updated UI to support multiple providers (LocalAI supported)

  • Added release version of OpenAI Realtime API 5.0

  • Added think and reasoning LLM support with chat panel in demo projects

  • Note: Removed support for Unreal Engine 5.5 due to major engine subsystem differences


LLMAI v2.0 requires Unreal Engine 5.6+ on Windows 64-bit. LocalAI operation in its current configuration requires an NVIDIA GPU with 20GB+ VRAM