[Plugin] AIChatPlus - AI Chat Integration (OpenAI, Azure, Claude, Gemini, Ollama ...)

[AI Chat Plus] is an Unreal Engine plugin that enables communication with various AI chat services.

Ccurrently supported services include OpenAI (ChatGPT, DALL-E), Azure OpenAI (ChatGPT, DALL-E), Claude, Google Gemini, Ollama. It is designed to facilitate integration with these AI chat services for UE developers.

The implementation is based on asynchronous REST requests, ensuring high performance and convenience. More service providers will be supported in the future.

Latest Version: v1.2.0

Coming Soon:

  • Support llama.cpp

Blueprint

Editor Chat Tool:

Links

AIChatPlus V1.3.0

Change Log

  • Support Offline llama.cpp: integration with llama.cpp library, CAN Run AI Model Offline!
  • llama.cpp Support Win64/Mac/Android/IOS.

AIChatPlus V1.3.1

  • Added a SystemTemplateViewer, which allows viewing and using hundreds of system prompt templates
  • Fixed an issue where plugins downloaded from the marketplace couldn’t find the link library for llama.cpp
  • Fixed the LLAMACpp path being too long
  • Fixed incorrect linking of llama.dll after packaging on Windows
  • Fixed file path reading issues on iOS/Android
  • Fixed incorrect name setting in Cllame

AIChatPlsu v1.3.2

  • Fix crash when manually stopping request in cllama.
  • Fix the issue of not being able to find ggml.dll and llama.dll files when packaging the Win version of the mall for download.
  • When creating a request, check if it is in the GameThread.

AIChatPlus v1.3.3

  • support UE-5.5

AIChatPlus v1.3.4

New Feature

  • support OpenAI vision

Bug Fix

  • fix OpenAI stream=false error

AIChatPlus v1.4.0

New Features

  • Cllama(llama.cpp) support Multimodal models, means support vision now.
  • Add ToolTip for All the Blueprint Types.

AIChatPlus v1.5.0

New Feature

  • Support sending audio to Gemini
  • Editor tools support sending audio and recordings

Bug Fix

  • Fixed the bug where Session copy failed

AIChatPlus v1.5.1

New Feature

  • Only allow sending audio to Gemini
  • Optimized the method for obtaining PCMData, decompressing audio data when generating B64
  • Added two callbacks to request: OnMessageFinished and OnImagesFinished
  • Optimized Gemini Method to automatically obtain the Method based on bStream
  • Added some blueprint functions to facilitate the conversion of Wrapper to actual types and to retrieve Response Message and Error

Bug Fix

  • Fixed the issue of multiple calls to Request Finish

AIChatPlus v1.6.0

New Feature

  • Upgrade llama.cpp to version b4604.
  • Cllama supports GPU backends: cuda and metal
  • The chat tool Cllama supports GPU usage
  • Support reading model files packaged in Pak

Bug Fix

  • Fix the issue where Cllama crashes when reloading during inference
  • Fix iOS compilation errors.

AIChatPlus v1.6.2

New Feature

  • Cllama has added the KeepContext parameter, with a default value of false, which automatically destroys the context after the chat ends.
  • Cllama has added the KeepAlive parameter, which reduces the repetitive reading of the model.

Hey there, i bought aichatplus and upon looking the documentation i see it has lot of potential, however i want to ask, has or will have the capability of assisting with blueprints?

Yes of course,I have been working on it, but it will take more time to finish