Hello, everyone.
My name is Antonio and I am a researcher working on Human-Machine Interaction and Dialogue Systems. I would like to present you the open source plugin I am publishing on the basis of the work I have been doing with UE4 in the last years at the University of Padua and at the University of Naples Federico II (Italy). The plugin is named Framework for Advanced Natural Tools and Applications with Social Interactive Agents (FANTASIA).
FANTASIA is a re-engineered version of a set of tools I have been developing for research purposes and it aims at easily integrating in UE4 powerful AI services and libraries for interaction management and knowledge representation. These are exposed to Blueprints to help people concentrate on the development of interaction models and reduce the time spent on technical issues as much as possible. The first version of the plugin will include the following features:
Access to the Automatic Speech Recognition service provided by Microsoft Azure
Access to the Text-to-Speech service provided by Microsoft Azure
Access to Natural Language applications developed in Azure LUIS
Access to the Text-to-Speech service provided by Amazon Polly (with lipsync data support)
Access to instances of the Neo4j graph database
I have a lot of prototypical code to restructure and include in future versions of the plugin to provide machine learning capabilities (both local and cloud-based), probabilistic decision systems and semantic 3D capabilities, among other things. In general, I hope this contribution will integrate with the efforts made by Epic towards the creation of Metahumans by providing the infrastructure needed to make them behave in a believable way, other than look great as they already do.
The plan is to publish the first public beta of the plugin on March 8th 2021. I am currently recording a small set of video tutorials that will show how to use this first set of tools to close the loop in the creation of a very simple humanoid conversational agent. Meanwhile, I am setting up some social channels to keep people informed about the development process. If you like the idea, you may visit the FANTASIA website or you can check out the Youtube channel, which currently contains a promo video of the plugin.
I hope this contribution will be able to help other people in the academy to leverage on the tools provided by Epic to make UE4 a powerful environment for the development of Embodied Conversational Agents. I also hope the provided interfaces will help the general community to work more easily with powerful tools that really have the potential to give rise to exciting new applications!
Hello, everyone.
After some final polishing and setting up of the social channels, the first public beta version of FANTASIA is online. You can download it from the Github repository here
A presentation video showing the installation process is on the Youtube channel
By the end of the week I will publish another video to show how to integrate a graph database (Neo4j) in Unreal using the Blueprint nodes provided by FANTASIA. I will also work on the documentation side to provide more insight about the plugin. For some preliminary information, you can check the “Features” section on the dedicated website
Hello, everyone.
I just posted a new video tutorial showing how to access graph based knowledge in Neo4j from UE4 using FANTASIA. In the next videos, I will show how to use Automatic Speech Recognition and Natural Language Processing from Microsoft Azure to guide data extraction. Here’s the link
Hello, eveyone!
A new video tutorial is online on the Youtube channel. This one shows how to combine the Microsoft Azure Automatic Speech Recognition service with parameterized queries in Neo4j to extract data in a flexible way from UE4. This is another step in the series of tutorials that will present how to build a simple Embodied Conversational Agent in UE4 using the FANTASIA plugin. Check it out and have fun!
Hello, everyone!
A new video is online on the Youtube channel. This tutorial shows how to create a basic application that allows streaming audio from UE4 to Microsoft Language Understanding Intelligent Service (LUIS). This way, you can use AI to analyse speech transcriptions and extract both user intentions and named entities (like movies and actors). The video also shows how to query Neo4j using LUIS feedback to extract information, so that you can easily query your data using Natural Language Understanding.
Hello, everyone!
The final video of the Basic tutorials series is online! In this video, I will show how spoken commands interpreted by Microsoft LUIS generate a query to Neo4j and how a synthetic voice is obtained with Amazon Polly. Audio and lipsync data are, then, used to animate a Genesis 8 3D character in the Unreal Engine!
From now on, I will be publishing advanced tutorials to cover functionalities and aspects that were not included in this first example. Also, I will be working on linking FANTASIA with Metahumans!
Hello, everyone!
Yesterday, FANTASIA was presented at the exciting NODES2021 event! We covered the Movie information example from the tutorial series and a more advanced case study for conflict detection in command sequences. Stay tuned to know more about how Neo4j and the UnrealEngine can work together!
Moreover, in his opening keynote, Neo4j’s CEO talked about Blueprints for success with graphs. Indeed, Blueprints are based on Event Graphs in UE4! See how the two naturally mix together?
Hello, everyone!
The video recording from the NODES 2021 conference is online. You can watch the talk, convering both the tutorial example and a more advanced example at the session link
You will find the talk marked at 2:26:35 (Graph databases for Embodied Conversational Agents in the Unreal Engine 4)
FANTASIA is being used at the Federico II University of Naples as part of a research line focusing on the use of Virtual Humans for Cultural Heritage presentations. Taking professional presenters as a reference, we use neural speech synthesis, pose and facial expressions estimation to model an artificial presenter. FANTASIA will manage the Metahuman designed for this purpose using the already available modules and future ones under development, which will also be released openly. Check this out on Twitter at the following link.
In the last year, I have been testing and presenting FANTASIA in a number of different cases. It is now time to take a further step forward and present the latest module I integrated: the aGrUM library for probabilistic graphical models. This is an important feature to add, in FANTASIA, as conversational agents mostly handle noisy input so the a probabilistic framework to take decisions is fundamental. In the first video of the new tutorial series, I will present the basics of Bayesian Networks and will show, in the next videos, how these can be integrate in Unreal to develop AIs with probabilistic reasoning.
Hello everyone!
The new video tutorial covering how to design and import Bayesian Networks in Unreal is out! In this video, I will show how to create a Bayesian Network in pyAgrum, how to import it as an Unreal asset and how to use it to make a Metahuman choose which questions to ask before taking a decision.
Hello, everyone!
The new Advanced Video Tutorial is online. This time, it shows how to integrate Neo4j and Bayesian Network nodes from FANTASIA with the AI system of the Unreal Engine, specifically with Behaviour Trees.
Hello, everyone!
Today, we start a video series dedicated to interviews with researchers who have been using FANTASIA to develop Embodied Conversational Agents. In this video, Dr. Maria Di Maro describes her PhD work on Clarification Requests using FANTASIA.
Hello, everyone.
These have been some busy months for FANTASIA. Here’s some updates:
A FANTASIA demo stand was present at the ACM Multimedia conference in Lisbon. A great venue to showcase the framework to researchers involved in managing multimedia contents
FANTASIA was used by students of the 5G Academy at the University of Naples Federico II to develop a proof-of-concept holographic assistant stressing the potential of 5G technology to support interactive applications with Pixel Streaming
FANTASIA is being used by the VISIT3D project by Logogramma (VISIT3D - Logogramma)!
The UE5 version of FANTASIA is available on GitHub. Check it out!
Such an interesting work! I am a software engineer and recently I am playing around with sort of digital human things. I found it in the internet and tried this out without much prior experience on Unreal Engine.
However as I followed the installation guide on Github and Youtube, I am still unable to put the plugin to my project. I am suspecting this is because I am using Mac instead of Windows, which I may not be able to create .sln for my project. Is there anything I could fix / I have missed so that I can set it up successfully?
Hello, Richard.
Sorry for reaching out this late. In the last months we had a lot of work at the University. I am afraid that FANTASIA can only work under Windows because of both Azure and AWS integrations. You may be able to make other parts of the plugin work (Neo4j and Bayesian Networks) by extracting the code and compiling yourself but I never tried that.
Hello, everyone!
We have just published an Unreal Engine 5.3 template based on FANTASIA to speed up the development of Metahuman-based Conversational Agents using the theoretical framework we develop as part of our research. You can find out more about the general theory in this video
The template can be found on Github at the following link
The first video presenting the FANTASIA Template is online! This video will show you around the Template and explain how it is organised. The presented content is linked to the concepts explained in the previous video about making Hybrid Conversational AI with FANTASIA.
After a very busy period, it is time to go back to work on improving FANTASIA. The new version, built for Unreal Engine 5.4 is available on the Github page! New functionalities are lined up for sharing during the next weeks. The plan is to provide new tools to improve the answering speed of Embodied Conversational Agents and support more on premise technology to reduce dependency on third party services.
We are also looking forward to testing the new animation tools provided by Unreal 5.4 to keep building the new animation system integrated with our approach to hybrid Conversational AI!