No announcement yet.

Training Stream - Advanced AI - May 12th, 2015

  • Filter
  • Time
  • Show
Clear All
new posts

    [LIVESTREAM] Training Stream - Advanced AI - May 12th, 2015

    Training Content Creator Ian Shadden is joined by Lead AI Programmer Mieszko Zielinksi as they build on the Basics of AI stream and delve into more advanced AI! They will show you how to create complex AI behavior using Behavior Trees and the Environment Query System (EQS).

    Tuesday, May 12th @ 2:00PM-3:00PM ET - [Countdown]


    Ian Shadden - Training Content Creator
    Mieszko Zielinski - Lead AI Programmer

    Feel free to ask any questions on the topic in the thread below, and remember, while we try to give attention to all inquiries, it's not always possible to answer everyone's questions as they come up. This is especially true for off-topic requests, as it's rather likely that we don't have the appropriate person around to answer. Thanks for understanding!

    Edit: The YouTube archive is now available here

    Last edited by Ivey; 05-13-2015, 05:23 PM.
    Unreal Engine Documentation
    Bored? Follow me! @ffejnosliw

    I would be really interested to see some details about the new perception system. Particularity how to extend it in C++ with more advanced senses.
    - Dev Blog - Twitter - Facebook - Google+


      I'd like to hear about how Meiszko views the long term AI roadmap for the engine. What features does he intend to work on himself in the near term, what features the community should produce etc. Also, I'd like to hear about his experience with regard to cover systems and the like.


        Awesome can't wait to hear more Ai roadmap so +1 from zoombapup.

        Also, if you could discuss more info on the Ai work done for the deer in the kite demo for the crowd mentality that would be stellar! =)
        W3 Studios


          Hey guys, I just want to remind everyone that this is a training stream so the main focus will be teaching you how to use the current tools. We may not have time for discussion about the roadmap, but we'll do our best.
          Unreal Engine Documentation
          Bored? Follow me! @ffejnosliw


            Hi Jeff, I think a lot of people watching the stream will be interested in the roadmap. If there are going to be significant changes or updates in the future that could affect how/when people implement their AI.

            Really looking forward to this one, if only I wasn't in a time zone that makes it so I'd have to get up at 4am to watch it...
            - Dev Blog - Twitter - Facebook - Google+


              I totally understand and we will try to answer all questions. We're just limited with time and there is a lot to go over. It really just depends on how much time is left at the end to answer questions.

              Maybe we can get Mieszko on the regular stream to talk more about future plans before he heads back to Poland if we run out of time on the training stream.
              Unreal Engine Documentation
              Bored? Follow me! @ffejnosliw


                I look forward to a more advanced discussion covering EQS.

                If possible it would be nice to cover the area of spatial LODing of AI. Basically handling AI at scale with persistence, over large distances. Maybe general thoughts on higher level system organisation and delegation from manager classes.
                Anything like that would be awesome.


                  Will there be functionality for running an EQS query from a task, or as a decorator or something?
                  Storyteller - An immersive VR audiobook player

                  Dungeon Survival - WIP First person dungeon crawler with a focus on survival and environmental gameplay ala roguelikes


                    Originally posted by n00854180t View Post
                    Will there be functionality for running an EQS query from a task, or as a decorator or something?
                    An EQS Query can be ran as just a Task or it can be called anywhere from the Blueprint Node "Run EQS Query". These are both available in 4.8. In 4.7 only the Task can be called.
                    Last edited by Alexander Paschall; 05-11-2015, 11:02 AM. Reason: more info
                    Twitch /unrealalexander| Twitter @UnrealAlexander
                    How to report a bug? | Installation & Setup issues?
                    Call me to a thread by posting this: [MENTION]Alexander Paschall[/MENTION]


                      If anyone has any more questions about the AI systems (or how to use them) after the stream, I'll be around and can make some tutorial material to cover those areas. I figured it'd only be fair to do that AFTER the training stream so that everyone can get a grasp of the new tools. Maybe we can have a twitch chat after the stream or something I dunno. Point being that there's some cool new stuff in there thats very useful and it'd be great if we could get people up to speed with them.

                      See you Tuesday!


                        Looking forward to this one!

                        Q1. In the stream demonstrating techniques used for the kite demo, there was a short explanation about a dynamic nav-mesh system that built tiles on the fly as needed. How far off is this from being integrated and ready to use?

                        Q2. Something I asked about on the last stream, but just wanted to ask Mieszko: any plans to implement a BT node/construct to allow choosing of tasks based on some weighted probability function? Currently it seems really hard to work around the priority-based structure when you want to introduce some randomness into the behaviour. At the moment I have a sequence which starts with a task used to make the randomized choice, followed by a selector which decorates each child with a test to see if that particular task was the one chosen. It's very ugly!


                          [QUESTION] How can we draw the debug for AIPerceptionComponent? For eg:- I wanna draw the LOS Cone inside PIE mode or in simulating mode.
                          There is an option for Debug under the Dominant Sense in AIPerceptionComponent. How to use that?


                            [QUESTION] How to set a BOOL to false depending upon whether the AI has lost sight of the player? It always returns true once detected.

                            My Workaround :- Get the distance from the player and check it against LoseSightRadius value, and then casting a ray to detect whether the player is visible or not. Not efficient I think but does the job.
                            Last edited by Amaresh; 05-12-2015, 10:53 AM.



                              I know there is a system state machine within Unreal Engine called Pawn Action. It makes sense, but my question is what you think about using it to create a Task/Planner system? Have you attempted to expand on this or can you give us an example of how you'd use it. Thanks!

                              If anyone is interested in EQS & AI Perception. I ran a stream on it a few weeks back. Of course after this video, and asking question -- I plan to create a new video with additional information. Hopefully they talk about how to properly view / debug the Perception tool.

                              Last edited by PeterLNewton; 05-12-2015, 01:49 PM.