Announcement

Collapse
No announcement yet.

Training Livestream - Getting Started with AI - Jan 31 - Live from Epic HQ

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #31
    Following up on question about nav edges.

    When I use navmesh->GetPolyEdges(polys[v], ed); for each poly, it gets only inside edges (edges that have poly on both sides, not the outside edges of namesh).
    I only get the blue edges not the pink ones (had to find them myself)


    Am I doing something wrong?

    Comment


      #32
      I guess this function is misnamed - it should be "GetPortalEdges" since it retrieves only the edges that are traversable. I'll make a note of it.
      Senior AI Programmer @ Epic Games by day, AI programmer at night!
      My slow-blog on random AI thing (including UE4 AI).
      My no longer active UE4 AI blog.

      Comment


        #33
        Is there a function to get all edges then? I already got what I wanted, but it's super messy. I was just wondering, if there was an easier way to get all edges (or preferably just the outside).

        Comment


          #34
          When is this going to be available on YouTube ?

          Are project files going to be released?

          Thanks

          Comment


            #35
            Responses to more questions from the livestream from Mieszko:

            [QUESTION] How would you do simple sound propagation for hearing. Would you use some chokepoints/portals for finding sound path to AI? I used navmesh for finding path from source to AI (and fake sound propagation), but idk if it's a good way to do it. Thoughts?
            I’d definitely try to base such a solution on navmesh, preferably a static one co that information can be prebuilt with the level.

            [QUESTION] Are we going to see any features on the AI section of the roadmap anytime soon? <ducks>
            All of AI roadmap is currently on hold due to other tasks assigned to the (dispersed) AI team.

            [QUESTION] Why not make Static NAV mesh what can be Attached to moving object ?
            Out Navmesh implementation simply doesn’t support it.

            [Question] Any suggestions of how to implement jumping between platforms with valid nav mesh?
            The easiest approach is to manually markup the navmesh with navlinks (for example using the NavLinkProxy), create a jump area for it and have PathFollowingComponent handle that. There’s a tutorial in the docs on how to do it.

            [QUESTION] why need the tree, cant you just do it all in the AI bp?
            You can do most of this stuff in pure BP, but at some level of complexity it will become a maintenance nightmare

            [QUESTION] What about AI in multiplayer?
            It works check out Paragon bots

            [QUESTION] What do you think about multi inheritance in Blackboards? In very complex BTs with many dynamic Behaviours (very abstract/general) it will be very helpful.
            Multi-inheritance requires more caution and special case handling, and conflict resolution, etc, and practice shows you can usuly do without it. Even UE4 is using single inheritance! (not counting interfaces)

            [QUESTION] When is going to be implemented a feature when a pawn approaches destination it will gradually decrease its speed (now it stops instantly at destination)
            It’s there, just set bUseAccelerationForPaths in your movement component to true. You might need to play with other params as well.

            [QUESTION] Is it possible to restore the state of a behaviour tree? For example when serialising a save game?
            Nope.
            Twitch /unrealalexander| Twitter @UnrealAlexander
            How to report a bug? | Installation & Setup issues?
            Call me to a thread by posting this: [MENTION]Alexander Paschall[/MENTION]

            Comment


              #36
              Just saw the stream and learned a lot from you guys and had a lot of fun too... It was fun to see you like getting mad with the other =P... Sometimes the thing would go off-script but I liked that, there were some useful unscripted tips and I didn't mind that you took more time as it offers an insight on how the experts do things and keep everything neat... Thank you guys.
              0xAFBF
              https://github.com/0xafbf

              Comment


                #37
                Originally posted by BoteRock View Post
                Just saw the stream and learned a lot from you guys and had a lot of fun too... It was fun to see you like getting mad with the other =P... Sometimes the thing would go off-script but I liked that, there were some useful unscripted tips and I didn't mind that you took more time as it offers an insight on how the experts do things and keep everything neat... Thank you guys.
                Thanks for watching! Mieszko and I have a long history of giving eachother a hard time for laughs. We have a lot of fun! He's always super informative.
                Twitch /unrealalexander| Twitter @UnrealAlexander
                How to report a bug? | Installation & Setup issues?
                Call me to a thread by posting this: [MENTION]Alexander Paschall[/MENTION]

                Comment


                  #38
                  Could i nudge a question real quick? How can i change in runtime my AIPerception lets say change the line sight radius or angle? Is it possible in blueprints? If so, how?

                  Comment


                    #39
                    Thank you so mutch, this really helped me in my project. Im fairly new to unreal.

                    How can I specify different animations according to different usable objects?

                    Comment


                      #40
                      What a great tutorial, where can we get more on the debug features? Also, i think there is a gap in the tutorials - i need basic information on how to save game data, say can I save and reload the Black board data... for each individual AI, perhaps one AI character can pass on knowledge to the player ... say the the AI could hide or find, as the game goes along.So it would i make sense to have an inventory-like structure for objects/places/containers.. with information that can be "compiled" and exchanged over time.

                      But in general you would want to save the full range of game-state information, say where should all Actors spawn at reload, and how do we select and initiate values at load time? I think the Game Framework tutorial (https://www.youtube.com/watch?v=0LG4hiisflg ) touch upon this, but there should be some basic/general UR4 design to handle such challenges?

                      Comment


                        #41
                        Originally posted by joeptruijen View Post
                        Thank you so mutch, this really helped me in my project. Im fairly new to unreal.

                        How can I specify different animations according to different usable objects?
                        Good question, you can do this in a number of ways. In general the "to be used object" would need to hold some values that the player/AI can use…. That is all objects (actors) can hold a Move-To-Location… and possibly a list of possible use-Actions.

                        Say, you can have a door, a chair and a bed - all will hold a "move-to-location" where you can play an open door-unlock/sit-down/lie-down (and the reverse) animations… - in general your special animation would be 'implemented'… when you press 'E', you AI/Player will need to get some parameters and from that it will play the correct animation. (All Actors holds a move-to-location, but if that is zero.. you would not want/need to play any move-to-animation)

                        Now add to this that the lock/unlock action could have a target location. So when your AI reach out to this Actor the hand will actually hit some point in space. While the move-to-location will hold you natural operation position, the operate-target-point will be lock-position (say doors are different and a having this you operate a range of other targets using the same animation - given that you can actually make an operate-target-animation (probably an BS-animation that takes the operate-target-point as a parameter?).

                        From this you can derive a range of targeted-animations that will invite you to develop "Actor objects" that can be "used" in different ways. Say, you could invent a new set of objects that can be "pushed"… and for these you would play the push-animation rather than the unlock-animation. That is if the AI/player does actually implement that action/animation. (A clever AI would be able to play these use-actions randomly… say your NPC-AI could randomly sit/open/lie-down/get-up etc. and find food/eat when hungry!

                        When designing that you would like to plan a bit… say what value can Actor objects provide, say "rest", "energy", "access"… and from that you will enable a range of simple options that will make your NPC look clever (Say can the NPC find a key to unlock the door?) . However - If you go that way you would also want the player to actually notice the AI difference, often random or even pre-planned behavior will be enough. (If you can order the NPC to find search for a key, rest when sleepy or find food when you are hungry … that would be cool?)

                        Comment

                        Working...
                        X