Download

Cover system design thoughts - feedback welcome

Hey all.

Almost at the point where I can dive into some more C++ oriented AI work for the summer. One of the bigger projects I’ve set myself is a cover generator. So i wanted to spitball some ideas regarding the cover system here with you lot, get some feedback if anyone cares to dive in.

So the basis for the cover implementation is essentially the stuff I’m going to steal from Mikko Mononen and Matthew Jack that they both did while doing AI at Crytek (I think Matthew did a pass on it after Mikko, but I might be wrong there). Anyway, the basic functionality is pretty clearly documented on the pages Matthew wrote for Cryengine. The main thing is to have a very configurable cover selection system, pretty similar to EQS in many respects, it generates and scores points. My plan is to essentially generate a bunch of cover locations with associated data (using user configurable data attributes and a way to setup a cover scoring class to plug into the system) and store them in some hierarchical data structure for quick lookup. Then implement an EQS generator that allows a configurable subset of those cover points to be selected and scored for the final cover selection.

But here’s where things get a bit tricky, because I am doing a Rainbow Six style game and I want to be able to support various scales of conflict, for instance some tight CQB in a single building, or some longer range hijinx over a larger terrain (hell why not if UE4 does it right?). So I’m thinking that from the outset I’ll need this to support dynamic generation (i.e. as things dynamic load for the landscape). My initial plan is to simply find the entry points for the Navmesh streaming/generating and grab the navpoly data from those to feed into an edge list for cover generation (cover by its definition only really exists where the navmesh has been carved by something).

Hooking the navmesh should hopefully give me the same sort of workflow, in that it will be mostly automated as the level is built. But then can be augmented with additional designer placed hints (one use-case for me, is door breaching for a swat team, where the door initially might block the navmesh, but then once the door is opened the cover points that might have been generated would naturally get changed). Designed hints for the door use-case are needed to make sure that the cover selected understands the relationship with the door state.

I’d love to do some work on implementing Ben Sunshine Hill’s stuff from AiGameConf 2014 regarding using volume sweeps for things like mantling navigation generation (essentially a really nice way of choosing where AI can vault over obstacles etc without collision). But I doubt that’s going to get done in my summer break.

So my main issues design wise are:

What data do we need to store for a particular cover spot. Do we want that to be designer/programmer configurable ala the EQS? (I’m thinking so).
What is the most rational data structure to allow rapid lookup of cover locations considering that the navigable area at any time might well be streaming out or changing. It might have multiple overlaps etc.
Does this cover generation need to be a user-chosen action. I.e. is it fully automated, or do you have to refresh it manually if you’re working on a level?
What kind of debug information is required for designers to know they can rely on the system when working on AI and level design?
How much designer input is needed overall, can we live with some simple trigger-type hints, or must they be able to literally manually reconfigure the automatic generated content?
What kind of programmer API is needed to provide the information required. I’m assuming a very EQS style system (generator, filters, output buffers filled over multiple frames, notification based etc), is there anything better?

Would love to hear your thoughts about this. I’ll probably give it away when its implemented, or submit it as a PR for inclusion in the normal build.

Sounds fun! I’m not clued up on a lot of this stuff, but here are my initial thoughts.

Am I right in thinking you’re proposing a dynamic generation phase, which continues as level geometry is streamed in and out (or perhaps even adjusted for dynamic objects?), updating the data structure as it goes, followed by a scoring/selection phase which is executed (possibly with different configuration) each time you want to query the system?

When you refer to navpoly data, do you mean the polys of the navmesh itself, or the polys of the collision shapes from which the navmesh gets generated? It sounds like the former, but in that case wouldn’t you be imposing a fundamental limitation on your system by coupling it so tightly to navigation? I’m assuming the primary measure of ‘cover’ is line-of-sight visibility, which will often not match up with navmesh boundaries.

So is the data stored with generated cover points used only as a means to an end - to eventually be used to determine the best cover points in the scoring phase? If so could the necessary data be determined automatically based on the requirements of any scoring phase configurations registered by the designer?

Yeah for anything this in depth, an asynchronous querying system is a must I’d say.

I think I’d better leave it there for now as there’s a good chance I’ve totally misunderstood some aspects of this!

Yeah, it would have to be dynamic I think. Having anything precalculated pretty much flies in the face of the direction of travel engine wise and given the navmesh can be dynamic I think I’d prefer the same. Having said that, I’d probably cache the generated edges/points in the same manner as the navmesh is cached, so that you don’t incur the (probably quite huge) cost of the cover evaluation phase during runtime too much.

When I mean the navpoly data, I mean the navmesh triangles yeah. Although that wouldn’t fundamentally limit the system as it could simply handle input from elsewhere too. But to make it reasonably performant it seems like a good idea to limit it to navmesh edges which aren’t shared. So basically anything with an outline in the navmesh would then be fed into the generator (list of vertex pairs initially I guess). Then the generator would do the various checks to determine where along the navmesh edge you can place cover points (using some configurable check). I agree it would limit the generator to the navmesh initially, but honestly I think thats a fair tradeoff for performance. I guess it might be useful to think of cases were you might not even use a navmesh, but you could do that by passing a list of vertex pairs in from any other system I guess.

Primary measure of cover is proximity to something that blocks visibility, which in most cases also blocks the navmesh, so the pairs should be at least a good first pass selection for cover points. The reason I’m suggesting this is to keep the actual cost of cover generation within a reasonable budget, given how expensive raycast/physics sweeps are.

Then the rest is just an async api as you say, to make sure that costs for the queries can be amortized over time.

Thats the plan anyway. Will do some quick tests to evaluate performance and see if navmesh edges are really the right way to select potential cover locations.

Yep, I realised I was thinking about it the wrong way with respect to visibility checks. I guess you’re saying the first phase, using the nav polys, just generates potentially plausible cover points (by virtue of being close to walls/impassable objects). The second phase would be where you’d do visibility raycasts and such like, when you have more specific contextual information like the locations of player and enemies.

So what kind of things were you thinking of with regards associating data with generated cover points? Things like height of cover, directional cover (just one direction, 180 degrees, etc), cover solidity (concrete wall/glass window)? Assuming the query system you provide for the final phase allows a lot of flexibility in the kinds of things that can be tested, presumably any data stored with the cover point is more of an optimisation than a requirement? By that I mean, these values could potentially be computed in the evaluation phase from scratch, but if you store some context-agnostic data with the points and expose those predefined data items through the query system, you can avoid more costly calculations being run with each query.

Exactly. Potentially you could add any arbitrary data you like to them. But the ones you mention are pretty normal. Ideally I’d store some bitflags for general cases (i.e. crouch, standing, maybe 8 directions of each) as a means of speeding up tests. But I’d rather make that some flexible “user data” container kind of thing and leave the creation of that data and the evaluation of it as user configurable classes. I’ll make a quick UML diagram or some such once I figure out the whole thing in my head.

So you’d generate points, then generate the cover data values for those points, then at query time, you’d filter the set of points down to a small subset and then choose the one you want using whatever scoring metric you prefer. Which really is quite similar to EQS in functionality, aside from it does the generation during navmesh rebuild. So it might be that a good chunk of the work can simply be thrown into the EQS system and then use a custom EQS generator that simply reads the cover points from the cache of cover points and does its own scoring of them.

The other major use case is the special case for stuff like doors where the “cover” changes as you open the door. But even that case will work with the navmesh regeneration setup I guess.

I’m super interested in seeing how this goes for you.
In regards to your questions, all below imo of course :smiley:

I would imagine designers would want cover to be disabled at certain times even when the navmesh and other tests are valid. AI will also need to know what sort of cover animation to play based off the cover its using. I have this all setup manually for our game currently.

To be consistent with the rest of the engine I’d probably go with an octree in some fashion.

A lot of the generation can probably be done offline/editor time. Only when some cover is disabled or near destructibles would I see it need to happen at runtime.

Visual debug rendering of the cover slots/walls/however you represent them. Ideally you can piggy back on the EQS testing pawn method and move a player or AI around a level and have the best cover highlighted at editor time.

For an automated system I would give the designers as little power as possible to keep things simple, easier to understand and keep them from breaking anything. So hints :smiley:

Devil is in the details

Yep, makes a lot more sense to me now.

This is interesting from my point of view. I’m working on a system myself (loosely, AI spatial awareness) which also has a great deal of overlap with EQS. It currently doesn’t make use of EQS though, as I started a while before I even saw Mieszko’s introductory video. Since then I’ve wondered about whether I should implement it through EQS. Clearly there are benefits to doing so, however my system (and perhaps the one you’re suggesting) is potentially so processor intensive that my suspicion is that to be viable it will need a dedicated solution to enable as much optimisation as possible. EQS is extremely generic and flexible which is great, but I’m guessing, though I haven’t studied the code, that that inevitably leads to some performance overhead.

Well anyway, I’ll be very interested in following your progress and decision making on this.