Hi
I’m trying to make the driving agents also avoid obstacles on the track, but I can’t get it to work. After reading all the posts about this topic, such as: Learning Agents 5.4 Question, adding obstacles to the track of the tutorial, What is the best way to add ray cast to observations? Learning Agents 5.4 and Ray cast Observation for obstacles detection and Rewards, my implementation looks like this:
Interactor
-
Specify Agent Observation
-
Gather Agent Observation
I basically shoot 7 rays, make array represents the angles.
- Actions
I didn’t change the actions.
Trainer
- Gather Agent Reward
I still want the car to drive forward, so I let the following rewards from the tutorial in there:
n
And those are the rewards I added:
First I raycast 3 times:
Then I give a penalty if the actor gets too close to the Hit positions of the obstacles
If the car actually hits something (OnActorBeginOverlap), I also give a penalty:
I’ve experimented with different Distance Thresholds and Reward Scales, but at the end they drive as fast as they can without accounting for any obstacles.
- Gather Agent Completion
If they drive out of the track:
If they hit something, or are about to hit something
Then I add those together with Or node
This is the track, the white dots are the obstacles:
I let this run overnight, and the result was them just hitting any obstacle on the path, not even trying to avoid. Any idea what I’m doing wrong? Any help is appreciated. Thank you!