Ai Perception Sence

Greetings everyone. Who can to say how to work with AI Prediction and AI Touch in the AI Perception on BluePrints and is it possible without C ++?
Particularly i interested in AI Touch. In the documentation says that as an example it can be used to contact the player with the AI. Such as in a stealth that would sneak and the AI ​​will follow some logic if the AI ​​Touch worked. What is the basis of such a contact, collisions of meshes / capsules?
AI Debug does not give results on this contact.

And I would like to find out: The command definition setting was added to Blueprint or is it still possible to implement it only using C ++? I remember that everyone complained about the need to switch on all the checkmarks in the Sence settings otherwise the AI ​​will not recognize anything and will not divide the players into teams.

I am grateful in advance for any answers. In the Internet, as much as I did not search, all information only about eyesight, hearing and sometimes about damage

Sorry for my bad English.