In this particular dev log, I’m going to focus on the AI system I’ve prototype out. It’s not rocket science but it definitely does get the job done. So far I’ve managed to have my AI agent move back and forth between a set of way-points and follow the player when they come into view. I’ve broken down what I did below.

Basic Way-point System

Basic Way-Point System

The way-point system I’ve coded out uses Unity’s basic Nav Mesh setup. In the scene, I have 5 way-points (the yellow spheres) scattered throughout the level that is assigned upon start-up.

// get all gameobjects with the tag "Waypoint" and assign it to an array
m_col_goWaypoints = GameObject.FindGameObjectsWithTag("Waypoint");

Once assigned, the Nav Mesh Agent (the red cube) would pick the first point in the waypoint array and move to it. When it reaches the point it plays a short delay then proceeds to the next point. I’ve used a Vector3.Distance to determine the stopping distance between the way-point and the AI game object.

// determine the distance between the target object and this object
Vector3.Distance(m_transTarget.transform.position, this.transform.position)

For a pretty basic setup this gets the job done rather nicely.

Player Tracking

After I got my basic way-point system working the next thing I wanted to take care of was player tracking. I knew I wanted the player to be tracked at all times when they fall into the AI’s field of vision so I had a couple ideas on how to tackle the concept.

First I needed to determine what the AI’s field of vision was. In order to properly calculate what angles the player can be detected at I had to determine what the position is between the player and the AI object and perform a Vector3.Angle to get its current angle.

// get the position from the player to the ai object
// then get the current angle of the player based on that position
Vector3 t_vecTargetDir = m_goPlayer.transform.position - this.transform.position;
float t_fltTargetAngle = (Vector3.Angle(t_vecTargetDir, this.transform.forward));
Player Tracking Raycast

I knew once I got that angle it was only a matter of determining what degrees the player can be detected at which in this case was a 45-degree angle.

// if the player is within the field of vision
if (t_fltTargetAngle >= -45 && t_fltTargetAngle <= 45)
    // run the function here

You can see from the screenshot above, that when the player is standing in front of the AI object anywhere between -45-degree and 45-degree, that they are being detected. I represented this with a white debug line.

Player Tracking Out of Field of Vision

Once the player falls above or below any of those angles the debug line is automatically gone informing us that the player is no longer being tracked.

Player Tracking Behind Objects

I knew once I got basic player tracking working I wanted the player to be detected at all times when they are in the AI’s field of vision except when they are hiding behind objects.

To accomplished this, I essentially kept the raycast on the player casting from the AI at all times when they are in the field of vision. This helped me determine when an object is standing in between the player and the AI object by returning casting information about the object it is currently hitting instead of the player.

Player Tracking Behind Objects

This also ensured that once the player appears back in sight they are immediately discovered right away and tracked.

This is definitely still far from perfect and it also not all I have planned for the AI controller. I’ll have more posted next time as I get the AI concept more fleshed out.

Previous ArticleNext Article
Tony V. Le is an Experience Designer who specialized in game and web development. He is also the owner of tvledesign LLC and publishes online tutorials and blogs about his experiences as a freelancer.

Leave a Reply

Your email address will not be published. Required fields are marked *