This isn’t really a question of either or. There’s positives and negatives to each. That’s why they all exist. It’s sort of like asking if you should use a hammer or a screw driver. It depends on what you’re trying to accomplish.
Feel free to correct me if I’m wrong, but this is how I understand the difference between Tick, Timeline and Timer:
Event Tick means your event is being called every frame. As such, the time for a tick is not constant/consistent. If you’re getting 200FPS it means that you’re calling your event 200 times per second. If you’re getting 30 fps, the event is only being called 30 times in that second. This means that your tick rate will constantly be fluctuating and not giving you a precise value. Delta time is used to calculate the time in seconds (really miliseconds) between the last frame rendering and this one) to smooth out the end result of your tick interval so it’s not so chaotic.
To see what this value is, Use the Get World Delta Seconds node and plug it into a print string with a duration of 0 and call that Print String from EventTick and then walk around your level. You’ll see that the World Delta time will give you small fluctuations in value (0.011 to 0.012 for example)
Tick is generally used if you need to make sure the information is updated every frame regardless of the frame rate. It’s not a timer in and of itself, so you need to make sure you’re not thinking of it as one.
A Timer means that you’re aiming for consistent timing. So if you set the timer to loop at 1 second intervals, it means that your event is always going to fire once per second regardless of your frame rate. If you set a timer to loop at 0.1 it means it will fire 10 times in a second and .01 will fire 100 times per second. Therefore a timer can be more expensive than Tick if you set your interval lower than your average frame rate. For example if your FPS is 30 but you set your timer to 0.01 interval, then it will be much more “expensive” than Tick because it’s calling your code more frequently.
This means that a timer is not necessarily less expensive than Tick, because there’s an overhead to a timer as well. For example, it might cost more for the timer to ensure that it’s firing consistently every second when your FPS is choppy. Dips in FPS can result in the timer having to calculate adjustments to ensure that it’s actually firing at the consistent interval that you specify.
A Timeline as pointed out earlier, consumes a certain amount of ram (as I’m sure Timers do too) and it is essentially ticking for a specified duration. The difference being that the Timeline has easy to use controls to stop doing what it’s doing. If you set a timeline to do something on a loop of 1 second, then I believe it means this is more expensive than Event Tick because it’s driven by Tick itself. IE not only are you ticking every second, but you’re also encuring the cost of the Timeline. However, as with a Timer, a Timeline is consistent so it can be useful when you want to make sure something is happening at a consistent rate that is not connected to your current frame rate.
So the answer is, it depends on what you’re doing and what you’re aiming for. If you want something to be firing every frame, then use Event Tick. If you want something to fire at a consistent interval, use a Timer. If you want Something to tick for a specific duration, has easy Stop and Start and Reverse commands and functionality to create curve driven data, use a Timeline.