Well you cant really expect things to be smooth if youre already exceeding 255ms of ping. Thats a quarter of a second already! Over 10 times greater than the delay humans can notice by eye.
You need to divide the ping by 2 since the server is only sending the command (ping is send AND recieve delay). This system WONT be perfect since at the ping levels youre working with, you would also expect fluctuating/unstable connections. So depending on how the network is, the corrected time could be off by a wide margin. Still better than nothing though.
The prediction/interpolation happens outside the timeline. You want a lagging client’s timeline to “catch up”, so you use an interpolate to accelerate the timeline playspeed by a certain factor so that it tries to catch up to the server timeline. Making this happen over more frames makes it smoother, but there will STILL be a desync. Making it happen over fewer frames would be the most accurate representstion (client side) of where the server timeline is, but leads to rubber banding.
Bonus:
You didnt implement a “time in seconds” float output in your timeline. When a new player joins, it should request from the server the current time the timeline is on and start playing from there (hooks up to the “new time” input node if im not mistaken). And dont hook it up to play from start. Just set the start time to 0 of you wanna play from the start. Makes your code much more flexible.