is it remotely possible that i can move my character around a scene and generate a spline and look direction from the motion that i can later tweek and attach a camera too for smoother rendering, or should i find a better solution?
I think this is possible. What is the end goal so I better understand?
If this is PIE/Play in editor:
You could use Take Recorder and record the character then you can access all of that data. You could make a spline or anything else you needed from that.
If this is a cooked build:
One idea, and this is kind of a wonky hacky way but I think it would work. Would be to generate the spline from the character as it moves, creating a point either based on how much time passes (so a new spline point every 0.5 seconds) or the distance traveled (if the character has moved 100 units since the last spline point, then create another). Then before ending the sessions you could print to the log the spline points from the spline.
Then create a blueprint to parse this information that you could paste into and generate a spline from once you have ended the gameplay session.
The tricky thing in general about the workflow you are describing is that you want to generate the spline and then keep it for use in the editor. This is just me coming up with a quick non optimized way but I think it would work.
i was thinking of using it for a kind of freeform archviz rendering, i wanted freedom of movement around the scene but without the choppiness of a mouse controller making it look like a game,