Hello everyone.
English is not my native language, please excuse possible spelling or grammatical mistakes.
I am currently working on top-down 2d game similar to rpg-maker games.
To achieve a similar graphics style I used nearest filter + editor icon compression for the textures.
I am also using an orthographic camera for rendering.
Right now I am experiencing some strange image distortions while camera movement.
Its like some vertical / horizontal lines moving through the image.
I am pretty sure this is because I am using nearest filtering for the textures.
The texture resolution is 32x32 pixels and the mesh planes used for the grid is 100x100 units in size.
After rendering those planes are about 64x64 pixels on screen. I believe the image distortions are caused
by some rounding issues which causes the texture to be projected to 63x63 pixels while rendering.
So one pixel is drawn twice.
I tried multiple values of “Ortho Width” in the orthographic camera to achieve a perfect rounding free projection,
but this seems to be impossible.
When I tried using a projection camera, those artifacts were mostly gone.
I think this is due to the txaa filtering used instead of fxaa while using an orthographic camera.
Am I right with this presumption?
I tried making some screenshots and videos of the distortions,
the screens are useless because the distortions are only visible while movement.
And most of the details are lost in movie compression, but I still could upload those.
So what am I looking for:
Is there a way to use the same txaa aliasing filter while orthographic rendering as in perspective?
This would not fix the actual problem but fix the result.
Or is there something I can do to fix the rounding issues?
Like a mathematical way to calculate ortho width from resolution and texture size.
Or maybe I am doing something completely wrong?
Thank you for reading. I would appreciate any help or advice.