Is there a significant performance improvement when using less numbers?

I have not done any big projects yet, but I guess there would be a substantial performance gain on a large project by using less decimal places and using integers wherever floats are not really needed.

That’s a myth, ints and floats are about equal in speed and the number of decimal places is completely irrelevant. It’s much more practical to think about whether floats or ints make more sense for the calculations you have to do instead of worrying about speed

But all those unneeded numbers would have to start costing at some point, wouldn’t they?

As @Zeblote said, it’s better to worry about which makes more sense than worrying about the speed. Odds are you will be limited far limited by algorithm or content than you ever will the raw speed of calculations using floats/ints. And both can be SIMD’d so you can take advantage of that if they ever do become a problem.

When dealing with shader code it starts to matter which types are used since hundreds of millions calculations are made every second. With regular gameplay programming it rarely matters at all though compared to other things like mentioned above.