| ▲ | hamdingers 4 days ago | ||||||||||||||||||||||||||||||||||||||||||||||
That's correct, and the most competitive multiplayer games tend to have fixed tick rates on the server, but the higher FPS is still beneficial (again, theoretically for all but the highest level of competition) because your client side inputs are sampled more frequently and your rendered frames are at most a couple ms old. | |||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | adastra22 4 days ago | parent [-] | ||||||||||||||||||||||||||||||||||||||||||||||
I think you're missing the point. The game could be processing input and doing a state update at 1000Hz, while still rendering a mere 60fps. There doesn't have to be any correlation whatsoever between frame rate and input processing. Furthermore, this would actually have less latency because there won't be a pipeline of frame buffers being worked on. Tying the input loop to the render loop is a totally arbitrary decision that the game industry is needlessly perpetuating. | |||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||