| ▲ | adastra22 4 days ago | |||||||||||||||||||||||||||||||
I think you're missing the point. The game could be processing input and doing a state update at 1000Hz, while still rendering a mere 60fps. There doesn't have to be any correlation whatsoever between frame rate and input processing. Furthermore, this would actually have less latency because there won't be a pipeline of frame buffers being worked on. Tying the input loop to the render loop is a totally arbitrary decision that the game industry is needlessly perpetuating. | ||||||||||||||||||||||||||||||||
| ▲ | hamdingers 4 days ago | parent | next [-] | |||||||||||||||||||||||||||||||
No, I'm explaining how most games work in practice. You're right a game could be made that works that way. I'm not aware of one, but I don't have exhaustive knowledge and it wouldn't surprise me if examples exist, but that was not the question. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
| ▲ | fizzynut 4 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||
Doing that will increase input latency, not decrease it. There are many tick rates that happen at the same time in a game, but generally grabbing the latest input at the last possible moment before updating the camera position/rotation is the best way to reduce latency. It doesn't matter if you're processing input at 1000hz if the rendered output is going to have 16ms of latency embedded in it. If you can render the game in 1ms then the image generated has 1ms of latency embedded in to it. In a magical ideal world if you know how long a frame is going to take to render, you could schedule it to execute at a specific time to minimise input latency, but it introduces a lot of other problems like both being very vulnerable to jitter and also software scheduling is jittery. | ||||||||||||||||||||||||||||||||