graphical fidelity peaked in Red Dead Redemption 2 and yet system requirements and filesizes continue to increase. I was kinda shocked to see that the new Indiana Jones game looks about the same graphically as Far Cry 5, which came out in 2018, and yet it recommends that you have an RTX 3080Ti and a Ryzen 7 7700
I feel like video game studios might be running into the trap that the modern web ran into - where they have a “performance budget” to fill up (tested on extremely powerful computers, of course) and there’s no need to optimize anything as long as you aren’t overbudget on performance
so instead of asking themselves “how fast should this reasonably run, given what it is?” they ask themselves “what kind of hardware is the average gamer going to have and what framerate will they expect?”
I’ve been thinking about this lately as I upgrade my computer. modern games really don’t look better than the games that my computer can run just fine from around 2018 and on. and yet they run so much worse - to the point that I’m forced to get new hardware