Whenever a new video game comes out with impressive visuals, it ends up requiring beefier hardware to run. Many games on the AAA scale have development times of 6 or more years. How do these games that have been in development for so long come out with recent graphics? Do the devs just update the graphics as they go? Are the highest graphical quality games of today actually just what we were capable of doing 6 years ago?
Comments
Game dev here. Yes, some teams do update engines as they go while others don’t (which is why some major studios release games that look a lot like the stuff they were releasing many years ago).
Also, Dev machines are often much more powerful than what most customers have. What used to be possible to run only on the highest end machines is now mass-market tech. So what’s “cutting edge” for customers is not what’s cutting edge for AAA devs.
For example, Guild Wars 1 is a great game that released 20 years go. A few years back it got a massive graphical improvement. Every area in the game looked much better. People weren’t sure how as there are only ~2 devs keeping the game going. It turns out that one dev realized the game had a better graphic setting that was only running during “screenshot mode” because computers of 20 years ago couldn’t handle running those settings all the time.
Today’s mass market computers are much more powerful, so he just enabled the better graphics that existed all along. People joke about “why dont’ the devs jsut flip the ‘make game better’ switch?” but this time that’s basically what happened,
The graphics you see now took 6 years of development to achieve. Essentially there’s a lag between what is being worked on and what is released.
Multiple things.
First of all final graphics are relativly late in the development of a game. If a game is developed for 6 years a game can use placeholder graphics for 5 of those.
Then most upgrades to graphics cards are sequential upgrades there isnt some completely new technology that require you to think/do graphics different than befor coming out every year. Like something that makes graphics actually work different would be raytracing but those things dont happen all that often.
For just “simply” better graphics you can already draw/create 16k graphics. You just cant use them in the game because the graphics card is to bad. So its a bit of a prediction game of how good are graphics cards gonna be when the game releases and you create the graphics fitting for that.
Bigger developers have a direct link to nvidea and amd and work with them directly. So they get drivers and new features much earlier than the general public and thus can already create things even befor release.
And yes in the end if a game takes much longer than expected graphics will be updated.