Here’s how game system requirements have evolved over 20 years, from 512MB to 16GB

PC game requirements often follow the evolution of console hardware capabilities, as developers release their games on multiple platforms.

We may earn a commission if you make a purchase from a Club386 link. See our ethics statement.

Lately, with the advent of ready-to-go game engines such as Unity and Unreal Engine, gamers have started noticing that official system requirements no longer reflect a game’s graphics quality. At the same time, the rate of release of new hardware – especially graphics cards – has slowed down from yearly refreshes to two years or more between each generation. Requirements continue to go up while rasterisation graphics quality stagnates, or even regresses, demanding halo-class GPUs just to be able to reach 60fps when enabling all the eye candy.

From early PC games that required simple PCs to modern AAA titles that demand multi-core CPUs with 8GB+ GPUs, not to mention copious amounts of SSD storage, let me take you down memory lane and see how these demands have evolved through the years, and if they are warranted.

2006 to 2009

Higher-resolution textures, the first shader generations, and the beginning of multi-core.

Crysis.

Going into the mid-2000s, PC games targeted systems that would find it difficult to even run a browser nowadays. Titles such as The Elder Scrolls IV: Oblivion and Hitman: Blood Money demanded just a single-core Intel Pentium 4 CPU and 512MB or so of RAM. On the GPU side, many games were more than happy with anything that supported DirectX 9 (DX9), as long as it had about 128MB of VRAM. If you had a 512MB graphics card, you were golden. Nvidia’s GeForce 8800 GTX (2006) was a landmark GPU at the time, introducing unified shaders for the first time to PCs.

DirectX 9 opened the way for dynamic lighting and richer materials, which brought GPU specs to the forefront of game requirements. The most notable title from that era was undoubtedly Crysis, which pushed visual fidelity to new heights thanks to its detailed textures, shaders, and foliage. It was so demanding, PCs struggled to max out its graphics settings, coining the phrase “Can it run Crysis?”

Minimum specs.

YearGameMinimum CPUMinimum GPURAM
2006The Elder Scrolls IV: Oblivion2.0GHz Intel Pentium 4128MB Direct3D-compatible GPU512MB
2007Crysis2.8GHz Intel Pentium 4256MB GeForce 6800 GT or Radeon 9800 Pro1GB
2008Grand Theft Auto IV1.8GHz Intel Core 2 Duo256MB GeForce 7900 or Radeon X19001.5GB
2009Batman: Arkham Asylum3.0GHz Intel Pentium 4128MB GeForce 6600 or ATI Radeon 13001GB

Nearing the end of the decade, developers started to publish specific minimum and recommended specs, including the exact CPU and GPU model, instead of general recommendations such as 128MB GPU or 2GHz CPU. But, as long as you had mid-range hardware from the same year as the game, you could be pretty much certain to run it comfortably.

Hardware availability was also straightforward then: anyone – at least in the US or Europe – could buy a mid-range configuration at a moment’s notice and without extreme costs. Even the flagship GeForce 8800 GTX was available for $599 (roughly $900 in today’s money with inflation). Good luck getting an RTX 5090 for that amount!

Recommended specs.

YearGameRecommended CPURecommended GPURAM
2006The Elder Scrolls IV: Oblivion3.0GHz Intel Pentium 4256MB GeForce 68001GB
2007Crysis2.2GHz Intel Core 2 Duo512MB GeForce 7800 or Radeon 18002GB
2008Grand Theft Auto IV2.2GHz Intel Core 2 Duo512MB GeForce 8600 or Radeon 38702GB
2009Batman: Arkham AsylumIntel Core 2 Duo E6600512MB GeForce 9800GTX or Radeon 38702GB

2010 to 2014

Multi-threaded engines, richer physics, and DirectX 11.

Battlefield 3.

As DirectX 11 began to be implemented into game engines and multi-core CPUs became a commodity, developers started evolving their rendering pipelines to take advantage of the extra threads and API capabilities. Features such as tessellation – which was quite heavy at the time – offered greater geometric detail that made the game worlds feel more authentic and less flat. As a result, games started asking for quad-core CPUs, 4 to 8GB of RAM, and DX11 compatibility… something older GPUs lacked. I remember dragging my Intel Core 2 Quad Q6600 plus Radeon HD 3750 all the way to Battlefield 3 (2011), despite their age.

Minimum specs.

YearGameMinimum CPUMinimum GPURAM
2010Metro 2033Core 2 Duo256MB GeForce 88001GB
2011Battlefield 32.4GHz Core 2 Duo or 2.7GHz Athlon X2512MB GeForce 8800 GT or Radeon 38702GB
2012Borderlands 22.4GHz Dual-core256MB GeForce 8500 or ATI Radeon HD 26002GB
2013BioShock Infinite2.4GHz Core 2 Duo or 2.7GHz Athlon X2512MB GeForce 8800 GT or Radeon HD 38702GB
2014Middle-earth: Shadow of MordorCore i5-750 or Phenom II X4 9651GB GeForce GTX 460 or Radeon HD 58503GB

Thankfully, many developers included DX10/10.1 rendering options in their games, which allowed many users who still had capable systems to run these games. To further include those with lower-end hardware, developers also started providing easily-applicable presets such as Low or Medium, which allowed older dual-core systems to run games without capping the graphics quality for those who could afford it.

Overall, attaining a basic experience targeting between 30 and 60fps, you needed something like a 2GHz+ dual-core, 2 to 4GB of RAM, and a mid-range GeForce GTX 460 or Radeon HD 5850 with 1GB of VRAM. Aside from some outliers, a 2.5GHz+ quad-core plus GTX 570 were enough to max out the graphics in plenty of games.

Recommended specs.

YearGameRecommended CPURecommended GPURAM
2010Metro 2033Quad-core or 3GHz Dual-core512MB GeForce GTX 2602GB
2011Battlefield 3Quad‑core1GB GeForce GTX 560 or Radeon HD 69504GB
2012Borderlands 22.3GHz Quad‑core512MB GeForce GTX 560 or Radeon HD 58502GB
2013BioShock InfiniteQuad‑core1GB GeForce GTX 560 or Radeon HD 69504GB
2014Middle-earth: Shadow of MordorCore i7-3770 or FX-8350GeForce GTX 660 or Radeon HD 79508GB

2015 to 2019

1080p becomes the baseline, open worlds and streaming expand memory needs.

The Witcher 3 Wild Hunt.

2015 saw the arrival of big open-world games such as The Witcher 3 and Fallout 4, which brought the visual target baseline to 1080p. Due to their larger maps and countless NPCs, not forgetting improved graphics, these games began demanding faster CPU cores for the AI/physics and more GPU VRAM to store higher-definition textures. To avoid consuming too much VRAM and remain compatible with old, mid-range hardware, developers implemented texture streaming, which continuously loaded and discarded assets as the player traversed the world.

Even so, modern games, especially nearing 2019, required 6 to 8GB of VRAM, as developers shifted attention to 1440p or even 4K gaming. Frame rates became a stronger topic as high refresh displays grew widely available at reasonable prices, letting many experience gameplay smoothness previously unknown. Though 30fps remained the target for many console games, hardcore PC gamers wanted 60fps. As a result, mid-range GPUs were no longer enough, at least for those who wanted to maintain high graphics settings.

Minimum specs.

YearGameMinimum CPUMinimum GPURAM
2015The Witcher 3: Wild HuntCore i5-2500K or Phenom II X4 940GeForce GTX 660 2GB or Radeon HD 7870 2GB6GB
2016Battlefield 1Core i5 6600K or FX-6350GeForce GTX 660 2GB or Radeon HD 7850 2GB8GB
2017Assassin’s Creed OriginsCore i5-2400S or FX-6350GeForce GTX 660 2GB or Radeon R9 270 2GB6GB
2018Far Cry 5Core i5-2400 or FX-6300GeForce GTX 670 2GB or Radeon R9 270 2GB8GB
2019Metro ExodusCore i5-4440GeForce GTX 1050 2GB or Radeon HD 7870 2GB8GB

With the launch of Red Dead Redemption 2, Assassin’s Creed Odyssey, and Metro Exodus, it quickly become clear that anything short of the flagship GPUs didn’t have enough muscle, at least if you craved those AAA graphics. At the same time, hardware started to become rapidly more expensive with each passing generation due to higher silicon manufacturing costs. For reference, the Nvidia GeForce GTX 980 Ti launched in 2015 at $650; three years later, the RTX 2080 Ti hit store shelves at $999. To be fair, the performance uplift was dramatic, but still, that was the price of a full mid-range PC. At least on the CPU side, things didn’t change much until 2018 as Intel was reluctant to move from quad-cores, and RAM was dirt cheap, which is not something we can say now.

Unsurprisingly, seeing the extra RAM and VRAM on hand, developers started packing larger texture sets, making 6 to 8GB graphics cards the norm. Many took this opportunity to target striking visual effects and realism and ended up creating outstanding worlds like Red Dead Redemption 2, which still look amazing even by today’s standards.

Recommended specs.

YearGameRecommended CPURecommended GPURAM
2015The Witcher 3: Wild HuntCore i7-3770 or FX-8350GeForce GTX 770 2GB or Radeon R9 290 4GB8GB
2016Battlefield 1Core i7-4790 or FX-8350GeForce GTX 1060 3GB or Radeon RX 480 4GB16GB
2017Assassin’s Creed OriginsCore i7-3770 or FX-8350GeForce GTX 760 4GB or Radeon R9 280X 3GB8GB
2018Far Cry 5Core i7-3770 or FX-8350GeForce GTX 970 4GB or Radeon R9 290X 4GB8GB
2019Metro ExodusCore i7-4770kGeForce RTX 2060 6GB or Radeon RX Vega 56 8GB8GB

2020 to 2026

Ray tracing, AI upscaling, and mandatory SSD.

Cyberpunk 2077.

From 2020 onwards, the game industry split into two parallel trends, one focusing on the good old rasterised rendering and another embracing the new and heavy real-time ray-traced rendering. Though the initial attempts at ray tracing showed little graphical improvement compared to well-designed screen-space and cube map rendering, its qualities shine through today. Due to its revolutionary way of calculating lighting and reflections, ray tracing and later path tracing were and are still extremely demanding on GPUs. Furthermore, putting a stake in the ground, hardware-based ray tracing is incompatible with graphics architectures older than Nvidia’s Turing and AMD’s RDNA 2.

At the same time, though, VR (Virtual Reality) games opened a whole can of worms due to their extended view range, binocular display (one frame per eye), low latency requirements, and very high-resolution rendering. To reduce the need for super-powerful hardware, developers relied on some clever techniques, such as foveated rendering, which displayed the parts on the player’s peripheral vision at a lower resolution.

Minimum specs.

YearGameMinimum CPUMinimum GPURAM
2020Cyberpunk 2077Core i7-6700 or Ryzen 5 1600GeForce GTX 1060 6GB or Radeon RX 580 8GB12GB
2021Resident Evil VillageCore i5-7500 or Ryzen 3 1200GeForce GTX 1050 Ti 4GB or Radeon RX 560 4GB8GB
2022Elden RingCore i5-8400 or Ryzen 3 3300XGeForce GTX 1060 3GB or Radeon RX 580 4GB12GB
2023Alan Wake 2Core i5-7600K or Ryzen 5 1600GeForce GTX 1070 8GB or Radeon RX 5600 XT 6GB16GB
2024Suicide Squad: Kill the Justice LeagueCore i5-8400 or Ryzen 5 1600GeForce GTX 1070 8GB or Radeon RX Vega 56 8GB16GB
2025Battlefield 6Core i5-8400 or Ryzen 5 2600GeForce RTX 2060 6GB or Radeon RX 5600 XT 6GB16GB
2026Crimson DesertCore i5-8500 or Ryzen 5 2600XGeForce GTX 1060 6GB or Radeon RX 5500 XT16GB

Be it VR, ray-traced, or rasterised, most AAA games began asking for at least 16GB of system RAM, with 24GB, or more, recommended for those running voice chat or content streaming apps in the background. Storage demand also saw a massive uptick as games routinely required 100GB+ of space, with examples like ARK: Survival Evolved gobbling up above 250GB. Titles such as Microsoft Flight Simulator could potentially take around 2TB through its optional world data streaming and high-resolution textures and locations. At the same time, SSDs became specifically mentioned in system requirements, with some games adding dedicated options for the slower HDDs to alleviate some of their weaknesses.

All of this meant that developers had to find a new way to display their games’ system requirements. Some chose to split the hardware list between ray-traced and rasterised rendering, while others hid the performance impact behind DLSS/FSR upscaling. The latter helped a lot, but it wasn’t a silver bullet. One of the best examples of such system requirement layouts remains Dying Light: The Beast, and Star Wars Outlaws, which note every detail from resolution and frame rate to the upscaling quality option and DLSS/FSR technology version.

Recommended specs.

YearGameRecommended CPURecommended GPURAM
2020Cyberpunk 2077Core i7-12700 or Ryzen 7 7800X3DGeForce RTX 2060 Super 8GB or Radeon RX 5700 XT 8GB16GB
2021Resident Evil VillageCore i7-8700 or Ryzen 5 3600GeForce GTX 1070 8GB or Radeon RX 5700 8GB16GB
2022Elden RingCore i7-8700K or Ryzen 5 3600XGeForce GTX 1070 8GB or Radeon RX Vega 56 8GB16GB
2023Alan Wake 2Core i7-11700K or Ryzen 7 3700XGeForce RTX 3060 8GB or Radeon RX 6600 XT 8GB16GB
2024Suicide Squad: Kill the Justice LeagueCore i7-10700K or Ryzen 7 5800X3DGeForce RTX 2080 8GB or Radeon RX 6800 XT 16GB16GB
2025Battlefield 6Core i7-10700 or Ryzen 7 3700XGeForce RTX 3060 Ti 8GB or Radeon RX 6700 XT 12GB16GB
2026Crimson DesertCore i5-11600K or Ryzen 5 5600GeForce RTX 2080 8GB or Radeon RX 6700 XT 12GB16GB

System requirements don’t tell the whole story

Throughout all these periods, the same business and technical incentives guided spec choices. In their attempt to balance marketing (flashy screenshots/trailers) and development constraints (performance impact of graphics), the published requirements shifted noticeably from reality.

For example, Cyberpunk 2077 recommended a GeForce RTX 2060 Super 8GB, yet to experience its amazing path-traced graphics at an acceptable frame rate, you needed a GeForce RTX 4080, plus some help from DLSS upscaling. In fact, this game can be so demanding that AMD GPUs are not even an option with Path Tracing. Other times, it’s just a consequence of poor game optimisation – looking at you, Starfield and Arc Survival Evolved – where systems built according to the official guidelines struggle to provide a stable experience.

Another discrepancy stems from how each developer describes minimum or recommended specs. For some, minimum specs mean 30fps at low settings, while others add upscaling on top of that. The same goes for recommended specs, where many argue it should be above 60fps at high settings, but this can easily be tricked by frame generation. This is why detailed system requirements that note frame rate and upscaling options are the best solution so far. I urge all studios to use these more granular metrics when describing games.

At the end of the day, game publishers want the maximum number of users to buy and play their games, and for that, everything is allowed: imprecise specs, forced upscaling, and even the inclusion of frame generation in the fine print.

The future

With next-gen consoles rumoured to pack at least 8-core CPUs with 30GB of RAM and a brand-new ray-tracing-capable GPU, we can expect PC game specs to evolve from the current six-core CPUs plus 16GB of RAM and ~8GB VRAM, to something equal or higher. New technology requirements, such as support for Direct Storage, Neural Rendering, and Neural Texture Compression may also become mandatory. 16GB of GPU VRAM will likely become standard, especially at 4K resolution, while more graphics horsepower will be needed to take advantage of the upcoming +1000Hz displays.

In short, expect the trajectory of system requirements to continue going up, probably at a faster rate than your wallet allows. The price of innovation, huh?

Fahd Temsamani
Fahd Temsamani
Senior Writer at Club386, his love for computers began with an IBM running MS-DOS, and he’s been pushing the limits of technology ever since. Known for his overclocking prowess, Fahd once unlocked an extra 1.1GHz from a humble Pentium E5300 - a feat that cemented his reputation as a master tinkerer. Fluent in English, Arabic, and French, his motto when building a new rig is ‘il ne faut rien laisser au hasard.’

Deal of the Day

Hot Reviews

Preferred Partners

Long Reads