Lately, with the advent of ready-to-go game engines such as Unity and Unreal Engine, gamers have started noticing that official system requirements no longer reflect a game’s graphics quality. At the same time, the rate of release of new hardware – especially graphics cards – has slowed down from yearly refreshes to two years or more between each generation. Requirements continue to go up while rasterisation graphics quality stagnates, or even regresses, demanding halo-class GPUs just to be able to reach 60fps when enabling all the eye candy.
From early PC games that required simple PCs to modern AAA titles that demand multi-core CPUs with 8GB+ GPUs, not to mention copious amounts of SSD storage, let me take you down memory lane and see how these demands have evolved through the years, and if they are warranted.
2006 to 2009
Higher-resolution textures, the first shader generations, and the beginning of multi-core.

Going into the mid-2000s, PC games targeted systems that would find it difficult to even run a browser nowadays. Titles such as The Elder Scrolls IV: Oblivion and Hitman: Blood Money demanded just a single-core Intel Pentium 4 CPU and 512MB or so of RAM. On the GPU side, many games were more than happy with anything that supported DirectX 9 (DX9), as long as it had about 128MB of VRAM. If you had a 512MB graphics card, you were golden. Nvidia’s GeForce 8800 GTX (2006) was a landmark GPU at the time, introducing unified shaders for the first time to PCs.
DirectX 9 opened the way for dynamic lighting and richer materials, which brought GPU specs to the forefront of game requirements. The most notable title from that era was undoubtedly Crysis, which pushed visual fidelity to new heights thanks to its detailed textures, shaders, and foliage. It was so demanding, PCs struggled to max out its graphics settings, coining the phrase “Can it run Crysis?”
Minimum specs.
| Year | Game | Minimum CPU | Minimum GPU | RAM |
| 2006 | The Elder Scrolls IV: Oblivion | 2.0GHz Intel Pentium 4 | 128MB Direct3D-compatible GPU | 512MB |
| 2007 | Crysis | 2.8GHz Intel Pentium 4 | 256MB GeForce 6800 GT or Radeon 9800 Pro | 1GB |
| 2008 | Grand Theft Auto IV | 1.8GHz Intel Core 2 Duo | 256MB GeForce 7900 or Radeon X1900 | 1.5GB |
| 2009 | Batman: Arkham Asylum | 3.0GHz Intel Pentium 4 | 128MB GeForce 6600 or ATI Radeon 1300 | 1GB |
Nearing the end of the decade, developers started to publish specific minimum and recommended specs, including the exact CPU and GPU model, instead of general recommendations such as 128MB GPU or 2GHz CPU. But, as long as you had mid-range hardware from the same year as the game, you could be pretty much certain to run it comfortably.
Hardware availability was also straightforward then: anyone – at least in the US or Europe – could buy a mid-range configuration at a moment’s notice and without extreme costs. Even the flagship GeForce 8800 GTX was available for $599 (roughly $900 in today’s money with inflation). Good luck getting an RTX 5090 for that amount!
Recommended specs.
| Year | Game | Recommended CPU | Recommended GPU | RAM |
| 2006 | The Elder Scrolls IV: Oblivion | 3.0GHz Intel Pentium 4 | 256MB GeForce 6800 | 1GB |
| 2007 | Crysis | 2.2GHz Intel Core 2 Duo | 512MB GeForce 7800 or Radeon 1800 | 2GB |
| 2008 | Grand Theft Auto IV | 2.2GHz Intel Core 2 Duo | 512MB GeForce 8600 or Radeon 3870 | 2GB |
| 2009 | Batman: Arkham Asylum | Intel Core 2 Duo E6600 | 512MB GeForce 9800GTX or Radeon 3870 | 2GB |
2010 to 2014
Multi-threaded engines, richer physics, and DirectX 11.

As DirectX 11 began to be implemented into game engines and multi-core CPUs became a commodity, developers started evolving their rendering pipelines to take advantage of the extra threads and API capabilities. Features such as tessellation – which was quite heavy at the time – offered greater geometric detail that made the game worlds feel more authentic and less flat. As a result, games started asking for quad-core CPUs, 4 to 8GB of RAM, and DX11 compatibility… something older GPUs lacked. I remember dragging my Intel Core 2 Quad Q6600 plus Radeon HD 3750 all the way to Battlefield 3 (2011), despite their age.
Minimum specs.
| Year | Game | Minimum CPU | Minimum GPU | RAM |
| 2010 | Metro 2033 | Core 2 Duo | 256MB GeForce 8800 | 1GB |
| 2011 | Battlefield 3 | 2.4GHz Core 2 Duo or 2.7GHz Athlon X2 | 512MB GeForce 8800 GT or Radeon 3870 | 2GB |
| 2012 | Borderlands 2 | 2.4GHz Dual-core | 256MB GeForce 8500 or ATI Radeon HD 2600 | 2GB |
| 2013 | BioShock Infinite | 2.4GHz Core 2 Duo or 2.7GHz Athlon X2 | 512MB GeForce 8800 GT or Radeon HD 3870 | 2GB |
| 2014 | Middle-earth: Shadow of Mordor | Core i5-750 or Phenom II X4 965 | 1GB GeForce GTX 460 or Radeon HD 5850 | 3GB |
Thankfully, many developers included DX10/10.1 rendering options in their games, which allowed many users who still had capable systems to run these games. To further include those with lower-end hardware, developers also started providing easily-applicable presets such as Low or Medium, which allowed older dual-core systems to run games without capping the graphics quality for those who could afford it.
Overall, attaining a basic experience targeting between 30 and 60fps, you needed something like a 2GHz+ dual-core, 2 to 4GB of RAM, and a mid-range GeForce GTX 460 or Radeon HD 5850 with 1GB of VRAM. Aside from some outliers, a 2.5GHz+ quad-core plus GTX 570 were enough to max out the graphics in plenty of games.
Recommended specs.
| Year | Game | Recommended CPU | Recommended GPU | RAM |
| 2010 | Metro 2033 | Quad-core or 3GHz Dual-core | 512MB GeForce GTX 260 | 2GB |
| 2011 | Battlefield 3 | Quad‑core | 1GB GeForce GTX 560 or Radeon HD 6950 | 4GB |
| 2012 | Borderlands 2 | 2.3GHz Quad‑core | 512MB GeForce GTX 560 or Radeon HD 5850 | 2GB |
| 2013 | BioShock Infinite | Quad‑core | 1GB GeForce GTX 560 or Radeon HD 6950 | 4GB |
| 2014 | Middle-earth: Shadow of Mordor | Core i7-3770 or FX-8350 | GeForce GTX 660 or Radeon HD 7950 | 8GB |
2015 to 2019
1080p becomes the baseline, open worlds and streaming expand memory needs.

2015 saw the arrival of big open-world games such as The Witcher 3 and Fallout 4, which brought the visual target baseline to 1080p. Due to their larger maps and countless NPCs, not forgetting improved graphics, these games began demanding faster CPU cores for the AI/physics and more GPU VRAM to store higher-definition textures. To avoid consuming too much VRAM and remain compatible with old, mid-range hardware, developers implemented texture streaming, which continuously loaded and discarded assets as the player traversed the world.
Even so, modern games, especially nearing 2019, required 6 to 8GB of VRAM, as developers shifted attention to 1440p or even 4K gaming. Frame rates became a stronger topic as high refresh displays grew widely available at reasonable prices, letting many experience gameplay smoothness previously unknown. Though 30fps remained the target for many console games, hardcore PC gamers wanted 60fps. As a result, mid-range GPUs were no longer enough, at least for those who wanted to maintain high graphics settings.
Minimum specs.
| Year | Game | Minimum CPU | Minimum GPU | RAM |
| 2015 | The Witcher 3: Wild Hunt | Core i5-2500K or Phenom II X4 940 | GeForce GTX 660 2GB or Radeon HD 7870 2GB | 6GB |
| 2016 | Battlefield 1 | Core i5 6600K or FX-6350 | GeForce GTX 660 2GB or Radeon HD 7850 2GB | 8GB |
| 2017 | Assassin’s Creed Origins | Core i5-2400S or FX-6350 | GeForce GTX 660 2GB or Radeon R9 270 2GB | 6GB |
| 2018 | Far Cry 5 | Core i5-2400 or FX-6300 | GeForce GTX 670 2GB or Radeon R9 270 2GB | 8GB |
| 2019 | Metro Exodus | Core i5-4440 | GeForce GTX 1050 2GB or Radeon HD 7870 2GB | 8GB |
With the launch of Red Dead Redemption 2, Assassin’s Creed Odyssey, and Metro Exodus, it quickly become clear that anything short of the flagship GPUs didn’t have enough muscle, at least if you craved those AAA graphics. At the same time, hardware started to become rapidly more expensive with each passing generation due to higher silicon manufacturing costs. For reference, the Nvidia GeForce GTX 980 Ti launched in 2015 at $650; three years later, the RTX 2080 Ti hit store shelves at $999. To be fair, the performance uplift was dramatic, but still, that was the price of a full mid-range PC. At least on the CPU side, things didn’t change much until 2018 as Intel was reluctant to move from quad-cores, and RAM was dirt cheap, which is not something we can say now.
Unsurprisingly, seeing the extra RAM and VRAM on hand, developers started packing larger texture sets, making 6 to 8GB graphics cards the norm. Many took this opportunity to target striking visual effects and realism and ended up creating outstanding worlds like Red Dead Redemption 2, which still look amazing even by today’s standards.
Recommended specs.
| Year | Game | Recommended CPU | Recommended GPU | RAM |
| 2015 | The Witcher 3: Wild Hunt | Core i7-3770 or FX-8350 | GeForce GTX 770 2GB or Radeon R9 290 4GB | 8GB |
| 2016 | Battlefield 1 | Core i7-4790 or FX-8350 | GeForce GTX 1060 3GB or Radeon RX 480 4GB | 16GB |
| 2017 | Assassin’s Creed Origins | Core i7-3770 or FX-8350 | GeForce GTX 760 4GB or Radeon R9 280X 3GB | 8GB |
| 2018 | Far Cry 5 | Core i7-3770 or FX-8350 | GeForce GTX 970 4GB or Radeon R9 290X 4GB | 8GB |
| 2019 | Metro Exodus | Core i7-4770k | GeForce RTX 2060 6GB or Radeon RX Vega 56 8GB | 8GB |
2020 to 2026
Ray tracing, AI upscaling, and mandatory SSD.

From 2020 onwards, the game industry split into two parallel trends, one focusing on the good old rasterised rendering and another embracing the new and heavy real-time ray-traced rendering. Though the initial attempts at ray tracing showed little graphical improvement compared to well-designed screen-space and cube map rendering, its qualities shine through today. Due to its revolutionary way of calculating lighting and reflections, ray tracing and later path tracing were and are still extremely demanding on GPUs. Furthermore, putting a stake in the ground, hardware-based ray tracing is incompatible with graphics architectures older than Nvidia’s Turing and AMD’s RDNA 2.
At the same time, though, VR (Virtual Reality) games opened a whole can of worms due to their extended view range, binocular display (one frame per eye), low latency requirements, and very high-resolution rendering. To reduce the need for super-powerful hardware, developers relied on some clever techniques, such as foveated rendering, which displayed the parts on the player’s peripheral vision at a lower resolution.
Minimum specs.
| Year | Game | Minimum CPU | Minimum GPU | RAM |
| 2020 | Cyberpunk 2077 | Core i7-6700 or Ryzen 5 1600 | GeForce GTX 1060 6GB or Radeon RX 580 8GB | 12GB |
| 2021 | Resident Evil Village | Core i5-7500 or Ryzen 3 1200 | GeForce GTX 1050 Ti 4GB or Radeon RX 560 4GB | 8GB |
| 2022 | Elden Ring | Core i5-8400 or Ryzen 3 3300X | GeForce GTX 1060 3GB or Radeon RX 580 4GB | 12GB |
| 2023 | Alan Wake 2 | Core i5-7600K or Ryzen 5 1600 | GeForce GTX 1070 8GB or Radeon RX 5600 XT 6GB | 16GB |
| 2024 | Suicide Squad: Kill the Justice League | Core i5-8400 or Ryzen 5 1600 | GeForce GTX 1070 8GB or Radeon RX Vega 56 8GB | 16GB |
| 2025 | Battlefield 6 | Core i5-8400 or Ryzen 5 2600 | GeForce RTX 2060 6GB or Radeon RX 5600 XT 6GB | 16GB |
| 2026 | Crimson Desert | Core i5-8500 or Ryzen 5 2600X | GeForce GTX 1060 6GB or Radeon RX 5500 XT | 16GB |
Be it VR, ray-traced, or rasterised, most AAA games began asking for at least 16GB of system RAM, with 24GB, or more, recommended for those running voice chat or content streaming apps in the background. Storage demand also saw a massive uptick as games routinely required 100GB+ of space, with examples like ARK: Survival Evolved gobbling up above 250GB. Titles such as Microsoft Flight Simulator could potentially take around 2TB through its optional world data streaming and high-resolution textures and locations. At the same time, SSDs became specifically mentioned in system requirements, with some games adding dedicated options for the slower HDDs to alleviate some of their weaknesses.
All of this meant that developers had to find a new way to display their games’ system requirements. Some chose to split the hardware list between ray-traced and rasterised rendering, while others hid the performance impact behind DLSS/FSR upscaling. The latter helped a lot, but it wasn’t a silver bullet. One of the best examples of such system requirement layouts remains Dying Light: The Beast, and Star Wars Outlaws, which note every detail from resolution and frame rate to the upscaling quality option and DLSS/FSR technology version.
Recommended specs.
| Year | Game | Recommended CPU | Recommended GPU | RAM |
| 2020 | Cyberpunk 2077 | Core i7-12700 or Ryzen 7 7800X3D | GeForce RTX 2060 Super 8GB or Radeon RX 5700 XT 8GB | 16GB |
| 2021 | Resident Evil Village | Core i7-8700 or Ryzen 5 3600 | GeForce GTX 1070 8GB or Radeon RX 5700 8GB | 16GB |
| 2022 | Elden Ring | Core i7-8700K or Ryzen 5 3600X | GeForce GTX 1070 8GB or Radeon RX Vega 56 8GB | 16GB |
| 2023 | Alan Wake 2 | Core i7-11700K or Ryzen 7 3700X | GeForce RTX 3060 8GB or Radeon RX 6600 XT 8GB | 16GB |
| 2024 | Suicide Squad: Kill the Justice League | Core i7-10700K or Ryzen 7 5800X3D | GeForce RTX 2080 8GB or Radeon RX 6800 XT 16GB | 16GB |
| 2025 | Battlefield 6 | Core i7-10700 or Ryzen 7 3700X | GeForce RTX 3060 Ti 8GB or Radeon RX 6700 XT 12GB | 16GB |
| 2026 | Crimson Desert | Core i5-11600K or Ryzen 5 5600 | GeForce RTX 2080 8GB or Radeon RX 6700 XT 12GB | 16GB |
System requirements don’t tell the whole story
Throughout all these periods, the same business and technical incentives guided spec choices. In their attempt to balance marketing (flashy screenshots/trailers) and development constraints (performance impact of graphics), the published requirements shifted noticeably from reality.
For example, Cyberpunk 2077 recommended a GeForce RTX 2060 Super 8GB, yet to experience its amazing path-traced graphics at an acceptable frame rate, you needed a GeForce RTX 4080, plus some help from DLSS upscaling. In fact, this game can be so demanding that AMD GPUs are not even an option with Path Tracing. Other times, it’s just a consequence of poor game optimisation – looking at you, Starfield and Arc Survival Evolved – where systems built according to the official guidelines struggle to provide a stable experience.
Another discrepancy stems from how each developer describes minimum or recommended specs. For some, minimum specs mean 30fps at low settings, while others add upscaling on top of that. The same goes for recommended specs, where many argue it should be above 60fps at high settings, but this can easily be tricked by frame generation. This is why detailed system requirements that note frame rate and upscaling options are the best solution so far. I urge all studios to use these more granular metrics when describing games.
At the end of the day, game publishers want the maximum number of users to buy and play their games, and for that, everything is allowed: imprecise specs, forced upscaling, and even the inclusion of frame generation in the fine print.
The future
With next-gen consoles rumoured to pack at least 8-core CPUs with 30GB of RAM and a brand-new ray-tracing-capable GPU, we can expect PC game specs to evolve from the current six-core CPUs plus 16GB of RAM and ~8GB VRAM, to something equal or higher. New technology requirements, such as support for Direct Storage, Neural Rendering, and Neural Texture Compression may also become mandatory. 16GB of GPU VRAM will likely become standard, especially at 4K resolution, while more graphics horsepower will be needed to take advantage of the upcoming +1000Hz displays.
In short, expect the trajectory of system requirements to continue going up, probably at a faster rate than your wallet allows. The price of innovation, huh?

