Back in July 2025, a 32 GB kit of DDR5-6000 CL30 could be had for under $90 during sales. Today, that same kit runs $480 on Amazon. If you want RGB or a name-brand kit from Corsair or G.Skill, you are paying even more. The so-called RAMpocalypse, driven by AI data center demand hoovering up DRAM supply, has made memory one of the most painful line items in any PC build right now.
So here is the question that actually matters for gamers: if you settle for slower, cheaper DDR5-4800 instead of the recommended DDR5-6000, does your gaming experience take a real hit?
PC Gamer hardware writer Nick Evanson ran a structured test across 7 games to find out, using a rig built around an AMD Ryzen 9 9900X and an Nvidia GeForce RTX 5090. The memory comparison was Lexar Thor OC DDR5-6000 CL32 versus the same kit running without EXPO enabled, dropping it to DDR5-4800 CL40. That slower setting reflects what cheaper kits available right now actually offer.
The games that did not care at all
Three games were tested specifically because they were expected to show no meaningful difference: Cyberpunk 2077 at RT Ultra, Black Myth: Wukong at Cinematic, and Stalker 2 at Epic settings.
The results matched expectations almost perfectly. At 1080p, Cyberpunk 2077 actually posted fractionally higher average frames with the slower memory (114 vs 112), which falls well within normal gameplay variance. Black Myth: Wukong was essentially identical across all three resolutions. Stalker 2 showed a 3 fps gap at 1080p average, but that is within the 5% margin of error that any gameplay-based test carries.
The reason is straightforward: these are GPU-heavy titles. With the RTX 5090 running at maximum settings, the graphics card is the bottleneck. The CPU and system RAM are not even close to being the limiting factor, so swapping memory speeds changes nothing measurable.
info
Any frame rate difference of 5% or less in gameplay-based testing should be treated as noise, not signal. Built-in benchmarks repeat the same code path; real gameplay does not.
Where things got slightly interesting
Counter-Strike 2 and Microsoft Flight Simulator 2024 represent two very different ways a game can become sensitive to memory speed.
CS2, running at Very High settings, hit 392 average fps with DDR5-6000 at 1080p, compared to 363 fps with DDR5-4800. That is roughly a 7% gap. At 1440p and 4K, the difference shrinks to almost nothing, because the GPU starts doing more work. The key here is that competitive players running 1080p low settings to chase maximum frame rates will feel this more, not less. Lower quality settings push more of the workload onto the CPU and RAM.
Microsoft Flight Simulator 2024 was messier. The game's performance is notoriously inconsistent even on top-end hardware, but there was at least some evidence of a speed difference across resolutions. MSFS 24 carries an enormous CPU workload simulating weather, air traffic, and terrain streaming simultaneously, so it makes sense that memory bandwidth would show up here.
The two titles where slow RAM actually showed up
Spider-Man Remastered and Hogwarts Legacy are both games that constantly stream data from system memory, and both showed a real performance gap between the two DRAM speeds.
With the RTX 5090, Spider-Man Remastered dropped from 117 to 106 average fps at 1080p with DDR5-4800. The 1% low figures also fell noticeably, from 72 to 66. Across 1440p and 4K, the game remained CPU-limited (the resolution change barely moved the frame rate), and the slower memory continued to produce lower 1% lows throughout.
Hogwarts Legacy was the most dramatic case. At 1080p Ultra with the RTX 5090, average fps dropped from 183 to 154 when switching to DDR5-4800. That is a 16% reduction. The 1% lows fell from 78 to 65.
To isolate the effect more cleanly, the test was repeated with an RTX 3060 Ti at 1080p across all quality presets:
At Low and Medium presets, where the GPU has spare capacity and the CPU and RAM become the bottleneck, the slower memory causes a real drop in minimum performance. At High and Ultra, the GPU takes over completely and the memory speed becomes irrelevant again.
What this means for most PC gamers
For the vast majority of gaming scenarios, slow DDR5 is not going to ruin anything. If your GPU is working hard, which it will be at 1440p or 4K with modern settings, the memory speed simply does not matter enough to feel in practice.
The exceptions are real but specific. Competitive players grinding CS2 at 1080p low settings, or anyone playing open-world titles that stream large amounts of data like Spider-Man or Hogwarts Legacy, will see a tangible difference. In Hogwarts Legacy specifically, the 1% low performance dropped by up to 20% with DDR5-4800 under CPU-limited conditions.
Here's the thing: a 32 GB kit of DDR5-4800 from a lesser-known brand can still cost $370 right now. That is still a painful amount of money for memory that carries some performance compromises. Saving money during the current crisis is completely reasonable, but the savings are not as dramatic as they were even a year ago.
For a broader look at hardware buying decisions right now, browse more guides to stay on top of what is actually worth buying in this market. And if you want to see how current GPUs perform before committing to a build, the latest reviews cover the full range of current options. Make sure to check out more:







