AMD Ryzen 5 2400G Memory Speed Performance Analysis

Manufacturer: AMD

Memory Matters

Memory speed is not a factor that the average gamer thinks about when building their PC. For the most part, memory performance hasn't had much of an effect on modern processors running high-speed memory such as DDR3 and DDR4.

With the launch of AMD's Ryzen processors, last year emerged a platform that was more sensitive to memory speeds. By running Ryzen processors with higher frequency and lower latency memory, users should see significant performance improvements, especially in 1080p gaming scenarios.

However, the Ryzen processors are not the only ones to exhibit this behavior.

Gaming on integrated GPUs is a perfect example of a memory starved situation. Take for instance the new AMD Ryzen 5 2400G and it's Vega-based GPU cores. In a full Vega 56 or 64 situation, these Vega cores utilize blazingly fast HBM 2.0 memory. However, due to constraints such as die space and cost, this processor does not integrate HBM.

View Full Size

Instead, both the CPU portion and the graphics portion of the APU must both depend on the same pool of DDR4 system memory. DDR4 is significantly slower than memory traditionally found on graphics cards such as GDDR5 or HBM. As a result, APU performance is usually memory limited to some extent.

In the past, we've done memory speed testing with AMD's older APUs, however with the launch of the new Ryzen and Vega based R3 2200G and R5 2400G, we decided to take another look at this topic.

For our testing, we are running the Ryzen 5 2400G at three different memory speeds, 2400 MHz, 2933 MHz, and 3200 MHz. While the maximum supported JEDEC memory standard for the R5 2400G is 2933, the memory provided by AMD for our processor review will support overclocking to 3200MHz just fine.

Continue reading our look at memory speed scaling with the Ryzen 5 2400G!

Synthetic Benchmarks

View Full Size

View Full Size

Both of our synthetic graphics benchmarks show fairly linear scaling between memory frequency and graphics performance.

The increase from DDR4-2400 to DDR4-2933 saw an 8% and 6% difference in 3DMark and Unigine Superposition respectively. Moving to DDR4-3200 saw impressive scaling of 13% and 11% from the slower DDR-2400.


View Full Size

View Full Size

View Full Size

View Full Size

Just as we saw in our synthetic benchmarks, real-world gaming examples show scaling of 10-15% moving from DDR4-2400 to DDR4-3200. DDR4-2933 provides a 5-7% performance boost compared to DDR4-2400.

Single Channel vs Dual Channel

Of course when we are talking about memory performance, and especially with APUs, the idea of using one DIMM in a single channel configuration v.s. two in a dual channel configuration is a big topic. 

View Full Size

View Full Size

While lower-end system builders can get away with the usually cheaper Single DIMM option on other platforms, that's simply not the case for gamers looking to really take advantage of the graphics on their R5 2400G.

View Full Size

Historically, we've seen reasonably linear scaling of frame rates with memory speed increases for AMD APUs for gaming. This doesn't change with the new Ryzen and Vega-based APUs. Increased baseline memory compatibility to DDR-2933 with the Raven Ridge platform provides nice advantages in allowing all users to achieve higher memory clocks without having to overclock RAM and depend on motherboard compatibility with certain memory speeds.

We also tested allocating additional memory to the Ryzen part within the BIOS. Going from the default allocation of 1GB to the highest available on 2GB saw no performance advantage. It's clear that the Raven Ridge platform is able to allocate additional system memory as needed while gaming. However, allocating more memory in the BIOS can be useful for games that do checks for video memory either at the time of launching the executable, or when enabling higher visual quality settings.

Currently, with highly inflated prices for all DDR4 memory, a 2x8GB kit of DDR4-2400 memory will run you about $160For around $20 more, you can get a kit of the same capacity that will support DDR4-3200. For users looking to game on the integrated GPU of the Ryzen 3 2200G or Ryzen 5 2400G, it seems like a no-brainer to opt for the faster memory.

Review Terms and Disclosure
All Information as of the Date of Publication
How product was obtained: The product is on loan from AMD for the purpose of this review.
What happens to product after review: The product remains the property of AMD but is on extended loan for future testing and product comparisons.
Company involvement: AMD had no control over the content of the review and was not consulted prior to publication.
PC Perspective Compensation: Neither PC Perspective nor any of its staff were paid or compensated in any way by AMD for this review.
Advertising Disclosure: AMD has purchased advertising at PC Perspective during the past twelve months.
Affiliate links: This article contains affiliate links to online retailers. PC Perspective may receive compensation for purchases through those links.
Consulting Disclosure: AMD is a current client of Shrout Research for products or services related to this review. 

Video News

February 19, 2018 | 04:23 PM - Posted by djotter

Did the 2200G scale the same?

February 19, 2018 | 04:25 PM - Posted by Kristin Rich (not verified)

Shame the original review wasn't carried out with Ddr4 3200, wen there is only a $20 premium.

February 19, 2018 | 04:51 PM - Posted by willmore

Agreed. From both a performance and a value perspective, it makes sense.

February 19, 2018 | 04:57 PM - Posted by belloc

Yeah for only 20 bucks they should definitely update their reviews. I mean what is the price difference between the Intel and AM4 motherboard, I'm sure it's more than 20 bucks.

February 20, 2018 | 05:42 PM - Posted by VintageDude (not verified)

20 bucks more, No Way. The link brings you to memory that costs $240. That is crazy for this low end apu.

February 20, 2018 | 05:43 PM - Posted by VintageDude (not verified)

Sorry that's $255.00 dollars!

February 20, 2018 | 05:47 PM - Posted by VintageDude (not verified)

Sorry again, just saw it was sold prior for 169. geeze, I missed a great deal, I just ordered the 2200g apu.

February 19, 2018 | 04:49 PM - Posted by JP (not verified)

What's the hash rate of the GPU? gotta make that dolla.

February 20, 2018 | 03:33 AM - Posted by aparsh335i (not verified)

At launch reports are around 270mh combined CPU+GPU, which is pretty good because you're getting about half of what you get out of a Ryzen 1700 w/ OC for about half the price. I'm hoping that as time goes by there will be some updates between mining software and bios that will get this up toward the 350-400 area for 2400g and 300 area for 2200g. That would be nice :)

February 20, 2018 | 03:34 AM - Posted by aparsh335i (not verified)

Sorry, 270mh for 2400g and about 170mh for 2200g *

February 19, 2018 | 05:13 PM - Posted by pdjblum

seems just wrong and disingenuous to have used 2400 mem speeds in the initial review

seems as if you guys were trying to show the new apu's in the worst light possible

shame on ryan and this site

February 19, 2018 | 05:20 PM - Posted by Ryan Shrout

We were definitely not trying to show any product in any kind of specific light. I believed at the time, and still stand by, that testing everything at 2400 MHz was the right thing to do. We wanted to include memory scaling in the first story, just ran out of time. This follow-up was the best option for us.

Memory pricing changes, and we can't be expected to adjust for that on-the-fly 100% of the time.

I think our review and thoughts and opinions on the new Ryzen with Vega graphics were incredibly positive.

February 19, 2018 | 08:30 PM - Posted by Marees (not verified)

I think you should also have included ddr4-2133 ram, for completeness sake

February 19, 2018 | 09:02 PM - Posted by AdamR (not verified)

..and that is the reason you should have tested with faster RAM. Your justification was pricing of RAM but then you say that pricing of RAM changes all the time and you cant be expected to adjust. For just $20 more the justification for just purchasing an APU instead of a discrete 1030 could be made as well.

February 19, 2018 | 11:48 PM - Posted by Ryan Shrout

I don't disagree that memory speed changes performance, hence today's story.

However, I think we made the argument that the new APU made the use of a GT 1030 significantly less appealing as it is.

February 20, 2018 | 05:49 AM - Posted by Anonymous - ok anon is taken (not verified)

Problem is that when you show something in a bad light at first then that impression sticks. No amount of “but it’s so much faster if you do this” changes the initial impression you created.

Take the time to be complete for the initial review. Be complete. Be credible.

February 20, 2018 | 06:52 AM - Posted by odizzido2 (not verified)

It does for me. I also think that the original fury cards are pretty decent these days as they have aged very well. Way better than Nvidia's stuff.

I do however agree with you that many people will not include this ram speed test in their opinion of these APUs.

February 20, 2018 | 10:17 PM - Posted by Myopic

Gee dude, troll much? At no time in the many years of reading this sight has Ryan and his crew done anything but credible work. If they make a mistake they own it. If they run out of time, you might want to look at whether there's competition to write about the same topic.
You hammer on "its only $20 difference when thats not the point at all. Its about the time they have to test and get things written so we, the reader they serve (quite well imo)can get the info and make the decisions on whether to try out the new stuff.

February 20, 2018 | 08:29 AM - Posted by Anonymous2 (not verified)

In PCPer's defense, the majority of truly BUDGET builders and 100% of OEMs are not going to spend $20 extra on faster RAM. Most people won't even overclock. Using stock speeds and bottom-of-the-barrel RAM is going to be the most realistic scenario for the 2200G/2400G. It's an appropriate first impression.

That $20 can be saved for later and go toward a new/used dGPU later down the road. In the time between now and then, they have a perfectly decent 720p/1080p gaming rig or HTPC.

March 1, 2018 | 10:30 PM - Posted by Mike_B_E (not verified)

I don't think it's wrong or disingenuous. I think that some people will be buying according to their wallet first, which might be why they select a Rysen 3/5 APU, rather than the traditional CPU + separate video card.

RAM prices, like video cards, are rising rather than falling.

February 19, 2018 | 06:08 PM - Posted by Anonymous-911 (not verified)

3200 Ram?

Whats the difference if you are playing in Low or Medium Graphics settings. At this point, it doesn't really matter if its 30 FPS or 60 FPS does it. You are basically just wasting time on your computer which is fine.

Probably dropping in a dedicated GPU and then running the platform memory tests is probably more important as you try to future proof your system.

Anyways, i'm happy with the work you guys did. It's important to see these kinds of tests.

February 19, 2018 | 06:10 PM - Posted by DontPanic! (not verified)

I like that you guys added a disclosure. It makes the review more tranparent. Keep up the good work!

February 19, 2018 | 06:59 PM - Posted by LetTheZenVegaAPUTweakingBegin (not verified)

Maybe next generation on the lower end APUs AMD could add some eDRAM on die for the GPU and enable the Vega HBCC to use that eDRAM as HBC in a similar manner to what the HBCC does with HBM2 as HBC that is used on the discrete Vega variants.

It also looks like Vega's L2 cache and that reworking of Vega's L2 cache with respect to the GPU's raster back end and other GPU functional blocks has helped Vgea perform much better than previous GCN generation integrated graphics. But AMD needs to think about eDRAM on the lower end APUs and HBM2 for some future high end APUs so Vega's HPCC/HBC IP can benefit APUs also.

Is 12nm going to be eventually used all of AMD's APUs like it will be for the Ryzen CPU only refresh SKUs or will AMD just wait for 7nm to update its line of Zen/Vega APUs.

AMD's APU product stack for at least the next several months is finally complete as far as there being Zen/Vega mobile and desktop APU variants availabe finally from AMD. But the 2200G/2400G desktop variants are directly derived from the mobile variants so maybe at 12nm AMD will have some sort of refresh APUs or maybe just one new desktop APU variant at 12nm with a bit larger Vega nCU counts and some sort of extra GPU cache level for its integrated mobile Vega HBCC to use as HBC.

Now comes the process of getting all the Firmware/Driver and Gaming software tweaked for Raven Ridge and that includes testing any games that will be making use of Vega's FP16(Packed math) and Explicit Primitive Shaders, and orher new Vega IP. With Vega's HBCC/HBC IP on hold for APUs until AMD can get either eDRAM or HBM2 for Vega's HBCC to use as HBC.

Faster memory is always a good thing to have for Mobile CPU/Integrated Graphics combo SKUs and eDRAM can be of help also until the HBM2 ecosystem and econony of scale takes hold at some future time and HBM2 can be used on mobile and desktop Zen/Vega products also.

February 19, 2018 | 09:42 PM - Posted by Rick0502 (not verified)

What is up with the “review terms and disclosure”? I can’t recall seeing that on any other review.

I do have to agree that testing with 2400mhz memory in the initial review was the right thing to do. You need to have a standard when testing CPUs (or any hardware). You can’t test cpu x with 2400 then test cpu y with 3200. It would have been nice to have this in the initial review, but I understand that things take time.

February 20, 2018 | 07:31 AM - Posted by DrGuns (not verified)

He got blasted in an investigative piece for failing to disclose potential conflicts of interest. While I think that piece might have gone a bit too far at points and Ryan being the kind of guy he is looked it over thought about it and made some changes based on what parts he thought were valid concerns. I don't personally think Ryan intentionally did anything wrong or unethical. I do appreciate the additional effort at disclosure.

February 20, 2018 | 10:57 AM - Posted by Jabbadap

Memory freq should always be max. supported by manufacturer, everything higher than that is considered overclocking. In which begs to question: Which is it for these ryzen APUs by AMD? Anandtech says it is 2933MHz, thus I can't find confirmation for that on amd web site. And mobile ryzens with the same chip we have confirmed memfreq at 2400MHz. Third number to give is igpuless ryzens 2667MHz.

February 20, 2018 | 11:34 AM - Posted by Rick0502 (not verified)

I would disagree. You need to test all CPUs the same. That will help you compare cpu to cpu. How can you tell if x cpu is faster than y cpu if they aren’t compared accross a level playing field?

September 19, 2018 | 01:14 PM - Posted by Oreo (not verified)

Im in the same boat at the moment- I upgraded to a Ryzen 5 w/ Vega 11 gpx on board and using an MSI X470 Pro Carbon Gaming Motherboard- nowhere can I find the best answer as to how far I can go with DDR4 sticks.

February 20, 2018 | 02:12 AM - Posted by Marc (not verified)

Latencies and Command Rate for each frequency? Without it's a little useless, thanks.

February 20, 2018 | 03:37 AM - Posted by aparsh335i (not verified)

Agree. It clearly said at the very beginning of the article that the latencies were not becoming more important for gaming and then completely left that off the review for their equipment used for the tests.

February 20, 2018 | 03:21 AM - Posted by WhyMe (not verified)

Would have been nice to see, or hear, if faster RAM effected CPU performance now there's no inter CCX communication happening over IF.

February 20, 2018 | 06:55 AM - Posted by odizzido2 (not verified)

Pretty solid gains from the faster ram. Good information to have, thank you :)

February 20, 2018 | 08:09 AM - Posted by HigherStandards (not verified)

This clearly should've been in the original review, the 10-15% uplift could've made it look better vs the 1030 and showed us the best case scenario for this APU. This article is welcomed for the "record" but is very thin and doesn't give us the complete picture. Where is the analysis? no memory timings, no max memory speed achievable(stable/bench only, ddr4-3333 or higher perhaps?), no overclocking GPU? Also in general what about underclocking? Ryan I come here to be informed and dazzled with charts and data(like allen does with ssds) not this half-assed quickly done article with little to no analysis. Is it that y'all only save in-depth analysis to make AMD look bad a la the frame-pacing saga?

February 20, 2018 | 05:59 PM - Posted by CNote

Overclocking the gpu would have been nice since Ive seen some reviews go up past 1400mhz with ease.

Just tested my 2200G with the igpu overclocked to 1600mhz at 1.25v. Plus using 8gb lpx 3000.
Using the TW: Warhammer 2 benchmark with 1280x720p at low settings.
Stock settings: Avg 40.6
Oc'd to 1600mhz: Avg 52.7
Running on my freesync monitor it looked pretty smooth overclocked. Looked like a game from 2012 though.

Superposition scores with 720p test:
Stock: 5439
Oc'd: 6565-- 20.7% jump

February 27, 2018 | 04:43 AM - Posted by yottaXT (not verified)

PCPER way to go when talking about AMD is to show their product at the most weak state posible for the first instance, then correct the said information with the right one with a short extra data time later just to look unbiased. As always said "the first impresion is very important"

February 20, 2018 | 12:45 PM - Posted by msroadkill612


The apuS seem fine w/ 8GB.

It seems it dynamically allocates ram well to the gpu (ref steve at hardware unboxed), so no need to reserve memory for it

3200 ram isnt all that fancy in the intel world - its not top of the line by any means.

February 21, 2018 | 04:26 AM - Posted by bsbuster (not verified)

Don't worry, it's all part of the Intel PR/Shill plan. Which is to try and convince people that the *only* memory that works ok with these APUs is 16GB of low latency 3200 memory that costs well over $200.

However, the truth is that to acheive 90+% of their gaming performance all you need is 8GB of CL15 2933 (2x4GB sticks)and a bit of iGPU overclocking.

That memory will only cost you ~$90-100. So you can understand why shills/Intel PR mouthpieces don't want people to realize this.

February 20, 2018 | 03:06 PM - Posted by elites2012

it had always mattered. how is this new news? it always made a difference in all of the AMD and intel chips.

February 21, 2018 | 12:43 PM - Posted by Rocky1234 (not verified)

Very nice follow up on the first review & it does show that spending that little extra on higher speed memory does actually net you some gains of you are going to fully depend on the Vega 11 for your daily tasks and a bit of gaming.

February 22, 2018 | 06:00 AM - Posted by VintageDude (not verified)

In fairness, perhaps the original review of the 2400g APU should be amended to show the game scores using the 2933 memory. After all, that is what AMD recommends and sent with the chip.

March 9, 2018 | 12:30 PM - Posted by Robert Is Watching (not verified)

1) Does the iGPU have its own memory controller ?
2) What's the max frequency the memory controller of a Ryzen 5 2400G can manage ?
In other words above what frequency does it become throwing money in the bin when buying DDR4 for a 2400G ?
3) What make and model numbers of memory sticks (will) do well with the 2400G ?

March 28, 2018 | 04:14 PM - Posted by SR vs DR (not verified)

Would be interesting to know how much the difference is between Dual Rank 2666MHz and Single Rank 3000MHz, and if the Single Rank 3000MHz would be worth the purchase over the Dual Rank 2666MHz RAM performance wise.

December 16, 2018 | 11:30 AM - Posted by Carlos olivera (not verified)

I can only play 5 minutes THE DIVISION, before the game goes completely slow.

Ryzen 2400g
ddr4 16g 2933mhz corsair

I'm dissapointed :'( I guess I'm going back to intel processors

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.