Review Index:
Feedback

The AMD Ryzen 5 2400G and Ryzen 3 2200G Review: Return of the APU

Author: Ryan Shrout
Subject: Processors
Manufacturer: AMD

Radeon Vega Graphics - Discrete Level in a Processor

First up, the data that is probably most interesting and compelling for the new AMD processor, we’ll look at the integrated graphics performance of the 2400G and 2200G. With the Vega graphics system and upgraded GCN architecture, in addition to higher stream processor and GPU clock count than any previous APU, we have expectations of decent performance for 720p and 1080p gaming.

There are two angles to look at: how does the new Ryzen APU compare in graphical and gaming performance to the latest Intel processors with Intel HD Graphics? And, how does the Ryzen APU compare to the modest discrete graphics of something like the NVIDIA GeForce GT 1030 or Radeon RX 550?

*Update (2/19)* While this testing was performed at DDR4-2400 only due to time constraints for the initial review, we have now also taken a look at the performance of up to DDR4-3200 memory speeds with the R5 2400G

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

Compared to the HD 630 Graphics in a processor like the Core i5-8400, the graphics system in the new Ryzen APUs is world's faster. In most of our tests, including Civ6, Ashes, and Wildlands, the AMD Vega-based integrated graphics is 2-3x faster than what the Intel HD Graphics 630 can provide. That's not 20-30% mind you, 200-300%! AMD is clearly still the king of processor graphics!

Looking at how the Ryzen 5 2400G compares to the GT 1030 and RX 550, it’s clear this is a much more interesting fight. However, the results show that AMD can offer discrete-level graphics performance on a desktop processor that are slightly behind that of of the tested parts. That is a  constantly moving target, as the discrete market changes, but the Raven Ridge APU is an impressive piece of hardware. 


February 12, 2018 | 09:15 AM - Posted by AnonymousScrotum (not verified)

"AMD is clearly still the kind of processor graphics!"

clearly the king perhaps? discrete level in a processor second to last paragraph.

February 12, 2018 | 09:21 AM - Posted by Ryan Shrout

Heh, thanks, fixed!

February 12, 2018 | 08:35 PM - Posted by SincereAnonymous (not verified)

shouldn't have fixed that! 'kind' is way funnier.

February 12, 2018 | 09:27 AM - Posted by Anonymous1337 (not verified)

As expected slower than GT 1030. AMDslow.

February 12, 2018 | 09:41 AM - Posted by AnonymousScrotum (not verified)

https://www.techpowerup.com/reviews/AMD/Ryzen_5_2400G_Vega_11/20.html

not so sure about that.

February 14, 2018 | 08:22 AM - Posted by Anonymous12345 (not verified)

great link. thanks

February 12, 2018 | 12:08 PM - Posted by collie

Did you expect a cpu to have an integrated gpu faster than a 1030? It's the most powerful integrated graphics ever, and it's ALMOST as good as having an bottom of the barrel gpu. (a gpu that'll cost you over $100 at this time)

February 12, 2018 | 01:23 PM - Posted by yurall

it scales incredibly with memory speed. with 3200Mhz expect about 30% more performance.

btw. I feel that a processor should always be tested with the on the box memory speed. which would have been 2933. that would also give a massive boost to performance (23% according to hardware unboxed) compared to 2400Mhz.

little bit weird to just run it at 2400 because it's 20 bucks more expensive to get up to 2933...

February 14, 2018 | 12:32 AM - Posted by Anonymouse (not verified)

this is pcper. you can expect them to nerf AMD kit as much as they can

February 16, 2018 | 08:07 AM - Posted by Anonymussss (not verified)

Exactly!

February 21, 2018 | 06:14 AM - Posted by albert89 (not verified)

Correct.

February 12, 2018 | 06:01 PM - Posted by FasterThanIntelsIntegratedMaybeSlowerThanThe1030WhoKnows (not verified)

You are supposed to read more reviwes across more differet websites before coming to any conclusions. And be ready to provide links to more than a few reviews that support your statement. The GT 1030 also has 16 ROPs and a GPU Clock:
1228 MHz and Boost Clock: 1468 MHz while the Ryzen 5 2400G's GPU has a max boost of 1,250 MHz. And the GTX 1030 has a dedicated GDDR5 VRAM.

And the memory speed usd for testing 2400Mhz(?) on the APU well go and read all the reviews for the next few months and see how things settle. But Hey no one expects integrated graphics to totally defeat a Discret GPU with 2GB of dediacted VRAM for every benchmark.

What about Intel's integrated Graphics up against the GT 1030, or even the Ryzen 5 2400G's Vega 11(nCU) graphics.
The only bad thing that Ryzen 5 2400G has copied from Intel is that toothpaste TIM, but there will be deliddings to be done and that German Debau8er guy makes some nice delidding hardware.

February 21, 2018 | 06:13 AM - Posted by albert89 (not verified)

Beating the 1030 in a few games. Perhaps you got your unbiased info from PCPer's team of AMD hating reviewers. No surprises there.

February 12, 2018 | 09:30 AM - Posted by Hmmm (not verified)

Clarification.

I'm confused about your gaming test. In the "Discrete Gaming Tests" were all of the systems equipped with a GTX1080, or just the non-2200G & 2400G?

I'd hoped to see gaming tests of the built in Vega GPU in the new AMD chips, and I'm not sure if I'm seeing that.

THanks!

February 12, 2018 | 09:32 AM - Posted by Orcblood (not verified)

You can find the gaming tests of the built in Vega GPU @ "Radeon Vega Graphics - Discrete Level in a Processor" on the menu.

February 12, 2018 | 10:13 AM - Posted by Hmmm (not verified)

Thanks, Orcblood. I don't know how I missed that.

February 12, 2018 | 09:40 AM - Posted by Orcblood (not verified)

Definitely a big improvement over Intel and last gen AMD APUs while keeping good power consumption and thermals though I was expecting more performance with the integrated vega graphics. Guess it makes sense with 11CUs with 704 stream processors. I'm now more excited about Kaby Lake-G. Please do a comparision with the 2400G when that comes out!I think amd should make a bigger one of these though for notebooks. Don't know why they wouldn't especially now since APUs are an attractive alternative due to the low supply of discrete graphics when it comes to gaming.

Keep up the good work!

February 12, 2018 | 09:41 AM - Posted by Orcblood (not verified)

*notebooks or desktop that is.

February 12, 2018 | 10:08 AM - Posted by CyclesRenderingAcceleratedOnAMDsIntegratedGraphics (not verified)

If the APU included some eDRAM or even Some HBM2(1GB) the graphics performance would be worlds better. And Vega's HBCC/HBC IP can not be utilized on these APU SKUs because there is No HBM2 to be used as High Bandwidth Cache(HBC).

Hopefully AMD can release at some time on 7nm some Desktop "APUs" with some included eDRAM or even HBM2 in small amounts so the HPCC/HBM2-HBC IP can be enabled.

As it currently stands only AMD discrete Vega GPUs can take advantage of Vega's HBCC/HBM2-HBC IP and that includes the Intel SOC/Discrete Semi-custom Vega GPU die SKU on that EMIB/MCM SKUs. Next up on AMD's release schedule is their Vega Discrete mobile GPUs that will come with 4GB of HBM2. And I hope that reviewers will use the Vega GPUs with only 4GB of HBM2 as a chance to really stress test Vega's HBCC/HBM2-HBC IP by running some game benchmarks with the games loaded with texteure/mesh packs that make use of at least 8GB+ of VRAM, so Vega's HBCC/HBM2-HBC IP can be indipendently tested.

February 12, 2018 | 11:21 AM - Posted by Cooe (not verified)

Problem is, that adding HBM2 would not only require an entirely new die layout/floorplan with both DDR4 AND HBM2 memory controllers on die & wired to the iGPU; but it would require the addition of a large, & expensive silicon interposer with enough room for Raven Ridge and a single stack of HBM2 (which would likely prevent it from fitting in the AM4 socket; thus needing a whole new socket too). And due to the way HBM2 works, you can't have a 1GB stack, it's technically impossible (HBM1 yes, HBM2 no); HBM comes as four-high die stacks from Samsung or SK Hynix which with currently available die density's, works out to either 4GB or 8GB. That's it, those are the only available HBM2 quantities per stack available (Hence 2 stack HBM2 parts like Vega 10 having 8/16GB, and current single stack parts like Vega Mobile/KL-G having 4GB [though 8GB would fit]).

So to add HBM2 to Raven Ridge they'd have needed to totally re-do the die design with a new HBM2 controller, add an expensive interposer, make a new socket to fit said big interposer, and put a 4GB stack of also crazy expensive HBM2 on said interposer.... The interposer + 4GB HBM2 stack likely costs more for AMD to manufacture than the entire Raven Ridge die. Not saying it wouldn't be awesome, because it totally would, but with this kinda product/ target market, adding HBM2 makes absolutely 0 freaking sense.

February 12, 2018 | 12:48 PM - Posted by AMDsOptionsAreVariedButAMDsRevenuesAreNowIncreasing (not verified)

Yes the HBM2 would require an Interposer but the Memory controller on APU is combined with the CPU and GPU both using the same memory controller. And HBM2 being limited to 4GB and larger is simply not true. Samsung and SK Hynix could produce other HBM2 configuratons with 2GB or even one 1GB if AMD wanted. There is also the possibility of HBM# variants that make use of 512 bit wide interfaces for integrated GPUs that do not need all the effective bandwidth.

I'm not talking about the current Raven Ridge APUs, not these current variants, as AMD was rumored to be developing a professional Workstation Professional APU on an Interposer. And That's an entirely different price markup than any consumer SKUs.

As far as fitting HBM2 on the AM4 socket that's not a problem as the JEDEC Standard can have variants added at any time, and really the The JEDEC standard only deals with what is needed to support that 1024(devided into 8, 128 bit indipendent channels) bit interface to a single stack of HBM2 and there is no hard and fast rules dictating the overall layout of the HBM2 die other that the extra cost involved if the die shape were custom made longer and narrower to fit.

Samsung is sure looking into a lower cost HBM/Low cost variant of HBM2 with maybe only a 512 bit interface for any discrete mobile/maybe integrated GPUs to make use of.

There is also the actual 3D stacking of HBM2 over a porton of the processors die istead of the Interposer based 2.5D stacking so that's being looked into. Then there is Navi to consider which will alraedy be smaller GPU dies/chiplets attatcher to an interposer with any CPU dies that also cuuld be added as a Zen/Cores die on a completely modular method atop a silicon Interposer. The interposer being made of silicon itself fortells the prosibility of moving the infinity fabric in total onto an Active Interposer and haveing any number of die/Chiplets GPU/CPU based attatched to the Interposer and its Active Infinity Fabric and traces and cirsuits etched on the interposer leaving more room on the die/chiplets for CPU or GPU cores/other IP.

This AMD/academic partner research paper on the design of an Exascale APU(1) is a good indicator of where AMD could take things going forward.

So this research continues partally funded by US Exascale Initative funding for AMD and the other CPU/GPU makers also. It's a good read and a proper indicator as to how far AMD will be pushing tehcnology in the future, and all that Government funded research make its way down into the consumer markets relatively quickly.

"Abstract—The challenges to push computing to exaflop
levels are difficult given desired targets for memory capacity, memory bandwidth, power efficiency, reliability, and cost. This paper presents a vision for an architecture that can be used to construct exascale systems. We describe a conceptual Exascale Node Architecture (ENA), which is the computational building block for an exascale supercomputer. The ENA consists of an Exascale Heterogeneous Processor (EHP) coupled with an advanced memory system. The EHP provides a high-performance accelerated processing unit (CPU+GPU), in-package high-bandwidth 3D memory, and aggressive use of die-stacking and chiplet technologies to meet the requirements for exascale computing in a balanced manner. We present initial experimental analysis to demonstrate the promise of our approach, and we discuss remaining open research challenges for the community." (1)

(1)

"Design and Analysis of an APU for Exascale Computing
Thiruvengadam Vijayaraghavan†∗, Yasuko Eckert, Gabriel H. Loh, Michael J. Schulte, Mike Ignatowski,
Bradford M. Beckmann, William C. Brantley, Joseph L. Greathouse, Wei Huang, Arun Karunanithi, Onur Kayiran,
Mitesh Meswani, Indrani Paul, Matthew Poremba, Steven Raasch, Steven K. Reinhardt, Greg Sadowski, Vilas Sridharan
AMD Research, Advanced Micro Devices, Inc.
†Department of Electrical and Computer Engineering, University of Wisconsin-Madison"

http://www.computermachines.org/joe/publications/pdfs/hpca2017_exascale_...

February 12, 2018 | 09:56 AM - Posted by CyclesRenderingAcceleratedOnAMDsIntegratedGraphics (not verified)

"The AMD Ryzen Processor with Radeon Vega Graphics, which is it its official name, does exactly what we thought it would do for AMD. It brings the revitalization of the Zen architecture to a market and class of product that previously had been without it. While that sounds simple and straightforward, being able to address as much as 50% or more of the consumer and SMB desktop PC market that they could previously not, or did so only with inferior products, is a significant business milestone for the team."

Maybe not the SMB market if AMD has plans for any Pro branded Desktop Raven Ridge "APU" variants. I'd expect that any of Those Pro SKUs will have the remote management features that businesses(IT departments) may want, including that same 3 year Pro warrenty and extended support and product availability(More years before the SKUs are no longer produced) sorts of guarantees.

With Regards to Blender, why is it that the only Blender benchmarks used are used to test the CPU cores with little or no attention to Testing Blender's Cycles(OpenCL accelerated on AMD's GPUs) and compareing that to the Nvidia CUDA based Blender Cycles GPU Rendering acclerated on the GPU for Nvidia's GPUs.

I realise that the Respective Blender CPU rendering workloads do stress test a CPUs cores at 100% for a great way to judge CPU performance but Blender's Cycles rendering needs to be Tested specifically accelerated on the GPUs cores for much faster rendering times for most rendering workloads. It's a Fact that AMD's Cycles rendering on the GPU has been available for a few years now whereas before Cycles rendering on the GPU was only supported via Cycles/CUDA code path. But Reviewers are ignoring Blender's Cycles rendering accelerated on AMD GCN GPU(GCN Generation 2 or later) GPU/Graphics(integrated and discrete) and any potential Graphics Usage on Blender 3D for AMD's GPU/Graphics accelerated rendering on GCN APUs is very hard to find.

There are in addition to Blender, other 3d and 2d graphics packages that can make use of OpenCL sccelaration on the GPU and that's not being tested also. Ditto for more real world testing of any Office Applications that make use of OpenCL to accelerate light compute workloads on Integrated graphics.

February 13, 2018 | 12:14 AM - Posted by msroadkill612

Its a shame windows cant shunt some of the cpu work onto the often underutilised gpu.

February 12, 2018 | 10:14 AM - Posted by Cryptominor (not verified)

Yes, but what's it like for cryptocurrency mining?

February 12, 2018 | 10:23 AM - Posted by Anonymous25222 (not verified)

No one fucking cares keep yar filthy hands off ya cunt

February 12, 2018 | 10:29 AM - Posted by Cryptominor (not verified)

:^)

February 12, 2018 | 01:14 PM - Posted by DirectYouAngerAgainstYearlySmartPhoneUpdatersAlso (not verified)

Remember to also channel that anger at Smartphones, namely for those that update their smartphones every year unnecessarily, as that's drivng up the price of DRAM. So Please channel that anger by keeping your smartphone for at least 3 years in order to reduce demand on a limited(Artifically Limited by the Big 3 DRAM maker/crooks also) DRAM supplies.

Most gamers can live with not updating their Smartphones on a tearly basis as a way of sticking it to the greedy DRAM makers and maybe that will help with reducing any extra DRAM price pressures. Be sure to also let your elected officials know that the regulatory agencies need to be putting the DRAM makers' actions under a microscope with the DRAM makers being found guilty in the past of artifically limiting DRAM supplies and gouging the consumers and the OEMs on DRAM pricing.

Let's give the smartphone makers a little recession to contemplate via some collective gamer/PC builder productively directed anger at the DRAM makers tactics for gouging with those PC/Laptop DRAM prices. It's just too bad for the smartphone makers but that's how things work out when those nefarions big 3 DRAM makers are up to no good!

February 12, 2018 | 10:58 AM - Posted by Anonymously Anonymous (not verified)

But can it run Meltdown?

February 12, 2018 | 02:02 PM - Posted by MiniMicroFormFactorBareBonesMeatOnThemBonesSystems (not verified)

That's an exclusive Intel Title as par of their pawnedworks offerings!

February 21, 2018 | 06:17 AM - Posted by albert89 (not verified)

Me thinks your confusing AMD with Intel.
Did the PCper team of reviewers feed you this fake news ?

February 12, 2018 | 12:24 PM - Posted by bobbyw (not verified)

Here's an interesting result - though the Ryzen 5 2400G falls behind the Core i5-8400 in the second pass of our X264 benchmark results, it is able to perform better than the 6-core part from Intel in the first pass results.

The x264 encoding graph shows the opposite (2400G slower 1st pass, faster 2nd pass)... Which is it?!!

February 12, 2018 | 12:25 PM - Posted by bobbyw (not verified)

Even that is a tough pile to swallow…

That's why I choose to swallow pills instead!

February 12, 2018 | 12:26 PM - Posted by WhyMe (not verified)

So whose going to be the first brave soul to test out direct die cooling seeing as these chips are using standard TIM. ;-)

February 12, 2018 | 12:42 PM - Posted by WhyMe (not verified)

I've been reading that other testers experienced a lot of issues in their testing (instability, crashes, etc, etc.), is that something PCPer also had problems with?

Obviously things like that are to be expected with a new product launch, and will probably be sorted out over time, but you'd hope AMD were a little more on the ball after having the shaky start they had with the Ryzen 1000 series launch.

February 12, 2018 | 12:56 PM - Posted by Oskars (not verified)

Try delidding, turns out it's not soldered to IHS, but uses a crummy thermal paste instead.

February 12, 2018 | 01:31 PM - Posted by Max Settings (not verified)

Awesome little chip for <$170. And it will run just fine with 8GB of cheaper DDR4-3000 (2x4GB) and some OC. Great job AMD.

February 12, 2018 | 01:59 PM - Posted by MiniMicroFormFactorBareBonesMeatOnThemBonesSystems (not verified)

What about that Zotac ZBOX MA551 SKU with the Ryzen 5 2400G or Ryzen 3 2200G options! Any ETA on that and any other Mini/Micro form factor kits that makes use of the Ryzen 5 2400G or Ryzen 3 2200G SKUs.

These SKUs look to be ready made for those Mini/Micro Form Factor bare bones systems with the OEMs able to get that Tray pricing. But I waiting for some great Mini/Micro Form Factor review roundups once the market get pleny of these systems that make use of these Raven Ridge desktio variants.

February 12, 2018 | 03:04 PM - Posted by djotter

I am going to grab a 2400G I think for a Plex server and client machine. An i3 8100 or i5 8400 would be more efficient for about the same performance, but the complete lack of cheap ITX motherboards forces me to AMD. Cheapest Z370 board available to me is $USD220.

February 12, 2018 | 04:15 PM - Posted by Mr.Gold (not verified)

"Our classic CineBench results show a 10% deficit for the 2400G compared to the Core i5-8400, and 15% deficit for the 2200G compared to the 8100 in the single threaded results. In the multi-threaded testing, the Ryzen 3 part is nearly on par with the Core i3-8100 though the Ryzen 5 2400G is still 18% slower than the more powerful Core i5-8400."

Why are guys keep doing this BS ?!

Did you notice that the i5-8400 is 12% more costly and is locked?

Why not compare against the i3-8350k that priced the same ?

"Its slower.... its slower... " I give with you.

PCPer "unconscious" bias at its best.

February 12, 2018 | 11:50 PM - Posted by SincereAnonymous (not verified)

Because Corei5 and Ryzen5 have 5 in their names, and Corei3 and Ryzen3 have 3 in their names. Isn't it obvious?

February 13, 2018 | 02:17 AM - Posted by Ryan Shrout

Even AMD in its own documentation compares the 2400G to the Core i5-8400. 

February 16, 2018 | 08:15 AM - Posted by Anonymussss (not verified)

I bet it didn't say anything about pairing these with DDR4-2400 though...

#youknowthetech
#iexpectbetterofyou

February 13, 2018 | 10:36 AM - Posted by TheInternetHasAlwaysBeenThus (not verified)

You can say that about the entire Online Review Industry, be you an AMD person or Nvidia/Intel/other person. The online reporting indistry is in the mindshare business and the selling of mindshare for a price business and that's why you have to try and read as many online sources as you can, and watch some YouTube sources, as well as read the Reddit like Forums and Blogs for folks doing their own testing and posting the results online(Watch Out For Those Terfers As Always).

Maybe that Adored TV person can be encourged to do a review of each review website's Benchmarking Choices to see if there are and indications of bias or lack of utilizing the Scientific Method. Gaming the Benchmarks for spin is a common tactic of those in the mindshare business and all the online review websites are in that mindshare business, or they get no review samples and other such perks.

Remember the Internet was never created and fostered to disseminate knoledege and understanding its was fostered because it was unregulated compared to the over the air Radio and TV broadcasting which are more regulated for turth and accuracy. The Internet is not regulated as much and where there is little regulation with respect to Truth and Accuracy there is going to be little in the way of Truth and Accuracy so the on the Internet Things can never be fully trusted mostly.

You can never trust anyone in the Mindshare Business as that's Marketing at it's heart and marketing and the Snake Oil Salesman relationship goes way back in those sorts of Non-Truths, Lies of Omission, and Lack of Objectivity. The internet is where The Scientific Method and Truth and Objectivity went to die for the most part, so you have to take that at face value and do your own due diligence.

February 12, 2018 | 05:00 PM - Posted by Anonymous-911 (not verified)

Looks like this is about the same as a PS4 - 5 years later. There is a reason the PS4 uses DDR5.

February 12, 2018 | 09:10 PM - Posted by CNote

Grabbed the 2200G for that exact motherboard and some LPX 3000. Nice little box.

February 13, 2018 | 11:33 AM - Posted by Geoff Peterson (not verified)

Does anyone know how these new CPUs will handle work as an HTPC? I'm currently running a 4930k and 1080ti for 4K HDR video, and it's a bit overkill. I would love to build a power efficient HTPC for 4K HDR playback, but I'm not sure how well AMD hardware handles this.

February 20, 2018 | 11:12 AM - Posted by CNote

I got a 2200G to power my 4k tv and my 4k monitor. The monitor was doing 60hz over displayport. Pretty sure it was doing 60 on my tv but didnt even check, it would be over hdmi. Just going off specs it should rock that HDR just like an i3.

February 13, 2018 | 12:40 PM - Posted by ReallyAMDdoYouReallyWantToGoThere (not verified)

"AMD Ryzen Desktop Processors with Radeon Vega Graphics" were once known as Desktop "APUs"!

So do we call them ARDPWRVG(AMD Ryzen Desktop Processors with Radeon Vega Graphics) for dasktop "APUs" and ARMPWVG(AMD Ryzen Mobile Processors With Radeon Vega Graphics)for any Mobile "APUs". And what happens with The Navi GPU generation when that arrives.

Maybe We can call them The Processors Formally Known As Acceleated Processing Units(TPFKAAPU). Maybe System On a Chip with Graphics(SOCwG), or System On a Chip With Vega Graphics(SOCwVG). But I'm thinking that APU is best and so easy to say. You See Folks why Marketing is such an Utter Trash "Profession" and Now we have all that confusion foisted on us by the marketing monkeys. WTF AMD!

February 13, 2018 | 12:42 PM - Posted by ReallyAMDdoYouReallyWantToGoThere (not verified)

Edit: Acceleated
To: Accelerated

February 13, 2018 | 12:54 PM - Posted by Anonymously Anonymous (not verified)

How hard would it have been to include 2 extra data points on each graph for the APU's tested at 2400 and also at max RAM speed?

To not include it in the initial launch review is doing a disservice to anyone reading your review interested in the new APUs.

February 13, 2018 | 04:07 PM - Posted by djotter

Yes, please do this soon! I am buying a 2200G soon and I will be pairing it with a 3200MHz kit. Can you also please look what impact the size of the frame bufer has like J2c did?

February 18, 2018 | 04:15 AM - Posted by djotter

Never mind on the frame buffer. I was basing my request on a Jayztwocents video, but it turns out he was a bit more drama than data. Hardware Unboxed did a very comprehensive review of frame buffer impacts and found it negligible.

February 13, 2018 | 06:06 PM - Posted by Pink Gnome (not verified)

Good review! I know that traditionally you guys don't go through the full FCAT routine with your cpu reviews, but I would have loved to see that for the integrated graphics on this one. Minimum frame rate would be good too.

February 15, 2018 | 01:14 PM - Posted by Rocky1234 (not verified)

Great review thanks.
On the part about chosen memory speed & using 2400MHz memory speed. First off most would at least try to get 2933Mhz memory and then try to OC it to 3200Mhz using 2400Mhz just seems like a lot of wasted performance potential. If the board & CPU/APU support it and the kit was provided at that speed then set the system up with those specs please to not gimp the system just because 10% of the buyers may opt to use slower memory to save a few bucks. Like you said it is not AMD's fault the prices are out to lunch so why gimp their hardware to make a point. By the way all of Intel's coffee lakes the non K versions were all tested on the highest end chipset the z370 and with memory speeds at 3200Mhz or above. Only because Intel chose to hold back the lower end chip sets for the CPU's fully knowing the lower spec CPU's would be tested on boards that allow for memory overclocking which also gave the lower spec CPU's a big performance boost.

When the lower tier chip sets come out all of the data done in testing on 95% of the sites just becomes useless because these lower tier chips will now be bought with the lower end boards that do not allow over clocking of any kind. My point is AMD gave/sent you a review kit that yes will allow the kit to perform at it's best but the hardware actually supports everything in the kit sent out for reviews at least they were not trying to fake the numbers like Intel pretty much did with non K Coffee lakes CPU's by allowing reviews to be done on hardware that will not be put together as a kit once the non OC chip sets are released for the non OC CPU's. The point is test the hardware sent out and at least try to set the hardware up so it shows what it's full performance potential is not gimping it because a few may not buy the higher spec memory for it.

February 15, 2018 | 01:58 PM - Posted by Rocky1234 (not verified)

Hello second post for this

"This clearly proves out AMD's case that changes to the Precision Boost 2 technology can help with gaming performance, but the IPC advantages Intel holds remain the difference."

It is not so much the IPC gain Intel has over AMD (which is only 5%-6% by the way) but more so the actual clock speed advantage Intel has over AMD right now oh & in the i5 case 2 extra cores over the AMD 4/8 setup as well. Those 2 extra cores more than make up the 8 threads the 2400G has because they are hardware cores not logical cores as not physical cores. My best guess is if you could take a AMD Ryzen APU and pit it up against a Intel part with same core count and had the clock rates setup at the same speed Intel would only behead by 5%-6% because of slightly better IPC on the Intel parts. But hey it is good to see intel's Coffee lake 6th gen CPU's doing so well oh wait that's 8th gen nope wrong 6th gen because they are sky lake cores just a few more added to the mix pretty sad really when my Old Sandy Bridge i7 OC'ed to 5Ghz is faster than a 6700K stock and gets near or passes a 7700K which is 7th gen and mine is second gen core series. Pretty sad also that a second gen can clock as high or faster than a 3rd,4th,5th,6th,7th gen class CPU's and run cooler while doing it even though it is on a lot bigger node process.

I do have to admit the 6th/8th gen Coffee lakes do seems to finally OC a bit better with some of them reaching 5.3-5.4Ghz with good cooling. My CPU only dreams of getting to 5.3GHz not alone 5.4GHz but I have had it at 5.2GHz for benching and it will run 24/7 stable at 5.1GHz so yep sad that a old 2010 or is it 2011 CPU is able to still play with the big boys and show them that Grand dad still has a few tricks up it's sleeve.

February 17, 2018 | 05:06 AM - Posted by VintageDude (not verified)

Does anyone know if any of the major PC manufacturers are going to put these APU's in the computers they sell?

February 19, 2018 | 08:15 AM - Posted by VintageDude (not verified)

Hey, just learned AMD will loan out a old CPU to do an bios upgrade that is needed on a lot of newer low cost motherboards.

February 19, 2018 | 02:08 PM - Posted by CNote

I was running cpu-z or some benchmark and my 2200G with the larger wraith cooler, the xfr jumped up to 4.125. A 400MHz jump ain't bad and is up there with a 1300X.

February 19, 2018 | 09:03 PM - Posted by Tantor (not verified)

Very interesting article. The R5-2400g is not intended to compete with the I5-8400. The discreet tests should probably have included the R5-1600.

The moral is simple. If you need integrated graphics, then the R5-2400g is obviously far superior to Intel. If you need discrete graphics, then the R5-1600 is a great competitor to the I5-8400, at the SAME EXACT PRICE.

February 21, 2018 | 07:54 AM - Posted by Drbaltazar (not verified)

Hey paper question reguarding down/spect intel vs amd (down/spect is was chosen by Steve Gibson among the huge pile he received )anyway from what www say , it look like I got cheated ! Every place online was saying intel was better when in fact bang for $ amd was the king of the hill ! My question ! You guys used to do article with link when you didn’t do them directly I know w going back in the past is a Nono but could you confirm performance lost from r a série , like the sandy bridge say i5 2500 k
Just the performance lost is good , it like ok like web say 30 %.but since I know you test very differently I suspect it must be a different ! Why I ask ? Tired of having excuse from intel! People paid way more for intel unit and the whole point was to get the best people can afford! Steve Gibson hinted that a lot of intel processor might loose performance or security reguardless of what google say (Microcode and all that) he was suggesting for people wait till the dust settle ! Till the dust settle can you look into down/spect intel vs amd and see the performance drop! Since I mainly talk about gaming keeping it simple is good enough

May 11, 2018 | 04:46 PM - Posted by Anonymousn00b (not verified)

Wow Ryan, your skimping lately, just let Josh do amd, you are not even showing its OC potential. What happen to u? ;)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.