It's clear by now that AMD's latest CPU releases, the Ryzen 3 2200G and the Ryzen 5 2400G are compelling products. We've already taken a look at them in our initial review, as well as investigated how memory speed affected the graphics performance of the internal GPU but it seemed there was something missing.
Recently, it's been painfully clear that GPUs excel at more than just graphics rendering. With the rise of cryptocurrency mining, OpenCL and CUDA performance are as important as ever.
Cryptocurrency mining certainly isn't the only application where having a powerful GPU can help system performance. We set out to see how much of an advantage the Radeon Vega 11 graphics in the Ryzen 5 2400G provided over the significantly less powerful UHD 630 graphics in the Intel i5-8400.
|Test System Setup|
|CPU||AMD Ryzen 5 2400G
Intel Core i5-8400
|Motherboard||Gigabyte AB350N-Gaming WiFi
ASUS STRIX Z370-E Gaming
|Memory||2 x 8GB G.SKILL FlareX DDR4-3200
(All memory running at 3200 MHz)
|Storage||Corsair Neutron XTi 480 SSD|
|Graphics Card||AMD Radeon Vega 11 Graphics
Intel UHD 630 Graphics
|Graphics Drivers||AMD 17.40.3701
|Power Supply||Corsair RM1000x|
|Operating System||Windows 10 Pro x64 RS3|
Before we take a look at some real-world examples of where a powerful GPU can be utilized, let's look at the relative power of the Vega 11 graphics on the Ryzen 5 2400G compared to the UHD 630 graphics on the Intel i5-8400.
SiSoft Sandra is a suite of benchmarks covering a wide array of system hardware and functionality, including an extensive range of GPGPU tests, which we are looking at today.
Comparing the raw shader performance of the Ryzen 5 2400G and the Intel i5-8400 provides a clear snapshot of what we are dealing with. In every precision category, the Vega 11 graphics in the AMD part are significantly more powerful than the Intel UHD 630 graphics. This all combines to provide a 175% increase in aggregate shader performance over Intel for the AMD part.
Now that we've taken a look at the theoretical power of these GPUs, let's see how they perform in real-world applications.