We are reader supported. External links may earn us a commission.
AMD Ryzen with Vega: The Ryzen We’ve Come to Like Gets Even Better

AMD Ryzen with Vega: The Ryzen We’ve Come to Like Gets Even Better

AMD launched both Ryzen and Vega last year, and we’ve been waiting a long time for a marriage of the two to come to fruition. While XDA’s own TK was at the Consumer Electronics Show last year, we heard it would finally happen this year: A Ryzen CPU with integrated Radeon Vega graphics! AMD’s only releasing two versions right now, and it’s touting both as top-to-bottom desktop processors with the best integrated graphics out there. Those are bold claims, to be sure, and there was only one way for us to find out if they were true.

On paper, AMD’s new chips look very similar to the recent 1300X, which we’ve reviewed and compared to the 1500X. The notable difference is the inclusion of Vega graphics. The 2400G has a higher base/turbo of 3.6/3.9 GHz versus the 3.5/3.7 GHz of the 1500X, and both have reduced cache amounts compared to their non-GPU counterparts. (Both the 2400G and 2200G have 4MB of L3 cache, while the 1300X has 8MB and the 1500X has 16MB.) Interestingly, while technically part of the new 2000 series, these Raven Ridge system-on-chips are based on first-generation Ryzen architecture and not the upcoming second-gen series.

Unboxing & Photos

As shown in the video above, AMD sent us a new mITX motherboard and some G.Skill RAM along with two CPU samples. The reason for the mITX motherboard is simple: People who want a small, portable PC that can handle the most popular competitive video games on the go now have a solution. At a time when discrete GPUs are extremely difficult to find (not to mention incredibly expensive), it’ll no doubt be attractive to those on a budget. I could actually see many people opting for the new chips as cheap steps into Ryzen, much as was the case with the 1300X — especially with the 2400G retailing for $169 USD.

After the review benchmarks were done, I decided to put the 2400G into the Acer rebuild — we’ll see how it performs in the long run. (Don’t fret – that mITX motherboard will be put to work here shortly, as well.)

Here are a few more photos from the unboxing here for your viewing pleasure.

Installation & Test Setup

We used the same open-air configuration as we used for the 1300X review, because it was ready to go and would prevent other variables from impacting test results. Since we wanted to see how the chips’ integrated graphics performed, we didn’t add a discrete GPU. The CPU and graphics ran on the latest Ubuntu 17.10 image during our tests, but the graphics were running in LLVM. We attempted to install the same 17.50 amdgpu-pro package that we used for our upcoming review of the Powercolor Red Devil Vega 56, but were unable to compile the module. We didn’t have luck in version 4.13, where we were able to get it working before, so we went with 4.15 instead — specifically 4.15.1 — and added the missing firmware for Raven Ridge afterward. This got both the Vega graphics and HDMI audio working, though not consistently.

When the system booted up correctly, it worked just fine, but it occasionally did not. Instead, it locked up and displayed a black screen or heavily distorted graphics — usually on the left half of the screen. When this happened, we simply rebooted or added a “nomodeset” flag manually. Typically after a reboot from nomodeset, it behaved properly. We suspect there’s more work to be done after release, but those more familiar with creating their own kernel should have no problem fine-tuning the chips to get them to a stable state.

Unfortunately, we weren’t able to get Radeon Open Compute up and running. As of publication time, I was unable to get ROCm to compile or get the necessary module working. Previous documentation suggests the chips fully support it, but they probably need additional updates before they’re ready to go. We’ll keep an eye out to see if new firmware drops tomorrow, and update our review if it does.

Platform Configuration

Software/Operating System

  • Ubuntu 17.10 with 4.15.1 kernel and amdgpu firmware manually added to initramfs
  • Phoronix Test Suite—current using apt

Testing Methodology

Our last two sets of benchmarks were done on stock speeds, and to keep things consistent, we went with the stock settings on these, as well. The AORUS X370 Gaming 5 offers some basic GPU overclocking, but I was unable to find any manual memory allocation settings for the GPU. We continue to use the same suite of tests as we’ve documented and shared before, but since we tested the graphics performances of the chips, we added two games and two benchmarks to it.

We ended up doing a full reinstall of Ubuntu for these benchmarks, so there’s a chance that the software packages may not have lined up exactly with the previous install. As requested, we’ve been adding the “monitor=all” flag for Phoronix Test Suite, but I’ve yet to see details coming back from the sensors.

Non-Build Benchmarks

Benchmark Notes: Phoronix Test Suite’s CPU suite offers a plethora of tests, and not all are included in this review. (The full results from the HEDT review are available on OpenBenchmarking.org.) Benchmarks for last week’s 1300X review can be found here, and the benchmarks for the 2200G and 2400G can be found here. They’re color-grouped by chip pairs: Ryzen/Threadripper, Intel 7700K/8700K/6950X. I’ve included the newer Intel high-end desktop CPUs in a separate color scheme that more closely follows XDA’s performance graph style.


FFTW is a single-threaded benchmark of Fast Fourier transform. The 2200G and 2400G keep with the pack here, which isn’t unexpected — the 1300X and 1500X did, too.


GZip Compression

GZip is a common compression scheme, and it’s a great test of day-to-day performance. We didn’t note much of a difference between the 2200G/2400G and 1300x/1500X, though the test seems to confirm the SMT helped out a little. Cache might have also played a role in the 1500X’s case, helping it to beat out the 2400G.

SciMark 2 (Java) v1.3.0

The SciMark 2 benchmark utilizes Java for arithmetic operations and generates a score based on those results. The 2200G and 2400G came up slightly short of the 1300X and 1500X, probably as a result of the cache difference between the chips.

John The Ripper

On the cryptography front, John The Ripper showed results in line with last week’s 1300X and 1500X tests. The higher clock speeds on the 2400G seem to have helped, though.

C-Ray v1.1

The C-Ray results from the group are very close to last week’s benchmarks, with the 2400G coming out on top again likely due to the higher clock speeds. Not bad so far!

Benchmarks: Build Performance

Build Test: ImageMagick

This is one place I expected the 2400G to do better, but that wasn’t the case. That’s probably because of its L3 cache disadvantage compared to the 1500X. Still, it’s competitive.

Build Test: GCC

In our GCC test, we see results similar to last week’s. Compared to the 1700X’s score of 902.06, the 2400G takes approximately 50% longer, and the 2200G takes nearly double that time.

Build Test: LineageOS cm-14.1 Pixel XL

We stuck with the Pixel XL for timed Android build benchmarks. As requested by our readers, we’ve displayed build times for scenarios with and without caches. The results of the no-cache test follow a predictable trend: It seems to hurt the 2400G, but the 2200G less so.

Graphics Testing

I ran benchmark tests in both Unigine Heaven and Valley, and the results were quite good (albeit not spectacular) for integrated GPUs. I also tested Tomb Raider at the Normal graphics preset to see how it would fare at 1080p. Despite the fact that it’s an older game, it’s a good indicator of high-end gaming performance on Linux.

Unigine Heaven Benchmarks (Extreme Preset)

Ryzen 2200G: Min 8 fps, Max 25.9 fps, Avg 12.9 fps
Ryzen 2400G: Min 5.7 fps, Max 29.5 fps, Avg 14.4 fps

Unigine Valley Benchmarks (Extreme HD Preset)

Ryzen 2200G: Min 5.5 fps, Max 16.5 fps, Avg 9.3 fps
Ryzen 2400G: Min 5.3 fps, Max 19.1 fps, Avg 10.6 fps

Tomb Raider (2009) – Normal Preset

Ryzen 2200G: Min 39.7 fps, Max 57.5 fps, Avg 48.1 fps
Ryzen 2400G: Min 46.1 fps, Max 67.6 fps, Avg 56.1 fps

While Rocket League doesn’t have a built-in benchmarking test, I gave it a go because it falls under the category of games that AMD’s targeting with these chips: Popular, casual titles like Overwatch, League of Legends, Hearthstone, and so on. Surprisingly, I didn’t have to switch to the lowest settings to achieve a very smooth 45-60 frames per second with both the 2200G and 2400G. It was certainly playable, albeit at a slightly lower frame rate than I was used to.

I’m not sure I’d be ready to call the new chips “console killers” just yet, but when it comes to integrated graphics, they deliver on AMD’s promise of beating out others in their price brackets.  They can definitely play most games at 1080p, and that’s great news for LAN party folks considering small, GPU-free PC builds that don’t compromise on performance.


AMD’s new CPUs technically kick off the second-generation Ryzen desktop lineup, and the company’s come out the gate swinging. I personally hope they’ll consider adding 6- and 8-core variants that pack a bigger punch, but I have to wonder if die space is an issue — especially after seeing how Intel has stuffed Vega graphics into its upcoming embedded chip lineup. As far as AMD’s processor go, though, the new chips do what AMD continue to do best: Offer attractive performance at a very affordable price point. It goes to show that Vega, even in lower core counts, can still deliver when it comes to raw processing power.

At $169, it’s hard to find anything to even compare the Ryzen 5 2400G against, and at $99, the 2200G has me floored. If you’re looking to perform a PC upgrade on a very tight budget, I’d even suggest skipping the 1300X/1500X right now and considering one of these two — they’re a great way to step into Ryzen without breaking the bank. Since a good chunk of PC games should be playable at 1080p, they’ll also hold their own against discrete GPUs, which should be music to the ears of any PC enthusiast who’s been burned by today’s inflated graphics card market.

I’ll be looking forward to seeing a Ryzen Pro variant of both of these. At my previous job, I used to buy discrete GPUs for the sole purpose of handling multi-monitor workloads on the Intel Core i3-based machines we used. These chips are the perfect all-in-one solution to that kind of problem, and IT managers would do well to pay attention.

All in all, AMD continues its bang-for-your-buck trend with Ryzen. If the results I’ve seen from the chips are any indication, I have no doubt that the PC enthusiast market will reward it.

Editor’s Note: The Ryzen 3 2200G and Ryzen 5 2400G were provided to XDA by AMD for review purposes.

About author

Daniel Moran
Daniel Moran

Former PC Hardware Editor for XDA.