Subject: Processors | March 2, 2017 - 11:29 AM | Ryan Shrout
Tagged: amd, ryzen, gaming, 1080p
By far one of the most interesting and concerning points about today's launch of the AMD Ryzen processor is gaming results. Many other reviewers have seen similar results to what I published in my article this morning: gaming at 1080p, even at "ultra" image quality settings, in many top games shows a deficit in performance compared to Intel Kaby Lake and Broadwell-E processors.
I shared my testing result with AMD over a week ago, trying to get answers and hoping to find some instant fix (a BIOS setting, a bug in my firmware). As it turns out, that wasn't the case. To be clear, our testing was done on the ASUS Crosshair VI Hero motherboard with the 5704 BIOS and any reports you see claiming that the deficits only existed on ASUS products are incorrect.
AMD responded to the issues late last night with the following statement from John Taylor, CVP of Marketing:
“As we presented at Ryzen Tech Day, we are supporting 300+ developer kits with game development studios to optimize current and future game releases for the all-new Ryzen CPU. We are on track for 1000+ developer systems in 2017. For example, Bethesda at GDC yesterday announced its strategic relationship with AMD to optimize for Ryzen CPUs, primarily through Vulkan low-level API optimizations, for a new generation of games, DLC and VR experiences.
Oxide Games also provided a public statement today on the significant performance uplift observed when optimizing for the 8-core, 16-thread Ryzen 7 CPU design – optimizations not yet reflected in Ashes of the Singularity benchmarking. Creative Assembly, developers of the Total War series, made a similar statement today related to upcoming Ryzen optimizations.
CPU benchmarking deficits to the competition in certain games at 1080p resolution can be attributed to the development and optimization of the game uniquely to Intel platforms – until now. Even without optimizations in place, Ryzen delivers high, smooth frame rates on all “CPU-bound” games, as well as overall smooth frame rates and great experiences in GPU-bound gaming and VR. With developers taking advantage of Ryzen architecture and the extra cores and threads, we expect benchmarks to only get better, and enable Ryzen excel at next generation gaming experiences as well.
Game performance will be optimized for Ryzen and continue to improve from at-launch frame rate scores.” John Taylor, AMD
The statement begins with Taylor reiterating the momentum of AMD to support developers both from a GPU and a CPU technology angle. Getting hardware in the hands of programmers is the first and most important step to find and fixing any problem areas that Ryzen might have, so this is a great move to see taking place. Both Oxide Games and Creative Assembly, developers of Ashes of the Singularity and Total War respectively, have publicly stated their intent to demonstrate improved threading and performance on Ryzen platforms very soon.
Taylor then recognizes the performance concerns at 1080p with attribution to those deficits going to years of optimizations for Intel processors. It's difficult, if not impossible, to know for sure how much weight this argument has, but it would make some logical sense. Intel CPUs have been the automatic, defacto standard for gaming PCs for many years, and any kind of performance optimizations and development would have been made on those same Intel processors. So it seems plausible that simply by seeding Ryzen to developers and having them look at performance as development goes forward would result in a positive change for AMD's situation.
For buyers today that are gaming at 1080p, the situation is likely to remain as we have presented it going forward. Until games get patched or new games are released from developers that have had access and hands-on time with Ryzen, performance is unlikely to change from some single setting/feature that AMD or its motherboard partners can enable.
The question I would love answered is why is this even happening? What architectural difference between Core and Zen is attributing to this delta? Is it fundamental to the pipeline built or to the caching structure or to how SMT is enabled? Does Windows 10 and its handling of kernel processes have something to do with it? There is a lot to try to figure out as testing moves forward.
If you want to see the statements from both Oxide and Creative Assembly, they are provided below.
“Oxide games is incredibly excited with what we are seeing from the Ryzen CPU. Using our Nitrous game engine, we are working to scale our existing and future game title performance to take full advantage of Ryzen and its 8-core, 16-thread architecture, and the results thus far are impressive. These optimizations are not yet available for Ryzen benchmarking. However, expect updates soon to enhance the performance of games like Ashes of the Singularity on Ryzen CPUs, as well as our future game releases.” - Brad Wardell, CEO Stardock and Oxide
"Creative Assembly is committed to reviewing and optimizing its games on the all-new Ryzen CPU. While current third-party testing doesn’t reflect this yet, our joint optimization program with AMD means that we are looking at options to deliver performance optimization updates in the future to provide better performance on Ryzen CPUs moving forward. " – Creative Assembly, Developers of the Multi-award Winning Total War Series
Subject: General Tech | March 1, 2017 - 03:46 PM | Jeremy Hellstrom
Tagged: Tegra X1, Nintendo Switch, Joy-Con, gaming
The Nintendo Switch has arrived for those who feel that mobile gaming is lacking in analog joysticks and buttons. The product sits in an interesting place, the 720p screen is nowhere near the resolution of modern phones though those phones lack a dock which triggers an overclocked mode to send 1080p to a TV. The programming team behind Nintendo also has far more resources than most mobile app developers and they can incorporate some tricks which a phone simply will not be able to replicate. Ars Technica took the Switch, its two Joy-Cons and the limited number of released games on a tour to see just how well Nintendo did on their new portable gaming system. There are some improvements that could be made but the Joy-Cons do sound more interesting than the Gameboy Advanced.
"With the Switch, Nintendo seems to be betting that the continued drum beat of Moore's Law and miniaturization has made that dichotomy moot. The Switch is an attempt to drag the portable gaming market kicking and screaming to a point where it's literally indistinguishable from the experience you'd get playing on a 1080p HDTV."
Here is some more Tech News from around the web:
- XC’mon: Xenonauts 2 releasing free dev builds @ Rock, Paper, SHOTGUN
- Galactic Civilizations III v2.0 Review @ OCC
- Stellaris’ New Horizons mod is the best Star Trek game @ Rock, Paper, SHOTGUN
- Wot I Think – Torment: Tides of Numenera @ Rock, Paper, SHOTGUN
- ARMA Humble Bundle
- Total Warhammer DLC ended with Bretonnia as devs switch to sequel @ Rock, Paper, SHOTGUN
- Radeon vs. NVIDIA Performance For HITMAN On Linux With 17 GPUs @ Phoronix
Subject: Motherboards | February 26, 2017 - 01:29 AM | Tim Verry
Tagged: x370, sli, ryzen, PCI-E 3.0, gaming, crossfire, b350, amd
Computerbase.de recently published an update (translated) to an article outlining the differences between AMD’s AM4 motherboard chipsets. As it stands, the X370 and B350 chipsets are set to be the most popular chipsets for desktop PCs (with X300 catering to the small form factor crowd) especially among enthusiasts. One key differentiator between the two chipsets was initially support for multi-GPU configurations with X370. Now that motherboards have been revealed and are up for pre-order now, it turns out that the multi-GPU lines have been blurred a bit. As it stands, both B350 and X370 will support AMD’s CrossFire multi-GPU technology and the X370 alone will also have support for NVIDIA’s SLI technology.
The AM4 motherboards equipped with the B350 and X370 chipsets that feature two PCI-E x16 expansion slots will run as x8 in each slot in a dual GPU setup. (In a single GPU setup, the top slot can run at full x16 speeds.) Which is to say that the slots behave the same across both chipsets. Where the chipsets differ is in support for specific GPU technologies where NVIDIA’s SLI is locked to X370. TechPowerUp speculates that the decision to lock SLI to its top-end chipset is due, at least in part, to licensing costs. This is not a bad thing as B350 was originally not going to support any dual x16 slot multi-GPU configurations, but now motherboard manufacturers are being allowed to enable it by including a second slot and AMD will reportedly permit CrossFire usage (which costs AMD nothing in licensing). Meanwhile the most expensive X370 chipset will support SLI for those serious gamers that demand and can afford it. Had B350 supported SLI and carried the SLI branding, they likely would have been ever so slightly more expensive than they are now. Of course, DirectX 12's multi-adapter will work on either chipset so long as the game supports it.
|X370||B350||A320||X300 / B300 / A300||Ryzen CPU||Bristol Ridge APU|
|PCI-E 3.0||0||0||0||4||20 (18 w/ 2 SATA)||10|
|USB 3.1 Gen 2||2||2||1||1||0||0|
|USB 3.1 Gen 1||6||2||2||2||4||4|
|SATA 6 Gbps||4||2||2||2||2||2|
|Overclocking Capable?||Yes||Yes||No||Yes (X300 only)|
Multi-GPU is not the only differentiator though. Moving up from B350 to X370 will get you 6 USB 3.1 Gen 1 (USB 3.0) ports versus 2 on B350/A30/X300, two more PCI-E 2.0 lanes (8 versus 6), and two more SATA ports (6 total usable; 4 versus 2 coming from the chipset).
Note that X370, B350, and X300 all support CPU overclocking. Hopefully this helps you when trying to decide which AM4 motherboard to pair with your Ryzen CPU once the independent benchmarks are out. In short, if you must have SLI you are stuck ponying up for X370, but if you plan to only ever run a single GPU or tend to stick with AMD GPUs and CrossFire, B350 gets you most of the way to a X370 for a lot less money! You do not even have to give up any USB 3.1 Gen 2 ports though you limit your SATA drive options (it’s all about M.2 these days anyway heh).
For those curious, looking around on Newegg I notice that most of the B350 motherboards have that second PCI-E 3.0 x16 slot and CrossFire support listed in their specifications and seem to average around $99. Meanwhile X370 starts at $140 and rockets up from there (up to $299!) depending on how much bling you are looking for!
Are you going for a motherboard with the B350 or X370 chipset? Will you be rocking multiple graphics cards?
- AMD Ryzen Pre-order Starts Today, Specs and Performance Revealed
- AMD Launching Ryzen 5 Six Core Processors Soon (Q2 2017)
- AMD Details AM4 Chipsets and Upcoming Motherboards
- AMD Officially Launches Bristol Ridge Processors And Zen-Ready AM4 Platform
- Biostar Launches X370GT7 Flagship Motherboard For Ryzen CPUs
- Gigabyte is Ryzen up to the challenge of their rivals
- Mid-Range Gigabyte Socket AM4 (B350 Chipset) Micro ATX Motherboard Pictured
- CES 2017: MSI Shows Off X370 XPower Gaming Titanium AM4 Motherboard
Subject: Processors | February 24, 2017 - 02:17 AM | Tim Verry
Tagged: Zen, six core, ryzen 5, ryzen, hexacore, gaming, amd
While AMD's Ryzen lineup and pricing has leaked out, only the top three Ryzen 7 processors are available for pre-order (with availability on March 2nd). Starting at $329 for the eight core sixteen thread Ryzen 7 1700, these processors are aimed squarely at enthusiasts craving top-end performance. It seems that enthusiasts looking for cheaper and better price/performance options for budget gaming and work machines will have to wait a bit for Ryzen 5 and Ryzen 3 which will reportedly launch in the second quarter and second half of 2017 respectively. Two six core Ryzen 5 processors will launch somewhere between April and June with the Ryzen 3 quad cores (along with mobile and "Raven Ridge" APU parts) following in the summer to end-of-year timeframe hopefully hitting that back-to-school and holiday shopping launch windows respectively.
Thanks to leaks, the two six core Ryzen 5 CPUs are the Ryzen 5 1600X at $259 and Ryzen 5 1500 at $229. The Ryzen 5 1600X is a 95W TDP CPU with six cores and twelve threads at 3.6 GHz base to 4.0 GHz boost with 16MB of L3 cache. AMD is pitting this chip against the Intel Core i5 7600K which is a $240 quad core Kaby Lake part sans Hyper-Threading. Meanwhile, the Ryzen 5 1500 is a 65W processor clocked at 3.2 GHz base and 3.5 GHz boost with 16 MB of L3 cache.
Note that the Ryzen 5 1600X features AMD's XFR (extreme frequency) technology which the Ryzen 5 1500 lacks. Both processors are unlocked and can be overclocked, however.
Interestingly, Antony Leather over at Forbes managed to acquire some information on how AMD is making these six core parts. According to his source, AMD is disabling one core (and its accompanying L2 cache) from each four core Core Complex (CCX). Doing this this way (rather than taking two cores from one CCX) should keep things balanced. It also allows AMD to keep all of the processors 16MB of L3 cache enabled and each of the remaining three cores of each complex will be able to access the L3 cache as normal. Previous rumors had suggested that the CCXes were "indivisible" and six cores were not possible, but it appears that AMD is able to safely disable at least one core of a complex without compromising the whole thing. I doubt we will be seeing any odd number core count CPUs from AMD though (like their old try at selling tri-core parts that later were potentially able to be unlocked). I am glad that AMD was able to create six core parts while leaving the entire L3 cache intact.
What is still not clear is whether these six core Ryzen 5 parts are made by physically disabling the core from the complex or if the cores are simply disabled/locked out in the micro code or BIOS/UEFI. It would be awesome if, in the future when yields are to the point where binning is more for product segmentation than because of actual defects, those six core processors could be unlocked!
The top end Ryzen 7 processors are looking to be great performers and a huge leap over Excavator while at least competing with Intel's latest at multi-threaded performance (I will wait for independent benchmarks for single threaded where even from AMD the benchmark scores are close although these benchmark runs look promising). These parts are relatively expensive though, and the cheaper Ryzen 5 and Ryzen 3 (and Raven Ridge APUs) are where AMD will see the most potential sales due to a much bigger market. I am looking forward to seeing more information on the lower end chips and how they will stack up against Intel and its attempts to shift into high gear with moves like enabling Hyper-Threading on lower end Kaby Lake Pentiums and possibly on new Core i5s (that's still merely a rumor though). Intel certainly seems to be taking notice of Ryzen and the reignited competition in the desktop processor space is very promising for consumers!
Are you holding out for a six core or quad core Ryzen CPU or are you considering a jump to the high-end Ryzen 7s?
Subject: General Tech | February 22, 2017 - 02:04 PM | Jeremy Hellstrom
Tagged: gaming, dawn of war III, wauughh
Dawn of War certainly changes from version to version. The first involved standard RTS fare, build bases and upgrade using resources collected on the map. The second was more squad based, with a hero leading meatshields into the fray. The third incarnation seems to lead off of the gameplay of the second, with at least some base and resource management making a comeback.
The new feature are superunits, extremely large and destructive units which you will gain access to as you take over portions of the map. Details are still a bit light but the game engine certainly looks pretty. You can pop by Rock, Paper, SHOTGUN to find a few of the other older teaser trailers.
"This cinematic-o-gameclip video introduces the broad story in Relic’s RTS and yes, it does basically boil down to finding a pointy stick. But what better item to fight over? If you can win a fight without a pointy stick, just imagine how powerful you’ll be once you get one!"
Here is some more Tech News from around the web:
- Humble Civ Bundle
- Half-Life 2: Episode 3 creator reflects on reception, one year after release @ Rock, Paper, SHOTGUN
- Horizon Zero Dawn is the best robot-safari adventure game ever made @ Ars Technica
- Total Warhammer shows off free Bretonnia faction DLC @ Rock, Paper, SHOTGUN
- Valve's Gabe Newell Says Only 30 SteamVR Apps Have Made $250,000+ @ Slashdot
- Let’s chatter over… Prey @ Rock, Paper, SHOTGUN
- Sniper Elite 4: Performance Analysis @ techPowerUp
- Pew pew! Mass Effect Andromeda trailer details combat @ Rock, Paper, SHOTGUN
Is Mechanical Mandatory?
The Logitech G213 Prodigy gaming keyboard offers the company's unique Mech-Dome keys and customizable RGB lighting effects, and it faces some stiff competition in a market overflowing with gaming keyboards for every budget (including mechanical options). But it really comes down to performance, feel, and usability; and I was interested in giving these new Mech-Dome keys a try.
“The G213 Prodigy gaming keyboard features Logitech Mech-Dome keys that are specially tuned to deliver a superior tactile response and performance profile similar to a mechanical keyboard. Mech-Dome keys are full height, deliver a full 4mm travel distance, 50g actuation force, and a quiet sound operation.
The G213 Prodigy gaming keyboard was designed for gaming, featuring ultra-quick, responsive feedback that is up to 4x faster than the 8ms report rate of standard keyboards and an anti-ghosting matrix that keeps you in control when you press multiple gaming keys simultaneously.”
I will say that at $69.99 the G213 plays in a somewhat odd space relative to the current gaming keyboard market; though it would be well positioned in a retail setting, where at a local Best Buy it would be a compelling option vs. a $100+ mechanical option. But savvy internet shoppers see the growing number of <$70 mechanical keyboards available and might question the need for a ‘quasi-mechanical’ option like this. I don’t review products from a marketing perspective, however, and I simply set out to determine if the G213 is a well-executed product on the hardware front.
Subject: General Tech | February 15, 2017 - 02:33 PM | Jeremy Hellstrom
Tagged: gaming, fallout 4
[H]ard|OCP took a look into the effect on performance the gigantic high resolution texture pack has on system performance in their latest article. For those who want the answer immediately, the largest amount of VRAM they saw utilized was a hair over 5GB, in most cases more than double the usage that the default textures use. This certainly suggests that those with 4GB cards should reconsider installing the texture pack and that a 6GB card shouldn't see performance impacts. As for the performance deltas, well we can't provide spoilers for their entire review!
"Bethesda has released its official High Resolution Texture Pack DLC for Fallout 4. We will look at performance impact, VRAM capacity usage levels, and compare image quality to see if this High Resolution Texture Pack might be worthy of your bandwidth in actually improving the gameplay experience."
Here is some more Tech News from around the web:
- The Humble Freedom Bundle
- Giveaway: 280 games from GOG’s Valentine’s sale @ Rock, Paper, SHOTGUN
- Drokk! Rebellion has opened up its 2000AD character canon for games developers @ The Inquirer
- Best total conversion mods @ Rock, Paper, SHOTGUN
- 14-Way NVIDIA GPU Comparison With Civilization VI On Linux @ Phoronix
- White Wolf and the World of Darkness revival: “Asking ‘when will you rage?’ has never been more relevant” @ Rock, Paper, SHOTGUN
- Prey: hands-on with Arkane’s shapeshifting shooter @ Rock, Paper, SHOTGUN
Subject: General Tech | February 8, 2017 - 02:24 PM | Jeremy Hellstrom
Tagged: gaming, bards tale, inxile
inXile have been very busy recently, doing a stellar job at resurrecting Wasteland into a new, modern RPG which is soon to see its third incarnation released. The long anticipated Torment: Tides of Numenara arrives at the end of this month, the beta has been a tantalizing taste as was the YouTube 'choose your own adventure' teaser. There is another project they have been working on, bringing the old Bard's Tale gaming into the modern era. A trailer showing in-game footage, including combat has just been release which you can see over at Rock, Paper, SHOTGUN. It certainly doesn't look like the Bard's Tale of old!
"On the game HUD, you can see your party occupying 2 rows of 4 spaces each. Enemies will line up on the opposite grid with the same number of slots. The exact positioning of enemies, as well as your own party, will determine which attacks can land, and which will swing wild past their mark."
Here is some more Tech News from around the web:
- Necromunda has bullet casings the size of tower blocks @ Rock, Paper, SHOTGUN
- Starship Commander VR game is purely voice controlled @ HEXUS
- Skyblivion looks even more like Oblivion rebuilt in Skyrim @ Rock, Paper, SHOTGUN
- The Radeon FreeSync 2 HDR Gaming Tech Report @ TechARP
- The Surge brings limb-theft to the Dark Souls party @ Rock, Paper, SHOTGUN
- Stellaris: Utopia expansion to blast off with Banks update @ Rock, Paper, SHOTGUN
Subject: General Tech | February 7, 2017 - 03:01 PM | Jeremy Hellstrom
Tagged: gaming, Star Wars, humble bundle
Do like Star Wars games, PCPer and Unicef? If so there is a Humble Bundle perfect for you running for the next two weeks. Depending on how much you pay you can get up to 15 games and an X-Wing verus TIE Fighter t-shirt, with a percentage of your purchase helping us to continue to provide the content you love. There is some overlap with previous bundles you may have picked up but for those of you missing KOTOR 1 or 2, The Force Unleashed 1 or 2, Shadows of the Empire or even the second Star Wars Battlefront game it is well worth the cost.
How can you resist that t-shirt?
Subject: General Tech | February 4, 2017 - 02:25 AM | Tim Verry
Tagged: usb 3.0, sony, ps4 pro, ps4, gaming, console
Sony is taking the wraps off of its latest firmware with the release of version 4.50 “Sasuke” beta firmware for the PS4. With the new firmware, Sony is rolling out a number of UI/UX improvement and users will finally be able to use external storage with the game console. On the PS4 Pro front, Sony will be adding a “boost mode” in a future update (it may not be ready in time for a production 4.50 release) that lets legacy games access the additional GPU horsepower of the Pro version of the console to smooth out frame rates without needing any special patches from the game developers.
The new firmware adds support for USB 3.0 hard drives (or SSDs) up to 8TB. Users will be able to use the external storage to store games, downloaded applications, screenshots, and videos and have it all show up on the main system menu along with the local storage. Users will not need to shuffle game data back and forth in order to play their games either. Note that currently, the actual save game data is still stored locally even if the game itself is stored on the external hard drive. Fans of the PlayStation VR (PS VR) also get an update with firmware 4.50 in the form of support for watching 3D Blu Rays. Beyond those big feature updates, Sony is also changing up the interface slightly. The Quick Menu now takes up less screen space and will allow gamers to create and join parties right from there rather than going to a separate app. In the notification area, Sony has condensed all the various notification types into a single unified list. Further, users will be able to set in game screenshots as the home screen wallpaper.
Perhaps most interesting is the planned “boost mode” for the PS4 Pro which is currently in beta. Gamers are reporting that titles such as The Evil Within and Just Cause 3 are significantly smoother frame rates with noticeably reduced stuttering. Reportedly, the boost mode will work with most PS4 games that were programmed with unlocked frame rates though the exact benefits will vary. Games that have a hard cap on the frame rate will still need specific patches from the game developers to get any improvements. Ars Technica speculates that the “boost mode” is simply Sony removing its own blocks it put in place to force compatibility with older games that were developed with the base PS4 in mind. When the boost mode is off, the PS4 Pro GPU has part of itself turned off such that it functions exactly as the PS4’s GPU and activating boost mode takes away the blocks and allows the full GPU (with it's 36 CUs) to process the game data as best it can. Getting things like native higher resolutions or more detailed textures will still require patches, of course.
If you have a PS4 or PS4 Pro, keep an eye on the beta Sasuke 4.50 firmware.
- Console Gaming on the PC: PS4 Remote Play vs. Xbox One Streaming
- PS4 Remote Play Now Available On PCs and Macs With 3.50 Firmware Update