All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Introduction and Technical Specifications
Courtesy of GIGABYTE
The GIGABYTE AX370-Gaming 5 board features a matte black PCB with with a white armor overlay protecting the rear panel and audio components. In line with their AORUS Intel boards, GIGABYTE spread RGB LEDs throughout the board's surface, configurable via the UEFI or the windows app. The board supports the AMD Ryzen processor line and Dual Channel DDR4 memory via the AMD X370 chipset. The AX370-Gaming 5 motherboard can be found at most retailers with an MRSP of $194.99
Courtesy of GIGABYTE
Courtesy of GIGABYTE
The following features have been integrated into the board: four SATA III 6Gbps ports; two SATA-Express ports; an M.2 PCIe Gen3 x4 32Gbps capable port; a U.2 PCIe Gen3 x4 32Gbps capable port; dual RJ-45 ports featuring an Intel I211-AT Gigabit NIC and a Rivet Networks Killer E2500 NIC; three PCI-Express x16 slots; three PCI-Express x1 slots; dual Realtek audio CODECs; an integrated HDMI video port; and USB 2.0, 3.0, and 3.1 Type-A and Type-C port support.
Courtesy of GIGABYTE
To power the board, GIGABYTE integrated integrated a 10-phase (6+4) digital power delivery system into the AX370-Gaming 5 board's design. The digital power system was designed with IR digital power controllers and PowIRstage ICs, Server Level Chokes, and Durable Black capacitors. The power components used are the same as those used to great effect on their AORUS Intel boards.
Courtesy of GIGABYTE
GIGABYTE integrated a variety of fan headers and temperature sensors into the board They integrated temperature sensors into the CPU socket, VRMs, and chipset. Additionally, there are monitored fan headers spread throughout the board's surface, all supporting high current devices (fans or water pumps), rated for up to 24W (2A at 12V).
A New Frontier
Console game performance has always been an area that we've been interested in here at PC Perspective but has been mostly out of our reach to evaluate with any kind of scientific tilt. Our Frame Rating methodology for PC-based game analysis relies on having an overlay application during screen capture which is later analyzed by a series of scripts. Obviously, we can not take this approach with consoles as we cannot install our own code on the consoles to run that overlay.
A few other publications such as Eurogamer with their Digital Foundry subsite have done fantastic work developing their internal toolsets for evaluating console games, but this type of technology has mostly remained out of reach of the everyman.
Recently, we came across an open source project which aims to address this. Trdrop is an open source software built upon OpenCV, a stalwart library in the world of computer vision. Using OpenCV, trdrop can analyze the frames of ordinary gameplay (without an overlay), detecting if there are differences between two frames, looking for dropped frames and tears to come up with a real-time frame rate.
This means that trdrop can analyze gameplay footage from any source, be it console, PC, or anything in-between from which you can get a direct video capture feed. Now that PC capture cards capable of 1080p60, and even 4K60p are coming down in price, software like this is allowing more gamers to peek at the performance of their games, which we think is always a good thing.
It's worth noting that trdrop is still listed as "alpha" software on it's GitHub repo, but we have found the software to be very stable and flexible in the current iteration.
|Xbox One S||Xbox One X||PS4||PS4 Pro|
|GPU CU||12x GCN
|1.4 TF||6.0 TF||1.84 TF||4.2 TF|
|Memory||8 GB DDR3
|12 GB GDDR5||8 GB GDDR5||8 GB GDDR5|
Now that the Xbox One X is out, we figured it would be a good time to take a look at the current generation of consoles and their performance in a few games as a way to get our feet wet with this new software and method. We are only testing 1080p here, but we now have our hands on a 4K HDMI capture card capable of 60Hz for some future testing! (More on that soon.)
Keeping a Low Profile
Havit is a Chinese company with a unique product for the enthusiast PC segment: the thinnest mechanical keyboard on the market at 22.5 mm. Their slim HV-KB395L keyboard offers real mechanical switching via Kailh low-profile blue switches, and full RGB lighting is thrown in for good measure. For a keyboard that retails for $79.99 this is certainly an interesting mix, but how in the world does low-profile mechanical feel? I will attempt to translate that experience into words (by… typing words).
- 104-key Mechanical Keyboard
- Customizable RGB backlighting
- Kailh PG1350 Low Profile Blue Switch
- 3mm of total travel, 45g of operating force
- N-Key Rollover
- Detachable USB Cable
- Weight: 0.57 kg
- Dimensions: 43.6 x 12.6 x 2.25 cm
First impressions of the keyboard are great, with nice packaging that cradles the keyboard in a carton inside the box. The keyboard itself feels quite premium, with a top panel that is actually metal - unusual for this price-point.
Introduction and Case Exterior
The In Win 301 is a mini tower case with a tempered glass side panel that sells for less than $70. How good is it? Dollar for dollar it could be the best affordable case on the market right now. That's a pretty bold statement, and you'll just have to read the whole review to see if I'm right.
In Win is one of the most unique enclosure makers in the industry, with designs running from elegant simplicity to some of the most elaborate and expensive cases we’ve ever seen. Though well-known for the striking tou 2.0 and the show-stopping (and motorized) H-Frame, in recent years In Win has expanded its offering in the affordable enclosure space, and there is no better example of this than the case we have for you today.
The 301, smaller sibling to the 303, is beautiful in its simplicity, thoughtfully designed for ease of use (as we will see here), and very affordable - even with its tempered-glass side panel, a signature of In Win enclosures. Sound too good to be true? It is limited to micro-ATX and mini-ITX motherboards, but if you’re looking for an option for a small form-factor build with room for full-sized components, this might just end up on your short list. Let’s take a close look at this stylish mini-tower case!
Is this the new budget champion?
True to their name, Corsair’s new HS50 STEREO gaming headsets offer traditional 2-channel sound from a similarly traditional headphone design. These are certainly ready for gaming with a detachable microphone and universal compatibility with both PCs and consoles, and budget friendly with an MSRP of only $49.99. How do they stack up? Let’s find out!
- Driver: 50mm Neodymium
- Frequency Response: 20Hz – 20kHz
- Impedance: 32 Ohms @ 1kHz
- Sensitivity: 111 dB (± 3 dB)
- Mic Type: Unidirectional noise-cancelling
- Mic Impedance: 2.0k Ohms
- Mic Frequency: Response 100Hz – 10kHz
- Mic Sensitivity: -40 dB (± 3 dB)
- Dimensions (LxWxH): 160 x 100 x 205 mm
- Weight: 319g
- Warranty: 2 years
- Available Colors: Carbon, Green, Blue
- Corsair HS50 STEREO Gaming Headset: $49.99, Amazon.com
Nothing about these say “budget” when you look at the packaging and first unbox them, and they have a substantial feel to them like a pair of premium headphones - not at all like an inexpensive gaming headset.
Ultimate Cord Cutting Guide - Part 2: Installation & Configuration
We're back with Part 2 of our cord cutting series, documenting our experience with dumping traditional cable and satellite providers in exchange for cheaper and more flexible online and over-the-air content. In Part 1 we looked at the devices that could serve as our cord-cutting hub, the types of subscription content that would be available, and the options for free OTA and online media.
In the end, we selected the NVIDIA SHIELD as our central media device due to its power, capabilities, and flexibility. Now in Part 2 we'll walk through setting up the SHIELD, adding our channels and services, configuring Plex, and more!
Introduction and Features
It has been several years since we looked at a true SFX form factor power supply, but today we are going to take a detailed look at one of SilverStone’s new SFX units, the SX650-G. As one of the original manufacturers of SFX power supplies, Silverstone Technology Co. is meeting demand with new products; continuing to expand their product offering with two new SFX units, the SX500-G and SX650-G.
(SX = SFX Form Factor, 650 = 650W, and G = 80 Plus Gold certified)
SilverStone has a long-standing reputation for providing a full line of high quality enclosures, power supplies, cooling components, and accessories for PC enthusiasts. With a continued focus on smaller physical size and support for small form-factor enthusiasts, SilverStone added the new SX-G power supplies to their SFX form factor series. There are now ten power supplies in SilverStone’s SFX Series, ranging in output capacity from 300W to 800W. Unlike the larger SX-L units which are 30mm (1.2”) longer than a standard SFX chassis, the SX650-G retains the original SFX dimensions.
(Courtesy of SilverStone)
The new SX650-G power supply features high efficiency (80 Plus Gold certified) and comes with all modular flat ribbon-style cables.
SilverStone SX650-G PSU Key Features:
• Small Form Factor (SFX) design
• 650W continuous power output
• Fluid Dynamic Bearing (FDB) fan for quiet, reliable operation
• 80 Plus Gold certified for high efficiency
• Powerful single +12V rail with 54.2A capacity
• All-modular, flat ribbon-style cables
• High quality construction with all Japanese capacitors
• Strict ±3% voltage regulation and low AC ripple and noise
• Support for high-end GPUs with four PCI-E 8/6-pin connectors
• Safety Protections: OCP, OVP, UVP, SCP, OTP, and OPP
Despite their large global presence in smartphones, Huawei isn't a brand widely known to US consumers. While this has improved year by year with the introduction of unlocked phones from and their Mate brand, I don't think that most Americans realize how big of a consumer electronics company Huawei is.
One of the more recent categories that Huawei has entered is the Windows notebook and tablet market. Starting with the announcement of the original MateBook at Mobile World Congress in 2016 (see our subsequent review here), the MateBook line was expanded this year to include two traditional notebook form factors—the thin-and-light MateBook X, and the more mainstream MateBook D.
With the introduction of these new products, the 2-in-1 tablet formerly known as just the MateBook has been slightly revised and renamed to the MateBook E, the product that we are looking at today.
|Huawei MateBook E (configuration as reviewed)|
|Processor||Intel Core m3-7Y30|
|Graphics||Intel HD Graphics 615|
|Screen||12-in 2160x1440 IPS|
128GB SanDisk SATA SSD
|Wireless||Intel 8275 802.11ac + BT 4.2 (Dual Band, 2x2)|
|Connections||1 x USB 3.1 Gen 1 Type C
Audio combo jack
|Dimensions||278.8mm x 194.1mm x 6.8mm (10.98" x 7.64" x .27")
2.43 lb (1100 g)
|OS||Windows 10 Home|
|Price||$699 - Amazon.com|
A quiet facade
Iceberg Interactive, whom you may know from games like Killing Floor or the Stardrive series have released a new strategy game called Oriental Empires, and happened to send me a copy to try out.
On initial inspection it resembles recent Civilization games but with a more focused design as you take on a tribe in ancient China and attempt to become Emperor, or at least make your neighbours sorry that they ever met you. Until you have been through 120 turns of the Grand Campaign you cannot access many of the tribes; not a bad thing as that first game is your tutorial. Apart from an advisor popping up during turns or events, the game does not hold your hand and instead lets you figure out the game on your own.
That minimalist ideal is featured throughout the entire game, offering one of the cleanest interfaces I've seen in a game. All of the information you need to maintain and grow your empire is contained in a tiny percentage of the screen or in a handful of in game menus. This plays well as the terrain and look of the campaign map is quite striking and varies noticeably with the season.
Spring features cherry blossom trees as well as the occasional flooding.
Summer is a busy season for your workers and perhaps your armies.
Fall colours indicate the coming of winter and snow.
Which also shrouds the peaks in fog. The atmosphere thus created is quite relaxing, somewhat at odds with many 4X games and perhaps the most interesting thing about this game.
In these screenshots you can see the entire GUI that gives you the information you need to play. The upper right shows your turn, income and occaisonally a helpful advsor offering suggestions. Below that you will find a banner that toggles between displaying three lists. The first is of your cites and their current build queues and population information, the second lists your armies compositions and if they currently have any orders while the last displays any events which effect your burgeoning empire. The bottom shows your leader and his authority which, among other things, indicates the number of cities you can support without expecting quickly increasing unrest.
The right hand side lets you bring up the only other five menus which you use in this game. From top to bottom they offer you diplomacy, technology, Imperial edicts you can or have applied to your Empire, player statistics to let you know how you are faring and the last offering detailed statistics of your empire and those competing tribes you have met.
A Trio of Air Coolers
Scythe is a major player in the air cooling space with a dizzying array of coolers for virtually any application from the Japanese company. In addition to some of the most compact coolers in the business Scythe also offers some of the highest performing - and most quiet - tower coolers available. Two of the largest coolers in the lineup are the new Mugen 5 Rev. B, and the Grand Kama Cross 3 - the latter of which is one of their most outlandish designs.
Rounding out this review we also have a compact tower option from Scythe in the Byakko, which is a 130 mm tall cooler that can fit in a greater variety of enclosures than the Mugen 5 or Grand Kama Cross due to its lower profile. So how did each perform on the cooler test bench? We put these Scythe coolers against the Intel Core i7-7700K to see how potent their cooling abilities are when facing a CPU that gets quite toasty under load. Read on to see how this trio responded to the challenge!
YouTube TV for NVIDIA SHIELD
When YouTube TV first launched earlier this year, it had one huge factor in its favor compared to competing subscription streaming services: local channels. The service wasn't available everywhere, but in the markets where it was available, users were able to receive all of their major local networks. This factor, combined with its relatively low subscription price of $35 per month, immediately made YouTube TV one of the best streaming options, but it also had a downside: device support.
At launch YouTube TV was only available via the Chrome browser, iOS and Android, and newer Chromecast devices. There were no native apps for popular media devices like the Roku, Amazon Fire TV, or Apple TV. But perhaps the most surprising omission was support for Android TV via devices like the NVIDIA SHIELD. Most of the PC Perspective staff personally use the SHIELD due to its raw power and capabilities, and the lack of YouTube TV support on Google's own media platform was disappointing.
Thankfully, Google recently addressed this omission and has finally brought a native YouTube TV app to the SHIELD with the SHIELD TV 6.1 Update.
Introduction and Specifications
Back in April, we finally got our mitts on some actual 3D XPoint to test, but there was a catch. We had to do so remotely. The initial round of XPoint testing done (by all review sites) on a set of machines located on the Intel campus. Intel had their reasons for this unorthodox review method, but we were satisfied that everything was done above board. Intel even went as far as walking me over to the very server that we would be remoting into for testing. Despite this, there were still a few skeptics out there, and today we can put all of that to bed.
This is a 750GB Intel Optane SSD DC P4800X - in the flesh and this time on *our* turf. I'll be putting it through the same initial round of tests we conducted remotely back in April. I intend to follow up at a later date with additional testing depth, as well as evaluating kernel response times across Windows and Linux (IRQ, Polling, Hybrid Polling, etc), but for now, we're here to confirm the results on our own testbed as well as evaluate if the higher capacity point takes any sort of hit to performance. We may actually see a performance increase in some areas as Intel has had several months to further tune the P4800X.
This video is for the earlier 375GB model launch, but all points apply here
(except that the 900P has now already launched)
The baseline specs remain the same as they were back in April with a few significant notable exceptions:
The endurance figure for the 375GB capacity has nearly doubled to 20.5 PBW (PetaBytes Written), with the 750GB capacity logically following suit at 41 PBW. These figures are based on a 30 DWPD (Drive Write Per Day) rating spanned across a 5-year period. The original product brief is located here, but do note that it may be out of date.
We now have official sequential throughput ratings: 2.0 GB/s writes and 2.4 GB/s reads.
We also have been provided detailed QoS figures and those will be noted as we cover the results throughout the review.
The Expected Unexpected
Last night we first received word that Raja had resigned from AMD (during a sabbatical) after they had launched Vega. The initial statement was that Raja would come back to resume his position at AMD in a December/January timeframe. During this time there was some doubt as to if Raja would in fact come back to AMD, as “sabbaticals” in the tech world would often lead the individual to take stock of their situation and move on to what they would consider to be greener pastures.
Raja has dropped by the PCPer offices in the past.
Initially it was thought that Raja would take the time off and then eventually jump to another company and tackle the issues there. This behavior is quite common in Silicon Valley and Raja is no stranger to this. Raja cut his teeth on 3D graphics at S3, but in 2001 he moved to ATI. While there he worked on a variety of programs including the original Radeon, the industry changing Radeon 9700 series, and finishing up with the strong HD 4000 series of parts. During this time ATI was acquired by AMD and he became one of the top graphics guru at that company. In 2009 he quit AMD and moved on to Apple. He was Director of Graphics Architecture at Apple, but little is known about what he actually did. During that time Apple utilized AMD GPUs and licensed Imagination Technologies graphics technology. Apple could have been working on developing their own architecture at this point, which has recently showed up in the latest iPhone products.
In 2013 Raja rejoined AMD and became a corporate VP of Visual Computing, but in 2015 he was promoted to leading the Radeon Technology Group after Lisu Su became CEO of the company. While there Raja worked to get AMD back on an even footing under pretty strained conditions. AMD had not had the greatest of years and had seen their primary moneymakers start taking on water. AMD had competitive graphics for the most part, and the Radeon technology integrated into AMD’s APUs truly was class leading. On the discrete side AMD was able to compare favorably to NVIDIA with the HD 7000 and later R9 200 series of cards. After NVIDIA released their Maxwell based chips, AMD had a hard time keeping up. The general consensus here is that the RTG group saw its headcount decreased by the company-wide cuts as well as a decrease in R&D funds.
Here comes a new challenger
The release of the GeForce GTX 1070 Ti has been an odd adventure. Launched into a narrow window of a product stack between the GTX 1070 and the GTX 1080, the GTX 1070 Ti is a result of the competition from the AMD RX Vega product line. Sure, NVIDIA might have speced out and prepared an in-between product for some time, but it was the release of competitive high-end graphics cards from AMD (for the first time in forever it seems) that pushed NVIDIA to launch what you see before us today.
With MSRPs of $399 and $499 for the GTX 1070 and GTX 1080 respectively, a new product that fits between them performance wise has very little room to stretch its legs. Because of that, there are some interesting peculiarities involved with the release cycle surrounding overclocks, partner cards, and more.
But before we get into that concoction, let’s first look at the specifications of this new GPU option from NVIDIA as well as the reference Founders Edition and EVGA SC Black Edition cards that made it to our offices!
GeForce GTX 1070 Ti Specifications
We start with our classic table of details.
|RX Vega 64 Liquid||RX Vega 64 Air||RX Vega 56||Vega Frontier Edition||GTX 1080 Ti||GTX 1080||GTX 1070 Ti||GTX 1070|
|Base Clock||1406 MHz||1247 MHz||1156 MHz||1382 MHz||1480 MHz||1607 MHz||1607 MHz||1506 MHz|
|Boost Clock||1677 MHz||1546 MHz||1471 MHz||1600 MHz||1582 MHz||1733 MHz||1683 MHz||1683 MHz|
|Memory Clock||1890 MHz||1890 MHz||1600 MHz||1890 MHz||11000 MHz||10000 MHz||8000 MHz||8000 MHz|
|Memory Interface||2048-bit HBM2||2048-bit HBM2||2048-bit HBM2||2048-bit HBM2||352-bit G5X||256-bit G5X||256-bit||256-bit|
|Memory Bandwidth||484 GB/s||484 GB/s||410 GB/s||484 GB/s||484 GB/s||320 GB/s||256 GB/s||256 GB/s|
|TDP||345 watts||295 watts||210 watts||300 watts||250 watts||180 watts||180 watts||150 watts|
|Peak Compute||13.7 TFLOPS||12.6 TFLOPS||10.5 TFLOPS||13.1 TFLOPS||11.3 TFLOPS||8.2 TFLOPS||7.8 TFLOPS||5.7 TFLOPS|
If you have followed the leaks and stories over the last month or so, the information here isn’t going to be a surprise. The CUDA core count of the GTX 1070 Ti is 2432, only one SM unit less than the GTX 1080. Base and boost clock speeds are the same as the GTX 1080. The memory system includes 8GB of GDDR5 running at 8 GHz, matching the performance of the GTX 1070 in this case. The TDP gets a bump up to 180 watts, in line with the GTX 1080 and slightly higher than the GTX 1070.
Overview and CPU Performance
When Intel announced their quad-core mobile 8th Generation Core processors in August, I was immediately interested. As a user who gravitates towards "Ultrabook" form-factor notebooks, it seemed like a no-brainer—gaining two additional CPU cores with no power draw increase.
However, the hardware reviewer in me was skeptical. Could this "Kaby Lake Refresh" CPU provide the headroom to fit two more physical cores on a die while maintaining the same 15W TDP? Would this mean that the processor fans would have to run out of control? What about battery life?
Now that we have our hands on our first two notebooks with the i7-8550U in, it's time to take a more in-depth look at Intel's first mobile offerings of the 8th Generation Core family.
Providers and Devices
"Cutting the Cord," the process of ditching traditional cable and satellite content providers for cheaper online-based services, is nothing new. For years, consumers have cancelled their cable subscriptions (or declined to even subscribe in the first place), opting instead to get their entertainment from companies like Netflix, Hulu, and YouTube.
But the recent introduction of online streaming TV services like Sling TV, new technologies like HDR, and the slow online adoption of live local channels has made the idea of cord cutting more complicated. While cord cutters who are happy with just Netflix and YouTube need not worry, what are the solutions for those who don't like the idea of high cost cable subscriptions but also want to preserve access to things like local channels and the latest 4K HDR content?
This article is the first in a three-part series that will look at this "high-end" cord cutting scenario. We'll be taking a look at the options for online streaming TV, access to local "OTA" (over the air) channels, and the devices that can handle it all, including DVR support, 4K output, and HDR compliance.
There are two approaches that you can take when considering the cord cutting process. The first is to focus on capabilities: Do you want 4K? HDR? Lossless surround sound audio? Voice search? Gaming?
The second approach is to focus on content: Do you want live TV or à la carte downloads? Can you live without ESPN or must it and your other favorite networks still be available? Are you heavily invested in iTunes content? Perhaps most importantly for those concerned with the "Spousal Acceptance Factor" (SAP), do you want the majority of your content contained in a single app, which can prevent you and your family members from having to jump between apps or devices to find what they want?
While most people on the cord cutting path will consider both approaches to a certain degree, it's easier to focus on the one that's most important to you, as that will make other choices involving devices and content easier. Of course, there are those of us out there that are open to purchasing and using multiple devices and content sources at once, giving us everything at the expense of increased complexity. But most cord cutters, especially those with families, will want to pursue a setup based around a single device that accommodates most, if not all, of their needs. And that's exactly what we set out to find.
Introduction, Specifications and Packaging
It’s been two long years since we first heard about 3D XPoint Technology. Intel and Micron serenaded us with tales of ultra-low latency and very high endurance, but when would we have this new media in our hot little hands? We got a taste of things with Optane Memory (caching) back in April, and later that same month we got a much bigger, albeit remotely-tested taste in the form of the P4800X. Since April all was quiet, with all of us storage freaks waiting for a consumer version of Optane with enough capacity to act as a system drive. Sure we’ve played around with Optane Memory parts in various forms of RAID, but as we found in our testing, Optane’s strongest benefits are the very performance traits that do not effectively scale with additional drives added to an array. The preferred route is to just get a larger single SSD with more 3D XPoint memory installed on it, and we have that very thing today (and in two separate capacities)!
You might have seen various rumors centered around the 900P lately. The first is that the 900P was to supposedly support PCIe 4.0. This is not true, and after digging back a bit appears to be a foreign vendor mistaking / confusing PCIe X4 (4 lanes) with the recently drafted PCIe 4.0 specification. Another set of rumors centered around pre-order listings and potential pricing for the 280 and 480 GB variants of the 900P. We are happy to report that those prices (at the time of this writing) are way higher than Intel’s stated MSRP's for these new models. I’ll even go as far as to say that the 480GB model can be had for less than what the 280GB model is currently listed for! More on that later in the review.
Performance specs are one place where the rumors were all true, but since all the folks had to go on was a leaked Intel press deck slide listing figures identical to the P4800X, we’re not really surprised here.
Lots of technical stuff above, but the high points are <10us typical latency (‘regular’ SSDs run between 60-100us), 2.5/2.0 GB/s sequential reads/writes, and 550k/500k random read/write performance. Yes I know, don’t tell me, you’ve seen higher sequentials on smaller form factor devices. I agree, and we’ve even seen higher maximum performance from unreleased 3D XPoint-equipped parts from Micron, but Intel has done what they needed to do in order to make this a viable shipping retail product, which likely means sacrificing the ‘megapixel race’ figures in favor of offering the lowest possible latencies and best possible endurance at this price point.
Packaging is among the nicest we’ve seen from an Intel SSD. It actually reminds me of how the Fusion-io ioDrives used to come.
Also included with the 900P is a Star Citizen ship. The Sabre Raven has been a topic of gossip and speculation for months now, and it appears to be a pretty sweet looking fighter. For those unaware, Star Citizen is a space-based MMO, and with a ‘ship purchase’ also comes a license to play the game. The Sabre Raven counts as such a purchase and apparently comes with lifetime insurance, meaning it will always be tied to your account in case it gets blown up doing data runs. Long story short, you get the game for free with the purchase of a 900P.
A potential game changer?
I thought we were going to be able to make it through the rest of 2017 without seeing AMD launch another family of products. But I was wrong. And that’s a good thing. Today AMD is launching the not-so-cleverly-named Ryzen Processor with Radeon Vega Graphics product line that will bring the new Zen processor architecture and Vega graphics architecture onto a single die for the ultrathin mobile notebook platforms. This is no minor move for them – just as we discussed with the AMD EPYC processor launch, this is a segment that has been utterly dominated by Intel. After all, Intel created the term Ultrabook to target these designs, and though that brand is gone, the thin and light mindset continues to this day.
The claims AMD makes about its Ryzen mobile APU (combination CPU+GPU accelerated processing unit, to use an older AMD term) are not to be made lightly. Right up front in our discussion I was told this is going to be the “world’s fastest for ultrathin” machines. Considering that AMD had previously been unable to even enter those markets with previous products, both due to some technological and business roadblocks, AMD is taking a risk by painting this launch in such a light. Thanks to its ability combine CPU and GPU technology on a single die though, AMD has some flexibility today that simply did not have access to previously.
From the days that AMD first announced the acquisition of ATI graphics, the company has touted the long-term benefits of owning both a high-performance processor and graphics division. By combining the architectures on a single die, they could become greater than the sum of the parts, leveraging new software directions and the oft-discussed HSA (heterogenous systems architecture) that AMD helped create a foundation for. Though the first rounds of APUs were able to hit modest sales, the truth was that AMD’s advantage over Intel’s on the graphics technology front was often overshadowed by the performance and power efficiency advantages that Intel held on the CPU front.
But with the introduction of the first products based on Zen earlier this year, AMD has finally made good on the promises of catching up to Intel in many of the areas where it matters the most. The new from-the-ground-up design resulted in greater than 50% IPC gains, improved area efficiency compared to Intel’s latest Kaby Lake core design, and enormous gains in power efficiency compared to the previous CPU designs. When looking at the new Ryzen-based APU products with Vega built-in, AMD claims that they tower over the 7th generation APUs with up to 200% more CPU performance, 128% more GPU performance, and 58% lower power consumption. Again, these are bold claims, but it gives AMD confidence that it can now target premium designs and form factors with a solution that will meet consumer demands.
AMD is hoping that the release of the Ryzen 7 2700U and Ryzen 5 2500U can finally help turn the tides in the ultrathin notebook market.
|Core i7-8650U||Core i7-8550U||Core i5-8350U||Core i5-8250U||Ryzen 7 2700U||Ryzen 5 2500U|
|Architecture||Kaby Lake Refresh||Kaby Lake Refresh||Kaby Lake Refresh||Kaby Lake Refresh||Zen+Vega||Zen+Vega|
|Base Clock||1.9 GHz||1.8 GHz||1.7 GHz||1.6 GHz||2.2 GHz||2.0 GHz|
|Max Turbo Clock||4.2 GHz||4.0 GHz||3.8 GHz||3.6 GHz||3.8 GHz||3.6 GHz|
|System Bus||DMI3 - 8.0 GT/s||DMI3 - 8.0 GT/s||DMI2 - 6.4 GT/s||DMI2 - 5.0 GT/s||N/A||N/A|
|Graphics||UHD Graphics 620||UHD Graphics 620||UHD Graphics 620||UHD Graphics 620||Vega (10 CUs)||Vega (8 CUs)|
|Max Graphics Clock||1.15 GHz||1.15 GHz||1.1 GHz||1.1 GHz||1.3 GHz||1.1 GHz|
The Ryzen 7 2700U will run 200 MHz higher on the base and boost clocks for the CPU and 200 MHz higher on the peak GPU core clock. Though both systems have 4-cores and 8-threads, the GPU on the 2700U will have two additional CUs / compute units.
Forza Motorsport 7 Performance
The first full Forza Motorsport title available for the PC, Forza Motorsport 7 on Windows 10 launched simultaneously with the Xbox version earlier this month. With native 4K assets, HDR support, and new visual features like fully dynamic weather, this title is an excellent showcase of what modern PC hardware can do.
Now that both AMD and NVIDIA have released drivers optimized for Forza 7, we've taken an opportunity to measure performance across an array of different GPUs. After some significant performance mishaps with last year's Forza Horizon 3 at launch on PC, we are excited to see if Forza Motorsport 7 brings any much-needed improvements.
For this testing, we used our standard GPU testbed, including an 8-core Haswell-E processor and plenty of memory and storage.
|PC Perspective GPU Testbed|
|Processor||Intel Core i7-5960X Haswell-E|
|Motherboard||ASUS Rampage V Extreme X99|
|Memory||G.Skill Ripjaws 16GB DDR4-3200|
|Storage||OCZ Agility 4 256GB (OS)
Adata SP610 500GB (games)
|Power Supply||Corsair AX1500i 1500 watt|
|OS||Windows 10 x64|
|Drivers||AMD: 17.10.1 (Beta)
As with a lot of modern console-first titles, Forza 7 defaults to "Dynamic" image quality settings. This means that the game engine is supposed to find the best image settings for your hardware automatically, and dynamically adjust them so that you hit a target frame rate (adjustable between 30 and 60fps) no matter what is going on in the current scene that is being rendered.
While this is a good strategy for consoles, and even for casual PC gamers, it poses a problem for us trying to measure equivalent performance across GPUs. Luckily the developers of Forza Motorsport 7, Turn 10 Studios, still let you disable the dynamic control and configure the image quality settings as you desire.
One quirk however though is that in order for V-Sync to be disabled, the rendering resolution within the game must match the native resolution of your monitor. This means that if you are running 2560x1440 on your 4K monitor, you must first set the resolution within windows to 2560x1440 in order to run the game in V-Sync off mode.
We did our testing with an array of three different resolutions (1080p, 1440p, and 4K) at maximum image quality settings. We tested both AMD and NVIDIA graphics cards in similar price and performance segments. The built-in benchmark mode for this game was used, which does feature some variance due to dynamic weather patterns. However, our testing within the full game matched the results of the benchmark mode closely, so we used it for our final results.
Right off the bat, I have been impressed at how well optimized Forza Motorsport 7 seems to be on the PC. Compared to the unoptimized disaster that was Forza Horizon 3 when it launched on PC last year, it's clear that Turn 10 Studios and Microsoft have come a long way.
Even gamers looking to play on a 4K display at 60Hz can seemingly get away with the cheaper, and more mainstream GPUs such as the RX 580 or the GTX 1060 with acceptable performance in most scenarios.
Games on high-refresh-rate displays don't appear to have the same luxury. If you want to game at a resolution such as 2560x1440 at a full 144Hz, neither the RX Vega 64 or GTX 1080 will do this with maximum image quality settings. Although these GPUs appear to be in the margin where you could turn down a few settings to achieve your full refresh rate.
For some reason, the RX Vega cards didn't seem to show any scaling in performance when moving from 2560x1440 to 1920x1080, unlike the Polaris-based RX 580 and the NVIDIA options. We aren't quite sure of the cause of this and have reached out to AMD for clarification.
As far as frame times are concerned, we also gathered some data with our Frame Rating capture analysis system.
Taking a look at the first chart, we can see while the GTX 1080 frame times are extremely consistent, the RX Vega 64 shows some additional variance.
However, the frame time variance chart shows that over 95% of the frame times of the RX Vega 64 come in at under 2ms of variance, which will still provide a smooth gameplay experience in most scenarios. This matches with our experience while playing on both AMD and NVIDIA hardware where we saw no major issues with gameplay smoothness.
Forza Motorsport 7 seems to be a great addition to the PC gaming world (if you don't mind using the Microsoft store exclusively) and will run great on a wide array of hardware. Whether or not you have a NVIDIA or AMD GPU, you should be able to enjoy this fantastic racing simulator.