NVIDIA Adds Metal Gear Solid V: The Phantom Pain Bundle to GeForce Cards

Subject: Graphics Cards | July 23, 2015 - 10:52 AM |
Tagged: nvidia, geforce, gtx, bundle, metal gear solid, phantom pain

NVIDIA continues with its pattern of flagship game bundles with today's announcement. Starting today, GeForce GTX 980 Ti, 980, 970 and 960 GPUs from select retailers will include a copy of Metal Gear Solid V: The Phantom Pain, due out September 15th. (Bundle is live on Amazon.com.) Also, notebooks that use the GTX 980M or 970M GPU qualify.


From NVIDIA's marketing on the bundle:

Only GeForce GTX gives you the power and performance to game like the Big Boss. Experience the METAL GEAR SOLID V: THE PHANTOM PAIN with incredible visuals, uncompromised gameplay, and advanced technologies. NVIDIA G-SYNC™ delivers smooth and stutter-free gaming, GeForce Experience™ provides optimal playable settings, and NVIDIA GameStream™ technology streams your game to any NVIDIA SHIELD™ device.

It appears that Amazon.com already has its landing page up and ready for the MGS V bundle program, so if you are hunting for a new graphics card stop there and see what they have in your range.

Let's hope that this game release goes a bit more smooth than Batman: Arkham Knight...

Source: NVIDIA

TSMC Plans 10nm, 7nm, and "Very Steep" Ramping of 16nm.

Subject: Graphics Cards, Processors, Mobile | July 19, 2015 - 06:59 AM |
Tagged: Zen, TSMC, Skylake, pascal, nvidia, Intel, Cannonlake, amd, 7nm, 16nm, 10nm

Getting smaller features allows a chip designer to create products that are faster, cheaper, and consume less power. Years ago, most of them had their own production facilities but that is getting rare. IBM has just finished selling its manufacturing off to GlobalFoundries, which was spun out of AMD when it divested from fabrication in 2009. Texas Instruments, on the other hand, decided that they would continue manufacturing but get out of the chip design business. Intel and Samsung are arguably the last two players with a strong commitment to both sides of the “let's make a chip” coin.


So where do you these chip designers go? TSMC is the name that comes up most. Any given discrete GPU in the last several years has probably been produced there, along with several CPUs and SoCs from a variety of fabless semiconductor companies.

Several years ago, when the GeForce 600-series launched, TSMC's 28nm line led to shortages, which led to GPUs remaining out of stock for quite some time. Since then, 28nm has been the stable work horse for countless high-performance products. Recent chips have been huge, physically, thanks to how mature the process has become granting fewer defects. The designers are anxious to get on smaller processes, though.

In a conference call at 2 AM (EDT) on Thursday, which is 2 PM in Taiwan, Mark Liu of TSMC announced that “the ramping of our 16 nanometer will be very steep, even steeper than our 20nm”. By that, they mean this year. Hopefully this translates to production that could be used for GPUs and CPUs early, as AMD needs it to launch their Zen CPU architecture in 2016, as early in that year as possible. Graphics cards have also been on that technology for over three years. It's time.

Also interesting is how TSMC believes that they can hit 10nm by the end of 2016. If so, this might put them ahead of Intel. That said, Intel was also confident that they could reach 10nm by the end of 2016, right until they announced Kaby Lake a few days ago. We will need to see if it pans out. If it does, competitors could actually beat Intel to the market at that feature size -- although that could end up being mobile SoCs and other integrated circuits that are uninteresting for the PC market.

Following the announcement from IBM Research, 7nm was also mentioned in TSMC's call. Apparently they expect to start qualifying in Q1 2017. That does not provide an estimate for production but, if their 10nm schedule is both accurate and also representative of 7nm, that would production somewhere in 2018. Note that I just speculated on an if of an if of a speculation, so take that with a mine of salt. There is probably a very good reason that this date wasn't mentioned in the call.

Back to the 16nm discussion, what are you hoping for most? New GPUs from NVIDIA, new GPUs from AMD, a new generation of mobile SoCs, or the launch of AMD's new CPU architecture? This should make for a highly entertaining comments section on a Sunday morning, don't you agree?

Manufacturer: Various

SLI and CrossFire

Last week I sat down with a set of three AMD Radeon R9 Fury X cards, our sampled review card as well as two retail cards purchased from Newegg, to see how the reports of the pump whine noise from the cards was shaping up. I'm not going to dive into that debate again here in this story as I think we have covered it pretty well thus far in that story as well as on our various podcasts, but rest assured we are continuing to look into the revisions of the Fury X to see if AMD and Cooler Master were actually able to fix the issue.


What we have to cover today is something very different, and likely much more interesting for a wider range of users. When you have three AMD Fury X cards in your hands, you of course have to do some multi-GPU testing with them. With our set I was able to run both 2-Way and 3-Way CrossFire with the new AMD flagship card and compare them directly to the comparable NVIDIA offering, the GeForce GTX 980 Ti.

There isn't much else I need to do to build up this story, is there? If you are curious how well the new AMD Fury X scales in CrossFire with two and even three GPUs, this is where you'll find your answers.

Continue reading our results testing the AMD Fury X and GeForce GTX 980 Ti in 3-Way GPU configurations!!

Introduction and Technical Specifications


In our previous article here, we demonstrated how to mod the EVGA GTX 970 SC ACX 2.0 video card to get higher performance and significantly lower running temps. Now we decided to take two of these custom modded EVGA GTX 970 cards to see how well they perform in an SLI configuration. ASUS was kind enough to supply us with one of their newly introduced ROG Enthusiast SLI Bridges for our experiments.

ASUS ROG Enthusiast SLI Bridge


Courtesy of ASUS


Courtesy of ASUS

For the purposes of running the two EVGA GTX 970 SC ACX 2.0 video cards in SLI, we chose to use the 3-way variant of ASUS' ROG Enthusiast SLI Bridge so that we could run the tests with full 16x bandwidth across both cards (with the cards in PCIe 3.0 x16 slots 1 and 3 in our test board). This customized SLI adapter features a powered red-colored ROG logo embedded in its brushed aluminum upper surface. The adapter supports 2-way and 3-way SLI in a variety of board configurations.


Courtesy of ASUS

ASUS offers their ROG Enthusiast SLI Bridge in 3 sizes for various variations on 2-way, 3-way, and 4-way SLI configurations. All bridges feature the top brushed-aluminum cap with embedded glowing ROG logo.

Continue reading our article on Modding the EVGA GTX 970 SC Graphics Card!


Courtesy of ASUS

The smallest bridge supports 2-way SLI configurations with either a two or three slot separation. The middle sized bridge supports up to a 3-way SLI configuration with a two slot separation required between each card. The largest bridge support up to a 4-way SLI configuration, also requiring a two slot separation between each card used.

Technical Specifications (taken from the ASUS website)

Dimensions 2-WAY: 97 x 43 x 21 (L x W x H mm)
3-WAY: 108 x 53 x 21 (L x W x H mm)
4-WAY: 140 x 53 x 21 (L x W x H mm)
Weight 70 g (2-WAY)
91 g (3-WAY)
123 g(4-WAY)
Compatible GPU set-ups 2-WAY: 2-WAY-S & 2-WAY-M
3-WAY: 2-WAY-L & 3-WAY
4-WAY: 4-WAY
Contents 2-WAY: 1 x optional power cable & 2 PCBs included for varying configurations
3-WAY: 1 x optional power cable
4-WAY: 1 x optional power cable

Continue reading our story!!

Subject: Mobile
Manufacturer: Various

Business Model Based on Partnerships

Alexandru Voica works for Imagination Technologies. His background includes research in computer graphics at the School of Advanced Studies Sant'Anna in Pisa and a brief stint as a CPU engineer, working on several high-profile 32-bit processors used in many mobile and embedded devices today. You can follow Alex on Twitter @alexvoica.

Some months ago my colleague Rys Sommefeldt wrote an article offering his (deeply) technical perspective on how a chip gets made, from R&D to manufacturing. While his bildungsroman production covers a lot of the engineering details behind silicon production, it is light on the business side of things; and that is a good thing because it gives me opportunity to steal some of his spotlight!

This article will give you a breakdown of the IP licensing model, describing the major players and the relationships between them. It is not designed to be a complete guide by any means and some parts might already sound familiar, but I hope it is a comprehensive overview that can be used by anyone who is new to product manufacturing in general.

The diagram below offers an analysis of the main categories of companies involved in the semiconductor food chain. Although I’m going to attempt to paint a broad picture, I will mainly offer examples based on the ecosystem formed around Imagination (since that is what I know best).


A simplified view of the manufacturing chain

Let’s work our way from left to right.

IP vendors

Traditionally, these are the companies that design and sell silicon IP. ARM and Imagination Technologies are perhaps the most renowned for their sub-brands: Cortex CPU + Mali GPU and MIPS CPU + PowerVR GPU, respectively.

Given the rapid evolution of the semiconductor market, such companies continue to evolve their business models beyond point solutions to become one-stop shops that offer more than for a wide variety of IP cores and platforms, comprising CPUs, graphics, video, connectivity, cloud software and more.

Continue reading The IP licensing business model. A love story. on PC Perspective!!

Podcast #355 - AMD R9 Fury X, Sapphire Nitro R9 390, Batman: Arkham Knight and more!

Subject: General Tech | June 25, 2015 - 03:08 PM |
Tagged: podcast, video, amd, fury x, Fury, Fiji, nvidia, gtx 980ti, maxwell, gm200, batman, arkham knight, gameworks, r9 390, sapphire, nitro, Intel, Braswell, Cherry Trail, Lenovo, thinkcentre

PC Perspective Podcast #355 - 06/25/2015

Join us this week as we discuss the AMD R9 Fury X, Sapphire Nitro R9 390, Batman: Arkham Knight and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Sebastian Peak, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Warner Bros. Suspends Arkham Knight PC Sales

Subject: General Tech, Graphics Cards | June 24, 2015 - 10:10 PM |
Tagged: batman, wb games, consolitis, gameworks, pc gaming, nvidia, amd

Over the last few days, the PC version of Batman: Arkham Knight has been receiving a lot of flak. Sites like PC Gamer were unable to review the game because they allege that Warner Brothers would not provide pre-release copies to journalists except for the PS4 version. This is often met with cynicism that can be akin to throwing darts in an unlit room with the assumption that a dartboard is in there somewhere. Other times, it is validated.


Whether or not the lack of PC review copies was related, the consensus is that Arkham Knight is a broken game. After posting a troubleshooting guide on the forums to help users choose the appropriate settings, WB Games has pulled the plug and suspended the game's sales on Steam until the issues are patched.

TotalBiscuit weighs in on the issues with his latest "Port Report".

No-one seems to be talking about what the issue is. Fortunately or unfortunately, I don't have the game myself so I cannot look and speculate based on debug information (which they probably disabled from the released game anyway). I could wildly speculate about DX11 limits from the number of elements on screen, but that is not based on any actual numbers. They could be really good at instancing and other tricks to keep the chunks of work being sent to the GPU as large as possible. I don't know. Whatever the issue is, it sounds pretty bad.

Source: WB Games

Podcast #354 - AMD R9 Fury X, R9 Nano, ASUS Zenfone2 and much more!

Subject: General Tech | June 18, 2015 - 02:03 PM |
Tagged: podcast, video, amd, radeon, R9, fury x, Fury, Fiji, fiji xt, r9 nano, fiji x2, project quantum, asus, zenfone 3, g751j, gameworks, nvidia, metal gear solid

PC Perspective Podcast #354 - 06/18/2015

Join us this week as we discuss the AMD R9 Fury X, R9 Nano, ASUS Zenfone2 and much more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

NVIDIA GameWorks Enhancements Coming in Upcoming Metal Gear Solid Game

Subject: Graphics Cards | June 16, 2015 - 12:53 AM |
Tagged: the phantom pain, nvidia, metal gear solid, graphics, gpus, geforce, gameworks

A blog post on NVIDIA's site indicates that Konami's upcoming game Metal Gear Solid 5: The Phantom Pain will make use of NVIDIA technologies, a move that will undoubtedly rankle AMD graphics users who can't always see the full benefit of GameWorks enhancements.


"The world of Metal Gear Solid V: The Phantom Pain is going to be 200 times larger than the one explored in Ground Zeroes. Because so much of this game’s action depends on stealth, graphics are a key part of the gameplay. Shadows, light, and terrain have to be rendered perfectly. That’s a huge challenge in a game where the hero is free to find his own way from one point to another.  Our engineers are signed up to work closely with Konami to get the graphics just right and to add special effects."

Now technically this quote doesn't confirm the use of any proprietary NVIDIA technology, though it sounds like that's exactly what will be taking place. In the wake of the Witcher 3 HairWorks controversy any such enhancements will certainly be looked upon with interest (especially as the next piece of big industry news will undoubtedly be coming with AMD's announcement later today at E3).

It's hard to argue with better graphical quality in high profile games such as the latest Metal Gear Solid installment, but there is certainly something to be said for adherence to open standards to ensure a more unified experience across GPUs. The dialog about inclusion though adherence to standards vs. proprietary solutions has been very heated with the FreeSync/G-Sync monitor refresh debate, and GameWorks is a series of tools that serves to further divide gamers, even as it provides an enhanced experience with GeForce GPUs.


Such advantages will likely matter less with DirectX 12 mitigating some differences with more efficiency in the vein of AMD's Mantle API, and if the rumored Fiji cards from AMD offer superior performance and arrive priced competitively this will matter even less. For now even though details are nonexistent expect an NVIDIA GeForce GPU to have the advantage in at least some graphical aspects of the latest Metal Gear title when it arrives on PC.

Source: NVIDIA

Introduction and Technical Specifications


The measure of a true modder is not in how powerful he can make his system by throwing money at it, but in how well he can innovate to make his components run better with what he or she has on hand. Some make artistic statements with their truly awe-inspiring cases, while others take the dremel and clamps to their beloved video cards in an attempt to eek out that last bit of performance. This article serves the later of the two. Don't get me wrong, the card will look nice once we're done with it, but the point here is to re-use components on hand where possible to minimize the cost while maximizing the performance (and sound) benefits.

EVGA GTX 970 SC Graphics Card


Courtesy of EVGA

We started with an EVGA GTX 970 SC card with 4GB ram and bundled with the new revision of EVGA's ACX cooler, ACX 2.0. This card is well built with a slight factory overclock out of the box. The ACX 2.0 cooler is a redesigned version of the initial version of the cooler included with the card, offering better cooling potential with fan's not activated for active cooling until the GPU block temperature breeches 60C.


Courtesy of EVGA

WATERCOOL HeatKiller GPU-X3 Core GPU Waterblock


Courtesy of WATERCOOL

For water cooling the EVGA GTX 970 SC GPU, we decided to use the WATERCOOL HeatKiller GPU-X3 Core water block. This block features a POM-based body with a copper core for superior heat transfer from the GPU to the liquid medium. The HeatKiller GPU-X3 Core block is a GPU-only cooler, meaning that the memory and integrated VRM circuitry will not be actively cooled by the block. The decision to use a GPU only block rather than a full cover block was two fold - availability and cost. I had a few of these on hand, making of an easy decision cost-wise.

Continue reading our article on Modding the EVGA GTX 970 SC Graphics Card!