Pushing the GTX 980 to the limits; the ASUS ROG Poseidon

Subject: Graphics Cards | March 4, 2015 - 03:02 PM |
Tagged: asus, ROG Poseidon GTX 980, GTX 980, factory overclocked

On the box the ASUS ROG Poseidon GTX 980 Platinum states a base of 1178MHz and a boost clock of 1279MHz but in testing [H]ard|OCP saw the card sitting at 1328MHz in game while on air cooling.  They then proceeded to hook up a Koolance Exos-2 V2 and pushed the card to 1580MHz in game, though the RAM would only increase by 1.1GHz to 8.1GHz.  As you would expect this had a noticeable impact on the performance and while it might not compete with the just announced Titan X at $640 it is also far less expensive though still $200 more than the Sapphire Vapor-X 290X it was tested against and $90 more than the 8GB version of that card.  If you have the budget this GTX 980 is the fastest single GPU card on the planet right now.

1425298901Z7r8UqWalm_2_15_l.jpg

"The highest overclocked GeForce GTX 980 based video card just landed. If the ASUS ROG Poseidon GTX 980 Platinum video card with a hybrid air and liquid cooling system doesn't impress you, we are not sure what will when it comes to GPU. We push the Poseidon clocks and pit it against the AMD Radeon R9 290X for an ultimate showdown."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: [H]ard|OCP

GDC 15: NVIDIA Shows TITAN X at Epic Games Keynote

Subject: Graphics Cards | March 4, 2015 - 01:10 PM |
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC

For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.

The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.

titanx3.jpg

JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.

titanx2.jpg

Any guesses on performance or price?

titanx1.jpg

titanx4.jpg

Jen-Hsun signs the world's first TITAN X for Tim Sweeney.

Kite Demo running on TITAN X

UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.

titanx5.jpg

Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.

titanx6.jpg

GDC 15: Native versions of Doom 3, Crysis 3 running on Android, Tegra X1

Subject: Graphics Cards, Mobile | March 3, 2015 - 10:43 PM |
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3

Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.

I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.

Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!

While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.

GDC 15: Khronos Acknowledges Mantle's Start of Vulkan

Subject: General Tech, Graphics Cards, Shows and Expos | March 3, 2015 - 03:37 PM |
Tagged: vulkan, Mantle, Khronos, glnext, gdc 15, GDC, amd

khronos-group-logo.png

Neil Trevett, the current president of Khronos Group and a vice president at NVIDIA, made an on-the-record statement to acknowledge the start of the Vulkan API. The quote came to me via Ryan, but I think it is a copy-paste of an email, so it should be verbatim.

Many companies have made great contributions to Vulkan, including AMD who contributed Mantle. Being able to start with the Mantle design definitely helped us get rolling quickly – but there has been a lot of design iteration, not the least making sure that Vulkan can run across many different GPU architectures. Vulkan is definitely a working group design now.

So in short, the Vulkan API was definitely started with Mantle and grew from there as more stakeholders added their opinion. Vulkan is obviously different than Mantle in significant ways now, such as its use of SPIR-V for its shading language (rather than HLSL). To see a bit more information, check out our article on the announcement.

Update: AMD has released a statement independently, but related to Mantle's role in Vulkan

EVGA and Inno3D Announce the First 4GB NVIDIA GeForce GTX 960 Cards

Subject: Graphics Cards | March 3, 2015 - 02:44 PM |
Tagged: video cards, nvidia, gtx 960, geforce, 4GB

They said it couldn't be done, but where there are higher density chips there's always a way. Today EVGA and Inno3D have both announced new versions of GTX 960 graphics cards with 4GB of GDDR5 memory, placing the cards in a more favorable mid-range position depending on the launch pricing.

960_evga.PNG

EVGA's new 4GB NVIDIA GTX 960 SuperSC

Along with the expanded memory capacity EVGA's card features their ACX 2.0+ cooler, which promises low noise and better cooling. The SuperSC is joined by a standard ACX and the higher-clocked FTW variant, which pushes Base/Boost clocks to 1304/1367MHz out of the box.

960_evga_2.PNG

Inno3D's press release provides fewer details, and the company appears to be launching a single new model featuring 4GB of memory which looks like a variant of their existing GTX 960 OC card.

inno3d_960.jpg

The existing Inno3D GTX 960 OC card

The current 2GB version of the GTX 960 can be found starting at $199, so expect these expanded versions to include a price bump. The GTX 960, with only 1024 CUDA cores (half the count of a GTX 980) and a 128-bit memory interface, has been a very good performer nonetheless with much better numbers than last year's GTX 760, and is very competitive with AMD's R9 280/285. (It's a great overclocker, too.) The AMD/NVIDIA debate rages on, and NVIDIA's partners adding another 4GB offering to the mix will certainly add to the conversation, particularly as an upcoming 4GB version of the GTX 960 was originally said to be unlikely.

Source: EVGA

ARM and Geomerics Show Enlighten 3 Lighting, Integrate with Unity 5

Subject: Graphics Cards, Mobile | March 3, 2015 - 12:00 PM |
Tagged: Unity, lighting, global illumination, geomerics, GDC, arm

Back in 2013 ARM picked up a company called Geomerics, responsible for one the industry’s most advanced dynamic lighting engines used in games ranging from mobile to console to PC. Called Enlighten, it is the lighting engine in many major games in a variety of markets. Battlefield 3 uses it, Need for Speed: The Run does as well, The Bureau: XCOM Declassified and Quantum Conundrum mark another pair of major games that depend on Geomerics technology.

geo-3.jpg

Great, but what does that have to do with ARM and why would the company be interested in investing in software that works with such a wide array of markets, most of which are not dominated by ARM processors? There are two answers, the first of which is directional: ARM is using the minds and creative talent behind Geomerics to help point the Cortex and Mali teams in the correct direction for CPU and GPU architecture development. By designing hardware to better address the advanced software and lighting systems Geomerics builds then Cortex and Mali will have some semblance of an advantage in specific gaming titles as well as a potential “general purpose” advantage. NVIDIA employs hundreds of gaming and software developers for this exact reason: what better way to make sure you are always at the forefront of the gaming ecosystem than getting high-level gaming programmers to point you to that edge? Qualcomm also recently (back in 2012) started employing game and engine developers in-house with the same goals.

ARM also believes it will be beneficial to bring publishers, developers and middleware partners to the ARM ecosystem through deployment of the Enlighten engine. It would be feasible to think console vendors like Microsoft and Sony would be more willing to integrate ARM SoCs (rather than the x86 used in the PS4 and Xbox One) when shown the technical capabilities brought forward by technologies like Geomerics Enlighten.

geomerics-1.jpg

It’s best to think of the Geomerics acquisition of a kind of insurance program for ARM, making sure both its hardware and software roadmaps are in line with industry goals and directives.

At GDC 2015 Geomerics is announcing the release of the Enlighten 3 engine, a new version that brings cinematic-quality real-time global illumination to market. Some of the biggest new features include additional accuracy on indirect lighting, color separated directional output (enables individual RGB calculations), better light map baking for higher quality output, and richer material properties to support transparency and occlusion.

All of this technology will be showcased in a new Subway demo that includes real-time global illumination simulation, dynamic transparency and destructible environments.

Geomerics Enlighten 3 Subway Demo

Enlighten 3 will also ship with Forge, a new lighting editor and pipeline tool for content creators looking to streamline the building process. Forge will allow import functionality from Autodesk 3ds Max and Maya applications making inter-operability easier. Forge uses a technology called YEBIS 3 to show estimated final quality without the time consuming final-build processing time.

geo-1.jpg

Finally, maybe the biggest news for ARM and Geomerics is that the Unity 5 game engine will be using Enlighten as its default lighting engine, giving ARM/Mali a potential advantage for gaming experiences in the near term. Of course Enlighten is available as an option for Unreal Engine 3 and 4 for developers using that engine in mobile, console and desktop projects as well as in an SDK form for custom integrations.

GDC 15: AMD Mantle Might Be Dead as We Know It: No Public SDK Planned

Subject: Graphics Cards | March 2, 2015 - 02:31 PM |
Tagged: sdk, Mantle, dx12, API, amd

The Game Developers Conference is San Francisco starts today and you can expect to see more information about DirectX 12 than you could ever possibly want, so be prepared. But what about the original low-level API, AMD Mantle. Utilized in Battlefield 4, Thief and integrated into the Crytek engine (announced last year), announced with the release of the Radeon R9 290X/290, Mantle was truly the instigator that pushed Microsoft into moving DX12's development along at a faster pace.

Since DX12's announcement, AMD has claimed that Mantle would live on, bringing performance advantages to AMD GPUs and would act as the sounding board for new API features for AMD and game development partners. And, as was always trumpeted since the very beginning of Mantle, it would become an open API, available for all once it outgrew the beta phase that it (still) resides in.

mantle1.jpg

Something might have changed there.

A post over on the AMD Gaming blog from Robert Hallock has some news about Mantle to share as GDC begins. First, the good news:

AMD is a company that fundamentally believes in technologies unfettered by restrictive contracts, licensing fees, vendor lock-ins or other arbitrary hurdles to solving the big challenges in graphics and computing. Mantle was destined to follow suit, and it does so today as we proudly announce that the 450-page programming guide and API reference for Mantle will be available this month (March, 2015) at www.amd.com/mantle.
 
This documentation will provide developers with a detailed look at the capabilities we’ve implemented and the design decisions we made, and we hope it will stimulate more discussion that leads to even better graphics API standards in the months and years ahead.

That's great! We will finally be able to read about the API and how it functions, getting access to the detailed information we have wanted from the beginning. But then there is this portion:

AMD’s game development partners have similarly started to shift their focus, so it follows that 2015 will be a transitional year for Mantle. Our loyal customers are naturally curious what this transition might entail, and we wanted to share some thoughts with you on where we will be taking Mantle next:

AMD will continue to support our trusted partners that have committed to Mantle in future projects, like Battlefield™ Hardline, with all the resources at our disposal.

  1. Mantle’s definition of “open” must widen. It already has, in fact. This vital effort has replaced our intention to release a public Mantle SDK, and you will learn the facts on Thursday, March 5 at GDC 2015.
     
  2. Mantle must take on new capabilities and evolve beyond mastery of the draw call. It will continue to serve AMD as a graphics innovation platform available to select partners with custom needs.
     
  3. The Mantle SDK also remains available to partners who register in this co-development and evaluation program. However, if you are a developer interested in Mantle "1.0" functionality, we suggest that you focus your attention on DirectX® 12 or GLnext.

Essentially, AMD's Mantle API in it's "1.0" form is at the end of its life, only supported for current partners and the publicly available SDK will never be posted. Honestly, at this point, this isn't so much of a let down as it is a necessity. DX12 and GLnext have already superseded Mantle in terms of market share and mind share with developers and any more work AMD put into getting devs on-board with Mantle is wasted effort.

mantle-2.jpg

Battlefield 4 is likely to be the only major title to use AMD Mantle

AMD claims to have future plans for Mantle though it will continue to be available only to select partners with "custom needs." I would imagine this would expand outside the world games but could also mean game consoles could be the target, where developers are only concerned with AMD GPU hardware.

So - from our perspective, Mantle as we know is pretty much gone. It served its purpose, making NVIDIA and Microsoft pay attention to the CPU bottlenecks in DX11, but it appears the dream was a bit bigger than the product could become. AMD shouldn't be chastised because of this shift nor for its lofty goals that we kind-of-always knew were too steep a hill to climb. Just revel in the news that pours from GDC this week about DX12.

Source: AMD

So Long Adware, and Thanks for All the Fish!

Subject: Graphics Cards | March 1, 2015 - 07:30 AM |
Tagged: superfish, Lenovo, bloatware, adware

Obviously, this does not forget the controversy that Lenovo got themselves into, but it is certainly the correct response (if they act how they imply). Adware and bloatware is common to find on consumer PCs, which makes the slowest of devices even more sluggish as demos and sometimes straight-up advertisements claim their share of your resources. This does not even begin to discuss the security issues that some of these hitchhikers drag in. Again, I refer you to the aforementioned controversy.

lenovo-do.png

In response, albeit a delayed one, Lenovo has announced that, by the launch of Windows 10, they will only pre-install the OS and “related software”. Lenovo classifies this related software as drivers, security software, Lenovo applications, and applications for “unique hardware” (ex: software for an embedded 3D camera).

It looks to be a great step, but I need to call out “security software”. Windows 10 should ship with Microsoft's security applications in many regions, which really questions why a laptop provider would include an alternative. If the problem is that people expect McAfee or Symantec, then advertise pre-loaded Microsoft anti-malware and keep it clean. Otherwise, it feels like keeping a single finger in the adware take-a-penny dish.

At least it is not as bad as trying to install McAfee every time you update Flash Player. I consider Adobe's tactic the greater of two evils on that one. I mean, unless Adobe just thinks that Flash Player is so insecure that you would be crazy to install it without a metaphorical guard watching over your shoulder.

And then of course we reach the divide between “saying” and “doing”. We will need to see Lenovo's actual Windows 10 devices to find out if they kept their word, and followed its implications to a tee.

Source: Lenovo

Imagination Launches PowerVR GT7900, "Super-GPU" Targeting Consoles

Subject: Graphics Cards, Mobile | February 26, 2015 - 02:15 PM |
Tagged: super-gpu, PowerVR, Imagination Technologies, gt7900

As a preview to announcements and releases being made at both Mobile World Congress (MWC) and the Game Developers Summit (GDC) next week, Imagination Technologies took the wraps off of a new graphics product they are calling a "super-GPU". The PowerVR GT7900 is the new flagship GPU as a part of its Series7XT family that is targeting a growing category called "affordable game consoles." Think about the Android-powered set-top devices like the Ouya or maybe Amazon's Kindle TV.

gt7900-1.png

PowerVR breaks up its GPU designs into unified shading clusters (USCs) and the GT7900 has 16 of them for a total of 512 ALU cores. Imagination has previously posted a great overview of its USC architecture design and how you can compare its designs to other GPUs on the market. Imagination wants to claim that the GT7900 will offer "PC-class gaming experiences" though that is as ambiguous as the idea of a work load of a "console-level game." But with rated peak performance levels hitting over 800 GFLOPS in FP32 and 1.6 TFLOPS in FP16 (half-precision) this GPU does have significant theoretical capability.

  PowerVR GT7900 Tegra X1
Vendor Imagination Technologies NVIDIA
FP32 ALUs 512 256
FP32 GFLOPS 800 512
FP16 GFLOPS 1600 1024
GPU Clock 800 MHz 1000 MHz
Process Tech 16nm FinFET+ 20nm TSMC

Imagination also believes that PowerVR offers a larger portion of its peak performance for a longer period of time than the competition thanks to the tile-based deferred rendering (TBDR) approach that has been "refined over the years to deliver unmatched efficiency."

gt7900-2.png

The FP16 performance number listed above is useful as an extreme power savings option where the half-precision compute operates in a much more efficient manner. A fair concern is how many applications, GPGPU or gaming, actually utilize the FP16 data type but having support for it in the GT7900 allows developers to target it.

Other key features of the GT7900 include support for OpenGL ES 3.1 + AEP (Android Extension Pack), hardware tessellation and ASTC LDR and HDR texture compression standards. The GPU also can run in a multi-domain virtualization mode that would allow multiple operating systems to run in parallel on a single platform.

gt7900-3.png

Imagination believes that this generation of PowerVR will "usher a new era of console-like gaming experiences" and will showcase a new demo at GDC called Dwarf Hall.

I'll be at GDC next week and have already setup a meeting with Imagination to talk about the GT7900 so I can have some hands on experiences to report back with soon. I am continually curious about the market for these types of high-end "mobile" GPUs with the limited market that the Android console market currently addresses. Imagination does claim that the GT7900 is beating products with performance levels as high as the GeForce GT 730M discrete GPU - no small feat.

NVIDIA Faces Class Action Lawsuit for the GeForce GTX 970

Subject: Graphics Cards | February 23, 2015 - 04:12 PM |
Tagged: nvidia, geforce, GTX 970

So apparently NVIDIA and a single AIB partner, Gigabyte, are facing a class action lawsuit because of the GeForce GTX 970 4GB controversy. I am not sure why they singled out Gigabyte, but I guess that is the way things go in the legal world. Unlucky for them, and seemingly lucky for the rest.

nvidia-970-architecture.jpg

For those who are unaware, the controversy is based on NVIDIA claiming that the GeForce GTX 970 has 4GB of RAM, 64 ROPs, and 2048 KB of L2 Cache. In actuality, it has 56 ROPs and 1792KB of L2 Cache. The main talking point is that the RAM is segmented into two partitions, one that is 3.5GB and another that is 0.5GB. All 4GB are present on the card though, and accessible (unlike the disable L2 Cache and ROPs). Then again, I cannot see an instance in that class action lawsuit's exhibits which claim an incorrect number of ROPs or amount of L2 Cache.

Again, the benchmarks that you saw when the GeForce GTX 970 launched are still valid. Since the issue came up, Ryan has also tried various configurations of games in single- and multi-GPU systems to find conditions that would make the issue appear.

Source: Court Filing

Windows Update Installs GeForce 349.65 with WDDM 2.0

Subject: General Tech, Graphics Cards | February 21, 2015 - 04:23 PM |
Tagged: wddm 2.0, nvidia, geforce 349.65, geforce, dx12

Update 2: Outside sources have confirmed to PC Perspective that this driver contains DirectX 12 as well as WDDM 2.0. They also claim that Intel and AMD have DirectX 12 drivers available through Windows Update as well. After enabling iGPU graphics on my i7-4790K, the Intel HD 4600 received a driver update, which also reports as WDDM 2.0 in DXDIAG. I do not have a compatible AMD GPU to test against (just a couple of old Windows 7 laptops) but the source is probably right and some AMD GPUs will be updated to DX12 too.

So it turns out that if your motherboard dies during a Windows Update reboot, then you are going to be spending several hours reinstalling software and patches, but that is not important. What is interesting is the installed version number for NVIDIA's GeForce Drivers when Windows Update was finished with its patching: 349.65. These are not available on NVIDIA's website, and the Driver Model reports WDDM 2.0.

nvidia-34965-driver.png

It looks like Microsoft pushed out NVIDIA's DirectX 12 drivers through Windows Update. Update 1 Pt. 1: The "Runtime" reporting 11.0 is confusing though, perhaps this is just DX11 with WDDM 2.0?

nvidia-34965-dxdiag.png

I am hearing online that these drivers support the GeForce 600 series and later GPUs, and that there are later, non-public drivers available (such as 349.72 whose release notes were leaked online). NVIDIA has already announced that DirectX 12 will be supported on GeForce 400-series and later graphics cards, so Fermi drivers will be coming at some point. For now, it's apparently Kepler-and-later, though.

So with OS support and, now, released graphics drivers, all that we are waiting on is software and an SDK (plus any NDAs that may still be in effect). With Game Developers Conference (GDC 2015) coming up in a little over a week, I expect that we will get each of these very soon.

Update 1 Pt. 2: I should note that the release notes for 349.72 specifically mention DirectX 12. As mentioned above, is possible that 349.65 contains just WDDM 2.0 and not DX12, but it contains at least WDDM 2.0.

GPU Market sees 20-point swing in 2014: NVIDIA gains, AMD falls

Subject: Graphics Cards | February 21, 2015 - 12:18 PM |
Tagged: radeon, nvidia, marketshare, market share, geforce, amd

One of the perennial firms that measures GPU market share, Jon Peddie Research, has come out with a report on Q4 of 2014 this weekend and the results are eye opening. According to the data, NVIDIA and AMD each took dramatic swings from Q4 of 2013 to Q4 of 2014.

  Q4 2014 Q3 2014 Q4 2013 Year-to-year Change
AMD 24.0% 28.4% 35.0% -11.0%
Matrox 0.00% 0.10% 0.10% -0.1%
NVIDIA 76.0% 71.5% 64.9% +11.1%
S3 0.00% 0.00% 0.00% +0.0%

Data source: Jon Peddie Research

Here is the JPR commentary to start us out:

JPR's AIB Report tracks computer add-in graphics boards, which carry discrete graphics chips. AIBs used in desktop PCs, workstations, servers, and other devices such as scientific instruments. They are sold directly to customers as aftermarket products, or are factory installed. In all cases, AIBs represent the higher end of the graphics industry using discrete chips and private high-speed memory, as compared to the integrated GPUs in CPUs that share slower system memory.

The news was encouraging and seasonally understandable, quarter-to-quarter, the market decreased -0.68% (compared to the desktop PC market, which decreased 3.53%).

On a year-to-year basis, we found that total AIB shipments during the quarter fell -17.52% , which is more than desktop PCs, which fell -0.72%.

However, in spite of the overall decline, somewhat due to tablets and embedded graphics, the PC gaming momentum continues to build and is the bright spot in the AIB market.

icon.jpg

NVIDIA's Maxwell GPU

The overall PC desktop market increased quarter-to-quarter including double-attach-the adding of a second (or third) AIB to a system with integrated processor graphics-and to a lesser extent, dual AIBs in performance desktop machines using either AMD's Crossfire or Nvidia's SLI technology.

The attach rate of AIBs to desktop PCs has declined from a high of 63% in Q1 2008 to 36% this quarter.

The year to year change that JPR is reporting is substantial and shows a 20+ point change in market share in favor of NVIDIA over AMD. According to this data, AMD's market share has now dropped from 35% at the end of 2013 to just 24% at the end of 2014. Meanwhile, NVIDIA continues to truck forward, going from 64.9% at the end of 2013 to 76% at the end of 2014.

08.jpg

The Radeon R9 285 release didn't have the impact AMD had hoped

Clearly the release of NVIDIA's Maxwell GPUs, the GeForce GTX 750 Ti, GTX 970 and GTX 980 have impacted the market even more than we initially expected. In recent weeks the GTX 970 has been getting a lot of negative press with the memory issue and I will be curious to see what effect this has on sales in the near future. But the 12 month swing that you see in the table above is the likely cause for the sudden departure of John Byrne, Collette LaForce and Raj Naik.

AMD has good products, even better pricing and a team of PR and marketing folks that are talented and aggressive. So how can the company recover from this? Products, people; new products. Will the rumors circling around the Radeon R9 390X develop into such a product?

Hopefully 2015 will provide it.

EVGA would like to give you a GTX 960 SSC and a Z97 FTW motherboard

Subject: Editorial, General Tech, Graphics Cards, Motherboards | February 20, 2015 - 05:10 PM |
Tagged: z97, gtx 960, giveaway, evga, contest

I know, the nerve of some people. Jacob from EVGA emails me this week, complaining about how he has this graphics card and motherboard just sitting in his cubicle taking up space and "why won't I just give it away already!?"

Fine. I'll do it. For science.

So let's make this simple shall we? EVGA wants to get rid of some kick-ass gaming hardware and you want to win it. Why muddle up a good thing?

The Prizes

  • EVGA GeForce GTX 960 SSC
     
    • The EVGA GeForce GTX 960 delivers incredible performance, power efficiency, and gaming technologies that only NVIDIA Maxwell technology can offer. This is the perfect upgrade, offering 60% faster performance and twice the power efficiency of previous-generation cards*. Plus, it features VXGI for realistic lighting, support for smooth, tear-free NVIDIA G-SYNC technology, and Dynamic Super Resolution for 4K-quality gaming on 1080P displays.
       
    • The new EVGA ACX 2.0+ cooler brings new features to the award winning EVGA ACX 2.0 cooling technology. A Memory MOSFET Cooling Plate (MMCP) reduces MOSFET temperatures up to 11°C, and optimized Straight Heat Pipes (SHP) reduce GPU temperature by an additional 5°C. ACX 2.0+ coolers also feature optimized Swept fan blades, double ball bearings and an extreme low power motor, delivering more air flow with less power, unlocking additional power for the GPU.

evgacontest1.jpg

  • EVGA Z97 FTW Motherboard
     
    • Welcome to a new class of high performance motherboards with the EVGA Z97 lineup. These platforms offer a return to greatness with a new GUI BIOS interface, reimagined power VRM that focuses on efficiency, and are loaded with features such as Intel® Gigabit LAN, Native SATA 6G/USB 3.0 and more.
       
    • Engineered for the performance users with excellent overclocking features. Includes a GUI BIOS that is focused on functionality, new software interface for overclocking in the O.S., high quality components, M.2 storage option and more.

evgacontest2.jpg

The Process (aka how do you win?)

So even though I'm doing all the work getting this hardware out of Jacob's busy hands and to our readers...you do have to do a couple of things to win the hardware as well. 

  1. Fill out the questionnaire below.
     
  2. Enter the "secret phrase" from tonight's 337th episode of the PC Perspective Podcast. We'll be live streaming at 10pm ET / 7pm PT or you can wait for the downloadable version at http://www.pcper.com/podcast or the video version on our PC Perspective YouTube channel

The contest will run for one week so you will have more than enough time to listen to or watch the podcast and get the super-secret answer. We'll ship to anywhere in the world and one person will win both fantastic prizes! Once the contest closes (Wednesday, February 25th at 12pm ET) we'll randomly draw a winner from the form below that got the correct answer!

A HUGE thanks goes to our friends at EVGA for supplying the hardware for our giveaway. Good luck!

Source: EVGA

You have to pay to play, Gigabyte's overclockable GTX 980 G1 GAMING

Subject: Graphics Cards | February 20, 2015 - 02:08 PM |
Tagged: gigabyte, nvidia, GTX 980 G1 GAMING, windforce, maxwell, factory overclocked

If you want the same amount of Maxwell Streaming Multiprocessors and ROP units as a GTX 980 as well as that last 500MB of RAM to run at full speed then you will need to pay for a GTX 980.  One choice is Gigabyte's GTX 980 G1 Gaming which will cost you $580, around $240 more than a GTX 970 but the premium can be worth it if you need the power.  [H]ard|OCP took the already overclocked card from a boost GPU frequency of 1329MHz and RAM of 7GHz all the way to a boost of 1513MHz with RAM topping out at 8.11GHz.  That overclock had a noticeable effect on performance and helped the card garner an Editors Choice award.  See it in action here.

1424030496NnqLS3kD4X_1_1.jpg

"Today we have the GIGABYTE GTX 980 G1 GAMING, which features the WINDFORCE 600W cooling system and a high factory overclock. We will make comparisons to the competition, find out how fast it is compared to a GTX 980, and you won't believe the overclock we achieved! We will make both performance and price comparisons."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

NVIDIA Recants: Overclocking Returning to Mobile GPUs

Subject: Graphics Cards, Mobile | February 19, 2015 - 03:58 PM |
Tagged: nvidia, notebooks, mobile, gpu

After a week or so of debate circling NVIDIA's decsision to disable overclocking on mobility GPUs, we have word that the company has reconsidered and will be re-enabling the feature in next month's driver release:

As you know, we are constantly tuning and optimizing the performance of your GeForce PC.

We obsess over every possible optimization so that you can enjoy a perfectly stable machine that balances game, thermal, power, and acoustic performance.

Still, many of you enjoy pushing the system even further with overclocking.

Our recent driver update disabled overclocking on some GTX notebooks. We heard from many of you that you would like this feature enabled again. So, we will again be enabling overclocking in our upcoming driver release next month for those affected notebooks. 

If you are eager to regain this capability right away, you can also revert back to 344.75.

Now, I don't want to brag here, but we did just rail NVIDIA for this decision on last night's podcast...and then the decision was posted on NVIDIA's forums just four hours ago... I'm not saying, but I'm just saying!

nvidia-logo.jpg

All kidding aside, this is great news! And NVIDIA desperately needs to be paying attention to what consumers are asking for in order to make up for some poor decisions made in the last several months. Now (or at least soon), you will be able to return to your mobile GPU overclocking!

Amazon, Newegg and others offering partial refunds on GTX 970 purchases

Subject: Graphics Cards | February 19, 2015 - 01:51 PM |
Tagged: nvidia, memory issue, GTX 970, geforce

It looks like some online retailers are offering up either partial refunds or full refunds (with return) for those users that complain about the implications of the GeForce GTX 970 memory issue. On January 25th NVIDIA came clean about the true memory architecture of the GTX 970 which included changes to specifications around L2 cache and ROP count in addition to the division of the 4GB of memory into two distinct memory pools. Some users have complained about performance issues in heavy memory-depedent gaming scenarios even though my own testing has been less than conclusive

GM204_arch.jpg

Initially there was a demand for a recall or some kind of compensation from NVIDIA regarding the issue but that has fallen flat. What appears to be working (for some people) is going to the retailer directly. A thread on reddit.com's Hardware sub-reddit shows quite a few users were able to convince Amazon to issue 20-30% price refunds for their trouble. Even Newegg has been spotted offering either a full refund with return after the typical return window or a 20% credit through gift cards. 

gtx970refund.jpg

To be fair, some users are seeing their requests denied:

"After going back and forth for the past hour I manage to escalated my partial refund to a supervisor who promptly declined it."

"Are you serious? NewEgg told me to go complain to the vendor."

So, while we can debate the necessity or vadidity of these types of full or partial refunds from a moral ground, the truth is that this is happening. I'm very curious to hear what NVIDIA's internal thinking is on this matter and if it will impact relationships between NVIDIA, it's add-in card partners and the online retailers themselves. Who ends up paying the final costs is still up in the air I would bet.

Our discussion on the original GTX 970 Issue - Subscribe to our YouTube Channel for more!

What do you think? Is this just some buyers taking advantage of the situation for their own gain? Warranted requests from gamers that were taken advantage of? Leave me your thoughts in the comments!

Further reading on this issue:

Source: Reddit.com

ASUS Announces GTX 960 Mini - A New Small Form-Factor Graphics Card

Subject: Graphics Cards | February 16, 2015 - 11:04 AM |
Tagged: SFF, nvidia, mini-ITX GPU, mini-itx, gtx 960, graphics, gpu, geforce, asus

ASUS returns to the mini-ITX friendly form-factor with the GTX 960 Mini (officially named GTX960-MOC-2GD5 for maximum convenience), their newest NVIDIA GeForce GTX 960 graphics card.

image.jpg

Other than the smaller size to allow compatibility with a wider array of small enclosures, the GTX 960 Mini also features an overclocked core and promises "20% cooler and vastly quieter" performance from its custom heatsink and CoolTech fan. Here's a quick rundown of key specs:

  • 1190 MHz Base Clock / 1253 MHz Boost Clock
  • 1024 CUDA cores
  • 2GB 128-bit GDDR5 @ 7010 MHz
  • 3x DisplayPort, 1x HDMI 2.0, 1x DVI output

image.jpg

​No word on the pricing or availability of the card just yet. The other mini-ITX version of the GTX 960 on the market from Gigabyte has been selling for $199.99, so expect this to run somewhere between $200-$220 at launch.

image.jpg

ASUS has reused this image from the GTX 970 Mini launch, and so have I

The product page is up on the ASUS website so availability seems imminent.

Source: ASUS

Ubisoft Discusses Assassin's Creed: Unity with Investors

Subject: General Tech, Graphics Cards | February 15, 2015 - 07:30 AM |
Tagged: ubisoft, DirectX 12, directx 11, assassins creed, assassin's creed, assasins creed unity

During a conference call with investors, analysts, and press, Yves Guillemot, CEO of Ubisoft, highlighted the issues with Assassin's Creed: Unity with an emphasis on the positive outcomes going forward. Their quarter itself was good, beating expectations and allowing them to raise full-year projections. As expected, they announced that a new Assassin's Creed game would be released at the end of the year based on the technology they created for Unity, with “lessons learned”.

ubisoft-assassins-creed-unity-scene.jpg

Before optimization, every material on every object is at least one draw call.

Of course, there are many ways to optimize... but that effort works against future titles.

After their speech, the question period revisited the topic of Assassin's Creed: Unity and how it affected current sales, how it would affect the franchise going forward, and how should they respond to that foresight (Audio Recording - The question starts at 25:20). Yves responded that they redid “100% of the engine”, which was a tremendous undertaking. “When you do that, it's painful for all the group, and everything has to be recalibrated.” He continues: “[...] but the engine has been created, and it is going to help that brand to shine in the future. It's steps that we need to take regularly so that we can constantly innovated. Those steps are sometimes painful, but they allow us to improve the overall quality of the brand, so we think this will help the brand in the long term.”

This makes a lot of sense to me. When the issues first arose, it was speculated that the engine was pushing way too many draw calls, especially for DirectX 11 PCs. At the time, I figured that Ubisoft chose Assassin's Creed: Unity to be the first title to use their new development pipeline, focused on many simple assets rather than batching things together to minimize host-to-GPU and GPU-to-host interactions. Tens of thousands of individual tasks being sent to the GPU will choke a PC, and getting it to run at all on DirectX 11 might have diverted resources from, or even caused, many of the glitches. Currently, a few thousand is ideal although “amazing developers” can raise the ceiling to about ten thousand.

This also means that I expect the next Assassin's Creed title to support DirectX 12, possibly even in the graphics API's launch window. If I am correct, Ubisoft has been preparing for it for a long time. Of course, it is possible that I am simply wrong, but it would align with Microsoft's Holiday 2015 expectation for the first, big-budget titles to use the new interface and it would be silly to have done their big overhaul without planning on switching to DX12 ASAP.

Then there is the last concern: If I am correct, what should Ubisoft have done? Is it right for them to charge full price for a title that they know will have necessary birth pains? Do they delay it and risk (or even accept) that it will be non-profitable, and upset fans that way? There does not seem to be a clear answer, with all outcomes being some flavor of damage control.

Source: GamaSutra

Maxwell keeps on overclocking

Subject: Graphics Cards | February 12, 2015 - 01:41 PM |
Tagged: overclocking, nvidia, msi, gtx 960, GM206, maxwell

While Ryan was slaving over a baker's dozen of NVIDIA's GTX 960s, [H]ard|OCP focused on overclocking the MSI GeForce GTX 960 GAMING 2G that they recently reviewed.  Out of the box this GPU will hit 1366MHz in game, with memory frequency unchanged at 7GHz effective.  As users have discovered, overclocking cards with thermal protection that automatically downclocks the GPU when a certain TDP threshold has been reached is a little more tricky as simply upping the power provided to the card can raise the temperature enough that you end up with a lesser frequency that before you overvolted.  After quite a bit of experimentation, [H] managed to boost the memory to a full 8GHz and the in game GPU was hitting 1557MHz which is at the higher end of what Ryan saw.  The trick was to increase the Power Limit and turn the clock speed up but leave the voltage alone.

1421920414x2ymoRS6JM_2_4_l.jpg

"We push the new MSI GeForce GTX 960 GAMING video card to its limits of performance by overclocking to its limits. This NVIDIA GeForce GTX 960 GPU based video card has a lot of potential for hardware enthusiasts and gamers wanting more performance. We compare it with other overclocked cards to see if the GTX 960 can keep up."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

ASUS Launches GTX 750 Ti Strix OC Edition With Twice the Memory

Subject: Graphics Cards | February 11, 2015 - 11:55 PM |
Tagged: strix, maxwell, gtx 750ti, gtx 750 ti, gm107, factory overclocked, DirectCU II

ASUS is launching a new version of its factory overclocked GTX 750 Ti STRIX with double the memory of the existing STRIX-GTX750TI-OC-2GD5. The new card will feature 4GB of GDDR5, but is otherwise identical.

The new graphics card pairs the NVIDIA GM107 GPU and 4GB of memory with ASUS’ dual fan "0dB" DirectCU II cooler. The card can output video over DVI, HDMI, and DisplayPort.

Thanks to the aftermarket cooler, ASUS has factory overclocked the GTX 750 Ti GPU (640 CUDA cores) to a respectable 1124 MHz base and 1202 MHz GPU Boost clockspeeds. (For reference, stock clockspeeds are 1020 MHz base and 1085 MHz boost.) However, while the amount of memory has doubled the clockspeeds have remained the same at a stock clock of 5.4 Gbps (effective).

Asus GTX 750Ti Strix 4GB Factory Overclocked Graphics Card.jpg

ASUS has not annouced pricing or availability for the new card but expect it to come soon at a slight premium (~$15) over the $160 2GB STRIX 750Ti.

The additional memory (and it's usefulness vs price premium) is a bit of a headscratcher considering this is a budget card aimed at delivering decent 1080p gaming. The extra memory may help in cranking up the game graphics settings just a bit more. In the end, the extra memory is nice to have, but if you find a good deal on a 2GB card today, don’t get too caught up on waiting for a 4GB model.

Source: TechPowerUp