DirectX 12 Shipping with Windows 10

Subject: Graphics Cards | October 3, 2014 - 03:18 AM |
Tagged: microsoft, DirectX, DirectX 12, windows 10, threshold, windows

A Microsoft blog posting confirms: "The final version of Windows 10 will ship with DirectX 12". To me, this seems like a fairly obvious statement. The loose dates provided for both the OS and the availability of retail games suggest that the two would be launching at roughly the same time. The article also claims that DirectX 12 "Early Access" members will be able to develop with the Windows 10 Technical Preview. Apart from Unreal Engine 4 (for Epic Games subscribers), Intel will also provide source access to their Asteroids demo, shown at Siggraph 2014, to all accepted early access developers.

windows-directx12-landscapes.jpg

Our readers might find this information slightly disappointing as it could be interpreted that DirectX 12 would not be coming to Windows 7 (or even 8.x). While it does not look as hopeful as before, they never, at any point, explicitly say that it will not come to older operating systems. It still might.

Source: Microsoft

The MSI GeForce GTX 970 GAMING 4G is sitting right in the sweet spot

Subject: Graphics Cards | October 2, 2014 - 04:01 PM |
Tagged: msi, GTX 970 GAMING 4G, factory overclocked

It is sadly out of stock on both NewEgg and Amazon right now but MSI's $350 GTX 970 GAMING 4G is an incredible buy and worth waiting for.  The factory overclock already set up on this card is quite nice, a Core rated at 1140/1279MHz which [H]ard|OCP actually observed hit as high as 1366MHz until they overclocked it and hit 1542MHz before the 110% GPU power limitation ended their fun.  It would seem that the card is capable of more, if only you were not prevented from feeding it more than that extra 10%.  The card was already beating the 780 Ti and R8 290 before the overclock but you should read the full review to see what happened once they tested it at the full speed.

1411976595nitFZ11Eg1_1_9_l.jpg

"The MSI GeForce GTX 970 GAMING 4G video card is making the GeForce GTX 780 and AMD Radeon R9 290 obsolete. This $349 video card puts up a fight and punches in a win at this price. The overclock alone is somewhat staggering. If you are about to spend money on a GPU, don't miss this one."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: EVGA

Installation and Overview

While once a very popular way to cool your PC, the art of custom water loops tapered off in the early 2000s as the benefits of better cooling, and overclocking in general, met with diminished returns. In its place grew a host of companies offering closed loop system, individually sealed coolers for processors and even graphics cards that offered some of the benefits of standard water cooling (noise, performance) without the hassle of setting up a water cooling configuration manually.

A bit of a resurgence has occurred in the last year or two though where the art and styling provided by custom water loop cooling is starting to reassert itself into the PC enthusiast mindset. Some companies never left (EVGA being one of them), but it appears that many of the users are returning to it. Consider me part of that crowd.

IMG_9943.jpg

During a live stream we held with EVGA's Jacob Freeman, the very first prototype of the EVGA Hydro Copper was shown and discussed. Lucky for us, I was able to coerce Jacob into leaving the water block with me for a few days to do some of our testing and see just how much capability we could pull out of the GM204 GPU and a GeForce GTX 980.

Our performance preview today will look at the water block itself, installation, performance and temperature control. Keep in mind that this is a very early prototype, the first one to make its way to US shores. There will definitely be some changes and updates (in both the hardware and the software support for overclocking) before final release in mid to late October. Should you consider this ~$150 Hydro Copper water block for your GTX 980?

Continue reading our preview of the EVGA GTX 980 Hydro Copper Water Block!!

Broadwell-U (BGA) Lineup Leaked

Subject: Graphics Cards, Processors | September 30, 2014 - 03:33 AM |
Tagged: iris, Intel, core m, broadwell-y, broadwell-u, Broadwell

Intel's upcoming 14nm product line, Broadwell, is expected to have six categories of increasing performance. Broadwell-Y, later branded Core M, is part of the soldered BGA family at expected TDPs of 3.5 to 4.5W. Above this is Broadwell-U, which are also BGA packages, and thus require soldering by the system builder. VR-Zone China has a list of seemingly every 15W SKU in that category. 28W TDP "U" products are expected to be available in the following quarter, but are not listed.

intel-broadwell-u_1.png

Image Credit: VR-Zone

As for those 15W parts though, there are seventeen (17!) of them, ranging from Celeron to Core i7. While each product is dual-core, the ones that are Core i3 and up have Hyper-Threading, increasing the parallelism to four tasks simultaneously. In terms of cache, Celerons and Pentiums will have 2MB, Core i7s will have 4MB, and everything in between will have 3MB. Otherwise, the products vary on the clock frequency they were binned (bin-sorted) at, and the integrated graphics that they contain.

intel-broadwell-u_2.png

Image Credit: VR-Zone

These integrated iGPUs range from "Intel HD Graphics" on the Celerons and Pentiums, to "Intel Iris Graphics 6100" on one Core i7, two Core i5s, and one Core i3. The rest pretty much alternate between Intel HD Graphics 5500 and Intel HD Graphics 6000. Maximum frequency of any given iGPU can vary within the same product, but only by about 100 MHz at the most. The exact spread is below.

  • Intel HD Graphics: 300 MHz base clock, 800 MHz at load.
  • Intel HD Graphics 5500: 300 MHz base clock, 850-950 MHz at load (depending on SKU).
  • Intel HD Graphics 6000: 300 MHz base clock, 1000 MHz at load.
  • Intel Iris Graphics 6100: 300 MHz base clock, 1000-1100 MHz at load (depending on SKU).

Unfortunately, without the number of shader units to go along with the core clock, we cannot derive a FLOP value yet. This is a very important metric for increasing resolution and shader complexity, and it would provide a relatively fair metric to compare the new parts against previous offerings for higher resolutions and quality settings, especialy in DirectX 12 I would assume.

intel-broadwell-iris-graphics-6100.png

Image Credit: VR-Zone

Probably the most interesting part to me is that "Intel HD Graphics" without a number meant GT1 with Haswell. Starting with Broadwell, it has been upgraded to GT2 (apparently). As we can see from even the 4.5W Core M processors, Intel is taking graphics seriously. It is unclear whether their intention is to respect gaming's influence on device purchases, or if they are believing that generalized GPU compute will be "a thing" very soon.

Source: VR-Zone

AMD Catalyst 14.9 for Windows

Subject: Graphics Cards | September 29, 2014 - 05:33 PM |
Tagged: whql, radeon, Catalyst 14.9, amd

AMD.jpg

The full release notes are available here or take a look at the highlights below.

The latest version of the AMD Catalyst Software Suite, AMD Catalyst 14.9 is designed to support the following Microsoft Windows platforms:

Highlights of AMD Catalyst 14.9 Windows Driver

  • Support for the AMD Radeon R9 280
  • Performance improvements (comparing AMD Catalyst 14.9 vs. AMD Catalyst 14.4)
    • 3DMark Sky Diver improvements
      • AMD A4 6300 – improves up to 4%
      • Enables AMD Dual Graphics / AMD CrossFire support
    • 3DMark Fire Strike
      • AMD Radeon R9 290 Series - improves up to 5% in Performance Preset
    • 3DMark11
      • AMD Radeon R9 290 Series / R9 270 Series - improves up to 4% in Entry and Performance Preset
    • BioShock Infinite
      • AMD Radeon R9 290 Series – 1920x1080 - improves up to 5%
    • Company of Heroes 2
      • AMD Radeon R9 290 Series - improves up to 8%
    • Crysis 3
      • AMD Radeon R9 290 Series / R9 270 Series – improves up to 10%
    • Grid Auto Sport
      • AMD CrossFire profile
    • Murdered Soul Suspect
      • AMD Radeon R9 290X (2560x1440, 4x MSAA, 16x AF) – improves up to 50%
      • AMD Radeon R9 290 Series / R9 270 Series – improves up to 6%
      • CrossFire configurations improve scaling up to 75%
    • Plants vs. Zombies (Direct3D performance improvements)
      • AMD Radeon R9 290X - 1920x1080 Ultra – improves up to 11%
      • AMD Radeon R9290X - 2560x1600 Ultra – improves up to 15%
      • AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
    • Batman Arkham Origins:
      • AMD Radeon R9 290X (4x MSAA) – improves up to 20%
      • CrossFire configurations see up to a 70% gain in scaling
    • Wildstar
      • Power Xpress profile Performance improvements to improve smoothness of application
      • Performance improves up to 30% on the AMD Radeon R9 and R7 Series of products for both single GPU and Multi-GPU configurations
    • Tomb Raider
      • AMD Radeon R9 290 Series – improves up to 5%
    • Watch Dogs
      • AMD Radeon R9 290 Series / R9 270 Series – improves up to 9%
      • AMD CrossFire – Frame pacing improvement
      • Improved CrossFire performance – up to 20%
    • Assassin's Creed IV
      • Improves CrossFire scaling (3840x2160 High Settings) up to 93% (CrossFire scaling improvement of 25% compared to AMD Catalyst 14.4)
    • Lichdom
      • Improves performance for single GPU and Multi-GPU configurations
    • Star Craft II
      • AMD Radeon R9 290X (2560x1440, AA, 16x AF) – improves up to 20%

AMD Eyefinity enhancements

  • Mixed Resolution Support
    • A new architecture providing brand new capabilities
    • Display groups can be created with monitors of different resolution (including difference sizes and shapes)
    • Users have a choice of how surface is created over the display group
      • Fill – legacy mode, best for identical monitors
      • Fit – create the Eyefinity surface using best available rectangular area with attached displays
      • Expand – create a virtual Eyefinity surface using desktops as viewports onto the surface
    • Eyefinity Display Alignment
      • Enables control over alignment between adjacent monitors
      • One-Click Setup Driver detects layout of extended desktop
      • Can create Eyefinity display group using this layout in one click!
      • New user controls for video color and display settings
      • Greater control over Video Color Management:
        • Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
        • Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
        • Allows users to select different color depths per resolution and display

AMD Mantle enhancements

  • Mantle now supports AMD Mobile products with Enduro technology
    • Battlefield 4: AMD Radeon HD 8970M (1366x768; high settings) – 21% gain
    • Thief: AMD Radeon HD 8970M (1920x1080; high settings) – 14% gain
    • Star Swarm: AMD Radeon HD 8970M (1920x1080; medium settings) – 274% gain
  • Enables support for Multi-GPU configurations with Thief (requires the latest Thief update)
  • AMD AM1 JPEG decoding acceleration
    • JPEG decoding acceleration was first enabled on the A10 APU Series in AMD Catalyst 14.1 beta, and has now been extended to the AMD AM1 Platform
    • Provides fast JPEG decompression Provides Power Efficiency for JPEG decompression

Resolved Issues

  • 60Hz SST flickering has been identified as an issue with non-standard display timings exhibited by the AOC U2868PQU panel on certain AMD Radeon graphics cards. A software workaround has been implemented in the AMD Catalyst 14.9 driver to resolve the display timing issues with this display
  • Users seeing flickering issues in 60Hz SST mode are further encouraged to obtain newer display firmware from their monitor vendor that will resolve flickering at its origin.
  • Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection.
  • 4K panel flickering issues found on the AMD Radeon R9 290 Series and AMD Radeon HD 7800
  • Series Screen tearing observed on AMD CrossFire systems with Eyefinity portrait display configurations
  • Instability issues for Grid Autosport when running in 2x1 or 1x2 Eyefinity configurations
  • Geometry corruption in State of Decay
Source: AMD

Apple A8 Die Shot Released (and Debated)

Subject: Graphics Cards, Processors, Mobile | September 29, 2014 - 01:53 AM |
Tagged: apple, a8, a7, Imagination Technologies, PowerVR

First, Chipworks released a dieshot of the new Apple A8 SoC (stored at archive.org). It is based on the 20nm fabrication process from TSMC, which they allegedly bought the entire capacity for. From there, a bit of a debate arose regarding what each group of transistors represented. All sources claim that it is based around a dual-core CPU, but the GPU is a bit polarizing.

apple-a8-dieshot-chipworks.png

Image Credit: Chipworks via Ars Technica

Most sources, including Chipworks, Ars Technica, Anandtech, and so forth believe that it is a quad-core graphics processor from Imagination Technologies. Specifically, they expect that it is the GX6450 from the PowerVR Series 6XT. This is a narrow upgrade over the G6430 found in the Apple A7 processor, which is in line with the initial benchmarks that we saw (and not in line with the 50% GPU performance increase that Apple claims). For programmability, the GX6450 is equivalent to a DirectX 10-level feature set, unless it was extended by Apple, which I doubt.

apple-a8-dieshot-dailytech.png

Image Source: DailyTech

DailyTech has their own theory, suggesting that it is a GX6650 that is horizontally-aligned. From my observation, their "Cluster 2" and "Cluster 5" do not look identical at all to the other four, so I doubt their claims. I expect that they heard Apple's 50% claims, expected six GPU cores as the rumors originally indicated, and saw cores that were not there.

Which brings us back to the question of, "So what is the 50% increase in performance that Apple claims?" Unless they had a significant increase in clock rate, I still wonder if Apple is claiming that their increase in graphics performance will come from the Metal API even though it is not exclusive to new hardware.

But from everything we saw so far, it is just a handful of percent better.

EVGA Live Stream! Learn about Maxwell, X99 and Win Prizes!!

Subject: General Tech, Graphics Cards, Motherboards, Cases and Cooling | September 28, 2014 - 12:25 AM |
Tagged: X99, video, maxwell, live, GTX 980, GTX 970, evga

UPDATE: If you missed the live stream with myself and Jacob, you can catch the entire event in the video below. You won't want to miss out on seeing the first ever GTX 980 water block as well as announcements on new Torq mice!

EVGA has been a busy company recently. It has continued to innovate with new coolers for the recent GTX 980 and GTX 970 card releases, newer power supplies offer unique features and improved quality and power output, a new line of X99 chipset motherboards including a Micro ATX variant and hey, the company even released a line of high-performance mice this year! PC Perspective has covered basically all of these releases (and will continue to do so with pending GPU and MB reviews) but there is a lot that needs explaining.

To help out, an industry and community favorite will be stopping by from EVGA to the PC Perspective offices: Jacob Freeman. You might know him as @EVGA_JacobF on Twitter or have seen him on countless forums, but he will making an in-person appearance on Friday, September 26th on PC Perspective Live! We plan on discussing the brand new ACX 2.0 cooler on the Maxwell GPUs released last week, go over some of highlights of the new X99 motherboards and even touch on power supplies and the Torq mice line as well.

pcperlive.png

EVGA GTX 980/970, X99, PSU and Torq Live Stream featuring Jacob Freeman

3pm ET / 12pm PT - September 26th

PC Perspective Live! Page

EVGA has been a supporter of PC Perspective for a long time and we asked them to give back to our community during this live stream - and they have stepped up! Look at this prize list:

How can you participate and win these awesome pieces of hardware? Just be here at 3pm ET / 12pm PT on http://www.pcper.com/live and we'll be announcing winners as we go for those that tune in. It really couldn't be more simple!

x99class.jpg

If you have questions you want to ask Jacob about EVGA, or any of its line of products, please leave them in the comments section below and we'll start compiling a list to address on the live stream Friday. Who knows, we may even save some prizes for some of our favorite questions!

To make sure you don't miss our live stream events, be sure you sign up for our spam-free PC Perspective Live! Mailing List. We email that group a couple hours before each event gets started.

NVIDIA Confirms It Has No Plans to Support Adaptive Sync

Subject: Graphics Cards | September 27, 2014 - 07:24 PM |
Tagged: nvidia, maxwell, gsync, g-sync, freesync, adaptive sync

During an interview that we streamed live with NVIDIA's Tom Petersen this past Thursday, it was confirmed that NVIDIA is not currently working on, or has any current plans to, add support for the VESA-based and AMD-pushed Adaptive Sync portion of the DisplayPort 1.2a specification. To quote directly:

There is no truth [to that rumor of NVIDIA Adaptive Sync support] and we have made no official comments about Adaptive Sync. One thing I can say is that NVIDIA as a company is 100% dedicated to G-Sync. We are going to continue to invest in G-Sync and it is a way we can make the gaming experience better. We have no need for Adaptive Sync. We have no intention of [implementing it]."

Discussion of G-Sync begins at 1:27:14 in our interview.

To be clear, the Adaptive Sync part of DP 1.2a and 1.3+ are optional portions of the VESA spec that is not required for future graphics processors or even future display scalar chips. That means that upcoming graphics cards from NVIDIA could still be DisplayPort 1.3 compliant without implementing support for the Adaptive Sync feature. Based on the comments above, I fully expect that to be the case.

IMG_9328.JPG

The ASUS ROG Swift PG278Q G-Sync monitor

With that new information, you can basically assume that the future of variable refresh monitors is going to be divided: one set for users of GeForce cards and one set for users with Radeon cards. (Where Intel falls into this is up in the air.) Clearly that isn't ideal for a completely open ecosystem but NVIDIA has made the point, over and over, that what they have developed with G-Sync is difficult and not at all as simple as could be solved with the blunt instrument that Adaptive Sync is. NVIDIA has a history of producing technologies and then keeping them in-house, focusing on development specifically for GeForce owners and fans. The dream of having a VRR monitor that will run on both vendors GPUs appears to be dead.

When asked about the possibility of seeing future monitors that can support both NVIDIA G-Sync technology as well as Adaptive Sync technology, Petersen stated that while not impossible, he "would not expect to see such a device."

The future of G-Sync is still in development. Petersen stated:

"Don't think that were done. G-Sync is not done. Think of G-Sync as the start of NVIDIA solving the problems for gamers that are related to displays...G-Sync is our first technology that makes games look better on displays. But you can start looking at displays and make a lot of things better."

tearing5.jpg

Diagram showing how G-Sync affects monitor timings

So now we await for the first round of prototype FreeSync / Adaptive Sync monitors to hit our labs. AMD has put a lot of self-inflicted pressure on itself for this release by making claims, numerous times, that FreeSync will be just as good of an experience as G-Sync, and I am eager to see if they can meet that goal. Despite any ill feelings that some users might have about NVIDIA and some of its policies, it typically does a good job of maintaining a high quality user experience with these custom technologies. AMD will have to prove that what it has developed is on the same level. We should know more about that before we get too much further into fall.

You can check out our stories and reviews covering G-Sync here:

The Evil Within System Requirements

Subject: General Tech, Graphics Cards | September 27, 2014 - 02:59 AM |
Tagged: rage, pc gaming, consolitis

Shinji Mikami has been developing a survival horror game, which makes sense given a good portion of his portfolio. He created Resident Evil and much of the following franchise. The Evil Within is about to release, having recently gone gold. At around this time, publishers begin to release system requirements and Bethesda does not disappoint in that regard.

bethesda-evilwithin.jpg

Are the requirements... RAGE-inducing?

A case could be made for disappointing requirements, themselves, though.

Basically, Bethesda did not release minimum requirements. Instead, they said "This is what we recommend. It will run on less. Hope it does!" This would not be so problematic if one of their requirements wasn't a "GeForce GTX 670 with 4GBs of VRAM".

They also recommend a quad-core Core i7, 4GB of system memory, 50GB of hard drive space, and a 64-bit OS (Windows 7 or Windows 8.x).

Before I go on, I would like to mention that The Evil Within is built on the RAGE engine. Our site has dealt extensively with that technology when it first came out in 2011. While I did not have many showstopping performance problems with that game, personally, it did have a history with texture streaming. Keep that in mind as you continue to read.

A typical GTX 670 does not even have 4GBs of VRAM. In fact, the GTX 780 Ti does not even have 4GB of VRAM. Thankfully, both of the newly released Maxwell GPUs, the GTX 970 and the GTX 980, have at least 4GB of RAM. Basically, Bethesda is saying, "I really hope you bought the custom model from your AIB vendor". They literally say:

Note: We do not have a list of minimum requirements for the game. If you’re trying to play with a rig with settings below these requirements (you should plan to have 4 GBs of VRAM regardless), we cannot guarantee optimal performance.

Each time I read, "You should plan to have 4 GBs of VRAM regardless", it is more difficult for me to make an opinion about it. That is a lot of memory. Personally, I would wait for reviews and benchmarks, specifically for the PC, before purchasing the title. These recommended settings could be fairly loose, to suit the vision of the game developers, or the game could be a revival of RAGE, this time without the engine's original architect on staff.

The Evil Within launches on October 14th.

Source: Bethesda

New GPUs also mean lower prices for the previous generation

Subject: Graphics Cards | September 26, 2014 - 02:22 PM |
Tagged: asus, ROG, gtx 780 ti, MATRIX Platinum, DirectCU II

With the release of the new Maxwell cards comes an opportunity for those with a smaller budget to still get a decent upgrade for their systems.  Early adopters will often sell their previous GPUs once they've upgraded allowing you to get a better card than your budget would usually allow, though with a risk of ending up with a bum card.  The ASUS ROG GTX 780 Ti MATRIX Platinum is a good example with a DirectCU II air cooler for general usage but the LN2 switch will also allow more extreme cooling methods for those looking for something a little more impressive.  The factory overclock is not bad at 1006/1072MHz core and 7GHz effective memory but the overclock [H]ard|OCP managed at 1155/1220MHz and 7.05GHz pushes the performance above that of the R9 290X of the same family.  If you can find this card used at a decent price it could give you more of an upgrade than you thought you could afford.

141133148469XXJKl2RB_1_1.jpg

"In today's evaluation we are breaking down the ASUS ROG GTX 780 Ti MATRIX Platinum video card. We put this head-to-head with the ASUS ROG R9 290X MATRIX Platinum. Which provides a better gaming experience, best overclocking performance, and power and temperature? Which one provides the best value? "

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP