More NVIDIA GeForce GTX 980 Leaked Details

Subject: General Tech, Graphics Cards | September 15, 2014 - 08:02 PM |
Tagged: nvidia, geforce, GTX 980

Details and photographs of the GeForce GTX 980 are leaking on various forums and websites. Based on the Maxwell architecture, it is supposed to be faster and more efficient than Kepler while being manufactured on an identical, 28nm fab process. While we were uncertain before, it now looks like the GTX 980 will be its formal name, as seen in leaked photographs, below.

NVIDIA-GeForce-GTX-980-Front-Picture.jpg

Image Credit: Videocardz

As expected, the cooler is a continuation of NVIDIA's reference cooler, as seen on recent high-end graphics cards (such as the GeForce Titan). Again, this is not a surprise. The interesting part is that it is rated for about 250W whereas Maxwell is rumored to draw 180W. While the reference card has two six-pin PCIe power connectors, I am curious to see if the excess cooling will lead to interesting overclocks. That is not even mentioning what the AIB partners can do.

NVIDIA-GeForce-GTX-980-Back-Picture.jpg

Image Credit: Videocardz

Beyond its over-engineering for Maxwell's TDP, it also includes a back plate.

nvidia-gtx-980-display-connectors.jpg

Image Credit: Chip Hell via Videocards

Its display connectors have been hotly anticipated. As you can see above, the GTX 980 has five outputs: three DisplayPort, one HDMI, and one DVI. Which version of HDMI? Which version of DisplayPort? No clue at the moment. There has been some speculation regarding HDMI 2.0, and the DisplayPort 1.3 standard was just released to the public today, but I would be pleasantly surprised if even one of these made it in.

Check out Videocardz for a little bit more.

Source: Videocardz

AMD R9 390X Is Rumored Liquid Cooled

Subject: General Tech, Graphics Cards, Cases and Cooling | September 15, 2014 - 05:50 PM |
Tagged: amd, R9, r9 390x, liquid cooler, liquid cooling, liquid cooling system, asetek

Less than a year after the launch of AMD's R9 290X, we are beginning to hear rumors of a follow-up. What is being called the R9 390X, because if it is called anything else, then that was a very short-lived branding scheme, might be liquid cooled. This would be the first single-processor, reference graphics card to have an integrated water cooler. That said, the public evidence is not as firm as I would normally like.

amd-r9-390x-leak.jpg

Image Credit: Baidu Forums

According to Tom's Hardware, Asetek is working on a liquid-cooled design for "an undisclosed OEM". The product is expected to ship during the first half of 2015 and the press release claims that it will "continue Asetek's success in the growing liquid cooling market". Technically, this could be a collaboration with an AIB partner, not necessarily a GPU developer. That said, the leaked photograph looks like a reference card.

We don't really know anything more than this. I would expect that it will be a refresh based on Hawaii, but that is pure speculation. I have no evidence to support that.

Personally, I would hope that a standalone air-cooled model would be available. While I have no experience with liquid cooling, it seems like a bit extra of a burden that not all purchasers of a top-of-the-line single GPU add-in board would want to bare. Specifically, placing the radiator if their case even supports it. That said, having a high-performing reference card will probably make the initial benchmarks look extra impressive, which could be a win in itself.

NCIX and LinusTech Does Four Single GPUs... Twice

Subject: General Tech, Graphics Cards | September 11, 2014 - 07:46 PM |
Tagged: quad sli, quad crossfire, nvidia, amd

Psst. AMD fans. Don't tell "Team Green" but Linus decided to take four R9 290X graphics cards and configure them in Quad Crossfire formation. They did not seem to have too much difficulty setting it up, although they did have trouble with throttling and setting up Crossfire profiles. When they finally were able to test it, they got a 3D Mark Fire Strike Extreme score of 14979.

Psst. NVIDIA fans. Don't tell "Team Red" but Linus decided to take four GeForce Titan Black graphics cards and configure them in Quad SLI formation. He had a bit of a difficult time setting up the machine at first, requiring a reshuffle of the cards (how would reordering PCIe slots for identical cards do anything?) and a few driver crashes, but it worked. Eventually, they got a 3D Mark Fire Strike Extreme score of around 13,300 (give or take a couple hundred).

NVIDIA GM204 info is leaking

Subject: General Tech, Graphics Cards | September 8, 2014 - 05:03 PM |
Tagged: leak, nvidia, GM204, GTX 980, GTX 980M, GTX 970, GTX 970M

Please keep in mind that this information has been assembled via research done by WCCF Tech and Videocardz off of 3DMark entries of unreleased GPUs; we won't get the official numbers until the middle of this month.  That said, rumours and guesswork about new hardware are a favourite past time of our readers so here is the information we've seen so far about the upcoming GM204 chip from NVIDIA.  On the desktop side is the GeForce GTX 980 and GeForce GTX 970 which should both have 4GB of GDDR5 on a 256-bit bus with GPU clock speeds ranging from 1127 to 1190 MHz.  The performance that was shown on 3DMark has the GTX 980 beating the 780 Ti and R9 290X and the GTX 970 performing similarly to the plain GTX 780 and falling behind the 290X.  SLI scaling looks rather attractive with a pair of GTX 980 coming within a hair of the performance of the R9 295X2.

NVIDIA-GeForce-GTX-980-and-GeForce-GTX-970-3DMark-Firestrike-Performance-635x599.png

On the mobile side things look bleak for AMD, the GTX 980M and GTX 970M surpass the current GTX 880M which in turn benchmarks far better than AMD's M290X chip.  Again the scaling in SLI systems will be impressive assuming that the leaks that you can see indepth here are accurate.  It won't be too much longer before we know one way or the other so you might want to keep your finger off of the Buy Button for a short while.

Source: WCCF Tech

Intel Graphics Drivers Claim Significant Improvements

Subject: General Tech, Graphics Cards, Processors | September 6, 2014 - 05:25 PM |
Tagged: iris pro, iris, intel hd graphics, Intel

I was originally intending to test this with benchmarks but, after a little while, I realized that Ivy Bridge was not supported. This graphics driver starts and ends with Haswell. While I cannot verify their claims, Intel advertises up to 30% more performance in some OpenCL tasks and a 10% increase in games like Batman: Arkham City and Sleeping Dogs. They even claim double performance out of League of Legends at 1366x768.

inteltf2.jpg

Intel is giving gamers a "free lunch".

The driver also tunes Conservative Morphological Anti-Aliasing (CMAA). They claim it looks better than MLAA and FXAA, "without performance impact" (their whitepaper from March showed a ~1-to-1.5 millisecond cost on Intel HD 5000). Intel recommends disabling it after exiting games to prevent it from blurring other applications, and they automatically disable it in Windows, Internet Explorer, Chrome, Firefox, and Windows 8.1 Photo.

Adaptive Rendering Control was also added in this driver. This limits redrawing identical frames by comparing the ones it does draw with previously drawn ones, and adjusts the frame rate accordingly. This is most useful for games like Angry Birds, Minesweeper, and Bejeweled LIVE. It is disabled when not on battery power, or when the driver is set to "Maximum Performance".

The Intel Iris and HD graphics driver is available from Intel, for both 32-bit and 64-bit Windows 7, 8, and 8.1, on many Haswell-based GPUs.

Source: Intel

AMD's Dropping the R9 295X2 Price to $999 USD

Subject: General Tech, Graphics Cards | September 3, 2014 - 09:28 PM |
Tagged: amd, R9, r9 295x2, price cut

amd-r9-295x2-gigabyte.jpg

While not fully in effect yet, AMD is cutting $500 off of the R9 295X2 price tag to $999 USD. Currently, there are two models available on Newegg USA at the reduced price, and one at Amazon for $1200. We expect to see other SKUs reduce soon, as well. This puts the water-cooled R9 295X2 just below the cost of two air-cooled R9 290X graphics cards.

If you were interested in this card, now might be the time (if one of the reduced units are available).

Source: Newegg

Matrox Creates Professional Graphics Cards with AMD GPUs

Subject: General Tech, Graphics Cards | September 3, 2014 - 06:15 PM |
Tagged: Matrox, firepro, cape verde xt gl, cape verde xt, cape verde, amd

Matrox, along with S3, develop GPU ASICs for use with desktop add-in boards, alongside AMD and NVIDIA. Last year, they sold less than 7000 units in their quarter according to my math (rounding to 0.0% market share implies < 0.05% of total market, which was 7000 units that quarter). Today, Matrox Graphics Inc. announce that they will use an AMD GPU on their upcoming product line.

Matrox-AMD-Logos-Image.jpg

While they do not mention a specific processor, they note that "the selected AMD GPU" will be manufactured at a 28nm process with 1.5 billion transistors. It will support DirectX 11.2, OpenGL 4.4, and OpenCL 1.2. It will have a 128-bit memory bus.

Basically, it kind-of has to be Cape Verde XT (or XT GL) unless it is a new, unannounced GPU.

If it is Cape Verde XT, it would have about 1.0 to 1.2 TFLOPs of single precision performance (depending on the chosen clock rate). Whatever clock rate is chosen, the chip contains 640 shader processors. It was first released in February 2012 with the Radeon HD 7770 GHz Edition. Again, this is assuming that AMD will not release a GPU refresh for that category.

Matrox will provide their PowerDesk software to configure multiple monitors. It will work alongside AMD's professional graphics drivers. It is a sad that to see a GPU ASIC manufacturer throw in the towel, at least temporarily, but hopefully they can use AMD's technology to remain in the business with competitive products. Who knows: maybe they will make a return when future graphics APIs reduce the burden of driver and product development?

Source: Matrox

PNY's Customized OC series, decent value for great performance

Subject: Graphics Cards | August 25, 2014 - 03:57 PM |
Tagged: pny, gtx 780, gtx 780 ti, Customized OC, factory overclocked

PNY is not as heavily marketed as some GPU resellers in North America but that doesn't mean they are not hard at work designing custom cards.  Hardware Canucks tried out the Customized OC GTX 780 and 780 Ti recently with the factory overclock as well as pushing the cards to the limit by manual overclocking.  Using EVGA's Precision overclocking tools they pushed the GTX 780 to 1120MHz Core, 6684MHz RAM and the Ti to an impressive 1162MHz Core, 7800MHz RAM.  Read on to see how effective the custom cooler proved to be as it is also a major part of the Customized series.

PNY-CUSTOM-1.jpg

"PNY's latest Customized series will be rolling through their GTX 780 and GTX 780 Ti lineups, bringing high end cooling and increased performance."

Here are some more Graphics Card articles from around the web:

Graphics Cards

AMD Announces Radeon R9 285X and R9 285 Graphics Card

Subject: Graphics Cards | August 23, 2014 - 10:46 AM |
Tagged: radeon, r9 285, R9, amd, 285

Today during AMD's live stream event celebrating 30 years of graphics and gaming, the company spent a bit of time announcing and teasing a new graphics card, the Radeon R9 285X and R9 285. Likely based on the Tonga GPU die, the specifications haven't been confirmed but most believe that the chip will feature 2048 stream processors, 128 texture units, 32 ROPs and a 256-bit memory bus.

r9285.jpg

In a move to help donate to the Child's Play charity, AMD currently has an AMD Radeon R9 285 on Ebay. It lists an ASUS built Strix-style cooled retail card, with 2GB of memory being the only specification that is visible on the box.

r92852.jpg

The R9 285X and R9 285 will replace the R9 280X and R9 280 more than likely and we should see these shipping and available in very early September.

UPDATE: AMD showed specifications of the Radeon R9 285 during the live stream.

r92853.jpg

For those of you with eyes as bad as mine, here are the finer points:

  • 1,792 Stream Processors
  • 918 MHz GPU Clock
  • 3.29 TFLOPS peak performance
  • 112 Texture units
  • 32 ROPs
  • 2GB GDDR5
  • 256-bit memory bus
  • 5.5 GHz memory clock
  • 2x 6-pin power connectors
  • 190 watt TDP
  • $249 MSRP
  • Release date: September 2nd

These Tonga GPU specifications are VERY similar to that of the R9 280: 1792 stream units, 112 texture units, etc. However, the R9 280 had a wider memory bus (384-bit) but runs at 500 MHz lower effective frequency. Clock speeds on Tonga look like they are just slightly lower as well. Maybe most interesting is the frame buffer size drop from 3GB to 2GB.

That's all we have for now, but I expect we'll have our samples in very soon and expect a full review shortly!

UPDATE 2: Apparently AMD hasn't said anything about the Radeon R9 285X, so for the time being, that still falls under the "rumor" category. I'm sure we'll know more soon though.

Source: AMD

PCPer Live! Recap - NVIDIA G-Sync Surround Demo and Q&A

Subject: Graphics Cards, Displays | August 22, 2014 - 08:05 PM |
Tagged: video, gsync, g-sync, tom petersen, nvidia, geforce

Earlier today we had NVIDIA's Tom Petersen in studio to discuss the retail availability of G-Sync monitors as well as to get hands on with a set of three ASUS ROG Swift PG278Q monitors running in G-Sync Surround! It was truly an impressive sight and if you missed any of it, you can catch the entire replay right here.

Even if seeing the ASUS PG278Q monitor again doesn't interest you (we have our full review of the monitor right here), you won't want to miss the very detailed Q&A that occurs, answering quite a few reader questions about the technology. Covered items include:

  • Potential added latency of G-Sync
  • Future needs for multiple DP connections on GeForce GPUs
  • Upcoming 4K and 1080p G-Sync panels
  • Can G-Sync Surround work through an MST Hub?
  • What happens to G-Sync when the frame rate exceeds the panel refresh rate? Or drops below minimum refresh rate?
  • What does that memory on the G-Sync module actually do??
  • A demo of the new NVIDIA SHIELD Tablet capabilities
  • A whole lot more!

Another big thank you to NVIDIA and Tom Petersen for stopping out our way and for spending the time to discuss these topics with our readers. Stay tuned here at PC Perspective as we will have more thoughts and reactions to G-Sync Surround very soon!!

NVIDIA Live Stream: We Want Your Questions!

Subject: Graphics Cards, Displays, Mobile | August 21, 2014 - 05:23 PM |
Tagged: nvidia, video, live, shield, shield tablet, g-sync, gsync, tom petersen

Tomorrow at 12pm EDT / 9am PDT, NVIDIA's Tom Petersen will be stopping by the PC Perspective office to discuss some topics of interest. There has been no lack of topics floating around the world of graphics card, displays, refresh rates and tablets recently and I expect the show tomorrow to be incredibly interesting and educational.

On hand we'll be doing demonstrations of G-Sync Surround (3 panels!) with the ASUS ROG Swift PG278Q display (our review here) and also show off the SHIELD Tablet (we have a review of that too) with some multiplayer action. If you thought the experience with a single G-Sync monitor was impressive, you will want to hear what a set of three of them can be like.

pcperlive.png

NVIDIA Live Stream with Tom Petersen

9am PT / 12pm ET - August 22nd

PC Perspective Live! Page

The topic list is going to include (but not limited to):

  • ASUS PG278Q G-Sync monitor
  • G-Sync availability and pricing
  • G-Sync Surround setup, use and requirements
  • Technical issues surrounding G-Sync: latency, buffers, etc.
  • Comparisons of G-Sync to Adaptive Sync
  • SHIELD Tablet game play
  • Altoids?

gsyncsurround.jpg

But we want your questions! Do you have burning issues that you think need to be addressed by Tom and the NVIDIA team about G-Sync, FreeSync, GameWorks, Tegra, tablets, GPUs and more? Nothing is off limits here, though obviously Tom may be cagey on future announcements. Please use the comments section on this news post below (registration not required) to ask your questions and we can organize them before the event tomorrow. We MIGHT even be able to come up with a couple of prizes to giveaway for live viewers as well...

See you tomorrow!!

An odd Q2 for tablets and PCs

Subject: General Tech, Graphics Cards | August 19, 2014 - 12:30 PM |
Tagged: jon peddie, gpu market share, q2 2014

Jon Peddie Research's latest Market Watch adds even more ironic humour to the media's continuing proclamations of the impending doom of the PC industry.  This quarter saw tablet sales decline while overall PCs were up and that was without any major releases to drive purchasers to adopt new technology.  While JPR does touch on the overall industry this report is focused on the sale of GPUs and APUs and happens to contain some great news for AMD.  They saw their overall share of the market increase by 11% from last quarter and by just over a percent of the entire market.  Intel saw a small rise in share though it does still hold the majority of the market as PCs with no discrete GPU are more likely to contain Intel's chips than AMDs.  That leaves NVIDIA who are still banking solely on discrete GPUs and saw over an 8% decline from last quarter and a decline of almost two percent in the total market.  Check out the other graphs in JPR's overview right here.

unnamed.jpg

"The big drop in graphics shipments in Q1 has been partially offset by a small rise this quarter. Shipments were up 3.2% quarter-to-quarter, and down 4.5% compared to the same quarter last year."

Here is some more Tech News from around the web:

Tech Talk

Khronos Announces "Next" OpenGL & Releases OpenGL 4.5

Subject: General Tech, Graphics Cards, Shows and Expos | August 15, 2014 - 08:33 PM |
Tagged: siggraph 2014, Siggraph, OpenGL Next, opengl 4.5, opengl, nvidia, Mantle, Khronos, Intel, DirectX 12, amd

Let's be clear: there are two stories here. The first is the release of OpenGL 4.5 and the second is the announcement of the "Next Generation OpenGL Initiative". They both occur on the same press release, but they are two, different statements.

OpenGL 4.5 Released

OpenGL 4.5 expands the core specification with a few extensions. Compatible hardware, with OpenGL 4.5 drivers, will be guaranteed to support these. This includes features like direct_state_access, which allows accessing objects in a context without binding to it, and support of OpenGL ES3.1 features that are traditionally missing from OpenGL 4, which allows easier porting of OpenGL ES3.1 applications to OpenGL.

opengl_logo.jpg

It also adds a few new extensions as an option:

ARB_pipeline_statistics_query lets a developer ask the GPU what it has been doing. This could be useful for "profiling" an application (list completed work to identify optimization points).

ARB_sparse_buffer allows developers to perform calculations on pieces of generic buffers, without loading it all into memory. This is similar to ARB_sparse_textures... except that those are for textures. Buffers are useful for things like vertex data (and so forth).

ARB_transform_feedback_overflow_query is apparently designed to let developers choose whether or not to draw objects based on whether the buffer is overflowed. I might be wrong, but it seems like this would be useful for deciding whether or not to draw objects generated by geometry shaders.

KHR_blend_equation_advanced allows new blending equations between objects. If you use Photoshop, this would be "multiply", "screen", "darken", "lighten", "difference", and so forth. On NVIDIA's side, this will be directly supported on Maxwell and Tegra K1 (and later). Fermi and Kepler will support the functionality, but the driver will perform the calculations with shaders. AMD has yet to comment, as far as I can tell.

nvidia-opengl-debugger.jpg

Image from NVIDIA GTC Presentation

If you are a developer, NVIDIA has launched 340.65 (340.23.01 for Linux) beta drivers for developers. If you are not looking to create OpenGL 4.5 applications, do not get this driver. You really should not have any use for it, at all.

Next Generation OpenGL Initiative Announced

The Khronos Group has also announced "a call for participation" to outline a new specification for graphics and compute. They want it to allow developers explicit control over CPU and GPU tasks, be multithreaded, have minimal overhead, have a common shader language, and "rigorous conformance testing". This sounds a lot like the design goals of Mantle (and what we know of DirectX 12).

amd-mantle-queues.jpg

And really, from what I hear and understand, that is what OpenGL needs at this point. Graphics cards look nothing like they did a decade ago (or over two decades ago). They each have very similar interfaces and data structures, even if their fundamental architectures vary greatly. If we can draw a line in the sand, legacy APIs can be supported but not optimized heavily by the drivers. After a short time, available performance for legacy applications would be so high that it wouldn't matter, as long as they continue to run.

Add to it, next-generation drivers should be significantly easier to develop, considering the reduced error checking (and other responsibilities). As I said on Intel's DirectX 12 story, it is still unclear whether it will lead to enough performance increase to make most optimizations, such as those which increase workload or developer effort in exchange for queuing fewer GPU commands, unnecessary. We will need to wait for game developers to use it for a bit before we know.

AMD Catalyst 14.7 Release Candidate 3

Subject: Graphics Cards | August 14, 2014 - 07:20 PM |
Tagged: catalyst 14.7 RC3, beta, amd

A new Catalyst Release Candidate has arrived and as with the previous driver it no longer supports Windows 8.0 or the WDDM 1.2 driver, so upgrade to Win 7 or Win 8.1 before installing please.  AMD will eventually release a driver which supports WDDM 1.1 under Win 8.0 for those who do not upgrade.

AMD-Catalyst-12-11-Beta-11-7900-Modded-Driver-Crafted-for-Performance.jpg

Feature Highlights of the AMD Catalyst 14.7​ RC3 Driver for Windows Includes all improvements found in the AMD Catalyst 14.7 RC driver

  • Display interface enhancements to improve 4k monitor performance and reduce flickering.
  • Improvements apply to the following products: ​
    • AMD Radeon R9 290 Series
    • AMD Radeon R9 270 Series
    • AMD Radeon HD 7800 Series​ ​​
  • Even with these improvements, cable quality and other system variables can affect 4k performance. AMD recommends using DisplayPort 1.2 HBR2 certified cables with a length of 2m (~6 ft) or less when driving 4K monitors.​
  • Wildstar: AMD Crossfire profile support
  • Lichdom: Single GPU and Multi-GPU performance enhancements
  • Watch Dogs: Smoother gameplay on single GPU and Multi-GPU configurations​

Feature Highlights of the AMD Catalyst 14.7​ RC Driver for Windows

  • Includes all improvements found in the AMD Catalyst 14.6 RC driver
    • AMD ​CrossFire and AMD Radeon Dual Graphics profile update for Plants vs. Zombies​​​
    • Assassin's Creed IV - improved CrossFire scaling (3840x2160 High Settings) up to 93%
    • Collaboration with AOC has identified non-standard display timings as the root cause of 60Hz SST flickering exhibited by the AOC U2868PQU panel on certain AMD Radeon graphics cards.
    • A software workaround has been implemented in AMD Catalyst 14.7 RC driver to resolve the display timing issues with this display. Users are further encouraged to obtain newer display firmware from AOC that will resolve flickering at its origin.
    • Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection.​​​

Feature Highlights of the AMD Catalyst 14.6 RC Driver for Windows

  • Plants vs. Zombies (Direct3D performance improvements):
    • AMD Radeon R9 290X - 1920x1080 Ultra – improves up to 11%
    • AMD Radeon R9 290X - 2560x1600 Ultra – improves up to 15%
    • AMD Radeon R9 290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
  • 3DMark Sky Diver improvements:
    • AMD A4-6300 – improves up to 4%
    • Enables AMD Dual Graphics/AMD CrossFire support
  • Grid Auto Sport: AMD CrossFire profile
  • Wildstar: Power Xpress profile
    • Performance improvements to improve smoothness of application
    • Performance improves up to 24% at 2560x1600 on the AMD Radeon R9 and R7 Series of products for both single GPU and multi-GPU configurations.
  • Watch Dogs: AMD CrossFire – Frame pacing improvements
  • Battlefield Hardline Beta: AMD CrossFire profile

Known Issues

  • Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
  • Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in flickering
  • AMD CrossFire configurations with AMD Eyefinity enabled will see instability with BattleField 4 or Thief when running Mantle
  • Catalyst Install Manager text is covered by Express/Custom radio button text
  • Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder
Source: AMD

Intel and Microsoft Show DirectX 12 Demo and Benchmark

Subject: General Tech, Graphics Cards, Processors, Mobile, Shows and Expos | August 13, 2014 - 09:55 PM |
Tagged: siggraph 2014, Siggraph, microsoft, Intel, DirectX 12, directx 11, DirectX

Along with GDC Europe and Gamescom, Siggraph 2014 is going on in Vancouver, BC. At it, Intel had a DirectX 12 demo at their booth. This scene, containing 50,000 asteroids, each in its own draw call, was developed on both Direct3D 11 and Direct3D 12 code paths and could apparently be switched while the demo is running. Intel claims to have measured both power as well as frame rate.

intel-dx12-LockedFPS.png

Variable power to hit a desired frame rate, DX11 and DX12.

The test system is a Surface Pro 3 with an Intel HD 4400 GPU. Doing a bit of digging, this would make it the i5-based Surface Pro 3. Removing another shovel-load of mystery, this would be the Intel Core i5-4300U with two cores, four threads, 1.9 GHz base clock, up-to 2.9 GHz turbo clock, 3MB of cache, and (of course) based on the Haswell architecture.

While not top-of-the-line, it is also not bottom-of-the-barrel. It is a respectable CPU.

Intel's demo on this processor shows a significant power reduction in the CPU, and even a slight decrease in GPU power, for the same target frame rate. If power was not throttled, Intel's demo goes from 19 FPS all the way up to a playable 33 FPS.

Intel will discuss more during a video interview, tomorrow (Thursday) at 5pm EDT.

intel-dx12-unlockedFPS-1.jpg

Maximum power in DirectX 11 mode.

For my contribution to the story, I would like to address the first comment on the MSDN article. It claims that this is just an "ideal scenario" of a scene that is bottlenecked by draw calls. The thing is: that is the point. Sure, a game developer could optimize the scene to (maybe) instance objects together, and so forth, but that is unnecessary work. Why should programmers, or worse, artists, need to spend so much of their time developing art so that it could be batch together into fewer, bigger commands? Would it not be much easier, and all-around better, if the content could be developed as it most naturally comes together?

That, of course, depends on how much performance improvement we will see from DirectX 12, compared to theoretical max efficiency. If pushing two workloads through a DX12 GPU takes about the same time as pushing one, double-sized workload, then it allows developers to, literally, perform whatever solution is most direct.

intel-dx12-unlockedFPS-2.jpg

Maximum power when switching to DirectX 12 mode.

If, on the other hand, pushing two workloads is 1000x slower than pushing a single, double-sized one, but DirectX 11 was 10,000x slower, then it could be less relevant because developers will still need to do their tricks in those situations. The closer it gets, the fewer occasions that strict optimization is necessary.

If there are any DirectX 11 game developers, artists, and producers out there, we would like to hear from you. How much would a (let's say) 90% reduction in draw call latency (which is around what Mantle claims) give you, in terms of fewer required optimizations? Can you afford to solve problems "the naive way" now? Some of the time? Most of the time? Would it still be worth it to do things like object instancing and fewer, larger materials and shaders? How often?

To boldy go where no 290X has gone before?

Subject: Graphics Cards | August 13, 2014 - 06:11 PM |
Tagged: factory overclocked, sapphire, R9 290X, Vapor-X R9 290X TRI-X OC

As far as factory overclocks go, the 1080MHz core and 5.64GHz RAM on the new Sapphire Vapor-X 290X is impressive and takes the prize for the highest factory overclock on this card [H]ard|OCP has seen yet.  That didn't stop them from pushing it to 1180MHz and 5.9GHz after a little work which is even more impressive.  At both the factory and manual overclocks the card handily beat the reference model and the manually overclocked benchmarks could meet or beat the overclocked MSI GTX 780 Ti GAMING 3G OC card.  The speed is not the only good feature, Intelligent Fan Control keeps two of the three fans from spinning when the GPU is under 60C which vastly reduces the noise produced by this card.  It is currently selling for $646, lower than the $710 that the GeForce is currently selling for as well.

1406869221rJVdvhdB2o_1_6_l.jpg

"We take a look at the SAPPHIRE Vapor-X R9 290X TRI-X OC video card which has the highest factory overclock we've ever encountered on any AMD R9 290X video card. This video card is feature rich and very fast. We'll overclock it to the highest GPU clocks we've seen yet on R9 290X and compare it to the competition."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Time to update your Gallium3D

Subject: General Tech, Graphics Cards | August 6, 2014 - 01:34 PM |
Tagged: radeon, Gallium3D, catalyst 14.6 Beta, linux, ubuntu 14.04

The new Gallium3D is up against the open source Catalyst 14.6 Beta, running under Ubuntu 14.04 and both the 3.14 and 3.16 Linux kernels, giving Phoronix quite a bit of testing to do.  They have numerous cards in their test ranging from an HD 6770 to an R9 290 though unfortunately there are no Gallium3D results for the R9 290 as it will not function until the release of the Linux 3.17 kernel.  Overall the gap is closing, the 14.6 Beta still remains the best performer but the open source alternative is quickly closing the gap.

image.php_.jpg

"After last week running new Nouveau vs. NVIDIA proprietary Linux graphics benchmarks, here's the results when putting AMD's hardware on the test bench and running both their latest open and closed-source drivers. Up today are the results of using the latest Radeon Gallium3D graphics code and Linux kernel against the latest beta of the binary-only Catalyst driver."

Here is some more Tech News from around the web:

Tech Talk

Source: Phoronix

Rumor: NVIDIA GeForce GTX 880 Is Actually September?

Subject: General Tech, Graphics Cards | August 3, 2014 - 04:59 PM |
Tagged: nvidia, maxwell, gtx 880

Just recently, we posted a story that claimed NVIDIA was preparing to launch high-end Maxwell in the October/November time frame. Apparently, that was generous. The graphics company is said to announce their GeForce GTX 880 in mid-September, with availability coming later in the month. It is expected to be based on the GM204 architecture (which previous rumors claim is 28nm).

nvidia-geforce.png

It is expected that the GeForce GTX 880 will be available with 4GB of video memory, with an 8GB version possible at some point. As someone who runs multiple (five) monitors, I can tell you that 2GB is not enough for someone of my use case. Windows 7 says the same. It kicks me out of applications to tell me that it does not have enough video memory. This would be enough reason for me to get more GPU memory.

We still do not know how many CUDA cores will be present in the GM204 chip, or if the GeForce GTX 880 will have all of them enabled (but I would be surprised if it didn't). Without any way to derive its theoretical performance, we cannot compare it against the GTX 780 or 780Ti. It could be significantly faster, it could be marginally faster, or it could be somewhere between.

But we will probably find out within two months.

Source: Videocardz

AMD Releases FreeSync Information as a FAQ

Subject: General Tech, Graphics Cards, Displays | July 29, 2014 - 09:02 PM |
Tagged: vesa, nvidia, g-sync, freesync, DisplayPort, amd

Dynamic refresh rates have two main purposes: save power by only forcing the monitor to refresh when a new frame is available, and increase animation smoothness by synchronizing to draw rates (rather than "catching the next bus" at 16.67ms, on the 16.67ms, for 60 Hz monitors). Mobile devices prefer the former, while PC gamers are interested in the latter.

Obviously, the video camera nullifies the effect.

NVIDIA was first to make this public with G-Sync. AMD responded with FreeSync, starting with a proposal that was later ratified by VESA as DisplayPort Adaptive-Sync. AMD, then, took up "Project FreeSync" as an AMD "hardware/software solution" to make use of DisplayPort Adaptive-Sync in a way that benefits PC gamers.

Today's news is that AMD has just released an FAQ which explains the standard much more thoroughly than they have in the past. For instance, it clarifies the distinction between DisplayPort Adaptive-Sync and Project FreeSync. Prior to the FAQ, I thought that FreeSync became DisplayPort Adaptive-Sync, and that was that. Now, it is sounding a bit more proprietary, just built upon an open, VESA standard.

If interested, check out the FAQ at AMD's website.

Source: AMD

NVIDIA 340.52 Drivers Are Now Available

Subject: General Tech, Graphics Cards | July 29, 2014 - 08:27 PM |
Tagged: nvidia, geforce, graphics drivers, shield tablet, shield

Alongside the NVIDIA SHIELD Tablet launch, the company has released their GeForce 340.52 drivers. This version allows compatible devices to use GameStream and it, also, is optimized for Metro: Redux and Final Fantasy XIV (China).

nvidia-geforce.png

The driver supports GeForce 8-series graphics cards, and later. As a reminder, for GPUs that are not based on the Fermi architecture (or later), 340.xx will be your last driver version. NVIDIA does intend to provided extended support for 340.xx (and earlier) drivers until April 1st, 2016. But, when Fermi, Kepler, and Maxwell move on to 343.xx, Tesla and earlier will not. That said, most of the content of this driver is aimed at Kepler and later. Either way, the driver itself is available for those pre-Fermi cards.

I should also mention that a user of Anandtech's forums noted the removal of Miracast from NVIDIA documentation. NVIDIA has yet to comment, although it is still very short notice, at this point.

Source: NVIDIA