Nvidia GeForce 347.09 beta drivers have arrived

Subject: Graphics Cards | December 17, 2014 - 09:19 PM |
Tagged: geforce, nvidia, 347.09 beta

The 347.09 beta driver is out, which will help performance in Elite: Dangerous and Metal Gear Solid V: Ground Zeroes.  If you use GeForce Experience they will install automatically otherwise head to the driver page to manually install them.  Project CARS should also benefit from this new beta and you will be able to enable 3D on Alien: Isolation, Elite: Dangerous, Escape Dead Island, Far Cry 4 and Middle-Earth - Shadow of Mordor.  NVIDIA's new incremental updates, called GeForce Game Ready will mean more frequent driver updates with less changes than we have become accustomed to but do benefit those playing the games which they have been designed to improve.

14-NV-GTX_980_970-668x258-GF-Header-PDP-3A.jpg

As with the previous WHQL driver, GTX 980M SLI and GTX 970M SLI on notebooks does not function and if you do plan on updating your gaming laptop you should disable SLI before installing them.  You can catch up on all the changes in this PDF

Source: NVIDIA

AMD Omega is no longer in Alpha

Subject: Graphics Cards | December 9, 2014 - 03:08 PM |
Tagged: amd, catalyst, driver, omega

With AMD's new leader and restructuring comes a new type of driver update.  The Omega driver is intended to provide a large number of new features as well as performance updates once a year.  It does not replace the current cycle of Beta and WHQL driver updates and the next driver update will incorporate all of the changes from the Omega driver plus the new bug fixes or updates that the driver was released to address.

Many sites including The Tech Report have had at least a small amount of time to test the new driver and have not seen much in the way of installation issues, or unfortunately performance improvements on systems not using an AMD APU.  As more time for testing elapses and more reviews come out we may see improvements on low end systems but for now the higher end machines show little to no improvement on raw FPS rates.  Keep your eyes peeled for an update once we have had time to test the change on frame pacing results, which are far more important than just increasing your FPS. 

The main reason to be excited about this release, it is the long list of new features, from a DSR-like feature called Virtual Super Resolution which allows you to increase the resolution of your monitor although for now 4K super resolution is limited to the R285 as it is the only AMD Tonga card on the market at the moment.  Along with the release of the Omega driver comes news about Freesync displays, another feature enabled in the new driver and their availability; we have a release date of January or February with a 4K model arriving in March.

Check out the links to The Tech Report and below to read the full list of new features that this driver brings and don't forget to click on Ryan's article as well.

freesync-slide.jpg

"AMD has introduced what may be its biggest graphics driver release ever, with more than 20 new features, 400 bug fixes, and some miscellaneous performance improvements."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Awake Yet? Good! Optimizing Inverse Trig for AMD GPUs.

Subject: General Tech, Graphics Cards | December 2, 2014 - 03:11 AM |
Tagged: amd, GCN, dice, frostbite

Inverse trigonometric functions are difficult to compute. Their use is often avoided like the plague. If, however, the value is absolutely necessary, it will probably be solved by approximations or, if possible, replacing them with easier functions by clever use of trig identities.

arctrig-examples.png

If you want to see how the experts approach this problem, then Sébastien Lagarde, a senior developer of the Frostbite engine at DICE, goes into detail with a blog post. By detail, I mean you will see some GPU assembly being stepped through by the end of it. What makes this particularly interesting is the diagrams at the end, showing what each method outputs as represented by the shading of a sphere.

If you are feeling brave, take a look.

The MSI GTX 980 GAMING 4G and its fancy new fan

Subject: Graphics Cards | December 1, 2014 - 02:52 PM |
Tagged: msi, nvidia, GTX 980, GAMING 4G, factory overclocked, Twin Frozr V

MSI has updated their Twin Frozr V with Torx fans which are effective at moving a lot of air very quietly and 'S' shaped heatpipes which bear the name SuperSU.  Connectivity is provided by dual-link DVI-I, HDMI and three DisplayPort plugs which ought to provide enough flexibility for anyone.  It is clocked at 1216 - 1331MHz out of the box with GDDR5 running at 7GHz effective which [H]ard|OCP managed to increase to 1406 - 1533MHz and 7.16GHz on the memory which is rather impressive for a Maxwell chip with NVIDIA's power limits and shows just how much you can squeeze out of their new chip without needing to up the amount of juice you are providing it.  The overclocked card upped the full system wattage to 378W which was much lower than the R9 290 they tested against and the GPU temperature went as high as 70C when pushed to the limit which again is lower than the 290 however NVIDIA's selling price is certainly higher than AMD's.  Check out their full review here.

1417398114ROQehtibgG_1_1.jpg

"The MSI GTX 980 GAMING 4G video card has a factory overclock and the new Twin Frozr V cooling system. We'll push it to its highest custom overclock and pit it against the ASUS ROG R9 290X MATRIX Platinum overclocker, and determine the gaming bang for your buck. May the best card win."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

ASUS Announces GeForce GTX 970 DirectCU Mini: More Mini-ITX Gaming Goodness

Subject: Graphics Cards | November 29, 2014 - 09:57 AM |
Tagged: pcie, PCI Express, nvidia, mini-itx, GTX 970, graphics card, geforce, directcu mini, DirectCU, asus

ASUS has announced a tiny new addition to their GTX 970 family, and it will be their most powerful mini-ITX friendly card yet with a full GeForce GTX 970 GPU.

970_1.png

Image credit: ASUS

The ASUS 970 DirectCU Mini card will feature a modest factory overclock on the GTX 970 core running at 1088 MHz (stock 1050 MHz) with a 1228 MHz Boost Clock (stock 1178 MHz). Memory is not overclocked and remains at the stock 7 GHz speed.

970_2.png

The GTX 970 DirectCU Mini features a full backplate. Image credit: ASUS

The ASUS GTX 970 DirectCU Mini uses a single 8-pin PCIe power connector in place of the standard dual 6-pin configuration, which shouldn’t be a problem considering the 150W spec of the larger connector (and 145W NVIDIA spec of the 970).

970_3.png

Part of this complete mITX gaming breakfast. Image credit: ASUS

The tiny card offers a full array of display outputs including a pair of dual-link DVI connectors, HDMI 2.0, and DisplayPort 1.2. No word yet on pricing or availability, but the product page is up on the ASUS site.

GTX 970 in SLI, $700 of graphics power

Subject: Graphics Cards | November 20, 2014 - 07:08 PM |
Tagged: sli, NVIDA, GTX 970

The contestants are lined up in [H]ard|OCP's test bench, at around $700 you have a pair of GTX 970's and in the same weight class are a pair of R9 290X cards, next weighing in at just under $550 are two R9 290s, and rounding out the completion are a pair of GTX 780's who punch somewhere between $800 to $1000 depending on when you look.  The cards are tested for their ability to perform on a 4K stage as well as in the larger 5760x1200 multi-monitor event.  After a long and gruelling battle the extra work the 290X put into trimming its self down and fitting into a lower weight class has proven to be well worth the effort as they managed to show up the 970's in every performance category although certainly not in power efficiency.  Any of these pairings will be powerful but none can match a pair of GTX 980's who are also in a price class all by themselves.

14164141670QcFceujXf_1_1.gif

"We take 2-Way NVIDIA GeForce GTX 970 SLI for a spin and compare it to R9 290X CF, R9 290 CF, GTX 780 SLI at 4K resolution as well as NV Surround on a triple-display setup. If you want to see how all these video cards compare in these different display configurations we've got just the thing. Find out what $700 SLI gets you."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Gigabyte Wants All Your Money for a 3-Way SLI Watercooled GTX 980 Setup

Subject: Graphics Cards | November 14, 2014 - 11:46 AM |
Tagged: sli, nvidia, N980X3WA-4GD, maxwell, GTX 980, gigabyte, geforce, 3-way

Earlier this week, a new product showed up on Gigabyte's website that has garnered quite a bit of attention. The GA-N980X3WA-4GD WaterForce Tri-SLI is a 3-Way SLI system with integrated water cooling powered by a set of three GeForce GTX 980 GPUs.

waterforce1.jpg

That. Looks. Amazing.

What you are looking at is a 3-Way closed loop water cooling system with an external enclosure to hold the radiators while providing a display full of information including temperatures, fans speeds and more. Specifications on the Gigabyte site are limited for now, but we can infer a lot from them:

  • WATERFORCE :3-WAY SLI Water Cooling System
  • Real-Time Display and Control
  • Flex Display Technology
  • Powered by NVIDIA GeForce GTX 980 GPU
  • Integrated with 4GB GDDR5 memory 256-bit memory interface(Single Card)
  • Features Dual-link DVI-I / DVI-D / HDMI / DisplayPort*3(Single Card)
  • BASE: 1228 MHz / BOOST: 1329 MHz
  • System power supply requirement: 1200W(with six 8-pin external power connectors)

waterforce2.jpg

The GPUs on each card are your standard GeForce GTX 980 with 4GB of memory (we reviewed it here) though they are running at overclocked base and boost clock speeds, as you would hope with all that water cooling power behind it. You will need a 1200+ watt power supply for this setup, which makes sense considering the GPU horsepower you'll have access to.

Another interesting feature Gigabyte is listing is called GPU Gauntlet Sorting.

With GPU Gauntlet™ Sorting, the Gigabyte SOC graphics card guarantees the higher overclocking capability in terms of excellent power switching.

Essentially, Gigabyte is going to make sure that the GPUs on the WaterForce Tri-SLI are the best they can get their hands on, with the best chance for overclocking higher than stock.

waterforce3.jpg

Setup looks interesting - the radiators and fans will be in the external enclosure with tubing passing into the system through a 5.25-in bay. It will need to have quick connect/disconnect points at either the GPU or radiator to make that installation method possible.

waterforce4.jpg

Pricing and availability are still unknown, but don't expect to get it cheap. With the GTX 980 still selling for at least $550, you should expect something in the $2000 range or above with all the custom hardware and fittings involved.

Can I get two please?

Source: Gigabyte

NVIDIA GeForce GTX 960 Specifications Potentially Leaked

Subject: Graphics Cards | November 13, 2014 - 12:46 PM |
Tagged: nvidia, geforce, gtx 960, maxwell

It is possible that a shipping invoice fragment was leaked for the NVIDIA GeForce GTX 960. Of course, an image of text on a plain, white background is one of the easiest things to fake and/or manipulate, so take it with a grain of salt.

nvidia-gtx-960-shipping.jpg

The GTX 960 is said to have 4GB of RAM on the same, 256-bit bus. Its video outputs are listed as two DVI, one HDMI, and one DisplayPort, making this graphics card useful for just one G-Sync monitor per card. If I'm reading it correctly, it also seems to have a 993 MHz base clock (boost clock unlisted) and an effective 6008 MHz (1500 MHz actual) RAM clock. This is slightly below the 7 GHz (1750 MHz actual) of the GTX 970 and GTX 980 parts, but it should also be significantly cheaper.

The GeForce GTX 960 is expected to retail in the low-$200 price point... some day.

Source: Reader Tip

Ubisoft Responds to Low Frame Rates in Assassin's Creed Unity

Subject: Graphics Cards | November 12, 2014 - 09:03 PM |
Tagged: Unity, ubisoft, assassin's creed

Over the last couple of days there have been a lot of discussions about the performance of the new Assassin's Creed Unity from Ubisoft on current generation PC hardware. Some readers have expressed annoyance that the game is running poorly, at lower than expected frame rates, at a wide range of image quality settings. Though I haven't published my results yet, we are working on a story comparing NVIDIA and AMD GPUs in Unity, but the truth is that this is occurring on GPUs from both sides.

For example, using a Core i7-3960X and a single GeForce GTX 980 4GB reference card, I see anywhere from 37 FPS to 48 FPS while navigating the crowded city of Paris at 1920x1080 and on the Ultra High preset. Using the Low preset, that frame rate increases to 65-85 FPS or so.

unity3.jpg

Clearly, those are lower frame rates at 1920x1080 than you'll find in basically any other PC game on the market. The accusation from some in the community is that Ubisoft is either doing this on purpose or doing it out of neglect with efficient code. I put some questions to the development team at Ubisoft and though I only had a short time with them, the answers tell their side of the story.

Ryan Shrout: What in the Unity game engine is putting the most demand on the GPU and its compute resources? Are there specific effects or were there specific design goals for the artists that require as much GPU horsepower as the game does today with high image quality settings?

Ubisoft: Assassin’s Creed Unity is one of the most detailed games on the market and [contains] a giant, open world city built to the scale that we’ve recreated. Paris requires significant details. Some points to note about Paris in Assassin’s Creed Unity:

  • There are tens of thousands of objects are visible on-screen, casting and receiving shadows.
  • Paris is incredibly detailed. For example, Notre-Dame itself is millions of triangles.
  • The entire game world has global illumination and local reflections.
  • There is realistic, high-dynamic range lighting.
  • We temporally stabilized anti-aliasing.

RS: Was there any debate internally about downscaling on effects/image quality to allow for lower end system requirements?

Ubisoft: We talked about this a lot, but our position always came back to us ensuring that Assassin’s Creed Unity is a next-gen only game with breakthrough graphics. With this vision, we did not degrade the visual quality of the game. On PC, we have several option for low-scaling, like disabling AA, decreasing resolution, and we have low option for Texture Quality, Environment Quality and Shadows.

RS: Were you looking forward or planning for future GPUs (or multi-GPU) that will run the game at peak IQ settings at higher frame rates than we have today?

Ubisoft: We targeted existing PC hardware.

RS: Do you envision updates to the game or to future GPU drivers that would noticeably improve performance on current generations of hardware?

Ubisoft: The development team is continuing to work on optimization post-launch through software updates. You’ll hear more details shortly.

Some of the features listed by the developer in the first answer - global illumination methods, high triangle counts, HDR lighting - can be pretty taxing on GPU hardware. I know there are people out there pointing out games that have similar feature sets and that run at higher frame rates, but the truth is that no two game engines are truly equal. If you have seen Assassin's Creed Unity in action you'll be able to tell immediately the game is beautiful, stunningly so. Is it worth that level of detail for the performance levels achieved from current high-end hardware? Clearly that's the debate.

unity2.jpg

When I asked if Ubisoft had considered scaling back the game to improve performance, they clearly decided against it. The developer had a vision for the look and style of the game and they were dedicated to it; maybe to a fault from some gamers' viewpoint.

Also worth nothing is that Ubisoft is continuing to work on optimization post-release; how much of an increase we'll actually see with game patches or driver updates will have to be seen as we move forward. Some developers have a habit of releasing a game and simply abandoning it as it shipped - hopefully we will see more dedication from the Unity team.

So, if the game runs at low frame rates on modern hardware...what is the complaint exactly? I do believe that Ubisoft would have benefited from better performance on lower image quality settings. You can tell by swapping the settings for yourself in game but the quality difference between Low and Ultra High is noticeable, but not dramatically so. Again, this likely harkens back to the desire of Ubisoft to maintain an artistic vision.

Remember that when Crysis 3 launched early last year, running at 1920x1200 at 50 FPS required a GTX 680, the top GPU at the time; and that was at the High settings. The Very High preset only hit 37 FPS on the same card.

PC gamers seems to be creating a double standard. On one hand, none of us want PC-ports or games that are developed with consoles in mind that don't take advantage of the power of the PC platform. Games in the Call of Duty series are immensely popular but, until the release of Advanced Warfare, would routinely run at 150-200 FPS at 1080p on a modern PC. Crysis 3 and Assassin's Creed Unity are the opposite of that - games that really tax current CPU and GPU hardware, paving a way forward for future GPUs to be developed and NEEDED.

If you're NVIDIA or AMD, you should applaud this kind of work. Now I am more interested than ever in a GTX 980 Ti, or a R9 390X, to see what Unity will play like, or what Far Cry 4 will run at, or if Dragon Age Inquisition looks even better.

Of course, if we can get more performance from a better optimized or tweaked game, we want that too. Developers need to be able cater to as wide of a PC gaming audience as possible, but sometimes creating a game that can scale between running on a GTX 650 Ti and a GTX 980 is a huge pain. And with limited time frames and budgets, don't we want at least some developers to focus on visual quality rather than "dumbing down" the product?

Let me know what you all think - I know this is a hot-button issue!

UPDATE: Many readers in the comments are bringing up the bugs and artifacts within Unity, pointing to YouTube videos and whatnot. Those are totally valid complaints about the game, but don't necessarily reflect on the game's performance - which is what we were trying to target with this story. Having crashes and bugs in the game is disappointing, but again, Ubisoft and Assassin's Creed Unity aren't alone here. Have you seen the bugs in Skyrim or Tomb Raider? Hopefully Ubisoft will be more aggressive in addressing them in the near future. 

UPDATE 2: I also wanted to comment that even though I seem to be defending Ubisoft around the performance of Unity, my direct feedback to them was that they should enable modes in the game that allow it to play at higher frame rates and even lower image quality settings, even if they were unable to find ways to "optimize" the game's efficiency. So far the developer seems aware of all the complaints around performance, bugs, physics, etc. and is going to try to address them.

UPDATE 3: In the last day or so, a couple of other media outlets have posted anonymous information that indicates that the draw call count for Assassin's Creed Unity is at fault for the poor performance of the game on PCs. According to this "anonymous" source, while the consoles have low-level API access to hardware to accept and process several times the draw calls, DirectX 11 can only handle "7,000 - 10,000 peak draw calls." Unity apparently is "pushing in excess of 50,000 draw calls per frame" and thus is putting more pressure on the PC that it can handle, even with high end CPU and GPU hardware. The fact that these comments are "anonymous" is pretty frustrating as it means that even if they are accurate, they can't be taken as the truth without confirmation from Ubisoft. If this turns out to be true, then it would be a confirmation that Ubisoft didn't take the time to implement a DX11 port correctly. If it's not true, or only partially to blame, we are left with more meaningless finger-pointing.

PCPer Live! Assassin's Creed Unity Game Stream Powered by NVIDIA!

Subject: General Tech, Graphics Cards | November 10, 2014 - 10:07 PM |
Tagged: video, Unity, pcper, nvidia, live, GTX 980, geforce, game stream, assassins creed

UPDATE: If you missed the live stream event: good news! We have it archived up on YouTube now and embeded below for your viewing pleasure!

Assassin's Creed Unity is shaping up to be one of the defining games of the holiday season, with visuals and game play additions that are incredible to see in person. Scott already wrote up a post that details some the new technologies found in the game along with a video of the impressive detail the engine provides. Check it out!

To celebrate the release, PC Perspective has partnered with NVIDIA to host a couple of live game streams that will feature some multi-player gaming fun as well some prizes to giveaway to the community. I will be joined by some new NVIDIA faces to take on the campaign in a cooperative style while taking a couple of stops to give away some hardware.

livelogo-unity.jpg

Assassin's Creed Unity Game Stream Powered by NVIDIA

5pm PT / 8pm ET - November 11th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Here are some of the prizes we have lined up for those of you that join us for the live stream:

Another awesome prize haul!! How do you win? It's really simple: just tune in and watch the Assassin's Creed Unity Game Stream Powered by NVIDIA! We'll explain the methods to enter live on the air and anyone can enter from anywhere in the world - no issues at all!

So stop by Tuesday night for some fun, some gaming and the chance to win some goods!

unity1.jpg

unity2.jpg

unity3.jpg

Meet the new Maxwell STRIX

Subject: Graphics Cards | November 10, 2014 - 03:45 PM |
Tagged: asus, strix, GTX 970 STRIX DirectCU II OC, GTX 970, nvidia, maxwell

When ASUS originally kicked off their new STRIX line they gained popularity not only due to the decent overclock and efficient custom cooler but also because there was only a small price premium over the base model.  At a price of $400 on Amazon the card has a price inline with other overclocked models, some base models can be up to $50 less.  [H]ard|OCP investigated this card to see what benefits you could expect from the model in this review, comparing it to the R290 and 290X.  Out of the box the card runs at a core of 1253 -1266MHz and memory of 7GHz, with a bit of overvolting they saw a stable core of 1473 - 1492MHz and memory of 7.832GHz. 

With the new price of the 290X dipping as low as $330 it makes for an interesting choice for GPU shoppers.  The NVIDIA card is far more power efficient and the fans operate at 0dB until the GPU hits 65C, which [H] did not see until after running at full load for a time and even then the highest their manually overclocked card hit was 70C.  On the other hand the AMD card costs $70 less and offers very similar performance.  It is always nice to see competition in the market.

1414970377FWgAjDtieB_1_9_l.jpg

"Today we examine ASUS' take on the GeForce GTX 970 video card. We have the ASUS GTX 970 STRIX DirectCU II OC video card today, and will break down its next-gen performance against an AMD Radeon R9 290 and R9 290X. This video card features 0dB fans, and many factors that improve its chance of extreme overclocking."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

AMD Gaming Evolved Goes Beyond Earth with Hawaii

Subject: General Tech, Graphics Cards | November 6, 2014 - 12:00 AM |
Tagged: radeon, r9 295x2, R9 290X, r9 290, R9, hawaii, civilization, beyond earth, amd

Why settle for space, when you can go Beyond Earth too (but only if you go to Hawaii)!

firaxis-civ-beyond-earth.jpg

The Never Settle promotion launched itself into space a couple of months ago, but AMD isn't settling for that. If you purchase a Hawaii-based graphics card (R9 290, R9 290X, or R9 295X2) then you will get a free copy of Civilization: Beyond Earth on top of the choice of three games (or game packs) from the Never Settle Space Gold Reward tier. Beyond Earth makes a lot of sense of course, because it is a new game that is also one of the most comprehensive implementations of Mantle yet.

AMD_GPU_web_iframe_Gold.png

To be eligible, the purchase would need to be made starting November 6th (which is today). Make sure that you check to make sure that what you're buying is a "qualifying purchase" from "participating retailers", because that is a lot of value to miss in a moment of carelessness.

AMD has not specified an end date for this promotion.

Source: AMD

8 GB Variants of the R9 290X Coming This Month

Subject: General Tech, Graphics Cards | November 5, 2014 - 12:56 PM |
Tagged: radeon, R9 290X, R9, amd, 8gb

With the current range of AMD’s R9 290X cards sitting at 4 GB of memory, listings for an 8 GB version have appeared on an online retailer. As far back as March, Sapphire was rumored to be building an 8 GB variety. Those rumours were supposedly quashed last month by AMD and Sapphire. However, AMD has since confirmed the existence of the new additions to the series. Pre-orders have appeared online and are said to be shipping out this month.

amd-r9-290x-8gb-GX-353-SP_88860_600.jpg

Image Credit: Overclockers UK

With 8 GB of GDDR5 memory and price tags between $480 and $520, these new additions, expectedly, do not come cheap. Compared to the 4 GB versions of the R9 290X line, which run about $160 less according to the online retailer, is it worth upgrading at this stage? For the people using a single 1080p monitor, the answer is likely no. For those with multi-screen setups, or those with deep enough pockets to own a 4K display, however, the benefits may begin to justify the premium. At 4K though, just a single 8 GB R9 290X may not provide the best experience; a Crossfire setup would benefit more from the 8 GB bump, being less reliant on GPU speed.

AMD’s 8 GB R9 290X’s are currently available for preorder: a reference version for £299.99 + VAT (~$480) and a Vapor-X version for £324.99 + VAT (~$520). They are slated to ship later this month.

What, me jealous? Four weeks with SLI'd GTX 980s

Subject: Graphics Cards | October 31, 2014 - 03:45 PM |
Tagged: sli, nvidia, GTX 980

Just in case you need a reason to be insanely jealous of someone, [H]ard|OCP has just published an article covering what it is like to be living with two GTX 980's in SLI.  The cards are driving three Dell U2410 24" 1920x1200 displays for a relatively odd resolution of 3600x1920 but apart from an issue with the GeForce Experience software suite the cards have no trouble displaying to all three monitors.  In their testing of Borderlands games they definitely noticed when PhysX was turned on, though like others [H] wishes that PhysX would abandon its proprietary roots.  When compared to the Radeon R9 290X CrossFire system the performance is very similar but when you look at heat, power and noise produced the 980's are the clear winner.  Keep in mind a good 290X is just over $300 while the least expensive GTX 980 will run you over $550.

1414677298HAmmSaoZGr_1_1.jpg

"What do you get when you take two NVIDIA GeForce GTX 980 video cards, configure those for SLI, and set those at your feet for four weeks? We give our thoughts and opinions about actually using these GPUs in our own system for four weeks with focus on performance, sound profile, and heat generated by these cards."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Assassin's Creed Unity Has NVIDIA-exclusive Effects via GameWorks

Subject: General Tech, Graphics Cards | October 29, 2014 - 06:12 PM |
Tagged: ubisoft, assassin's creed

Ubisoft has integrated GameWorks into Assassin's Creed Unity, or at least parts of it. The main feature to be included is NVIDIA's Horizon Based Ambient Occlusion Plus (HBAO+), which is their implementation of Ambient Occlusion. This effect darkens areas that would otherwise be incorrectly lit with our current limitations of Global Illumination. Basically, it analyzes the scene's geometry to subtract some of the influence of "ambient light" in places where it is an unrealistic approximation (particularly in small crevices). This is especially useful for overcast scenes, where direct sunlight does not overwhelm the contribution of scatters and bounces.

The other features to be included are Temporal Anti-alising (TXAA), Percentage-Closer Soft Shadows (PCSS), and GeometryWorks Advanced Tessellation. TXAA and PCSS were both included in Assassin's Creed IV: Black Flag, alongside the previously mentioned HBAO+, so it makes sense that Ubisoft continues to use what worked for them. GeometryWorks is a different story. NVIDIA seems to claim that it is like DirectX 11 tessellation, but is better suited for use alongside HBAO+ and PCSS.

unity2.jpg

Assassin's Creed Unity will be available on November 11th.

Source: NVIDIA

GeForce GTX 970 Coil Whine Concerns

Subject: Graphics Cards | October 28, 2014 - 12:09 PM |
Tagged: maxwell, GTX 970, geforce, coil whine

Coil whine is the undesirable effect of electrical components creating audible noise when operating. Let's look to our friends at Wikipedia for a concise and accurate description of the phenomenon:

Coil noise is, as its name suggests, caused by electromagnetic coils. These coils, which may act as inductors or transformers, have a certain resonant frequency when coupled with the rest of the electric circuit, as well as a resonance at which it will tend to physically vibrate.

As the wire that makes up the coil passes a variable current, a small amount of electrical oscillation occurs, creating a small magnetic field. Normally this magnetic field simply works to establish the inductance of the coil. However, this magnetic field can also cause the coil itself to physically vibrate. As the coil vibrates physically, it moves through a variable magnetic field, and feeds its resonance back into the system. This can produce signal interference in the circuit and an audible hum as the coil vibrates.

Coil noise can happen, for example, when the coil is poorly secured to the circuit board, is poorly damped, or if the resonant frequency of the coil is close to the resonant frequency of the electric circuit. The effect becomes more pronounced as the signal passing through the coil increases in strength, and as it nears the resonant frequency of the coil, or as it nears the resonant frequency of the circuit. Coil noise is also noticed most often when it is in the humanly audible frequency.

Coil noise is also affected by the irregularities of the magnetic material within the coil. The flux density of the inductor is effected by these irregularities, causing small currents in the coil, contaminating the original signal. This particular subset of is sometimes referred to as magnetic fluctuation noise or the Barkhausen effect. Coil noise can also occur in conjunction with the noise produced by magnetostriction.

Gamers that frequently upgrade their graphics cards may have been witness to this problem with a particular install, or you might have been one of the lucky ones to never deal with the issue. If your computer sits under your desk, in a loud room or you only game with headphones, it's also possible that you just never noticed.

inductor.jpg

Possibly offending inductors?

The reason this comes up to today is that reports are surfacing of GeForce GTX 970 cards from various graphics card vendors exhibiting excessive coil whine or coil noise. These reports are coming in from multiple forum threads around the internet, a collection of YouTube videos of users attempting to capture the issue and even official statements from some of NVIDIA's partners. Now, just because the internet is talking about it doesn't necessarily mean it's a "big deal" relative to the number of products being sold. However, after several Twitter comments and emails requesting we look into the issue, I thought it was pertinent to start asking questions.

As far as I can tell today, GTX 970 cards from multiple vendors including EVGA, MSI and Gigabyte all have users reporting issues and claims of excessive coil noise. For my part here, I have two EVGA GTX 970 cards and an MSI GTX 970, none of which are producing sound at what I would call "excessive" levels. Everyone's opinion of excessive noise is going to vary, but as someone who sits next to a desk-high test bed and hears hundreds of cards a year, I am confident I have a good idea of what to listen for.

We are still gathering data on this potential issue, but a few of the companies mentioned above have issued official or semi-official statements on the problem.

From MSI:  

The coil whine issue is not specific to 900 series, but can happen with any high end GPU and that MSI is looking in to ways to minimize the issue. If you still have concern regarding this issue, then please contact our RMA department.

From EVGA:

We have been watching the early feedback on GTX 970 and inductor noise very closely, and have actively taken steps to improve this. We urge anyone who has this type of concern to contact our support so we can address it directly.

From NVIDIA: 

We’re aware of a small percentage of users reporting excessive “coil whine” noises and are actively looking into the issue.

We are waiting for feedback from other partners to see how they plan to respond.

Since all of the GTX 970 cards currently shipping are non-reference, custom built PCB designs, NVIDIA's input to the problem is one mostly of recommendations. NVIDIA knows that it is their name and brand being associated with any noisy GeForce cards so I would expect a lot of discussions and calls being had behind closed doors to make sure partners are addressing user concerns.

IMG_9794.JPG

Interestingly, the GeForce GTX 970 was the one card of this Maxwell release where all of NVIDIA's partners chose to go the route of custom designs rather than adopting the NVIDIA reference design. On the GTX 980, however, you'll find a mix of both and I would wager that NVIDIA's reference boards do not exhibit any above average noise levels from coils. (I have actually tested four reference GTX 980s without coil whine coming into play.) Sometimes offering all of these companies the option to be creative and to differentiate can back-fire if the utmost care isn't taken in component selection.

Ironically the fix is simple: a little glue on those vibrating inductor coils and the problem goes away. But most of the components are sealed making the simple fix a non-starter for the end user (and I wouldn't recommend doing that anyway). It does point to a lack of leadership from board manufacturers that are willing to skimp on hardware in such a way to make this a big enough issue that I am sitting here writing about this today.

As an aside, if you hear coil whine when running a game at 500-5000 FPS, I don't think that counts as being a major problem for your gaming. I have seen a video or two running a DX9 render test at over 4500 FPS - pretty much any card built today will make noises you don't expect when hitting that kind of performance level.

As for my non-official discussions on the topics with various parties, everyone continues to reiterate that the problem is not as widespread as the some of the forum threads would have you believe. It's definitely higher than normal, and getting public acknowledgements from EVGA and MSI basically confirms that, but one person told me the complaint and RMA levels are where they were expected to be consider the "massively fast sell out rates" the GTX 970 is experiencing. 

Of course, AMD isn't immune to coil whine issues either. If you remember back to the initial launch of the Radeon R9 290X and R9 290, we had similar coil whine issues and experienced those first hand on reference card designs. (You can see a video I recorded of an XFX unit back in November of 2013 here.) You can still find threads on popular forums from that time period discussing the issue and YouTube never seems to forget anything, so there's that. Of course, the fact that previous card launches might have seen issues along the same line doesn't forgive the issue in current or later card releases, but it does put things into context.

So, let's get some user feedback; I want to hear from GTX 970 owners about their experiences to help guide our direction of research going forward.

Click here to take our short poll for GTX 970 owners!

Source: Various

Sony PS4 and Microsoft Xbox One Already Hitting a Performance Wall

Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM |
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd

A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?

For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:

The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.

What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.

We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..

unity1.jpg

So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.

  1. The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
     
  2. The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
     
  3. The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
     
  4. Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.

unity2.jpg

If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:

  PlayStation 4 Xbox One
Processor 8-core Jaguar APU 8-core Jaguar APU
Motherboard Custom Custom
Memory 8GB GDDR5 8GB DDR3
Graphics Card 1152 Stream Unit APU 768 Stream Unit APU
Peak Compute 1,840 GFLOPS 1,310 GFLOPS

The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

xboxonegpu.jpg

If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).

Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.

Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.

unity3.jpg

But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.

Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?

UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.

AMD Catalyst 14.9.2 Beta for Civilization: Beyond Earth

Subject: Graphics Cards | October 26, 2014 - 02:44 AM |
Tagged: amd, driver, catalyst

So Ryan has been playing many games lately, as a comparison between the latest GPUs from AMD and NVIDIA. While Civilization: Beyond Earth is not the most demanding game in existence on GPUs, it is not trivial either. While not the most complex, from a video card's perspective, it is a contender for most demanding game on your main processor (CPU). It also has some of the most thought-out Mantle support of any title using the API, when using the AMD Catalyst 14.9.2 Beta driver.

firaxis-civilization-beyond-earth.jpg

And now you can!

The Catalyst 14.9.2 Beta drivers support just about anything using the GCN architecture, from APUs (starting with Kaveri) to discrete GPUs (starting with the HD 7000 and HD 7000M series). Beyond enabling Mantle support in Civilization, it also fixes some issues with Metro, Shadow of Mordor, Total War: Rome 2, Watch_Dogs, and other games.

Also, both AMD and Firaxis are aware of a bug in Civilization: Beyond Earth where the mouse cursor does not click exactly where it is supposed to, if the user enables font scaling in Windows. They are working on it, but suggest setting it to the default (100%) if users experience this issue. This could be problematic for customers with high-DPI screens, but could keep you playing until an official patch is released.

You can get 14.9.2 Beta for Windows 7 and Windows 8.1 at AMD's website.

Source: AMD

AMD Radeon R9 290X Now Selling at $299

Subject: Graphics Cards | October 24, 2014 - 03:44 PM |
Tagged: radeon, R9 290X, leaderboard, hwlb, hawaii, amd, 290x

When NVIDIA launched the GTX 980 and GTX 970 last month, it shocked the discrete graphics world. The GTX 970 in particular was an amazing performer and undercut the price of the Radeon R9 290 at the time. That is something that NVIDIA rarely does and we were excited to see some competition in the market.

AMD responded with some price cuts on both the R9 290X and the R9 290 shortly thereafter (though they refuse to call them that) and it seems that AMD and its partners are at it again.

r9290x1.jpg

Looking on Amazon.com today we found several R9 290X and R9 290 cards at extremely low prices. For example:

The R9 290X's primary competition in terms of raw performance is the GeForce GTX 980, currently selling for $549 and up. If you can find them in stock, that means NVIDIA has a hill of $250 to climb when going against the lowest priced R9 290X.

r92901.jpg

The R9 290 looks interesting as well:

Several other R9 290 cards are selling for upwards of $300-320 making them bone-headed decisions if you can get the R9 290X for the same or lower price, but considering the GeForce GTX 970 is selling for at least $329 today (if you can find it) and you can see why consumers are paying close attention. 

Will NVIDIA make any adjustments of its own? It's hard to say right now since stock is so hard to come by of both the GTX 980 and GTX 970 but it's hard to imagine NVIDIA lowering prices as long as parts continue to sell out. NVIDIA believes that its branding and technologies like G-Sync make GeForce cards more valuable and until they being to see a shift in the market, I imagine that will stay the course.

For those of you that utilize our Hardware Leaderboard you'll find that Jeremy has taken these prices into account and update a couple of the system build configurations.

Source: Amazon.com

Who rules the ~$250 market? XFX R9 285 Black Edition versus the GTX 760

Subject: Graphics Cards | October 23, 2014 - 04:06 PM |
Tagged: xfx, R9 285 Black Edition, factory overclocked, amd

Currently sitting at $260 the XFX R9 285 Black Edition is a little less expensive than the ASUS ROG STRIKER GTX 760 and significantly more expensive than the ASUS GTX760 DirectCU2 card.  Those prices lead [H]ard|OCP to set up a showdown to see which card provided the best bang for the buck, especially once they overclocked the AMD card to 1125MHz core and 6GHz RAM.  In the end it was a very close race between the cards, the performance crown did go to the R9 285 BE but that performance comes at a premium as you can get performance almost as good for $50 less.  Of course the both the XFX card and the  STRIKER sell at a premium compared to cards with less features and a stock setup; you should expect the lower priced R9 285s to be closer in performance to the DirectCU2 card.

1413885880S78ZQ7Hqqp_1_13_l.jpg

"Today we are reviewing the new XFX Radeon R9 285 Black Edition video card. We will compare it to a pair of GeForce GTX 760 based GPUs to determine the best at the sub-$250 price point. XFX states that it is faster than the GTX 760, but that is based on a single synthetic benchmark, let's see how it holds up in real world gaming."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP