Report: NVIDIA Maxwell GM206 Pictured - Leak Claims GTX 960 Core

Subject: Graphics Cards | January 6, 2015 - 09:44 AM |
Tagged: rumor, nvidia, leak, gtx 960, GM206, geforce

VideoCardz.com is reporting that they not only know the upcoming GTX 960 core will be the GM206, but they reportedly have a photo of the unreleased chip.

NVIDIA-Maxwell-GM206-300-GPU.png

Why are reported leaks always slightly out of focus? (Credit: VideoCardz.com)

The chip pictured appears to be a GM206-300, which the site claims will be the exact variant in the GTX 960 when it is released. The post speculates that based on the die size we can expect between 8 - 10 SMM's, or 1080 - 1280 CUDA cores. They further claim that the GTX 960 will have a 128-bit memory bus and reference cards will have a 2GB frame buffer (though naturally we can expect models with 4GB of memory after launch).

ASUS-GTX960-DC2OC-2GD5-PRO-2GB.png

(Credit: VideoCardz.com)

The post goes on to show what appears to be a search result for an ASUS GTX 960 on their site, but if this existed it has since been taken down. More than likely a GTX 960 is in fact close at hand, and the reported specs (and now multiple claimed listings for the card) are not hard to fathom.

We will keep you updated on this alleged new GPU if more details emerge.

CES 2015: Gigabyte GTX 980 WaterForce 3-Way SLI Monster Spotted

Subject: Graphics Cards, Shows and Expos | January 5, 2015 - 07:15 PM |
Tagged: waterforce, GTX 980, gigabyte, ces 2015, CES, 3-way sli

Back in November Gigabyte asked for all your money in exchange for a set of three GeForce GTX 980 cards each running in a circuit of self-contained water cooling. After finally seeing the GTX 980 WaterForce in person I can tell you that it's big, it's expensive and it's damned impressive looking.

wf-2.jpg

With a price tag of $2999, there is a significant mark up over buying just a set of three retail GTX 980 cards, but this design is unique. Each GPU is individually cooled via a 120mm radiator and fan that is mounted inside of a chassis that rests on top of your PC case. On the front you'll find a temperature, fan speed and pump speed indicator along with some knobs and buttons to adjust settings and targets.

wf-3.jpg

Oh, and it ships inside of a suitcase that you can reuse for later travel. Ha! Think we can convince Gigabyte to send us one for testing?

wf-1.jpg

Coverage of CES 2015 is brought to you by Logitech!

PC Perspective's CES 2015 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

ASUS updates their popular series with the GTX 980 STRIX DC II OC

Subject: Graphics Cards | January 5, 2015 - 05:31 PM |
Tagged: GTX 980 STRIX DirectCU II OC, strix, asus, nvidia, factory overclocked

ASUS' popular STRIX line was recently updated to include NVIDIA's top card and now [H]ard|OCP has had a chance to benchmark this GTX 980 with custom quiet cooling.  The DirectCU II cooling system can operate at 0dB under all but the heaviest of loads and the 10 phase power design will mean you can go beyond the small factory overclock that the card arrives with.  [H]ard|OCP took the card from a Boost Clock of 1279MHz to 1500MHz and the RAM from 7GHz to 7.9GHz with noticeable performance improvements part of why it received a Gold Award.  If the ~$130 price difference between this card and the R9 290X does not bother you then it is a great choice for a new GPU.

14192008366QHJaRnDQg_1_1.jpg

"Today we delve into the ASUS GTX 980 STRIX DC II OC, which features custom cooling, 0dB fans and high overclocking potential. We'll experiment with this Maxwell GPU by overclocking it to the extreme. It will perform head to head against the ASUS ROG R9 290X MATRIX-P in today's most demanding games, including Far Cry 4."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

CES 2015: MSI Announces GTX 970 Gaming 100ME - 100 millionth GeForce GPU

Subject: Graphics Cards, Shows and Expos | January 4, 2015 - 03:56 PM |
Tagged: msi, GTX 970, gaming, ces 2015, CES, 100me

To celebrate the shipment of 100 million GeForce GPUs, MSI is launching a new revision of the GeForce GTX 970, the Gaming 100ME (millionth edition). The cooler is identical that used in the GTX 970 Gaming 4G but replaces the red color scheme of the MSI Gaming brand with a green very close to that of NVIDIA's.

100me-1.jpg

This will also ship with a "special gift" and will be a limited edition, much like the Golden Edition GTX 970 from earlier this year.

100me-2.jpg

MSI had some other minor updates to its GPU line including the GTX 970 4GD5T OC with a cool looking black and white color scheme and an 8GB version of the Radeon R9 290X.

Coverage of CES 2015 is brought to you by Logitech!

PC Perspective's CES 2015 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

GPU Rumors: AMD Plans 20nm but NVIDIA Waits for 16nm

Subject: General Tech, Graphics Cards | December 28, 2014 - 09:47 PM |
Tagged: radeon, nvidia, gtx, geforce, amd

According to an anonymous source of WCCFTech, AMD is preparing a 20nm-based graphics architecture that is expected to release in April or May. Originally, they predicted that the graphics devices, which they call R9 300 series, would be available in February or March. The reason for this “delay” is a massive demand for 20nm production.

nvidia-gtx-vs-amd-gaming-evolved.jpg

The source also claims that NVIDIA will skip 20nm entirely and instead opt for 16nm when that becomes available (which is said to be mid or late 2016). The expectation is that NVIDIA will answer AMD's new graphics devices with a higher-end Maxwell device that is still at 28nm. Earlier rumors, based on a leaked SiSoftware entry, claim 3072 CUDA cores that are clocked between 1.1 GHz and 1.39 GHz. If true, this would give it between 6.75 and 8.54 TeraFLOPs of performance, the higher of which is right around the advertised performance of a GeForce Titan Z (only in a single compute device that does not require distribution of work like what SLI was created to automate).

Will this strategy work in NVIDIA's favor? I don't know. 28nm is a fairly stable process at this point, which will probably allow them to get chips that can be bigger and more aggressively clocked. On the other hand, they pretty much need to rely upon chips that are bigger and more aggressively clocked to be competitive with AMD's slightly more design architecture. Previous rumors also hint that AMD is looking at water-cooling for their reference card, which might place yet another handicap against NVIDIA, although cooling is not an area that NVIDIA struggles in.

Source: WCCFTech

Nvidia GeForce 347.09 beta drivers have arrived

Subject: Graphics Cards | December 17, 2014 - 09:19 PM |
Tagged: geforce, nvidia, 347.09 beta

The 347.09 beta driver is out, which will help performance in Elite: Dangerous and Metal Gear Solid V: Ground Zeroes.  If you use GeForce Experience they will install automatically otherwise head to the driver page to manually install them.  Project CARS should also benefit from this new beta and you will be able to enable 3D on Alien: Isolation, Elite: Dangerous, Escape Dead Island, Far Cry 4 and Middle-Earth - Shadow of Mordor.  NVIDIA's new incremental updates, called GeForce Game Ready will mean more frequent driver updates with less changes than we have become accustomed to but do benefit those playing the games which they have been designed to improve.

14-NV-GTX_980_970-668x258-GF-Header-PDP-3A.jpg

As with the previous WHQL driver, GTX 980M SLI and GTX 970M SLI on notebooks does not function and if you do plan on updating your gaming laptop you should disable SLI before installing them.  You can catch up on all the changes in this PDF

Source: NVIDIA

AMD Omega is no longer in Alpha

Subject: Graphics Cards | December 9, 2014 - 03:08 PM |
Tagged: amd, catalyst, driver, omega

With AMD's new leader and restructuring comes a new type of driver update.  The Omega driver is intended to provide a large number of new features as well as performance updates once a year.  It does not replace the current cycle of Beta and WHQL driver updates and the next driver update will incorporate all of the changes from the Omega driver plus the new bug fixes or updates that the driver was released to address.

Many sites including The Tech Report have had at least a small amount of time to test the new driver and have not seen much in the way of installation issues, or unfortunately performance improvements on systems not using an AMD APU.  As more time for testing elapses and more reviews come out we may see improvements on low end systems but for now the higher end machines show little to no improvement on raw FPS rates.  Keep your eyes peeled for an update once we have had time to test the change on frame pacing results, which are far more important than just increasing your FPS. 

The main reason to be excited about this release, it is the long list of new features, from a DSR-like feature called Virtual Super Resolution which allows you to increase the resolution of your monitor although for now 4K super resolution is limited to the R285 as it is the only AMD Tonga card on the market at the moment.  Along with the release of the Omega driver comes news about Freesync displays, another feature enabled in the new driver and their availability; we have a release date of January or February with a 4K model arriving in March.

Check out the links to The Tech Report and below to read the full list of new features that this driver brings and don't forget to click on Ryan's article as well.

freesync-slide.jpg

"AMD has introduced what may be its biggest graphics driver release ever, with more than 20 new features, 400 bug fixes, and some miscellaneous performance improvements."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Awake Yet? Good! Optimizing Inverse Trig for AMD GPUs.

Subject: General Tech, Graphics Cards | December 2, 2014 - 03:11 AM |
Tagged: amd, GCN, dice, frostbite

Inverse trigonometric functions are difficult to compute. Their use is often avoided like the plague. If, however, the value is absolutely necessary, it will probably be solved by approximations or, if possible, replacing them with easier functions by clever use of trig identities.

arctrig-examples.png

If you want to see how the experts approach this problem, then Sébastien Lagarde, a senior developer of the Frostbite engine at DICE, goes into detail with a blog post. By detail, I mean you will see some GPU assembly being stepped through by the end of it. What makes this particularly interesting is the diagrams at the end, showing what each method outputs as represented by the shading of a sphere.

If you are feeling brave, take a look.

The MSI GTX 980 GAMING 4G and its fancy new fan

Subject: Graphics Cards | December 1, 2014 - 02:52 PM |
Tagged: msi, nvidia, GTX 980, GAMING 4G, factory overclocked, Twin Frozr V

MSI has updated their Twin Frozr V with Torx fans which are effective at moving a lot of air very quietly and 'S' shaped heatpipes which bear the name SuperSU.  Connectivity is provided by dual-link DVI-I, HDMI and three DisplayPort plugs which ought to provide enough flexibility for anyone.  It is clocked at 1216 - 1331MHz out of the box with GDDR5 running at 7GHz effective which [H]ard|OCP managed to increase to 1406 - 1533MHz and 7.16GHz on the memory which is rather impressive for a Maxwell chip with NVIDIA's power limits and shows just how much you can squeeze out of their new chip without needing to up the amount of juice you are providing it.  The overclocked card upped the full system wattage to 378W which was much lower than the R9 290 they tested against and the GPU temperature went as high as 70C when pushed to the limit which again is lower than the 290 however NVIDIA's selling price is certainly higher than AMD's.  Check out their full review here.

1417398114ROQehtibgG_1_1.jpg

"The MSI GTX 980 GAMING 4G video card has a factory overclock and the new Twin Frozr V cooling system. We'll push it to its highest custom overclock and pit it against the ASUS ROG R9 290X MATRIX Platinum overclocker, and determine the gaming bang for your buck. May the best card win."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

ASUS Announces GeForce GTX 970 DirectCU Mini: More Mini-ITX Gaming Goodness

Subject: Graphics Cards | November 29, 2014 - 09:57 AM |
Tagged: pcie, PCI Express, nvidia, mini-itx, GTX 970, graphics card, geforce, directcu mini, DirectCU, asus

ASUS has announced a tiny new addition to their GTX 970 family, and it will be their most powerful mini-ITX friendly card yet with a full GeForce GTX 970 GPU.

970_1.png

Image credit: ASUS

The ASUS 970 DirectCU Mini card will feature a modest factory overclock on the GTX 970 core running at 1088 MHz (stock 1050 MHz) with a 1228 MHz Boost Clock (stock 1178 MHz). Memory is not overclocked and remains at the stock 7 GHz speed.

970_2.png

The GTX 970 DirectCU Mini features a full backplate. Image credit: ASUS

The ASUS GTX 970 DirectCU Mini uses a single 8-pin PCIe power connector in place of the standard dual 6-pin configuration, which shouldn’t be a problem considering the 150W spec of the larger connector (and 145W NVIDIA spec of the 970).

970_3.png

Part of this complete mITX gaming breakfast. Image credit: ASUS

The tiny card offers a full array of display outputs including a pair of dual-link DVI connectors, HDMI 2.0, and DisplayPort 1.2. No word yet on pricing or availability, but the product page is up on the ASUS site.

GTX 970 in SLI, $700 of graphics power

Subject: Graphics Cards | November 20, 2014 - 07:08 PM |
Tagged: sli, NVIDA, GTX 970

The contestants are lined up in [H]ard|OCP's test bench, at around $700 you have a pair of GTX 970's and in the same weight class are a pair of R9 290X cards, next weighing in at just under $550 are two R9 290s, and rounding out the completion are a pair of GTX 780's who punch somewhere between $800 to $1000 depending on when you look.  The cards are tested for their ability to perform on a 4K stage as well as in the larger 5760x1200 multi-monitor event.  After a long and gruelling battle the extra work the 290X put into trimming its self down and fitting into a lower weight class has proven to be well worth the effort as they managed to show up the 970's in every performance category although certainly not in power efficiency.  Any of these pairings will be powerful but none can match a pair of GTX 980's who are also in a price class all by themselves.

14164141670QcFceujXf_1_1.gif

"We take 2-Way NVIDIA GeForce GTX 970 SLI for a spin and compare it to R9 290X CF, R9 290 CF, GTX 780 SLI at 4K resolution as well as NV Surround on a triple-display setup. If you want to see how all these video cards compare in these different display configurations we've got just the thing. Find out what $700 SLI gets you."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Gigabyte Wants All Your Money for a 3-Way SLI Watercooled GTX 980 Setup

Subject: Graphics Cards | November 14, 2014 - 11:46 AM |
Tagged: sli, nvidia, N980X3WA-4GD, maxwell, GTX 980, gigabyte, geforce, 3-way

Earlier this week, a new product showed up on Gigabyte's website that has garnered quite a bit of attention. The GA-N980X3WA-4GD WaterForce Tri-SLI is a 3-Way SLI system with integrated water cooling powered by a set of three GeForce GTX 980 GPUs.

waterforce1.jpg

That. Looks. Amazing.

What you are looking at is a 3-Way closed loop water cooling system with an external enclosure to hold the radiators while providing a display full of information including temperatures, fans speeds and more. Specifications on the Gigabyte site are limited for now, but we can infer a lot from them:

  • WATERFORCE :3-WAY SLI Water Cooling System
  • Real-Time Display and Control
  • Flex Display Technology
  • Powered by NVIDIA GeForce GTX 980 GPU
  • Integrated with 4GB GDDR5 memory 256-bit memory interface(Single Card)
  • Features Dual-link DVI-I / DVI-D / HDMI / DisplayPort*3(Single Card)
  • BASE: 1228 MHz / BOOST: 1329 MHz
  • System power supply requirement: 1200W(with six 8-pin external power connectors)

waterforce2.jpg

The GPUs on each card are your standard GeForce GTX 980 with 4GB of memory (we reviewed it here) though they are running at overclocked base and boost clock speeds, as you would hope with all that water cooling power behind it. You will need a 1200+ watt power supply for this setup, which makes sense considering the GPU horsepower you'll have access to.

Another interesting feature Gigabyte is listing is called GPU Gauntlet Sorting.

With GPU Gauntlet™ Sorting, the Gigabyte SOC graphics card guarantees the higher overclocking capability in terms of excellent power switching.

Essentially, Gigabyte is going to make sure that the GPUs on the WaterForce Tri-SLI are the best they can get their hands on, with the best chance for overclocking higher than stock.

waterforce3.jpg

Setup looks interesting - the radiators and fans will be in the external enclosure with tubing passing into the system through a 5.25-in bay. It will need to have quick connect/disconnect points at either the GPU or radiator to make that installation method possible.

waterforce4.jpg

Pricing and availability are still unknown, but don't expect to get it cheap. With the GTX 980 still selling for at least $550, you should expect something in the $2000 range or above with all the custom hardware and fittings involved.

Can I get two please?

Source: Gigabyte

NVIDIA GeForce GTX 960 Specifications Potentially Leaked

Subject: Graphics Cards | November 13, 2014 - 12:46 PM |
Tagged: nvidia, geforce, gtx 960, maxwell

It is possible that a shipping invoice fragment was leaked for the NVIDIA GeForce GTX 960. Of course, an image of text on a plain, white background is one of the easiest things to fake and/or manipulate, so take it with a grain of salt.

nvidia-gtx-960-shipping.jpg

The GTX 960 is said to have 4GB of RAM on the same, 256-bit bus. Its video outputs are listed as two DVI, one HDMI, and one DisplayPort, making this graphics card useful for just one G-Sync monitor per card. If I'm reading it correctly, it also seems to have a 993 MHz base clock (boost clock unlisted) and an effective 6008 MHz (1500 MHz actual) RAM clock. This is slightly below the 7 GHz (1750 MHz actual) of the GTX 970 and GTX 980 parts, but it should also be significantly cheaper.

The GeForce GTX 960 is expected to retail in the low-$200 price point... some day.

Source: Reader Tip

Ubisoft Responds to Low Frame Rates in Assassin's Creed Unity

Subject: Graphics Cards | November 12, 2014 - 09:03 PM |
Tagged: Unity, ubisoft, assassin's creed

Over the last couple of days there have been a lot of discussions about the performance of the new Assassin's Creed Unity from Ubisoft on current generation PC hardware. Some readers have expressed annoyance that the game is running poorly, at lower than expected frame rates, at a wide range of image quality settings. Though I haven't published my results yet, we are working on a story comparing NVIDIA and AMD GPUs in Unity, but the truth is that this is occurring on GPUs from both sides.

For example, using a Core i7-3960X and a single GeForce GTX 980 4GB reference card, I see anywhere from 37 FPS to 48 FPS while navigating the crowded city of Paris at 1920x1080 and on the Ultra High preset. Using the Low preset, that frame rate increases to 65-85 FPS or so.

unity3.jpg

Clearly, those are lower frame rates at 1920x1080 than you'll find in basically any other PC game on the market. The accusation from some in the community is that Ubisoft is either doing this on purpose or doing it out of neglect with efficient code. I put some questions to the development team at Ubisoft and though I only had a short time with them, the answers tell their side of the story.

Ryan Shrout: What in the Unity game engine is putting the most demand on the GPU and its compute resources? Are there specific effects or were there specific design goals for the artists that require as much GPU horsepower as the game does today with high image quality settings?

Ubisoft: Assassin’s Creed Unity is one of the most detailed games on the market and [contains] a giant, open world city built to the scale that we’ve recreated. Paris requires significant details. Some points to note about Paris in Assassin’s Creed Unity:

  • There are tens of thousands of objects are visible on-screen, casting and receiving shadows.
  • Paris is incredibly detailed. For example, Notre-Dame itself is millions of triangles.
  • The entire game world has global illumination and local reflections.
  • There is realistic, high-dynamic range lighting.
  • We temporally stabilized anti-aliasing.

RS: Was there any debate internally about downscaling on effects/image quality to allow for lower end system requirements?

Ubisoft: We talked about this a lot, but our position always came back to us ensuring that Assassin’s Creed Unity is a next-gen only game with breakthrough graphics. With this vision, we did not degrade the visual quality of the game. On PC, we have several option for low-scaling, like disabling AA, decreasing resolution, and we have low option for Texture Quality, Environment Quality and Shadows.

RS: Were you looking forward or planning for future GPUs (or multi-GPU) that will run the game at peak IQ settings at higher frame rates than we have today?

Ubisoft: We targeted existing PC hardware.

RS: Do you envision updates to the game or to future GPU drivers that would noticeably improve performance on current generations of hardware?

Ubisoft: The development team is continuing to work on optimization post-launch through software updates. You’ll hear more details shortly.

Some of the features listed by the developer in the first answer - global illumination methods, high triangle counts, HDR lighting - can be pretty taxing on GPU hardware. I know there are people out there pointing out games that have similar feature sets and that run at higher frame rates, but the truth is that no two game engines are truly equal. If you have seen Assassin's Creed Unity in action you'll be able to tell immediately the game is beautiful, stunningly so. Is it worth that level of detail for the performance levels achieved from current high-end hardware? Clearly that's the debate.

unity2.jpg

When I asked if Ubisoft had considered scaling back the game to improve performance, they clearly decided against it. The developer had a vision for the look and style of the game and they were dedicated to it; maybe to a fault from some gamers' viewpoint.

Also worth nothing is that Ubisoft is continuing to work on optimization post-release; how much of an increase we'll actually see with game patches or driver updates will have to be seen as we move forward. Some developers have a habit of releasing a game and simply abandoning it as it shipped - hopefully we will see more dedication from the Unity team.

So, if the game runs at low frame rates on modern hardware...what is the complaint exactly? I do believe that Ubisoft would have benefited from better performance on lower image quality settings. You can tell by swapping the settings for yourself in game but the quality difference between Low and Ultra High is noticeable, but not dramatically so. Again, this likely harkens back to the desire of Ubisoft to maintain an artistic vision.

Remember that when Crysis 3 launched early last year, running at 1920x1200 at 50 FPS required a GTX 680, the top GPU at the time; and that was at the High settings. The Very High preset only hit 37 FPS on the same card.

PC gamers seems to be creating a double standard. On one hand, none of us want PC-ports or games that are developed with consoles in mind that don't take advantage of the power of the PC platform. Games in the Call of Duty series are immensely popular but, until the release of Advanced Warfare, would routinely run at 150-200 FPS at 1080p on a modern PC. Crysis 3 and Assassin's Creed Unity are the opposite of that - games that really tax current CPU and GPU hardware, paving a way forward for future GPUs to be developed and NEEDED.

If you're NVIDIA or AMD, you should applaud this kind of work. Now I am more interested than ever in a GTX 980 Ti, or a R9 390X, to see what Unity will play like, or what Far Cry 4 will run at, or if Dragon Age Inquisition looks even better.

Of course, if we can get more performance from a better optimized or tweaked game, we want that too. Developers need to be able cater to as wide of a PC gaming audience as possible, but sometimes creating a game that can scale between running on a GTX 650 Ti and a GTX 980 is a huge pain. And with limited time frames and budgets, don't we want at least some developers to focus on visual quality rather than "dumbing down" the product?

Let me know what you all think - I know this is a hot-button issue!

UPDATE: Many readers in the comments are bringing up the bugs and artifacts within Unity, pointing to YouTube videos and whatnot. Those are totally valid complaints about the game, but don't necessarily reflect on the game's performance - which is what we were trying to target with this story. Having crashes and bugs in the game is disappointing, but again, Ubisoft and Assassin's Creed Unity aren't alone here. Have you seen the bugs in Skyrim or Tomb Raider? Hopefully Ubisoft will be more aggressive in addressing them in the near future. 

UPDATE 2: I also wanted to comment that even though I seem to be defending Ubisoft around the performance of Unity, my direct feedback to them was that they should enable modes in the game that allow it to play at higher frame rates and even lower image quality settings, even if they were unable to find ways to "optimize" the game's efficiency. So far the developer seems aware of all the complaints around performance, bugs, physics, etc. and is going to try to address them.

UPDATE 3: In the last day or so, a couple of other media outlets have posted anonymous information that indicates that the draw call count for Assassin's Creed Unity is at fault for the poor performance of the game on PCs. According to this "anonymous" source, while the consoles have low-level API access to hardware to accept and process several times the draw calls, DirectX 11 can only handle "7,000 - 10,000 peak draw calls." Unity apparently is "pushing in excess of 50,000 draw calls per frame" and thus is putting more pressure on the PC that it can handle, even with high end CPU and GPU hardware. The fact that these comments are "anonymous" is pretty frustrating as it means that even if they are accurate, they can't be taken as the truth without confirmation from Ubisoft. If this turns out to be true, then it would be a confirmation that Ubisoft didn't take the time to implement a DX11 port correctly. If it's not true, or only partially to blame, we are left with more meaningless finger-pointing.

PCPer Live! Assassin's Creed Unity Game Stream Powered by NVIDIA!

Subject: General Tech, Graphics Cards | November 10, 2014 - 10:07 PM |
Tagged: video, Unity, pcper, nvidia, live, GTX 980, geforce, game stream, assassins creed

UPDATE: If you missed the live stream event: good news! We have it archived up on YouTube now and embeded below for your viewing pleasure!

Assassin's Creed Unity is shaping up to be one of the defining games of the holiday season, with visuals and game play additions that are incredible to see in person. Scott already wrote up a post that details some the new technologies found in the game along with a video of the impressive detail the engine provides. Check it out!

To celebrate the release, PC Perspective has partnered with NVIDIA to host a couple of live game streams that will feature some multi-player gaming fun as well some prizes to giveaway to the community. I will be joined by some new NVIDIA faces to take on the campaign in a cooperative style while taking a couple of stops to give away some hardware.

livelogo-unity.jpg

Assassin's Creed Unity Game Stream Powered by NVIDIA

5pm PT / 8pm ET - November 11th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Here are some of the prizes we have lined up for those of you that join us for the live stream:

Another awesome prize haul!! How do you win? It's really simple: just tune in and watch the Assassin's Creed Unity Game Stream Powered by NVIDIA! We'll explain the methods to enter live on the air and anyone can enter from anywhere in the world - no issues at all!

So stop by Tuesday night for some fun, some gaming and the chance to win some goods!

unity1.jpg

unity2.jpg

unity3.jpg

Meet the new Maxwell STRIX

Subject: Graphics Cards | November 10, 2014 - 03:45 PM |
Tagged: asus, strix, GTX 970 STRIX DirectCU II OC, GTX 970, nvidia, maxwell

When ASUS originally kicked off their new STRIX line they gained popularity not only due to the decent overclock and efficient custom cooler but also because there was only a small price premium over the base model.  At a price of $400 on Amazon the card has a price inline with other overclocked models, some base models can be up to $50 less.  [H]ard|OCP investigated this card to see what benefits you could expect from the model in this review, comparing it to the R290 and 290X.  Out of the box the card runs at a core of 1253 -1266MHz and memory of 7GHz, with a bit of overvolting they saw a stable core of 1473 - 1492MHz and memory of 7.832GHz. 

With the new price of the 290X dipping as low as $330 it makes for an interesting choice for GPU shoppers.  The NVIDIA card is far more power efficient and the fans operate at 0dB until the GPU hits 65C, which [H] did not see until after running at full load for a time and even then the highest their manually overclocked card hit was 70C.  On the other hand the AMD card costs $70 less and offers very similar performance.  It is always nice to see competition in the market.

1414970377FWgAjDtieB_1_9_l.jpg

"Today we examine ASUS' take on the GeForce GTX 970 video card. We have the ASUS GTX 970 STRIX DirectCU II OC video card today, and will break down its next-gen performance against an AMD Radeon R9 290 and R9 290X. This video card features 0dB fans, and many factors that improve its chance of extreme overclocking."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

AMD Gaming Evolved Goes Beyond Earth with Hawaii

Subject: General Tech, Graphics Cards | November 6, 2014 - 12:00 AM |
Tagged: radeon, r9 295x2, R9 290X, r9 290, R9, hawaii, civilization, beyond earth, amd

Why settle for space, when you can go Beyond Earth too (but only if you go to Hawaii)!

firaxis-civ-beyond-earth.jpg

The Never Settle promotion launched itself into space a couple of months ago, but AMD isn't settling for that. If you purchase a Hawaii-based graphics card (R9 290, R9 290X, or R9 295X2) then you will get a free copy of Civilization: Beyond Earth on top of the choice of three games (or game packs) from the Never Settle Space Gold Reward tier. Beyond Earth makes a lot of sense of course, because it is a new game that is also one of the most comprehensive implementations of Mantle yet.

AMD_GPU_web_iframe_Gold.png

To be eligible, the purchase would need to be made starting November 6th (which is today). Make sure that you check to make sure that what you're buying is a "qualifying purchase" from "participating retailers", because that is a lot of value to miss in a moment of carelessness.

AMD has not specified an end date for this promotion.

Source: AMD

8 GB Variants of the R9 290X Coming This Month

Subject: General Tech, Graphics Cards | November 5, 2014 - 12:56 PM |
Tagged: radeon, R9 290X, R9, amd, 8gb

With the current range of AMD’s R9 290X cards sitting at 4 GB of memory, listings for an 8 GB version have appeared on an online retailer. As far back as March, Sapphire was rumored to be building an 8 GB variety. Those rumours were supposedly quashed last month by AMD and Sapphire. However, AMD has since confirmed the existence of the new additions to the series. Pre-orders have appeared online and are said to be shipping out this month.

amd-r9-290x-8gb-GX-353-SP_88860_600.jpg

Image Credit: Overclockers UK

With 8 GB of GDDR5 memory and price tags between $480 and $520, these new additions, expectedly, do not come cheap. Compared to the 4 GB versions of the R9 290X line, which run about $160 less according to the online retailer, is it worth upgrading at this stage? For the people using a single 1080p monitor, the answer is likely no. For those with multi-screen setups, or those with deep enough pockets to own a 4K display, however, the benefits may begin to justify the premium. At 4K though, just a single 8 GB R9 290X may not provide the best experience; a Crossfire setup would benefit more from the 8 GB bump, being less reliant on GPU speed.

AMD’s 8 GB R9 290X’s are currently available for preorder: a reference version for £299.99 + VAT (~$480) and a Vapor-X version for £324.99 + VAT (~$520). They are slated to ship later this month.

What, me jealous? Four weeks with SLI'd GTX 980s

Subject: Graphics Cards | October 31, 2014 - 03:45 PM |
Tagged: sli, nvidia, GTX 980

Just in case you need a reason to be insanely jealous of someone, [H]ard|OCP has just published an article covering what it is like to be living with two GTX 980's in SLI.  The cards are driving three Dell U2410 24" 1920x1200 displays for a relatively odd resolution of 3600x1920 but apart from an issue with the GeForce Experience software suite the cards have no trouble displaying to all three monitors.  In their testing of Borderlands games they definitely noticed when PhysX was turned on, though like others [H] wishes that PhysX would abandon its proprietary roots.  When compared to the Radeon R9 290X CrossFire system the performance is very similar but when you look at heat, power and noise produced the 980's are the clear winner.  Keep in mind a good 290X is just over $300 while the least expensive GTX 980 will run you over $550.

1414677298HAmmSaoZGr_1_1.jpg

"What do you get when you take two NVIDIA GeForce GTX 980 video cards, configure those for SLI, and set those at your feet for four weeks? We give our thoughts and opinions about actually using these GPUs in our own system for four weeks with focus on performance, sound profile, and heat generated by these cards."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Assassin's Creed Unity Has NVIDIA-exclusive Effects via GameWorks

Subject: General Tech, Graphics Cards | October 29, 2014 - 06:12 PM |
Tagged: ubisoft, assassin's creed

Ubisoft has integrated GameWorks into Assassin's Creed Unity, or at least parts of it. The main feature to be included is NVIDIA's Horizon Based Ambient Occlusion Plus (HBAO+), which is their implementation of Ambient Occlusion. This effect darkens areas that would otherwise be incorrectly lit with our current limitations of Global Illumination. Basically, it analyzes the scene's geometry to subtract some of the influence of "ambient light" in places where it is an unrealistic approximation (particularly in small crevices). This is especially useful for overcast scenes, where direct sunlight does not overwhelm the contribution of scatters and bounces.

The other features to be included are Temporal Anti-alising (TXAA), Percentage-Closer Soft Shadows (PCSS), and GeometryWorks Advanced Tessellation. TXAA and PCSS were both included in Assassin's Creed IV: Black Flag, alongside the previously mentioned HBAO+, so it makes sense that Ubisoft continues to use what worked for them. GeometryWorks is a different story. NVIDIA seems to claim that it is like DirectX 11 tessellation, but is better suited for use alongside HBAO+ and PCSS.

unity2.jpg

Assassin's Creed Unity will be available on November 11th.

Source: NVIDIA