PCPer Live! GeForce GTX TITAN X Live Stream and Giveaway!

Subject: Graphics Cards | March 16, 2015 - 07:13 PM |
Tagged: video, tom petersen, titan x, nvidia, maxwell, live, gtx titan x, gtx, gm200, geforce

UPDATE 2: If you missed the live stream, we now have the replay available below!

UPDATE: The winner has been announced: congrats to Ethan M. for being selected as the random winner of the GeForce GTX TITAN X graphics card!!

Get yourself ready, it’s time for another GeForce GTX live stream hosted by PC Perspective’s Ryan Shrout! This time the focus is going to be NVIDIA's brand-new GeForce GTX TITAN X graphics card, first teased a couple of weeks back at GDC. NVIDIA's Tom Petersen will be joining us live from the GPU Technology Conference show floor to discuss the GM200 GPU, it's performance and to show off some demos of the hardware in action.

GeForce_GTX_TITANX_3Qtr.jpg

And what's a live stream without a prize? One lucky live viewer will win a GeForce GTX TITAN X 12GB graphics card of their very own! That's right - all you have to do is tune in for the live stream tomorrow afternoon and you could win a Titan X!!

pcperlive.png

NVIDIA GeForce GTX TITAN X Live Stream and Giveaway

1pm PT / 4pm ET - March 17th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, March 17th at 1pm PT / 4pm ET at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience. To win the prize you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else.

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Tuesday at 1pm PT / 4pm ET and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

NVIDIA Releases GeForce GTX 960M and GTX 950M Mobile Graphics

Subject: Graphics Cards | March 12, 2015 - 11:13 PM |
Tagged: nvidia, maxwell, GTX 960M, GTX 950M, gtx 860m, gtx 850m, gm107, geforce

NVIDIA has announced new GPUs to round out their 900-series mobile lineup, and the new GTX 960M and GTX 950M are based on the same GM107 core as the previous 860M/850M parts.

geforce-gtx-960m-3qtr.png

Both GPUs feature 640 CUDA Cores and are separated by Base clock speed, with the GTX 960M operating at 1096 MHz and GTX 950M at 914 MHz. Both have unlisted maximum Boost frequencies that will likely vary based on thermal constraints. The memory interface is the other differentiator between the GPUs, with the GTX 960M sporting dedicated GDDR5 memory, and the GTX 950M can be implemented with either DDR3 or GDDR5 memory. Both GTX 960M and 950M use the same 128-bit memory interface and support up to 4GB of memory.

As reported by multiple sources the core powering the 960M/950M is a GM107 Maxwell GPU, which means that we are essentially talking about rebadged 860M/850M products, though the unlisted Boost frequencies could potentially be higher with these parts with improved silicon on a mature 28nm process. In contrast the previously announced GTX 965M is based on a cut down Maxwell GM204 GPU, with its 1024 CUDA Cores representing half of the GPU core introduced with the GTX 980.

New notebooks featuring the GTX 960M have already been announced by NVIDIA's partners, so we will soon see if there is any performance improvement to these refreshed GM107 parts.

Source: NVIDIA

GDC 15: NVIDIA Shows TITAN X at Epic Games Keynote

Subject: Graphics Cards | March 4, 2015 - 01:10 PM |
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC

For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.

The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.

titanx3.jpg

JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.

titanx2.jpg

Any guesses on performance or price?

titanx1.jpg

titanx4.jpg

Jen-Hsun signs the world's first TITAN X for Tim Sweeney.

Kite Demo running on TITAN X

UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.

titanx5.jpg

Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.

titanx6.jpg

Author:
Manufacturer: Asus

Quiet, Efficient Gaming

The last few weeks have been dominated by talk about the memory controller of the Maxwell based GTX 970.  There are some very strong opinions about that particular issue, and certainly NVIDIA was remiss on actually informing consumers about how it handles the memory functionality of that particular product.  While that debate rages, we have somewhat lost track of other products in the Maxwell range.  The GTX 960 was released during this particular firestorm and, while it also shared the outstanding power/performance qualities of the Maxwell architecture, it is considered a little overpriced when compared to other cards in its price class in terms of performance.

It is easy to forget that the original Maxwell based product to hit shelves was the GTX 750 series of cards.  They were released a year ago to some very interesting reviews.  The board is one of the first mainstream cards in recent memory to have a power draw that is under 75 watts, but can still play games with good quality settings at 1080P resolutions.  Ryan covered this very well and it turned out to be a perfect gaming card for many pre-built systems that do not have extra power connectors (or a power supply that can support 125+ watt graphics cards).  These are relatively inexpensive cards and very easy to install, producing a big jump in performance as compared to the integrated graphics components of modern CPUs and APUs.

strix_01.jpg

The GTX 750 and GTX 750 Ti have proven to be popular cards due to their overall price, performance, and extremely low power consumption.  They also tend to produce a relatively low amount of heat, due to solid cooling combined with that low power consumption.  The Maxwell architecture has also introduced some new features, but the major changes are to the overall design of the architecture as compared to Kepler.  Instead of 192 cores per SMK, there are now 128 cores per SMM.  NVIDIA has done a lot of work to improve performance per core as well as lower power in a fairly dramatic way.  An interesting side effect is that the CPU hit with Maxwell is a couple of percentage points higher than Kepler.  NVIDIA does lean a bit more on the CPU to improve overall GPU power, but most of this performance hit is covered up by some really good realtime compiler work in the driver.

Asus has taken the GTX 750 Ti and applied their STRIX design and branding to it.  While there are certainly faster GPUs on the market, there are none that exhibit the power characteristics of the GTX 750 Ti.  The combination of this GPU and the STRIX design should result in an extremely efficient, cool, and silent card.

Click to read the rest of the review of the Asus STRIX GTX 750 Ti!

You have to pay to play, Gigabyte's overclockable GTX 980 G1 GAMING

Subject: Graphics Cards | February 20, 2015 - 02:08 PM |
Tagged: gigabyte, nvidia, GTX 980 G1 GAMING, windforce, maxwell, factory overclocked

If you want the same amount of Maxwell Streaming Multiprocessors and ROP units as a GTX 980 as well as that last 500MB of RAM to run at full speed then you will need to pay for a GTX 980.  One choice is Gigabyte's GTX 980 G1 Gaming which will cost you $580, around $240 more than a GTX 970 but the premium can be worth it if you need the power.  [H]ard|OCP took the already overclocked card from a boost GPU frequency of 1329MHz and RAM of 7GHz all the way to a boost of 1513MHz with RAM topping out at 8.11GHz.  That overclock had a noticeable effect on performance and helped the card garner an Editors Choice award.  See it in action here.

1424030496NnqLS3kD4X_1_1.jpg

"Today we have the GIGABYTE GTX 980 G1 GAMING, which features the WINDFORCE 600W cooling system and a high factory overclock. We will make comparisons to the competition, find out how fast it is compared to a GTX 980, and you won't believe the overclock we achieved! We will make both performance and price comparisons."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Maxwell keeps on overclocking

Subject: Graphics Cards | February 12, 2015 - 01:41 PM |
Tagged: overclocking, nvidia, msi, gtx 960, GM206, maxwell

While Ryan was slaving over a baker's dozen of NVIDIA's GTX 960s, [H]ard|OCP focused on overclocking the MSI GeForce GTX 960 GAMING 2G that they recently reviewed.  Out of the box this GPU will hit 1366MHz in game, with memory frequency unchanged at 7GHz effective.  As users have discovered, overclocking cards with thermal protection that automatically downclocks the GPU when a certain TDP threshold has been reached is a little more tricky as simply upping the power provided to the card can raise the temperature enough that you end up with a lesser frequency that before you overvolted.  After quite a bit of experimentation, [H] managed to boost the memory to a full 8GHz and the in game GPU was hitting 1557MHz which is at the higher end of what Ryan saw.  The trick was to increase the Power Limit and turn the clock speed up but leave the voltage alone.

1421920414x2ymoRS6JM_2_4_l.jpg

"We push the new MSI GeForce GTX 960 GAMING video card to its limits of performance by overclocking to its limits. This NVIDIA GeForce GTX 960 GPU based video card has a lot of potential for hardware enthusiasts and gamers wanting more performance. We compare it with other overclocked cards to see if the GTX 960 can keep up."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

ASUS Launches GTX 750 Ti Strix OC Edition With Twice the Memory

Subject: Graphics Cards | February 11, 2015 - 11:55 PM |
Tagged: strix, maxwell, gtx 750ti, gtx 750 ti, gm107, factory overclocked, DirectCU II

ASUS is launching a new version of its factory overclocked GTX 750 Ti STRIX with double the memory of the existing STRIX-GTX750TI-OC-2GD5. The new card will feature 4GB of GDDR5, but is otherwise identical.

The new graphics card pairs the NVIDIA GM107 GPU and 4GB of memory with ASUS’ dual fan "0dB" DirectCU II cooler. The card can output video over DVI, HDMI, and DisplayPort.

Thanks to the aftermarket cooler, ASUS has factory overclocked the GTX 750 Ti GPU (640 CUDA cores) to a respectable 1124 MHz base and 1202 MHz GPU Boost clockspeeds. (For reference, stock clockspeeds are 1020 MHz base and 1085 MHz boost.) However, while the amount of memory has doubled the clockspeeds have remained the same at a stock clock of 5.4 Gbps (effective).

Asus GTX 750Ti Strix 4GB Factory Overclocked Graphics Card.jpg

ASUS has not annouced pricing or availability for the new card but expect it to come soon at a slight premium (~$15) over the $160 2GB STRIX 750Ti.

The additional memory (and it's usefulness vs price premium) is a bit of a headscratcher considering this is a budget card aimed at delivering decent 1080p gaming. The extra memory may help in cranking up the game graphics settings just a bit more. In the end, the extra memory is nice to have, but if you find a good deal on a 2GB card today, don’t get too caught up on waiting for a 4GB model.

Source: TechPowerUp
Author:
Manufacturer: NVIDIA

A baker's dozen of GTX 960

Back on the launch day of the GeForce GTX 960, we hosted NVIDIA's Tom Petersen for a live stream. During the event, NVIDIA and its partners provided ten GTX 960 cards for our live viewers to win which we handed out through about an hour and a half. An interesting idea was proposed during the event - what would happen if we tried to overclock all of the product NVIDIA had brought along to see what the distribution of results looked like? After notifying all the winners of their prizes and asking for permission from each, we started the arduous process of testing and overclocking a total of 13 (10 prizes plus our 3 retail units already in the office) different GTX 960 cards.

Hopefully we will be able to provide a solid base of knowledge for buyers of the GTX 960 that we don't normally have the opportunity to offer: what is the range of overclocking you can expect and what is the average or median result. I think you will find the data interesting.

The 13 Contenders

Our collection of thirteen GTX 960 cards includes a handful from ASUS, EVGA and MSI. The ASUS models are all STRIX models, the EVGA cards are of the SSC variety, and the MSI cards include a single Gaming model and three 100ME. (The only difference between the Gaming and 100ME MSI cards is the color of the cooler.)

cards2.jpg

Jenga!

To be fair to the prize winners, I actually assigned each of them a specific graphics card before opening them up and testing them. I didn't want to be accused of favoritism by giving the best overclockers to the best readers!

Continue reading our overclocking testing of 13 GeForce GTX 960 cards!!

The GTX 960 at 2560x1440

Subject: Graphics Cards | January 30, 2015 - 03:42 PM |
Tagged: nvidia, msi gaming 2g, msi, maxwell, gtx 960

Sitting right now at $220 ($210 after MIR) the MSI GTX 960 GAMING 2G is more or less the same price as a standard R9 280 or a 280x/290 on discount.  We have seen that the price to performance is quite competitive at 1080p gaming but this year the push seems to be for performance at higher resolutions.  [H]ard|OCP explored the performance of this card by testing at 2560 x 1440 against the GTX 770 and 760 as well as the R9 285 and 280x with some interesting results.  In some cases the 280x turned out on top while in others the 960 won, especially when Gameworks features like God Rays were enabled.  Read through this carefully but from their results the deciding factor between picking up one or the other of these cards could be the price as they are very close in performance.

14224342445gkR2VhJE3_1_1.jpg

"We take the new NVIDIA GeForce GTX 960 GPU based MSI GeForce GTX 960 GAMING 2G and push it to its limits at 1440p. We also include a GeForce GTX 770 and AMD Radeon R9 280X into this evaluation for some unexpected results. How does the GeForce GTX 960 really compare to a wide range of cards?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

A Summary Thus Far

UPDATE 2/2/15: We have another story up that compares the GTX 980 and GTX 970 in SLI as well.

It has certainly been an interesting week for NVIDIA. It started with the release of the new GeForce GTX 960, a $199 graphics card that brought the latest iteration of Maxwell's architecture to a lower price point, competing with the Radeon R9 280 and R9 285 products. But then the proverbial stuff hit the fan with a memory issue on the GeForce GTX 970, the best selling graphics card of the second half of 2014. NVIDIA responded to the online community on Saturday morning but that was quickly followed up with a more detailed expose on the GTX 970 memory hierarchy, which included a couple of important revisions to the specifications of the GTX 970 as well.

At the heart of all this technical debate is a performance question: does the GTX 970 suffer from lower performance because of of the 3.5GB/0.5GB memory partitioning configuration? Many forum members and PC enthusiasts have been debating this for weeks with many coming away with an emphatic yes.

GM204_arch.jpg

The newly discovered memory system of the GeForce GTX 970

Yesterday I spent the majority of my day trying to figure out a way to validate or invalidate these types of performance claims. As it turns out, finding specific game scenarios that will consistently hit targeted memory usage levels isn't as easy as it might first sound and simple things like the order of start up can vary that as well (and settings change orders). Using Battlefield 4 and Call of Duty: Advanced Warfare though, I think I have presented a couple of examples that demonstrate the issue at hand.

Performance testing is a complicated story. Lots of users have attempted to measure performance on their own setup, looking for combinations of game settings that sit below the 3.5GB threshold and those that cross above it, into the slower 500MB portion. The issue for many of these tests is that they lack access to both a GTX 970 and a GTX 980 to really compare performance degradation between cards. That's the real comparison to make - the GTX 980 does not separate its 4GB into different memory pools. If it has performance drops in the same way as the GTX 970 then we can wager the memory architecture of the GTX 970 is not to blame. If the two cards perform differently enough, beyond the expected performance delta between two cards running at different clock speeds and with different CUDA core counts, then we have to question the decisions that NVIDIA made.

IMG_9792.JPG

There has also been concern over the frame rate consistency of the GTX 970. Our readers are already aware of how deceptive an average frame rate alone can be, and why looking at frame times and frame time consistency is so much more important to guaranteeing a good user experience. Our Frame Rating method of GPU testing has been in place since early 2013 and it tests exactly that - looking for consistent frame times that result in a smooth animation and improved gaming experience.

reddit.jpg

Users at reddit.com have been doing a lot of subjective testing

We will be applying Frame Rating to our testing today of the GTX 970 and its memory issues - does the division of memory pools introduce additional stutter into game play? Let's take a look at a couple of examples.

Continue reading our look at GTX 970 Performance Testing using Frame Rating!