Author:
Manufacturer: ASUS

A slightly smaller MARS

The NVIDIA GeForce GTX 760 was released in June of 2013.  Based on the same GK104 GPU as the GTX 680, GTX 670 and GTX 770, the GTX 760 disabled a couple more of the clusters of processor cores to offer up impressive performance levels for a lower cost than we had seen previously.  My review of the GTX 760 was very positive as NVIDIA had priced it aggressively against the competing products from AMD. 

As for ASUS, they have a storied history with the MARS brand.  Typically an over-built custom PCB with two of the highest end NVIDIA GPUs stapled together, the ASUS MARS cards have been limited edition products with a lot of cache around them.  The first MARS card was a dual GTX 285 product that was the first card to offer 4GB of memory (though 2GB per GPU of course).  The MARS II took a pair of GTX 580 GPUs and pasted them on a HUGE card and sold just 1000 of them worldwide.  It was heavy, expensive and fast; blazing fast.  But at a price of $1200+ it wasn't on the radar of most PC gamers.

IMG_9023.JPG

Interestingly, the MARS iteration for the GTX 680 never occurred and why that is the case is still a matter of debate.  Some point the finger at poor sales and ASUS while others think that NVIDIA restricted ASUS' engineers from being as creative as they needed to be.

Today's release of the ASUS ROG MARS 760 is a bit different - this is still a high end graphics card but it doesn't utilize the fastest single-GPU option on the market.  Instead ASUS has gone with a more reasonable design that combines a pair of GTX 760 GK104 GPUs on a single PCB with a PCI Express bridge chip between them.  The MARS 760 is significantly smaller and less power hungry than previous MARS cards but it is still able to pack a punch in the performance department as you'll soon see.

Continue reading our review of the ASUS ROG MARS 760 Dual GPU Graphics Card!!

Author:
Manufacturer: NVIDIA

Quality time with G-Sync

Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology.  When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers.  This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync. 

IMG_8938.JPG

NVIDIA's Prototype G-Sync Monitor

We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics.  All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.

Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed.  This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.

In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync.  In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better.  It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.

The story today is more about extensive hands-on testing with the G-Sync prototype monitors.  The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future.  These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market.  However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user. 

Continue reading our tech preview of NVIDIA G-Sync!!

Author:
Manufacturer: EVGA

EVGA Brings Custom GTX 780 Ti Early

Reference cards for new graphics card releases are very important for a number of reasons.  Most importantly, these are the cards presented to the media and reviewers that judge the value and performance of these cards out of the gate.  These various articles are generally used by readers and enthusiasts to make purchasing decisions, and if first impressions are not good, it can spell trouble.  Also, reference cards tend to be the first cards sold in the market (see the recent Radeon R9 290/290X launch) and early adopters get the same technology in their hands; again the impressions reference cards leave will live in forums for eternity.

All that being said, retail cards are where partners can differentiate and keep the various GPUs relevant for some time to come.  EVGA is probably the most well known NVIDIA partner and is clearly their biggest outlet for sales.  The ACX cooler is one we saw popularized with the first GTX 700-series cards and the company has quickly adopted it to the GTX 780 Ti, released by NVIDIA just last week

evga780tiacx.jpg

I would normally have a full review for you as soon as we could but thanks to a couple of upcoming trips that will keep me away from the GPU test bed, that will take a little while longer.  However, I thought a quick preview was in order to show off the specifications and performance of the EVGA GTX 780 Ti ACX.

gpuz.png

As expected, the EVGA ACX design of the GTX 780 Ti is overclocked.  While the reference card runs at a base clock of 875 MHz and a typical boost clock of 928 MHz, this retail model has a base clock of 1006 MHz and a boost clock of 1072 MHz.  This means that all 2,880 CUDA cores are going to run somewhere around 15% faster on the EVGA ACX model than the reference GTX 780 Ti SKUs. 

We should note that though the cooler is custom built by EVGA, the PCB design of this GTX 780 Ti card remains the same as the reference models. 

Continue reading our preview of the EVGA GeForce GTX 780 Ti ACX custom-cooled graphics card!!

NVIDIA strikes back!

Subject: Graphics Cards | November 8, 2013 - 04:41 PM |
Tagged: nvidia, kepler, gtx 780 ti, gk110, geforce

Here is a roundup of the reviews of what is now the fastest single GPU card on the planet, the GTX 780 Ti, which is a fully active GK110 chip.  The 7GHz GDDR5 is faster than AMD's memory but use a 384-bit memory bus which is less than the R9 290X which leads to some interesting questions about the performance of this card under high resolutions.  Are you willing to pay quite a bit more for better performance and a quieter card? Check out the performance deltas at [H]ard|OCP and see if that changes your mind at all.

You can see how it measures up in ISUs in Ryan's review as well.

1383802230J422mbwkoS_1_10_l.jpg

"NVIDIA's fastest single-GPU video card is being launched today. With the full potential of the Kepler architecture and GK110 GPU fully unlocked, how will it perform compared to the new R9 290X with new drivers? Will the price versus performance make sense? Will it out perform a TITAN? We find out all this and more."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

GK110 in all its glory

I bet you didn't realize that October and November were going to become the onslaught of graphics cards it has been.  I know I did not and I tend to have a better background on these things than most of our readers.  Starting with the release of the AMD Radeon R9 280X, 270X and R7 260X in the first week of October, it has pretty much been a non-stop battle between NVIDIA and AMD for the hearts, minds, and wallets of PC gamers. 

Shortly after the Tahiti refresh came NVIDIA's move into display technology with G-Sync, a variable refresh rate feature that will work with upcoming monitors from ASUS and others as long as you have a GeForce Kepler GPU.  The technology was damned impressive, but I am still waiting for NVIDIA to send over some panels for extended testing. 

Later in October we were hit with the R9 290X, the Hawaii GPU that brought AMD back in the world of ultra-class single GPU card performance.  It has produced stellar benchmarks and undercut the prices (then at least) of the GTX 780 and GTX TITAN.  We tested it in both single and multi-GPU configurations and found that AMD had made some impressive progress in fixing its frame pacing issues, even with Eyefinity and 4K tiled displays. 

NVIDIA dropped a driver release with ShadowPlay that allows gamers to record playback locally without a hit on performance.  I posted a roundup of R9 280X cards which showed alternative coolers and performance ranges.  We investigated the R9 290X Hawaii GPU and the claims that performance is variable and configurable based on fan speeds.  Finally, the R9 290 (non-X model) was released this week to more fanfare than the 290X thanks to its nearly identical performance and $399 price tag. 

IMG_1862.JPG

And today, yet another release.  NVIDIA's GeForce GTX 780 Ti takes the performance of the GK110 and fully unlocks it.  The GTX TITAN uses one fewer SMX and the GTX 780 has three fewer SMX units so you can expect the GTX 780 Ti to, at the very least, become the fastest NVIDIA GPU available.  But can it hold its lead over the R9 290X and validate its $699 price tag?

Continue reading our review of the NVIDIA GeForce GTX 780 Ti 3GB GK110 Graphics Card!!

NVIDIA Drops GTX 780, GTX 770 Prices, Announces GTX 780 Ti Price

Subject: Graphics Cards | October 28, 2013 - 09:29 AM |
Tagged: nvidia, kepler, gtx 780 ti, gtx 780, gtx 770, geforce

A lot of news coming from the NVIDIA camp today, including some price drops and price announcements. 

First up, the high-powered GeForce GTX 780 is getting dropped from $649 to $499, a $150 savings that will bring the GTX 780 into line with the competition of AMD's new Radeon R9 290X launched last week

Next, the GeForce GTX 770 2GB is going to drop from $399 to $329 to help it compete more closely with the R9 280X. 

r9290x.JPG

Even you weren't excited about the R9 290X, you have to be excited by competition.

In a surprising turn of events, NVIDIA is now the company with the great bundle deal with GPUs as well!  Starting today you'll be able to get a free copy of Batman: Arkham Origins, Splinter Cell: Blacklist and Assassin's Creed IV: Black Flag with the GeForce GTX 780 Ti, GTX 780 and GTX 770.  If you step down to the GTX 760 or 660 you'll lose out on the Batman title.

SHIELD discounts are available as well: $100 off you buy the upper tier GPUs and $50 off if you but the lower tier. 

UPDATE: NVIDIA just released a new version of GeForce Experience that enabled ShadowPlay, the ability to use Kepler GPUs to record game play in the background with almost no CPU/system ovehead.  You can see Scott's initial impressions of the software right here; it seems like its going to be a pretty awesome feature.

bundle.png

Need more news?  The yet-to-be-released GeForce GTX 780 Ti is also getting a price - $699 based on the email we just received.  And it will be available starting November 7th!!

With all of this news, how does it change our stance on the graphics market?  Quite a bit in fact.  The huge price drop on the GTX 780, coupled with the 3-game bundle means that NVIDIA is likely offering the better hardware/software combo for gamers this fall.  Yes, the R9 290X is likely still a step faster, but now you can get the GTX 780, three great games and spend $50 less. 

The GTX 770 is now poised to make a case for itself against the R9 280X as well with its $70 drop.  The R9 280X / HD 7970 GHz Edition was definitely a better option with its $100 price delta but with only $30 separating the two competing cards, and the three free games, again the advantage will likely fall to NVIDIA.

Finally, the price point of the GTX 780 Ti is interesting - if NVIDIA is smart they are pricing it based on comparable performance to the R9 290X from AMD.  If that is the case, then we can guess the GTX 780 Ti will be a bit faster than the Hawaii card, while likely being quieter and using less power too.  Oh, and again, the three game bundle. 

NVIDIA did NOT announce a GTX TITAN price drop which might surprise some people.  I think the answer as to why will be addressed with the launch of the GTX 780 Ti next month but from what I was hearing over the last couple of weeks NVIDIA can't make the cards fast enough to satisfy demand so reducing margin there just didn't make sense. 

NVIDIA has taken a surprisingly aggressive stance here in the discrete GPU market.  The need to address and silence critics that think the GeForce brand is being damaged by the AMD console wins is obviously potent inside the company.  The good news for us though, and the gaming community as a whole, is that just means better products and better value for graphics card purchases this holiday.

NVIDIA says these price drops will be live by tomorrow.  Enjoy!

Source: NVIDIA

NVIDIA Launches WHQL Drivers for Battlefield and Batman

Subject: General Tech, Graphics Cards | October 23, 2013 - 12:21 AM |
Tagged: nvidia, graphics drivers, geforce

Mid-June kicked up a storm of poop across the internet when IGN broke the AMD optimizations for Frostbite 3. It was reported that NVIDIA would not receive sample code for those games until after they launched. The article was later updated with a statement from AMD: "... the AMD Gaming Evolved program undertakes no efforts to prevent our competition from optimizing for games before their release."

Now, I assume, the confusion was caused by then-not-announced Mantle.

nvidia-geforce.png

And, as it turns out, NVIDIA did receive the code for Battlefield 4 prior to launch. Monday, the company launched their 331.58 WHQL-certified drivers which are optimized for Batman: Arkham Origins and Battlefield 4. According to the release notes, you should even be able to use SLi out of the gate. If, on the other hand, you are a Civilization V player: HBAO+ should enhance your shadowing.

They also added a DX11 SLi profile for Watch Dogs... awkwarrrrrd.

To check out the blog at GeForce.com for a bit more information, check out the release notes, or just head over to the drivers page. If you have GeForce Experience installed, it probably already asked you to update.

Source: NVIDIA

NVIDIA Announces G-Sync, Variable Refresh Rate Monitor Technology

Subject: Graphics Cards | October 18, 2013 - 10:52 AM |
Tagged: variable refresh rate, refresh rate, nvidia, gsync, geforce, g-sync

UPDATE: I have posted a more in-depth analysis of the new NVIDIA G-Sync technology: NVIDIA G-Sync: Death of the Refres Rate.  Thanks for reading!!

UPDATE 2: ASUS has announced the G-Sync enabled version of the VG248QE will be priced at $399.

During a gaming event being held in Montreal, NVIDIA unveield a new technology for GeForce gamers that the company is hoping will revolutionize the PC and displays.  Called NVIDIA G-Sync, this new feature will combine changes to the graphics driver as well as change to the monitor to alter the way refresh rates and Vsync have worked for decades.

gsync.jpg

With standard LCD monitors gamers are forced to choose between a tear-free experience by enabling Vsync or playing a game with the substantial visual anomolies in order to get the best and most efficient frame rates.  G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync.  Essentially, G-Sync allows a properly equiped monitor to run at a variable refresh rate which will improve the experience of gaming in interesting ways.

gsync2.jpg

This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work.  The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes.  In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design - so you can obviously expect this to only function with NVIDIA GPUs. 

DisplayPort is the only input option currently supported. 

It turns out NVIDIA will actually be offering retrofitting kits for current users of the VG248QE at some yet to be disclosed cost.  The first retail sales of G-Sync will ship as a monitor + retrofit kit as production was just a bit behind.

Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing.  It can also display 133 FPS at 133 Hz without tearing.  Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync.

vsync04.jpg

The technology that NVIDIA is showing here is impressive when seen in person; and that is really the only way to understand the difference.  High speed cameras and captures will help but much like 3D Vision was, this is a feature that needs to be seen to be appreciated.  How users will react to that road block will have to be seen. 

Features like G-Sync show the gaming world that without the restrictions of console there is quite a bit of revolutionary steps that can be made to maintain the PC gaming advantage well into the future.  4K displays were a recent example and now NVIDIA G-Sync adds to the list. 

Be sure to stop back at PC Perspective on Monday, November 21st at 2pm ET / 11am PT as we will be joined in-studio by NVIDIA's Tom Petersen to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming.  You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!

NVIDIA G-Sync Live Stream

11am PT / 2pm ET - October 21st

PC Perspective Live! Page

Source: NVIDIA
Author:
Manufacturer: Various

Summary of Events

In January of 2013 I revealed a new testing methodology for graphics cards that I dubbed Frame Rating.  At the time I was only able to talk about the process, using capture hardware to record the output directly from the DVI connections on graphics cards, but over the course of a few months started to release data and information using this technology.  I followed up the story in January with a collection of videos that displayed some of the capture video and what kind of performance issues and anomalies we were able to easily find. 

My first full test results were published in February to quite a bit of stir and then finally in late March released Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing which dramatically changed the way graphics cards and gaming performance was discussed and evaluated forever. 

Our testing proved that AMD CrossFire was not improving gaming experiences in the same way that NVIDIA SLI was.  Also, we showed that other testing tools like FRAPS were inadequate in showcasing this problem.  If you are at all unfamiliar with this testing process or the results it showed, please check out the Frame Rating Dissected story above.

At the time, we tested 5760x1080 resolution using AMD Eyefinity and NVIDIA Surround but found there were too many issues and problems with our scripts and the results they were presenting to give reasonably assured performance metrics.  Running AMD + Eyefinity was obviously causing some problems but I wasn’t quite able to pinpoint what they were and how severe it might have been.  Instead I posted graphs like this:

01.png

We were able to show NVIDIA GTX 680 performance and scaling in SLI at 5760x1080 but we only were giving results for the Radeon HD 7970 GHz Edition in a single GPU configuration.

 

Since those stories were released, AMD has been very active.  At first they were hesitant to believe our results and called into question our processes and the ability for gamers to really see the frame rate issues we were describing.  However, after months of work and pressure from quite a few press outlets, AMD released a 13.8 beta driver that offered a Frame Pacing option in the 3D controls that enables the ability to evenly space out frames in multi-GPU configurations producing a smoother gaming experience.

02.png

The results were great!  The new AMD driver produced very consistent frame times and put CrossFire on a similar playing field to NVIDIA’s SLI technology.  There were limitation though: the driver only fixed DX10/11 games and only addressed resolutions of 2560x1440 and below.

But the story won’t end there.  CrossFire and Eyefinity are still very important in a lot of gamers minds and with the constant price drops in 1920x1080 panels, more and more gamers are taking (or thinking of taking) the plunge to the world of Eyefinity and Surround.  As it turns out though, there are some more problems and complications with Eyefinity and high-resolution gaming (multi-head 4K) that are cropping up and deserve discussion.

Continue reading our investigation into AMD Eyefinity and NVIDIA Surround with multi-GPU solutions!!

Holy Free Non-Free Free To Play NVIDIA!

Subject: General Tech, Graphics Cards | August 31, 2013 - 04:30 PM |
Tagged: geforce, F2P, bundle

If you read Jeremy's post, yesterday, and actually were a fan of in game currency then how about a new bundle? If you pick up a GeForce GTX 650 or participating GTX 700M-based notebooks, you will receive a total $75 split between three Free-to-Play (F2P) titles. One of the titles, Warframe, is also optimized for NVIDIA hardware with the inclusion of PhysX support primarily for particle effects, it would seem.

The bundle includes:

  • Warframe - 465 Platinum (Normally $25)
  • Dungeons and Dragons: NeverWinter - 1,000,000 Astral Diamonds (Normally $25)
  • Marvel Heroes - 2600 Gold (Normally $25)

NVIDIA stresses that you must make your purchase (be it a system containing a GTX 650, a discrete add-in GTX 650, or a laptop containing a GTX 700M) from one of the participating merchants. Codes will not be provided if the retailer, or 'e-tailer', is not a partner for this program.

Press blast after the break.

Source: NVIDIA