Author:
Manufacturer: EVGA

EVGA Brings Custom GTX 780 Ti Early

Reference cards for new graphics card releases are very important for a number of reasons.  Most importantly, these are the cards presented to the media and reviewers that judge the value and performance of these cards out of the gate.  These various articles are generally used by readers and enthusiasts to make purchasing decisions, and if first impressions are not good, it can spell trouble.  Also, reference cards tend to be the first cards sold in the market (see the recent Radeon R9 290/290X launch) and early adopters get the same technology in their hands; again the impressions reference cards leave will live in forums for eternity.

All that being said, retail cards are where partners can differentiate and keep the various GPUs relevant for some time to come.  EVGA is probably the most well known NVIDIA partner and is clearly their biggest outlet for sales.  The ACX cooler is one we saw popularized with the first GTX 700-series cards and the company has quickly adopted it to the GTX 780 Ti, released by NVIDIA just last week

evga780tiacx.jpg

I would normally have a full review for you as soon as we could but thanks to a couple of upcoming trips that will keep me away from the GPU test bed, that will take a little while longer.  However, I thought a quick preview was in order to show off the specifications and performance of the EVGA GTX 780 Ti ACX.

gpuz.png

As expected, the EVGA ACX design of the GTX 780 Ti is overclocked.  While the reference card runs at a base clock of 875 MHz and a typical boost clock of 928 MHz, this retail model has a base clock of 1006 MHz and a boost clock of 1072 MHz.  This means that all 2,880 CUDA cores are going to run somewhere around 15% faster on the EVGA ACX model than the reference GTX 780 Ti SKUs. 

We should note that though the cooler is custom built by EVGA, the PCB design of this GTX 780 Ti card remains the same as the reference models. 

Continue reading our preview of the EVGA GeForce GTX 780 Ti ACX custom-cooled graphics card!!

NVIDIA strikes back!

Subject: Graphics Cards | November 8, 2013 - 04:41 PM |
Tagged: nvidia, kepler, gtx 780 ti, gk110, geforce

Here is a roundup of the reviews of what is now the fastest single GPU card on the planet, the GTX 780 Ti, which is a fully active GK110 chip.  The 7GHz GDDR5 is faster than AMD's memory but use a 384-bit memory bus which is less than the R9 290X which leads to some interesting questions about the performance of this card under high resolutions.  Are you willing to pay quite a bit more for better performance and a quieter card? Check out the performance deltas at [H]ard|OCP and see if that changes your mind at all.

You can see how it measures up in ISUs in Ryan's review as well.

1383802230J422mbwkoS_1_10_l.jpg

"NVIDIA's fastest single-GPU video card is being launched today. With the full potential of the Kepler architecture and GK110 GPU fully unlocked, how will it perform compared to the new R9 290X with new drivers? Will the price versus performance make sense? Will it out perform a TITAN? We find out all this and more."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

GK110 in all its glory

I bet you didn't realize that October and November were going to become the onslaught of graphics cards it has been.  I know I did not and I tend to have a better background on these things than most of our readers.  Starting with the release of the AMD Radeon R9 280X, 270X and R7 260X in the first week of October, it has pretty much been a non-stop battle between NVIDIA and AMD for the hearts, minds, and wallets of PC gamers. 

Shortly after the Tahiti refresh came NVIDIA's move into display technology with G-Sync, a variable refresh rate feature that will work with upcoming monitors from ASUS and others as long as you have a GeForce Kepler GPU.  The technology was damned impressive, but I am still waiting for NVIDIA to send over some panels for extended testing. 

Later in October we were hit with the R9 290X, the Hawaii GPU that brought AMD back in the world of ultra-class single GPU card performance.  It has produced stellar benchmarks and undercut the prices (then at least) of the GTX 780 and GTX TITAN.  We tested it in both single and multi-GPU configurations and found that AMD had made some impressive progress in fixing its frame pacing issues, even with Eyefinity and 4K tiled displays. 

NVIDIA dropped a driver release with ShadowPlay that allows gamers to record playback locally without a hit on performance.  I posted a roundup of R9 280X cards which showed alternative coolers and performance ranges.  We investigated the R9 290X Hawaii GPU and the claims that performance is variable and configurable based on fan speeds.  Finally, the R9 290 (non-X model) was released this week to more fanfare than the 290X thanks to its nearly identical performance and $399 price tag. 

IMG_1862.JPG

And today, yet another release.  NVIDIA's GeForce GTX 780 Ti takes the performance of the GK110 and fully unlocks it.  The GTX TITAN uses one fewer SMX and the GTX 780 has three fewer SMX units so you can expect the GTX 780 Ti to, at the very least, become the fastest NVIDIA GPU available.  But can it hold its lead over the R9 290X and validate its $699 price tag?

Continue reading our review of the NVIDIA GeForce GTX 780 Ti 3GB GK110 Graphics Card!!

NVIDIA Drops GTX 780, GTX 770 Prices, Announces GTX 780 Ti Price

Subject: Graphics Cards | October 28, 2013 - 09:29 AM |
Tagged: nvidia, kepler, gtx 780 ti, gtx 780, gtx 770, geforce

A lot of news coming from the NVIDIA camp today, including some price drops and price announcements. 

First up, the high-powered GeForce GTX 780 is getting dropped from $649 to $499, a $150 savings that will bring the GTX 780 into line with the competition of AMD's new Radeon R9 290X launched last week

Next, the GeForce GTX 770 2GB is going to drop from $399 to $329 to help it compete more closely with the R9 280X. 

r9290x.JPG

Even you weren't excited about the R9 290X, you have to be excited by competition.

In a surprising turn of events, NVIDIA is now the company with the great bundle deal with GPUs as well!  Starting today you'll be able to get a free copy of Batman: Arkham Origins, Splinter Cell: Blacklist and Assassin's Creed IV: Black Flag with the GeForce GTX 780 Ti, GTX 780 and GTX 770.  If you step down to the GTX 760 or 660 you'll lose out on the Batman title.

SHIELD discounts are available as well: $100 off you buy the upper tier GPUs and $50 off if you but the lower tier. 

UPDATE: NVIDIA just released a new version of GeForce Experience that enabled ShadowPlay, the ability to use Kepler GPUs to record game play in the background with almost no CPU/system ovehead.  You can see Scott's initial impressions of the software right here; it seems like its going to be a pretty awesome feature.

bundle.png

Need more news?  The yet-to-be-released GeForce GTX 780 Ti is also getting a price - $699 based on the email we just received.  And it will be available starting November 7th!!

With all of this news, how does it change our stance on the graphics market?  Quite a bit in fact.  The huge price drop on the GTX 780, coupled with the 3-game bundle means that NVIDIA is likely offering the better hardware/software combo for gamers this fall.  Yes, the R9 290X is likely still a step faster, but now you can get the GTX 780, three great games and spend $50 less. 

The GTX 770 is now poised to make a case for itself against the R9 280X as well with its $70 drop.  The R9 280X / HD 7970 GHz Edition was definitely a better option with its $100 price delta but with only $30 separating the two competing cards, and the three free games, again the advantage will likely fall to NVIDIA.

Finally, the price point of the GTX 780 Ti is interesting - if NVIDIA is smart they are pricing it based on comparable performance to the R9 290X from AMD.  If that is the case, then we can guess the GTX 780 Ti will be a bit faster than the Hawaii card, while likely being quieter and using less power too.  Oh, and again, the three game bundle. 

NVIDIA did NOT announce a GTX TITAN price drop which might surprise some people.  I think the answer as to why will be addressed with the launch of the GTX 780 Ti next month but from what I was hearing over the last couple of weeks NVIDIA can't make the cards fast enough to satisfy demand so reducing margin there just didn't make sense. 

NVIDIA has taken a surprisingly aggressive stance here in the discrete GPU market.  The need to address and silence critics that think the GeForce brand is being damaged by the AMD console wins is obviously potent inside the company.  The good news for us though, and the gaming community as a whole, is that just means better products and better value for graphics card purchases this holiday.

NVIDIA says these price drops will be live by tomorrow.  Enjoy!

Source: NVIDIA

NVIDIA Launches WHQL Drivers for Battlefield and Batman

Subject: General Tech, Graphics Cards | October 23, 2013 - 12:21 AM |
Tagged: nvidia, graphics drivers, geforce

Mid-June kicked up a storm of poop across the internet when IGN broke the AMD optimizations for Frostbite 3. It was reported that NVIDIA would not receive sample code for those games until after they launched. The article was later updated with a statement from AMD: "... the AMD Gaming Evolved program undertakes no efforts to prevent our competition from optimizing for games before their release."

Now, I assume, the confusion was caused by then-not-announced Mantle.

nvidia-geforce.png

And, as it turns out, NVIDIA did receive the code for Battlefield 4 prior to launch. Monday, the company launched their 331.58 WHQL-certified drivers which are optimized for Batman: Arkham Origins and Battlefield 4. According to the release notes, you should even be able to use SLi out of the gate. If, on the other hand, you are a Civilization V player: HBAO+ should enhance your shadowing.

They also added a DX11 SLi profile for Watch Dogs... awkwarrrrrd.

To check out the blog at GeForce.com for a bit more information, check out the release notes, or just head over to the drivers page. If you have GeForce Experience installed, it probably already asked you to update.

Source: NVIDIA

NVIDIA Announces G-Sync, Variable Refresh Rate Monitor Technology

Subject: Graphics Cards | October 18, 2013 - 10:52 AM |
Tagged: variable refresh rate, refresh rate, nvidia, gsync, geforce, g-sync

UPDATE: I have posted a more in-depth analysis of the new NVIDIA G-Sync technology: NVIDIA G-Sync: Death of the Refres Rate.  Thanks for reading!!

UPDATE 2: ASUS has announced the G-Sync enabled version of the VG248QE will be priced at $399.

During a gaming event being held in Montreal, NVIDIA unveield a new technology for GeForce gamers that the company is hoping will revolutionize the PC and displays.  Called NVIDIA G-Sync, this new feature will combine changes to the graphics driver as well as change to the monitor to alter the way refresh rates and Vsync have worked for decades.

gsync.jpg

With standard LCD monitors gamers are forced to choose between a tear-free experience by enabling Vsync or playing a game with the substantial visual anomolies in order to get the best and most efficient frame rates.  G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync.  Essentially, G-Sync allows a properly equiped monitor to run at a variable refresh rate which will improve the experience of gaming in interesting ways.

gsync2.jpg

This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work.  The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes.  In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design - so you can obviously expect this to only function with NVIDIA GPUs. 

DisplayPort is the only input option currently supported. 

It turns out NVIDIA will actually be offering retrofitting kits for current users of the VG248QE at some yet to be disclosed cost.  The first retail sales of G-Sync will ship as a monitor + retrofit kit as production was just a bit behind.

Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing.  It can also display 133 FPS at 133 Hz without tearing.  Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync.

vsync04.jpg

The technology that NVIDIA is showing here is impressive when seen in person; and that is really the only way to understand the difference.  High speed cameras and captures will help but much like 3D Vision was, this is a feature that needs to be seen to be appreciated.  How users will react to that road block will have to be seen. 

Features like G-Sync show the gaming world that without the restrictions of console there is quite a bit of revolutionary steps that can be made to maintain the PC gaming advantage well into the future.  4K displays were a recent example and now NVIDIA G-Sync adds to the list. 

Be sure to stop back at PC Perspective on Monday, November 21st at 2pm ET / 11am PT as we will be joined in-studio by NVIDIA's Tom Petersen to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming.  You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!

NVIDIA G-Sync Live Stream

11am PT / 2pm ET - October 21st

PC Perspective Live! Page

Source: NVIDIA
Author:
Manufacturer: Various

Summary of Events

In January of 2013 I revealed a new testing methodology for graphics cards that I dubbed Frame Rating.  At the time I was only able to talk about the process, using capture hardware to record the output directly from the DVI connections on graphics cards, but over the course of a few months started to release data and information using this technology.  I followed up the story in January with a collection of videos that displayed some of the capture video and what kind of performance issues and anomalies we were able to easily find. 

My first full test results were published in February to quite a bit of stir and then finally in late March released Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing which dramatically changed the way graphics cards and gaming performance was discussed and evaluated forever. 

Our testing proved that AMD CrossFire was not improving gaming experiences in the same way that NVIDIA SLI was.  Also, we showed that other testing tools like FRAPS were inadequate in showcasing this problem.  If you are at all unfamiliar with this testing process or the results it showed, please check out the Frame Rating Dissected story above.

At the time, we tested 5760x1080 resolution using AMD Eyefinity and NVIDIA Surround but found there were too many issues and problems with our scripts and the results they were presenting to give reasonably assured performance metrics.  Running AMD + Eyefinity was obviously causing some problems but I wasn’t quite able to pinpoint what they were and how severe it might have been.  Instead I posted graphs like this:

01.png

We were able to show NVIDIA GTX 680 performance and scaling in SLI at 5760x1080 but we only were giving results for the Radeon HD 7970 GHz Edition in a single GPU configuration.

 

Since those stories were released, AMD has been very active.  At first they were hesitant to believe our results and called into question our processes and the ability for gamers to really see the frame rate issues we were describing.  However, after months of work and pressure from quite a few press outlets, AMD released a 13.8 beta driver that offered a Frame Pacing option in the 3D controls that enables the ability to evenly space out frames in multi-GPU configurations producing a smoother gaming experience.

02.png

The results were great!  The new AMD driver produced very consistent frame times and put CrossFire on a similar playing field to NVIDIA’s SLI technology.  There were limitation though: the driver only fixed DX10/11 games and only addressed resolutions of 2560x1440 and below.

But the story won’t end there.  CrossFire and Eyefinity are still very important in a lot of gamers minds and with the constant price drops in 1920x1080 panels, more and more gamers are taking (or thinking of taking) the plunge to the world of Eyefinity and Surround.  As it turns out though, there are some more problems and complications with Eyefinity and high-resolution gaming (multi-head 4K) that are cropping up and deserve discussion.

Continue reading our investigation into AMD Eyefinity and NVIDIA Surround with multi-GPU solutions!!

Holy Free Non-Free Free To Play NVIDIA!

Subject: General Tech, Graphics Cards | August 31, 2013 - 04:30 PM |
Tagged: geforce, F2P, bundle

If you read Jeremy's post, yesterday, and actually were a fan of in game currency then how about a new bundle? If you pick up a GeForce GTX 650 or participating GTX 700M-based notebooks, you will receive a total $75 split between three Free-to-Play (F2P) titles. One of the titles, Warframe, is also optimized for NVIDIA hardware with the inclusion of PhysX support primarily for particle effects, it would seem.

The bundle includes:

  • Warframe - 465 Platinum (Normally $25)
  • Dungeons and Dragons: NeverWinter - 1,000,000 Astral Diamonds (Normally $25)
  • Marvel Heroes - 2600 Gold (Normally $25)

NVIDIA stresses that you must make your purchase (be it a system containing a GTX 650, a discrete add-in GTX 650, or a laptop containing a GTX 700M) from one of the participating merchants. Codes will not be provided if the retailer, or 'e-tailer', is not a partner for this program.

Press blast after the break.

Source: NVIDIA
Author:
Manufacturer: MSI

A New TriFrozr Cooler

Graphics cards are by far the most interesting topic we cover at PC Perspective.  Between the battles of NVIDIA and AMD as well as the competition between board partners like EVGA, ASUS, MSI and Galaxy, there is very rarely a moment in time when we don't have a different GPU product of some kind on an active test bed.  Both NVIDIA and AMD release reference cards (for the most part) with each and every new product launch and it then takes some time for board partners to really put their own stamp on the designs.  Other than the figurative stamp that is the sticker on the fan.

IMG_9886.JPG

One of the companies that has recently become well known for very custom, non-reference graphics card designs is MSI and the pinnacle of the company's engineering falls into the Lightning brand.  As far back as the MSI GTX 260 Lightning and as recently as the MSI HD 7970 Lightning, these cards have combined unique cooling, custom power design and good amount of over engineering to really produce a card that has few rivals.

Today we are looking at the brand new MSI GeForce GTX 780 Lightning, a complete revamp of the GTX 780 that was released in May.  Based on the same GK110 GPU as the GTX Titan card, with two fewer SMX units, the GTX 780 easily the second fastest single GPU card on the market.  MSI is hoping to make the enthusiasts even more excited about the card with the Lightning design that brings a brand new TriFrozr cooler, impressive power design and overclocking capabilities that basic users and LN2 junkies can take advantage of.  Just what DO you get for $750 these days?

Continue reading our review of the MSI GeForce GTX 780 Lightning graphics card!!

New NVIDIA 326.41 Beta Graphics Drivers Add Shield PC Game Streaming Support

Subject: Graphics Cards | August 2, 2013 - 02:50 AM |
Tagged: graphics drivers, nvidia, shield, pc game streaming, gaming, geforce

NVIDIA recently released a new set of beta GeForce graphics card drivers targetted at the 400, 500, 600, and 700 series GPUs. The new version 326.41 beta drivers feature the same performance tweaks as the previous 326.19 drivers while baking in beta support for PC game streaming to NVIDIA’s Shield gaming portable from a compatible GeForce graphics card (GTX 650 or better). The new beta release is also the suggested version to use for those running the Windows 8.1 Preview.

NVIDIA has included the same performance tweaks as version 326.19. The tweaks offer up to 19% performance increases, depending on the particular GPU setup. For example, users running a GTX 770 will see as much as 15% better performance in Dirt: Showdown and 6% in Tomb Raider. Performance improvements are even higher for GTX 770 SLI setups, with boosts in Dirt: Showdown and F1 2012 of 19% and 11% respectively. NVIDIA has also added SLI profiles for Splinter Cell: Blacklist and Batman: Arkham Origins.

The NVIDIA Shield launched recently and reviews are making the rounds around the Internet. One of the exciting features of the Shield gaming handheld is the ability to stream PC games from a PC with NVIDIA graphics card to the Shield over Wi-Fi.

The 326.41 drivers improve performance across several games on the GTX 770.

The other major changes are improvements to tiled 4K displays, which are displays with 4K resolutions that are essentially made of two separate displays, and the monitor even shows up to the OS as two separate displays despite being in a single physical monitor. Using DisplayPort MST and tiled displays allows monitor manufacturers to deliver 4K displays with higher refresh rates.

Interested GeForce users can grab the latest beta drivers from the NVIDIA website or via the links below:

Source: Tech Spot