NVIDIA Crates the GeForce GTX 690

Subject: Graphics Cards | April 30, 2012 - 03:17 PM |
Tagged: nvidia, kepler, GTX 690, geforce

There has been a lot of excitement building about the GeForce GTX 690 dual-GPU graphics card, with the apex during CEO Jen-Hsun Huang's keynote that released all kinds of details this weekend.  We go our review sample of the new graphics beast this morning and needless to say NVIDIA felt the need to give this $999 video card a special ride.

box01.JPG

With the imprint of "Caution: Weapons Grade Gaming Power" on the outside of the crate, NVIDIA obviously wanted to give us a chance to use the pry bar sent last week.  And use it we did.

box04.5.JPG

There isn't much more we can say about the card itself but I can tell you that the fit and finish of the design is just impressive to see in person. 

box15.JPG

I have included quite a few more photos of the unboxing and the card itself if you continue to the full post right here!!

NVIDIA Announces GeForce Experience Cloud Service for Quality Presets

Subject: Graphics Cards | April 29, 2012 - 01:25 PM |
Tagged: nvidia, geforce experience, geforce

After spilling the beans about the new GeForce GTX 690 card at the GeForce LAN in Shanghai, NVIDIA's CEO Jen-Hsun Huang also let loose on the GeForce Experience, a cloud-based service that promises to simplify the configuration of game settings based on your hardware.

The is incredibly simple but equally impressive in its scope: based on your particular hardware configuration including the processor, memory capacity, storage speed and of course the graphics card, the NVIDIA tool will set the optimal in-game settings and resolution.  The breadth of being able to cover ALL the available hardware in the enthusiast market and even the mobile field is enormous but NVIDIA is confident that they have the personnel and testing systems in place to cover it all.

nvlogo.png

The process is pretty straight forward - when a user opens a game for the first time they will be presented with a screen that shows the default or current game settings side by side with the settings recommended by NVIDIA's GeForce Experience.  You can simply hit apply and the configuration files will be updated to the new settings and you are ready to start gaming.  Of course, users can simply use that GFE settings as a "base" and then modify them as they see fit.  

This will be particularly useful for the mobile market that is usually never addressed by the in-game auto configurations from companies like Valve, resulting in incredibly low frame rates or a horrible 800x600-style experience.  Now users will with machines running the somewhat unknown GT 540M will have the option to get some more modern, and hopefully realistic, settings easily applied.  

Obviously the goal is make gaming on the PC as simple as gaming on consoles and this type of service will definitely move the industry in the right direction.  With the beta supposedly starting in June, this is going to demand a long-term vision and constant and vigilant updates as new games, new driver revisions and user upgrades will constantly change what the "optimal" settings will be.  We wish NVIDIA the best of luck to be sure and we will be testing out the service in the not too distant future.  

NVIDIA Announces dual-GPU Kepler GeForce GTX 690

Subject: Graphics Cards | April 28, 2012 - 11:55 PM |
Tagged: nvidia, kepler, jen-hsun huang, hd 7990, GTX 690, gtx 680, geforce, 7990

During a keynote presentation at GeForce LAN 2012 being held in Shanghai, NVIDIA's CEO Jen-Hsun Huang unveiled what many of us have been theorizing would be coming soon; the dual-GPU variant of the Kepler architecture, the GeForce GTX 690 graphics card.

Though reviews aren't going to be released yet, Huang unveiled pretty much all of the information we need to figure it out.  With the full specifications listed as well as details about the stunning new design of the card and cooler, the GTX 690 is without a doubt going to be the fastest graphics card on the market when it goes on sale next month.

GeForce_GTX_690_style-299-processed.jpg

The GeForce GTX 690 4GB card is based on a pair of GK104 chips, each sporting 1536 CUDA cores, basically identical to the ones used in the GeForce GTX 680 2GB cards released in March.  The base clock speed of these parts is slightly lower at 915 MHz but the "typical" Boost clock is set as high as 1019 MHz, pushing it pretty close to the performance of the single GPU solutions.  With a total of 3072 processing cores, the GTX 690 will have insane amounts of compute horsepower.

Each GPU will have access to 2GB of independent frame buffer still running at 6 Gbps, for a grand total of 4GB on the card.  

GeForce_GTX_690_3qtr_bare_PCB.jpg

Sitting between the two GPUs will be a PCI Express 3.0 capable bridge chip from PLX supporting full x16 lanes to each GPU and a full x16 back to the host system. 

In terms of power requirements, the GTX 690 will use a pair of 8-pin connectors and will have a TDP of 300 watts - actually not that high consider the TDP of the GTX 680 is 195 watts on its own.  It is obvious that NVIDIA is going to be pulling the very best chips for this card, those that can run at clock speeds over 1 GHz with minimal leakage.  

Continue reading for more details and photos of the NVIDIA GeForce GTX 690 4GB Graphics Card!!

Author:
Manufacturer: Galaxy

Retail Ready

When the NVIDIA GeForce GTX 680 launched in March we were incredibly impressed with the performance and technology that the GPU was able to offer while also being power efficient.  Fast forward nearly a month and we are still having problems finding the card in stock - a HUGE negative towards the perception of the card and the company at this point.

Still, we are promised by NVIDIA and its partners that they will soon have more on shelves, so we continue to look at the performance configurations and prepare articles and reviews for you.  Today we are taking a look at the Galaxy GeForce GTX 680 2GB card - their most basic model that is based on the reference design.

If you haven't done all the proper reading about the GeForce GTX 680 and the Kepler GPU, you should definitely check out my article from March that goes into a lot more detail on that subject before diving into our review of the Galaxy card.

The Card, In Pictures

01.jpg

The Galaxy GTX 680 is essentially identical to the reference design with the addition of some branding along the front and top of the card.  The card is still a dual-slot design, still requires a pair of 6-pin power connections and uses a very quiet fan in relation to the competition from AMD.

Continue reading our review of the Galaxy GeForce GTX 680 2GB Graphics Card!!

PC Perspective Live Review Recap: NVIDIA GeForce GTX 680

Subject: Graphics Cards | March 24, 2012 - 05:51 PM |
Tagged: video, tom petersen, nvidia, live review, gtx 680, geforce

On the day of the GeForce GTX 680 launch, we hosted a "Live Review" to discuss the new product features and performance while also taking questions from a live chat room and via Twitter.  NVIDIA's own Tom Petersen stopped by the offices to talk with us and to show off the hardware features with some live demos of GPU Boost, overclocking and quite a bit more.

If you haven't seen the video yet, you should definitely do so; Tom does a great job explaining the new technology involved with the Kepler GPU.  One caveat: the recording process was a bit off and the recording actually starts just a few minutes AFTER we actually began the live stream.  Sorry!

For more information on other upcoming events you can follow PC Perspective on Twitter, Facebook or Google+ or just check http://pcper.com/live for the latest schedule!

Author:
Subject: Mobile
Manufacturer: Nvidia

Introduction, GT 640M Basics

timelimeultra1.jpg

About two months ago I wrote an less than enthusiastic editorial about ultrabooks that pointed out several weaknesses in the format. One particular weakness in all of the products we’ve seen to date is graphics performance. Ultrabooks so far have lacked the headroom for a discrete graphics component and have instead been saddled with a low-performance version of the already so-so Intel HD 3000 IGP.

This is a problem. Ultrabooks are expensive, yet they so far are less capable of displaying rich 3D graphics than your typical smartphone or tablet. Casual gamers will notice this and take their gaming time and dollars in that direction. Early leaked information about Ivy Bridge indicates that there has been a substantial increase in graphics capability, but the information available so far is centered on the desktop. The version that will be found in ultrabooks is unlikely to be as quick.

Today we’re looking at a potential solution - the Acer Aspire Timeline Ultra M3 equipped with Nvidia’s new GT 640M GPU. This is the first laptop to launch with a Kepler based GPU. It is also an ultrabook, albeit it one with a 15.6” display. Otherwise, it isn’t much different from other products on the market, as you can see below.

aceraspiretimelimeultram3specs.png

This is likely to be the only Kepler based laptop on the market for a month or two. The reason for this is Ivy Bridge - most of the manufacturers are waiting for Intel’s processor update before they go to the trouble of designing new products.

Continue reading our review of the NVIDIA Kepler GT 640M GPU in an Ultrabook!!

Author:
Subject: Editorial
Manufacturer: NVIDIA

Quarter Down but Year Up

Yesterday NVIDIA released their latest financial results for Q4 2012 and FY2012.  There was some good and bad mixed in the results, but overall it was a very successful year for NVIDIA.

Q4 saw gross revenue top $953.2 million US with a net income of $116 million US.  This is about $53 million less in gross revenue and $62 million down in net income as compared to last quarter.  There are several reasons as to why this happened, but the majority of it appears to be due to the hard drive shortage affecting add-in sales.  Simply put, the increase in hard drive prices caused most OEMs to take a good look at the price points of the entire system, and oftentimes would cut out the add-in graphics and just use integrated.

tegra3.jpg

Tegra 3 promises a 50% increase in revenue for NVIDIA this coming year.

Two other reasons for the lower than expected quarter were start of the transition to 28 nm products based on Kepler.  They are ramping up production on 28 nm and slowing down 40 nm.  Yields on 28 nm are not where they expected them to be, and there is also a shortage of wafer starts for that line.  This had a pretty minimal affect overall on Q4, but it will be one of the prime reasons why revenue looks like it will be down in Q1 2013. 

Read the rest of the article here.

Author:
Manufacturer: Galaxy

Four Displays for Under $70

Running multiple displays on your PC is becoming a trend that everyone is trying to jump on board with thanks in large part to the push of Eyefinity from AMD over the past few years.  Gaming is a great application for multi-display configurations but in truth game compatibility and game benefits haven't reached the level I had hoped they would by 2012.  But while gaming still has a way to go, the consumer applications for having more than a single monitor continue to expand and cement themselves in the minds of users.

Galaxy is the only NVIDIA partner that is really taking this market seriously with an onslaught of cards branded as MDT, Multiple Display Technology.  Using non-NVIDIA hardware in conjunction with NVIDIA GPUs, Galaxy has created some very unique products for consumers like the recently reviewed GeForce GTX 570 MDT.  Today we are going to be showing you the new Galaxy MDT GeForce GT 520 offering that brings support for a total of four simultaneous display outputs to a card with a reasonable cost of under $120.

The Galaxy MDT GeForce GT 520

Long time readers of PC Perspective already likely know what to expect based on the GPU we are using here but the Galaxy MDT model offers quite a few interesting changes.

01.jpg

02.jpg

The retail packaging clearly indicates the purpose of this card for users looking at running more than two displays.  The GT 520 is not an incredibly powerful GPU when it comes to gaming but Galaxy isn't really pushing the card in that manner.  Here are the general specs of the GPU for those that are interested:

  • 48 CUDA cores
  • 810 MHz core clock
  • 1GB DD3 memory
  • 900 MHz memory clock
  • 64-bit memory bus width
  • 4 ROPs
  • DirectX 11 support

Continue reading our review of the Galaxy MDT GeForce 520 graphics card!!

Author:
Manufacturer: Gigabyte

Guess what? Overclocked.

The NVIDIA GTX 580 GPU, based on the GF110 Fermi architecture, is old but it isn't forgotten.  Released in November of 2010, NVIDIA had held the single GPU performance grown for more than a year before it was usurped by AMD and the Radeon HD 7970 just this month.  Still, the GTX 580 is a solid high-end enthusiast graphics card that has wide spread availability and custom designed, overclocked models from numerous vendors making it a viable option.

Gigabyte sent us this overclocked and custom cooled model quite a while ago but we had simply fallen behind with other reviews until just after CES.  In today's market the card has a bit of a different role to fill - it surely won't be able to pass up the new AMD Radeon HD 7970 but can it fight the good fight and keep NVIDIA's current lineup of GPUs more competitive until Kepler finally shows himself?

The Gigabyte GTX 580 1.5GB Super Overclock Card

With the age of the GTX 580 designs, Gigabyte had plenty of time to perfect their PCB and cooler design.  This model, the Super Overclock (GV-N580SO-15I), comes in well ahead of the standard reference speeds of the GTX 580 but sticks to the same 1.5 GB frame buffer.

01.jpg

The clock speed is set at 855 MHz core and 1025 MHz memory, compared to the 772 MHz core speed and 1002 MHz clock rate of the reference design.  That is a very healthy 10% clock rate difference that should equate to nearly that big of a gap in gaming performance where the GPU is the real bottleneck.  

Continue reading our review of the Gigabyte GTX 580 1.5GB Super Overclock graphics card!!

Battlefield 3 Frame Rate Drop Issue with GeForce GPUs

Subject: Graphics Cards | December 28, 2011 - 09:24 PM |
Tagged: gtx, geforce, bf3

Every once in a while we come across some gaming issue that when we approach those responsible for it, NVIDIA, AMD, the game developer, they seem as lost as we do.  For the last few days I have been banging my head on the table trying to figure out an issue with GeForce GTX graphics cards and Battlefield 3 and I am hoping that some of YOU might have seen it and can confirm.

While testing our new X79-based GPU test bed we continued to find that while playing Battlefield 3, frame rates would drop from 30+ to ~10 while running at 2560x1600 and Ultra quality presets.  It could happen when walking down an empty hallway or in the middle of a huge dramatic shootout with some enemies.  And sometimes, the issue would reverse and the frame rate would again jump back up to 30+ FPS.

08.jpg

A 10 frame per second tank?  No thanks...

Even more odd, and something the normal user doesn't monitor, the power consumption of the system would drop significantly during this time.  At 30+ FPS the power draw might be 434 watts while when running at the ~10 FPS level it would draw 100 watts less!  The first theory was that this was the GPU going into a lower "p-state" due to overheating or some other bug, but when monitoring our GPU-Z logs we saw no clock speed decreases and temperatures never went above 75C - pretty tame for a GPU.

To demonstrate this phenomenon we put together a quick video. 

In the video, you are seeing the "tearing" of Vsync in a much more dramatic fashion because of of our capture method.  We actually were outputing a 2560x1600 signal (!!) to an external system to be recorded localy at a very high bit rate.  Unfortunately, we could only muster a ~30 FPS capture frame rate which, coupled with the 60 Hz signal being sent, results in a bit of double up on the tearing you might usually see.  Still, the FRAPS-reported frame rates are accurate and we use an external system to capture to video to remove the possibility of any interference on performance during the capture process.

The hardware used in this video was actually based on an ASUS X58 motherboard and a Nehalem Core i7-965 processor.  But wasn't I just talking about an X79 rig?  Yes, but I rebuilt our old test bed to make sure this bug was NOT related to X79 or Sandy Bridge-E.  The systems that exhibited the issue were:

  • Intel Core i7-3960X
  • ASUS P9X79 Pro
  • 16GB DDR3-1600
  • 600GB VelociRaptor HDD
  • Windows 7 x64 SP1
  • GeForce GTX 580 (two different cards tested)
  • 290.53 Driver

Also:

  • Intel Core i7-965
  • ASUS X58 WS 
  • 6GB DDR3-1600
  • 600GB VelociRaptor HDD
  • Windows 7 x64 SP1
  • GeForce GTX 580 (two different cards tested)
  • 290.53 Driver

For me, this is only occurring at 2560x1600 though I am starting to see more reports of the issue online.

  • Another 560 ti and BF3 FPS Low Or Drop!
    • Well I just Installed my 2nd evga 560 ti DS running SLI and When I play battlefield 3 i get about 60 to 90 fps then drops at
      20 to 30. Goes Up and down, I look at the evga precision looks like each gpu is running at 40% each and changes either up or down.
      Temp. is under 60 degrees c.
  • GTX 560 Ti dramatic FPS drops on BF3 only
    • "having any setting on Ultra will cue dramatic and momentary fps drops into the 30's. if i set everything to High, i will stay above 70 fps with the new beta 285.79 drivers released today (which i thought would fix this problem but didn't). i've been monitoring things with Afterburner and i've noticed that GPU usage will also drop at the same time these FPS drops happen. nothing is occurring in the game or on the screen to warrant these drops, FPS will just drop even when nothing is going on or exploding and i'm not even moving or looking around, just idle. they occur quite frequently as well."
  • BF3 Frame Drops
    • "When i use 4xAA i get abnormal framedrops, even while nothing is going on, on the screen.
      The weird thing is that, when it drops, it always drops to 33/32fps, not higher, not lower.
      It usually happens for a few seconds."
  • BF3 @ 2560x1600 Ultra Settings Preset Unplayable
    • "I know its a beta, but i haven't heard any problems yet about framedrops.
      Sometimes my frames drop from 75fps way back to 30/20 fps, even when nothing is going on, on the screen."

So what gives?  Is this a driver issue?  Is it a Battlefield 3 issue?  Many of these users are running at resolutions other than the 2560x1600 that I am seeing it at - so either there is another problem for them or it affects different cards at different quality levels.  It's hard to say, but doing a search for "radeon bf3 frame drop" pulls up much less incriminating evidence that gamers on that side of the fence are having similar discussions.  

I have been talking with quite a few people at NVIDIA about this and while they are working hard to figure out the source of the frame rate inconsistencies, those of us with GeForce GTX cards may just want to back off and play at a lower resolution or lower settings until the fix is found.