Author:
Manufacturer: Galaxy

Four Displays for Under $70

Running multiple displays on your PC is becoming a trend that everyone is trying to jump on board with thanks in large part to the push of Eyefinity from AMD over the past few years.  Gaming is a great application for multi-display configurations but in truth game compatibility and game benefits haven't reached the level I had hoped they would by 2012.  But while gaming still has a way to go, the consumer applications for having more than a single monitor continue to expand and cement themselves in the minds of users.

Galaxy is the only NVIDIA partner that is really taking this market seriously with an onslaught of cards branded as MDT, Multiple Display Technology.  Using non-NVIDIA hardware in conjunction with NVIDIA GPUs, Galaxy has created some very unique products for consumers like the recently reviewed GeForce GTX 570 MDT.  Today we are going to be showing you the new Galaxy MDT GeForce GT 520 offering that brings support for a total of four simultaneous display outputs to a card with a reasonable cost of under $120.

The Galaxy MDT GeForce GT 520

Long time readers of PC Perspective already likely know what to expect based on the GPU we are using here but the Galaxy MDT model offers quite a few interesting changes.

01.jpg

02.jpg

The retail packaging clearly indicates the purpose of this card for users looking at running more than two displays.  The GT 520 is not an incredibly powerful GPU when it comes to gaming but Galaxy isn't really pushing the card in that manner.  Here are the general specs of the GPU for those that are interested:

  • 48 CUDA cores
  • 810 MHz core clock
  • 1GB DD3 memory
  • 900 MHz memory clock
  • 64-bit memory bus width
  • 4 ROPs
  • DirectX 11 support

Continue reading our review of the Galaxy MDT GeForce 520 graphics card!!

Author:
Manufacturer: Gigabyte

Guess what? Overclocked.

The NVIDIA GTX 580 GPU, based on the GF110 Fermi architecture, is old but it isn't forgotten.  Released in November of 2010, NVIDIA had held the single GPU performance grown for more than a year before it was usurped by AMD and the Radeon HD 7970 just this month.  Still, the GTX 580 is a solid high-end enthusiast graphics card that has wide spread availability and custom designed, overclocked models from numerous vendors making it a viable option.

Gigabyte sent us this overclocked and custom cooled model quite a while ago but we had simply fallen behind with other reviews until just after CES.  In today's market the card has a bit of a different role to fill - it surely won't be able to pass up the new AMD Radeon HD 7970 but can it fight the good fight and keep NVIDIA's current lineup of GPUs more competitive until Kepler finally shows himself?

The Gigabyte GTX 580 1.5GB Super Overclock Card

With the age of the GTX 580 designs, Gigabyte had plenty of time to perfect their PCB and cooler design.  This model, the Super Overclock (GV-N580SO-15I), comes in well ahead of the standard reference speeds of the GTX 580 but sticks to the same 1.5 GB frame buffer.

01.jpg

The clock speed is set at 855 MHz core and 1025 MHz memory, compared to the 772 MHz core speed and 1002 MHz clock rate of the reference design.  That is a very healthy 10% clock rate difference that should equate to nearly that big of a gap in gaming performance where the GPU is the real bottleneck.  

Continue reading our review of the Gigabyte GTX 580 1.5GB Super Overclock graphics card!!

Battlefield 3 Frame Rate Drop Issue with GeForce GPUs

Subject: Graphics Cards | December 28, 2011 - 09:24 PM |
Tagged: gtx, geforce, bf3

Every once in a while we come across some gaming issue that when we approach those responsible for it, NVIDIA, AMD, the game developer, they seem as lost as we do.  For the last few days I have been banging my head on the table trying to figure out an issue with GeForce GTX graphics cards and Battlefield 3 and I am hoping that some of YOU might have seen it and can confirm.

While testing our new X79-based GPU test bed we continued to find that while playing Battlefield 3, frame rates would drop from 30+ to ~10 while running at 2560x1600 and Ultra quality presets.  It could happen when walking down an empty hallway or in the middle of a huge dramatic shootout with some enemies.  And sometimes, the issue would reverse and the frame rate would again jump back up to 30+ FPS.

08.jpg

A 10 frame per second tank?  No thanks...

Even more odd, and something the normal user doesn't monitor, the power consumption of the system would drop significantly during this time.  At 30+ FPS the power draw might be 434 watts while when running at the ~10 FPS level it would draw 100 watts less!  The first theory was that this was the GPU going into a lower "p-state" due to overheating or some other bug, but when monitoring our GPU-Z logs we saw no clock speed decreases and temperatures never went above 75C - pretty tame for a GPU.

To demonstrate this phenomenon we put together a quick video. 

In the video, you are seeing the "tearing" of Vsync in a much more dramatic fashion because of of our capture method.  We actually were outputing a 2560x1600 signal (!!) to an external system to be recorded localy at a very high bit rate.  Unfortunately, we could only muster a ~30 FPS capture frame rate which, coupled with the 60 Hz signal being sent, results in a bit of double up on the tearing you might usually see.  Still, the FRAPS-reported frame rates are accurate and we use an external system to capture to video to remove the possibility of any interference on performance during the capture process.

The hardware used in this video was actually based on an ASUS X58 motherboard and a Nehalem Core i7-965 processor.  But wasn't I just talking about an X79 rig?  Yes, but I rebuilt our old test bed to make sure this bug was NOT related to X79 or Sandy Bridge-E.  The systems that exhibited the issue were:

  • Intel Core i7-3960X
  • ASUS P9X79 Pro
  • 16GB DDR3-1600
  • 600GB VelociRaptor HDD
  • Windows 7 x64 SP1
  • GeForce GTX 580 (two different cards tested)
  • 290.53 Driver

Also:

  • Intel Core i7-965
  • ASUS X58 WS 
  • 6GB DDR3-1600
  • 600GB VelociRaptor HDD
  • Windows 7 x64 SP1
  • GeForce GTX 580 (two different cards tested)
  • 290.53 Driver

For me, this is only occurring at 2560x1600 though I am starting to see more reports of the issue online.

  • Another 560 ti and BF3 FPS Low Or Drop!
    • Well I just Installed my 2nd evga 560 ti DS running SLI and When I play battlefield 3 i get about 60 to 90 fps then drops at
      20 to 30. Goes Up and down, I look at the evga precision looks like each gpu is running at 40% each and changes either up or down.
      Temp. is under 60 degrees c.
  • GTX 560 Ti dramatic FPS drops on BF3 only
    • "having any setting on Ultra will cue dramatic and momentary fps drops into the 30's. if i set everything to High, i will stay above 70 fps with the new beta 285.79 drivers released today (which i thought would fix this problem but didn't). i've been monitoring things with Afterburner and i've noticed that GPU usage will also drop at the same time these FPS drops happen. nothing is occurring in the game or on the screen to warrant these drops, FPS will just drop even when nothing is going on or exploding and i'm not even moving or looking around, just idle. they occur quite frequently as well."
  • BF3 Frame Drops
    • "When i use 4xAA i get abnormal framedrops, even while nothing is going on, on the screen.
      The weird thing is that, when it drops, it always drops to 33/32fps, not higher, not lower.
      It usually happens for a few seconds."
  • BF3 @ 2560x1600 Ultra Settings Preset Unplayable
    • "I know its a beta, but i haven't heard any problems yet about framedrops.
      Sometimes my frames drop from 75fps way back to 30/20 fps, even when nothing is going on, on the screen."

So what gives?  Is this a driver issue?  Is it a Battlefield 3 issue?  Many of these users are running at resolutions other than the 2560x1600 that I am seeing it at - so either there is another problem for them or it affects different cards at different quality levels.  It's hard to say, but doing a search for "radeon bf3 frame drop" pulls up much less incriminating evidence that gamers on that side of the fence are having similar discussions.  

I have been talking with quite a few people at NVIDIA about this and while they are working hard to figure out the source of the frame rate inconsistencies, those of us with GeForce GTX cards may just want to back off and play at a lower resolution or lower settings until the fix is found.  

Author:
Manufacturer: NVIDIA

A Temporary Card with a Permanent Place in Our Heart

Today NVIDIA and its partners are announcing availability of a new graphics card that bridges the gap between the $230 GTX 560 Ti and the $330 GTX 570 currently on the market.  The new card promises to offer performance right between those two units with a price to match but with a catch: it is a limited edition part with expected availability only through the next couple of months.

When we first heard rumors about this product back in October I posited that the company would be crazy to simply call this the GeForce GTX 560 Ti Special Edition.  Well...I guess this makes me the jackass.  This new ~$290 GPU will be officially called the "GeForce GTX 560 Ti with 448 Cores". 

Seriously.

The GeForce GTX 560 Ti 448 Core Edition

The GeForce GTX 560 Ti with 448 cores is actually not a GTX 560 Ti at all and in fact is not even built on a GF114 GPU - instead we are looking at a GF110 GPU (the same used on the GeForce GTX 580 and GTX 570 graphics cards) with another SM disabled.  

block_580.jpg

GeForce GTX 580 Diagram

The above diagram shows a full GF110 GPU sporting 512 CUDA cores and the full 16 SMs (simultaneous multiprocessors) along with all the bells and whistles that go along with that $450 card.  This includes a 384-bit memory bus and a 1.5 GB frame buffer that all adds up to still being the top performing single graphics card on the market today.  

Continue reading our review of the GeForce GTX 560 Ti 448 Core Graphics Card!!

Author:
Manufacturer: EVGA

EVGA Changes the Game Again

Introduction

Dual-GPU graphics cards are becoming an interesting story.  While both NVIDIA and AMD have introduced their own reference dual-GPU designs for quite some time, it is the custom build models from board vendors like ASUS and EVGA that really peak our interest because of their unique nature.  Earlier this year EVGA released the GTX 460 2Win card that brought the worlds first (and only) graphics card with a pair of the GTX 460 GPUs on-board.  

ASUS has released dual-GPU options as well including the ARES dual Radeon HD 5870 last year and the MARS II dual GTX 580 just this past August but they were both prohibitively rare and expensive.  The EVGA "2Win" series, which we can call it now that there are two of them, is still expensive but much more in line with the performance per dollar of the rest of the graphics card market.  When the company approached us last week about the new GTX 560 Ti 2Win, we jumped at the chance to review it.

The EVGA GeForce GTX 560 Ti 2Win 2GB

The new GTX 560 Ti 2Win from EVGA follows directly in the footsteps of the GTX 460 model - we are essentially looking at a pair of GTX 560 Ti GPUs on a single PCB running in SLI multi-GPU mode.  Clock speeds, memory capacity, performance - it should all be pretty much the same as if you were running a pair of GTX 560 Ti cards independently.

01.jpg

Just as with the GTX 460 2Win, EVGA is the very first company to offer such a product.  NVIDIA didn't design a reference platform and pass it along to everyone like they did with the GTX 590 - this is all EVGA.

Continue reading our review of the EVGA GeForce GTX 560 Ti 2Win!!!

NVIDIA Upgrading GTX 560 to 448 CUDA Cores?

Subject: Graphics Cards | October 27, 2011 - 07:41 PM |
Tagged: nvidia, GTX 560, geforce

A rumor that I read over at Guru3D seems to think that the GeForce GTX 560 Ti, a card that has been very successful in the ~$230 graphics market, might be getting an upgrade just in time for the pending holiday buying season.  According to the report, the new version would move from the current 384 CUDA core count to 448 cores, essentially adding another full SM (symmetric multiprocessor) to the GPU.

gtx560ti.jpg

A collection of current GTX 560 Ti cards...

Guru3D notes though that the "new" GTX 560 Ti would be based on the GF110 GPU (same as the GTX 580 and GTX 570) simply because the GTX 560 Ti uses all the available processing cores of the GF114 design.  The GPU on this new card would be a GTX 580 with two SMs disabled, rather than the single SM disabled on the GTX 570.

Here are the reports other details:

It features 14 active SMs, which include 448 SP / CUDA Cores and 56 TMUs; 320-bit memory and 40 ROPs - a very similar configuration to the old GTX 470. Along with increased performance, power consumption is expected to rise over the 384 SP GTX 560 Ti. A benefit to using GF110 means the revised 560 Ti will feature 2 x SLI connectors, enabling 3-way SLI.

While I believe this part could definitely exist, I wouldn't think NVIDIA would simply remove the current GTX 560 Ti and replace it; instead I would imagine the company would go for the "GTX 565" route, or something similar to it.  Maybe a GTX 560 Ultra.  Either way, a new card that would fit in to the price slot somewhere between the GTX 560 Ti ($230) and the GTX 570 ($340) would be a welcome addition, especially with games like Battlefield 3 and Skyrim set to take advantage of that horsepower. 

Source: Guru3D

Almost Time! Battlefield 3 Release, Is Your Hardware Ready?

Subject: Graphics Cards | October 24, 2011 - 03:06 PM |
Tagged: radeon, nvidia, geforce, bf3, amd

I know that you might have Battlefield 3 overload by now, but I wanted to make sure you all remembered to take a look at our BF3 Performance Guide from a couple weeks back to make sure your PC is ready for what might be the most anticipated and talked about PC titles in years. 

14.jpg

Here is a summary of the content we have written based on the game - make sure you know ALL of it so you can get your system prepared for the pending battle!!

Keep checking back at PC Perspective as we are planning on doing some more fun live streaming of our BF3 matches and be sure to sign up for the official PCPer "Fragging Frogs" platoon in Battlelog!

Author:
Manufacturer: id Software

RAGE Peforms...well

RAGE is not as dependant on your graphics hardware as it is on your CPU and storage system (which may be an industry first); the reason for which we will discover when talking about the texture pop-up issue on the next page.

Introduction

The first id Software designed game since the release of Doom 3 in August of 2004, RAGE has a lot riding on it.  Not only is this the introduction of the idTech 5 game engine but also culminates more than 4 years of development and the first new IP from the developer since the creation of Quake.  And since the first discussions and demonstrations of Carmack's new MegaTexture technology, gamers have been expecting a lot as well. 

02-screen.jpg

Would this game be impressive enough on the visuals to warrant all the delays we have seen?  Would it push today's GPUs in a way that few games are capable of?  It looks like we have answers to both of those questions and you might be a bit disappointed.  

Performance Characteristics

First, let's get to the heart of the performance question: will your hardware play RAGE?  Chances are, very much so.  I ran through some tests of RAGE on a variety of hardware including the GeForce GTX 580, 560 Ti, 460 1GB and the Radeon HD 6970, HD 6950, HD 6870 and HD 5850.  The test bed included an Intel Core i7-965 Nehalem CPU, 6GB of DDR3-1333 memory running off of a 600GB VelociRaptor hard drive.  Here are the results from our performance tests running at 1920x1080 resolution with 4x AA enabled in the game options:

Continue reading our initial look at RAGE performance and image quality!!

PCPer Live! Battlefield 3 Beta Party and Discussion @ 10:30pm EST

Subject: Editorial, General Tech, Graphics Cards | September 29, 2011 - 06:51 PM |
Tagged: radeon, pcper live, nvidia, geforce, bf3, battlefield 3, amd

Look, there isn't any football on tonight, so what else are you going to do?  Rather than watch a rerun of Seinfeld, why not stop by here at http://pcper.com/live to hang out and not only watch us play some Battlefield 3 but also participate yourself by calling in on Skype, using Google+ Hangouts or even just chatting on our IRC server (welcome back to 1998, amiright?).

We are going to start the show at 10:30pm EST and plan on running until midnight at least, but it all depends on participation levels from YOU! 

What all do we have on the agenda?  Well, it is going to be pretty informal as we are still getting our feet wet with this whole "live" thing but here is what we are planning:

  • Watch Ryan and Ken get destroyed over and over in Battlefield 3 and watch as we attempt to sneak our way into the password protected Caspian levels with vehicles, 64 players and a new game type (oh my!).
  • Skype call ins from readers and gamers that want to talk about BF3 and their experiences with the game so far.  How does it run on your hardware?  Discussion like this will help others that might not have the beta figure out what they might want to upgrade in the near future.  (Be prepared to give us your Skype handle in the chat so we can call you!)
  • Trying some group games via our PC Perspective Platoon, the Fragging Frogs!  (Head over and apply to join or become a fan!)  You can also find me on EA Origin or in the Battlelog system as "ryanshrout".
  • We will discuss our brainstorming sessions for what hardware we will recommend for certain gaming resolutions and specific image quality settings in a future article...all live!
  • Maybe some surprise guests from the PC Perspective staff and beyond...??  

We will be streaming the festivities live on our Justin.tv channel (embedded below) and will have a chat widget here as well for those of you that would rather use IRC than the integrated Justin.tv chat.

In short, we are planning on having a good time playing some games and talking hardware so if you are into that, then I think you should be sure to stop by and say hello!!  Let us know in the comments if you have anything else you want to see or any more ideas for our live show.  Thanks!!!

 

Source: PCPer

More Battlefield 3 Beta Performance Results: GTX 460, Radeon HD 5850 and 9800 GT!

Subject: Graphics Cards | September 29, 2011 - 05:54 PM |
Tagged: bf3, battlefield 3, nvidia, amd, geforce, radeon, gtx 460, hd 5850, 9800 gt

I just wanted to drop a quick note here on the home page to let you know that we haven't been just playing around on Battlefield 3 all day - instead we have been playing around on BF3 all day in order to bring you some more information about performance in the game!  Earlier this afternoon I updated my Battlefield 3 Beta Performance article with a new page that focuses on results from the GeForce GTX 460 1GB, the AMD Radeon HD 5850 1GB and the GeForce 9800 GT 512MB card for those of you that are really behind the curve in the upgrade cycle.  

bf3caspian.jpg

What's this?  The Caspian map?  Performance results soon!

I would recommend you check out that new content to see where you hardware might fall and hopefully over the weekend we are going to be putitng together a system guide similar to our own Hardware Leaderboard but targetted at the BF3 gaming experience.  Stay tuned!!

Source: PCPer