Subject: Graphics Cards | June 8, 2012 - 01:23 PM | Tim Verry
Tagged: tahiti, graphics, gpu, computex, binning, amd, 7970 ghz edition
AMD is having a string of successes with its 28nm 7000 series graphics cards. While it was dethroned by NVIDIA’s GTX 680, the AMD Radeon HD 7970 is easier to get a hold of. It certainly seems like the company is having a much easier time in manufacturing its GPUs compared to NVIDIA’s Kepler cards. AMD has been cranking out HD 7970s for a few months now and they have gotten the binning process down such that they are getting a good number of pieces of silicon that have a healthy bit of overhead over that of the 7970’s stock speeds.
And so enters Tahiti 2. Tahiti 2 represents GPU silicon that is binning not only for HD 7970 speeds but is able to push up the default clock speed while running with lower voltage. As a result, the GPUs are able to stay within the same TDP of current 7970 cards but run faster.
But how much faster? Well, SemiAccurate is reporting that AMD is seeing as much as a 20% clock speed improvement over current Radeon HD 7970 graphics cards. This means that cards are able to run at clock speeds up to approximately 1075MHz – quite a bit above the current reference clock speed of 925MHz!
The AMD 7970 3GB card. Expect Tahiti 2 to look exactly the same but run at higher clock speeds.
They are further reporting that, because the TDP has not changed, no cooler, PCB, or memory changes will be needed. This will make it that much easier for add in board partners to get the updated reference-based GPUs out as quickly as possible and with minimal cost increases (we hope). You can likely count on board partners capitalizing on the 1,000MHz+ speeds by branding the new cards “GHz Edition” much like the Radeon 7770 has enjoyed.
With 7970 chips having overhead and binning higher than needed, an updated and lower-power using refresh may also be in order for AMD’s 7950 “Tahiti Pro” graphics cards. Heck, maybe they can refresh the entire lineup with better binned silicon but keep the same clock speeds in order to reduce power consumption on all their cards.
Subject: Graphics Cards | February 6, 2012 - 06:23 PM | Tim Verry
Tagged: nvidia, kepler, graphics, gpu
Although there were quite a few rumors leading up to AMD's Radeon 7000 series launch, the Internet has been very quiet on the greener side of the graphics market. Finally; however, we have some rumors to share with you on the Nvidia front. As always, take these numbers with more than your average grain of salt.
Specifically, EXP Review managed to uncover two charts that supposedly detail specifics about a range of GeForce 600 series Kepler cards from the number of stream processors to the release date. Needless to say, it's a lot of rumored information to take in all at once.
Anyway, without further adieu, let's dive into the two leaked charts.
|Model||Code Name||Die Size||Core Clock (TBD) MHz||Shader Clock (TBD) GHz||Stream Processors||SM Count||ROPs||Memory Clock (effective) GDDR5||Bus Width||Memory Bus Width|
From the chart above, we can see the entire lineup of Kepler cards from the NVIDIA GTX 640 to the dual GPU GTX 690. The die size in the higher end GeForce cards is approximately 50% larger than that of the AMD Radeon HD 7970, but not much bigger than that of the GTX 580. If only we knew the TDP of these cards! In the next chart, we see alleged performance comparison versus the AMD competition.
|Model||Bus Interface||Frame Buffer||Transistors (Billion)||Price Point||Release Date||Performance Scale|
|GTX690||PCI-E 3 x16||2x1.75 GB||2x6.4||$999||Q3 2012|
|GTX680||PCI-E 3 x16||2 GB||6.4||$649||April 2012||~45%>HD7970|
|GTX670||PCI-E 3 x16||1.75 GB||6.4||$499||April 2012||~20%>HD7970|
|GTX660Ti||PCI-E 3 x16||1.5 GB||6.4||$399||Q2/Q3 2012||~10%>HD7950|
|GTX660||PCI-E 3 x16||2 GB||3.4||$319||April 2012||~GTX580|
|GTX650Ti||PCI-E 3 x16||1.75 GB||3.4||$249||Q2/Q3 2012||~GTX570|
|GTX650||PCI-E 3 x16||1.5 GB||1.8||$179||May 2012||~GTX560|
|GTX640||PCI-E 3 x16||2 GB||1.8||$139||May 2012||~GTX550Ti|
If these numbers hold true, NVIDIA will handily beat the current AMD offerings; however, I would wait for reviews to come out before making any purchasing decisions. One interesting aspect is the amount of GDDR5 memory. It seems that NVIDIA is sticking with 2GB frame buffers (or less) per GPU while AMD has really started upping the RAM. It will be interesting to see how this affects gaming in NVIDIA Surround and/or at high resolutions.
What do you guys think about these numbers, do you think Kepler will live up to the alleged performance scale figures?
Subject: Graphics Cards | August 2, 2011 - 10:46 AM | Tim Verry
Tagged: graphics, gpu, galaxy
Popular maker of NVIDIA graphics cards Galaxy, recently announced that they are extending the warranty of their graphics cards products to three years. "Galaxy has listened to the enthusiast market and we are glad to move from a 2 year warranty to a 3 year warranty by registration." The new extended warranty will apply to all graphics cards purchased after August 1st, 2011 that are then registered with Galaxy. Products will further bear the seal shown below to let customers know that the graphics card qualifies.
Seeing warranties being extended is always a good thing, especially in a world where the once popular lifetime warranty is rare. What do you think of the extended warranty? Will this be enough to push you towards a Galaxy branded card on your next purchase?
Subject: Motherboards | July 6, 2011 - 04:36 PM | Tim Verry
Tagged: PCI-E 3.0, msi, graphics
MSI recently unveiled a new motherboard supporting the PCI-Express 3.0 standard. The Intel LGA 1155 CPU socket and Z68 chipset are also features of the upcoming motherboard, dubbed the Z68A-GD80 (G3).
The new MSI board joins ASRock's announcement as one of the first PCI-Express 3.0 motherboards, and is loaded with tons of features. The Z68 chipset naturally supports Intel Sandy Bridge processors, PCI-E 3.0, a UEFI BIOS, OC Genie II, and their signature MIL-810STD military class components. The PCI-E 3.0 slots help AMD CrossFire X and NVIDIA SLI multi GPU solutions fed with plenty of bandwidth. Rear IO includes a PS/2 port, USB 3.0, USB 2.0, HDMI, DVI, 7.1 audio, Dual Gigabit Ethernet, e-SATA, and firewire. On board IO includes 3 PCI-E 3.0 slots, 2 PCI slots, and two PCI-E x1 slots, the 1155 CPU socket, and 4 DDR3 DIMMs.
What do you think of the new board; are you ready for PCI-E 3.0?
Subject: Graphics Cards | May 29, 2011 - 11:35 PM | Tim Verry
Tagged: mobile gaming, gtx 560m, graphics, computex
At Computex 2011, NVIDIA plans to showcase the latest addition to its mobile graphics lineup, the GTX 560M GPU. Powered by a mobile version of the 500 series desktop GPU, the graphics card will bring support for NVIDIA's Optimus, 3D Vision, and PhysX technologies. On launch, there will be two notebooks from Asus and Toshiba, the G74sx and a Qosmio gaming laptop respectively, with many more to follow.
The Asus G74sx and Toshiba Gaming Notebook
From a performance aspect, the GTX 560M purports to deliver twice the performance of the current latest 540M mobile chip. According to GeForce.com, in Crysis: Warhead the GTX 560M pulls a respectable 30-40 FPS at 1080p resolution with “Gamer” detail settings. This is in contrast to the older GTX 540M, which can only maintain 30-40 frames per second at 1080p at the lowest detail settings. In 3DMark Vantage, the GTX 560M scored 10,000 points whereas the older 540M only pulled off approximately 4,200 points. Andrew Coonrad, of NVIDIA’s Technical Marketing department further stated that the graphics card would play both the Witcher 2 and Duke Nukem: Forever at approximately 50 frames per second.
GeForce states that if you are a mobile gamer looking for an easy to carry gaming notebook that can offer Optimus’ battery saving technology and 3D Visions gaming features, laptops with the GTX 560 are the way to go as the older GTX 480M is not nearly as power efficient (and thus less portable). Laptops with the new graphics card are in stock now at several online retailers.
Subject: Graphics Cards | May 21, 2011 - 03:04 AM | Tim Verry
Tagged: nvidia, kfa2, GTX 560, graphics
Not to be left out of the slew of NVIDIA GeForce GTX 560 releases, KFA2 announced two new NVIDIA graphics cards to their current graphics card lineup. Both are based on the Geforce GTX 560 GPU; however, one card is overclocked and fitted with an aftermarket heatsink and fan combo (the other is a standard single, centered, and shrouded fan design). Labeled the KFA2 GeForce GTX 560 1GB 256bit and the KFA2 GeForce GTX 560 EX OC 1GB 256bit, the DirectX 11 cards offer the following specifications:
|GeForce GTX 560 1GB 256bit||GeForce GTX 560 EX OC 1GB 256bit|
|GPU Clock||810 MHz||905 MHz|
|Shader Clock||1620 MHz||1810 MHz|
|Memory Clock||2004 MHz||2004 MHz|
|Memory||1 GB GDDR5 on 256-bit bus||1 GB GDDR5 on 256-bit bus|
|Memory Bandwidth||128.3 GB/s||128.3 GB/s|
|Texture Fill Rate||45.3 Billion/s||50.6 Billion/s|
The two new cards seem to be positioned (specifications wise) between purely reference cards and the highest clocked GTX 560 cards of their competitors. The street price will ultimately determine if they are worth picking up versus other brands with higher clocks or reference clocks but aftermarket cooling. KFA2 states that the cards will be available online and in retail stores throughout Europe, and are backed by a two year warranty.
Subject: Graphics Cards | May 19, 2011 - 04:33 AM | Tim Verry
Tagged: nvidia, GTX 560, graphics
Coinciding with the NDA lift on the NVIDIA GeForce GTX 560, Gigabyte announced its enthusiast class Overclock Edition graphics card based on new the GTX 560 GPU.
The new Overclock Edition replaces the reference design's cooler with Gigabyte's own WindForce 2X variant, which they claim reduces the noise of the card under full load to 31db. Further, the heatsink used direct heat pipe technology, which means that the heat pipes that carry heat away from the GPU and into the fins physically contact the GPU itself. Both fans produce 30.5 CFM of airflow to quickly dissipate the heat of the overclocked GTX 560 GPU, Gigabyte was able to clock the card at a 830 MHz GPU clock and a 4008 Mhz memory clock from the factory. Gigabyte claims to improve overclocking capability by 10% to 30% thanks to it's "Ultra Durable" copper PCB technology and power switching enhancements.
The full specification of the GeForce GTX 560 Overclock Edition are as follows:
|Core Clock||830 MHz|
|Shader Clock||1660 MHz|
|Memory Amount||1 GB|
|Memory Bus||256 bit|
|Card Bus||PCI-E 2.0|
|Process Technology||40 nm|
|Card Dimensions||43mm (h) x 238mm (l) x 130mm (w)|
|Power Requirements||Minimum 500 Watt PSU required|
1x HDMI and Display Port via adapter(s)
1x mini HDMI
1x VGA (via adapter)
Gigabyte is a popular motherboard manufacturer for enthusiasts and it seems that they are striving to gain that same level of consumer brand loyalty with their graphics cards. Do you have a Gigabyte graphics card in your rig?
Subject: Graphics Cards | May 17, 2011 - 01:43 PM | Tim Verry
Tagged: nvidia, hardware, graphics
The current GTX 590
VR-Zone reports that NVIDIA is gearing up to deliver a revised edition GTX 590 in June to combat the overheating problems that some overclockers fell victim too using certain drivers. PC Perspective did not run into the issue when overclocking their card; however, VR-Zone stated in an earlier article that:
"NVIDIA has sent out a cautionary to their partners regarding possible component damage due to high temperature when running Furmark 1.9 as it bypasses the capping detection. . . . This is something not able to fix through drivers nor it is just applicable to GeForce GTX 590."
Fortunately for overclockers, NVIDIA is planning to re-engineer aspects of the design, including new inductors, which should help with the over-current protection issues. This new design will also effect the size and dimensions of the current GTX 590 PCB, which means that current third party heat sinks and water blocks made for the (current) GTX 590 will not fit.
It is nice to see that NVIDIA is sticking by it's technology and updating its hardware to fix issues. Overclockers especially, will benefit from this updated model.
Subject: General Tech, Graphics Cards | May 13, 2011 - 05:30 PM | Tim Verry
Tagged: nvidia, GTX560, graphics
The NVIDIA GeForce GTX 560 is due to release on May 17th. As the release date approaches, vast speculation and rumors have flooded the Internet. GeForce.com has stepped up to preview what the card looks like and how it fairs in three soon to be released PC games versus the 9800GT at the popular 1080p resolution. GeForce chose the 9800 GT for comparison because they found the card to be one of the most popular used on Steam. As games are becoming more advanced graphically and 1080p monitors are becoming more popular, they wanted to compare what the GTX 560 is capable of versus a card that many people are familiar with.
While they were unable to share exact hardware specifications and performance numbers (due to NDA), they were able to show what graphics detail settings the card was able to run at 1080p and at least 35 frames per second. The stated "Optimal Playable Settings" for the GTX 560 were then compared to the 9800 GT in three games. These three soon to be released games were each chosen because of their ability to showcase what high resolution, high PhysX detail, and Nvidia Surround looked like. The GTX 560 was able to handle all three of those features with ease, whereas the older but popular 9800 GT ran into issues playing games with those features smoothly. The system configuration they used to test both cards is as follows:
|Motherboard||ASUS P8P67 WS Revolution|
|CPU||Intel Core i7 2600K @ 3.4GHz|
|RAM||8GB DDR3 1333MHz, 9-9-9-24|
|Operating System||Windows 7 x64|
The first game they showcased was Duke Nukem Forever. GeForce states that Duke Nukem will support both NVIDIA 3D and PhysX. The graphics details they were able to achieve with Duke Nukem Forever are:
|Shadows||World & Characters|
|Post Special Effects||On|
The GTX 560 managed to pull off at least 35fps. Conversely, the game was not playable at these settings with the 9800 GT. Specifically, the 3D feature was not practical with the 9800 GT.
Alice: Madness Returns was the second game GeForce showed off. One interesting aspect of Alice is the useage of PhysX. The graphics quality is much improved by the graphics textures and particles added by PhysX, as you can see in the comparison screenshot below.
The GTX 560 managed to run the game at the following setttings:
The 9800 GT that they compared the GTX 560 to was a "slide show" by comparison. The demands of PhysX were especially responsible for the reduced performance. The 9800 GT simply was not capable of processing both high resolution graphics and the high PhysX calculations. The GTX 560 was; however, capable of running the game at maxed out settings (at 1080p).
GeForce finally showcased the GTX 560 running Dragon Seige III. In this test, they utilized 3 monitors in an NVIDIA Surround configuration. The graphical settings that they were able to get out of the GTX 560 included:
|Visual Effects Quality||High|
Their results are as follows:
"On these settings, which were near maximum aside from anti-aliasing which tops off at 16x, the average framerate was again consistently smooth and playable. Here, the ultra-wide experience allowed us to immerse ourselves into some deep dungeon crawling. Unfortunately for the 9800 GT, the GPU in SLI does not support NVIDIA Surround, making it impossible to play at the 5760x1080 resolution. "
The GeForce GTX 560 is reported to be positioned between the Geforce 460 and 560Ti on the NVIDIA side, and the 6870 and 6950 (1GB) on the AMD side. When it comes to 1080p resolution, so far it has been a toss up for many DIY enthusiasts between buying the AMD 6950 (2GB) and the NVIDIA GTX 560Ti for maximum performance. If GeForce's preview holds true for other games, the GTX 560 may well provide an another option for enthusiasts after the bang for the buck price and performance at 1080p resolutions.
As for speculation and rumors on the graphics card's hardware, there have been many floating around the Internet. For example, Tech Connect states that the GTX 560 will feature 336 CUDA cores, 56 Texture Units, and 1GB of GDDR5 memory on a 256-bit bus. Further, Tech Connect maintains that the card is rumored to be priced at approximately $200. From Nvidia's statement that the graphics card will be positioned between the GTX 460 and the GTX 560Ti in terms of performance, the GPU will likely be clocked somewhere between the 675Mhz of the GTX 460 and the 820Mhz of the GTX 560Ti, with the RAM being slightly lower than the GTX 560Ti's 4008Mhz.
Unfortunately, (until the NDA is lifted) only NVIDIA can tell us what the real specifications of the GTX 560 will be, and they are not talking. You can; however, find further details as well as a video of the soon to be released card in action over at GeForce.com, and PC Perspective will have a review up with benchmarks gallore and the official hardware specifications as soon as the NDA is lifted on May 17th.
Will the GTX 560 power your next gaming rig?
Subject: General Tech, Graphics Cards | April 28, 2011 - 11:36 AM | Ryan Shrout
Tagged: graphics, battlefield 3, battlefield
Got 12 minutes to spare? No, how about at least a few? I promise you'll be pretty impressed by what you see. I think I am late to the game in posting this but EA released a 12 minute video showing gameplay of the upcoming Battlefield 3 game. While we don't know anything about the minimum specifications of the game yet, we can assume that it will indeed take advantage of some of the latest graphics hardware.
Take a look! And if you are really tight on time, just to the 2:30 mark.
Considering that many people were disappointed in the appearance of the recently released Crysis 2, the considerable work being put into the Battlefield 3 PC game is going to be well appreciated. In fact, this quote from developer DICE really proves our point that "building for consoles" is stupid:
"So for our target of what we want to hit, we are now using the more powerful platform to try and prove what we see gaming being in the future rather than using the lowest common denominator, instead of developing it for the consoles and then just adding higher resolution textures and anti-aliasing for the PC version. We’re do it the other way around, we start with the highest-end technology that we can come up with and then scale it back to the consoles."