Subject: General Tech | March 25, 2014 - 09:46 PM | Tim Verry
Tagged: gtx titan z, gtx titan, GTC 2014, CUDA
During the opening keynote, NVIDIA showed off several pieces of hardware that will be available soon. On the desktop and workstation side of things, researchers (and consumers chasing the ultra high end) have the new GTX Titan Z to look forward to. This new graphics card is a dual GK110 GPU monster that offers up 8 TeraFLOPS of number crunching performance for an equally impressive $2,999 price tag.
Specifically, the GTX TITAN Z is a triple slot graphics card that marries two full GK110 (big Kepler) GPUs for a total of 5,760 CUDA cores, 448 TMUs, and 96 ROPs with 12GB of GDDR5 memory on a 384-bit bus (6GB on a 384-bit bus per GPU). NVIDIA has yet to release clockspeeds, but the two GPUs will run at the same clocks with a dynamic power balancing feature. Four the truly adventurous, it appears possible to SLI two GTX Titan Z cards using the single SLI connector. Display outputs include two DVI, one HDMI, and one DisplayPort connector.
NVIDIA is cooling the card using a single fan and two vapor chambers. Air is drawn inwards and exhausted out of the front exhaust vents.
In short, the GTX Titan Z is NVIDIA's new number crunching king and should find its way into servers and workstations running big data analytics and simulations. Personally, I'm looking forward to seeing someone slap two of them into a gaming PC and watching the screen catch on fire (not really).
What do you think about the newest dual GPU flagship?
Stay tuned to PC Perspective for further GTC 2014 coverage!
Subject: Graphics Cards | January 21, 2014 - 08:49 PM | Ryan Shrout
Tagged: rumor, nvidia, kepler, gtx titan black, gtx titan, gtx 790
How about some fresh graphics card rumors for your Tuesday afternoon? The folks at VideoCardz.com have collected some information about two potential NVIDIA GeForce cards that are going to hit your pocketbook hard. If the mid-range GPU market was crowded wait until you see the changes NVIDIA might have for you soon on the high-end.
First up is the NVIDIA GeForce GTX TITAN Black Edition, a card that will actually have the same specifications as the GTX 780 Ti but with full performance double precision floating point and a move from 3GB to 6GB of memory. The all-black version of the GeForce GTX 700-series cooler is particularly awesome looking.
Image from VideoCardz.com
The new TITAN would sport the same GPU as GTX 780 Ti, only TITAN BLACK would have higher double precision computing performance, thus more FP64 CUDA cores. The GTX TITAN Black Edition is also said to feature 6GB memory buffer.This is twice as much as GTX 780 Ti, and it pretty much confirms we won’t be seeing any 6GB Ti’s.
The rest is pretty much well known, TITAN BLACK has 2880 CUDA cores, 240 TMUs and 48 ROPs.
VideoCardz.com says this will come in at $999. If true, this is a pure HPC play as the GTX 780 Ti would still offer the same gaming performance for enthusiasts.
Secondly, there looks to be an upcoming dual-GPU graphics card using a pair of GK110 GPUs that will be called the GeForce GTX 790. The specifications that VideoCardz.com says they have indicate that each GPU will have 2496 enabled CUDA cores and a smaller 320-bit memory interface with 5GB designated for each GPU. Cutting back on the memory interface, shader counts and even clocks speeds would allow NVIDIA to manage power consumption at the targeted 300 watt level.
Image from VideoCardz.com
Head over to VideoCardz.com for more information about these rumors but if all goes as they expect, you'll hear about these products quite a bit more in February and March.
What do you think? Are these new $1000 graphics cards something you are looking forward to?
Computex 2013: Gigabyte Preparing Custom GTX Titan With WindForce 450W Cooler... Some Assembly Required
Subject: Graphics Cards | June 7, 2013 - 05:20 PM | Tim Verry
Tagged: windforce 450w, windforce, gtx titan, gk110, gigabyte
Back in April, Gigabyte showed off its new custom WindForce 450W GPU HSF, but did not name which specific high end graphics cards it would be used with. So far, NVIDIA's highest-end single GPU solution, the GTX Titan, has been off limits for GPU manufacturers as far as putting custom air coolers on the cards (NVIDIA has restricted designs to its reference cooler or factory installed water blocks).
It seems that Gigabyte has found a solution to the cooler restriction, however. The company will be selling a GTX TITAN with model number GV-NTITAN-6GDB later this year that will come with NVIDIA's reference cooler pre-installed along with a bundled WindForce 3X 450W cooler and instructions for switching out the coolers.
Gigabyte Is Showing Off the custom GTX Titan at Computex, as discovered by TechPowerUp.
Users that do take Gigabyte up on its offer to switch to the custom WindForce cooler will still be covered under the company's standard warranty policy, which is a good thing. The kit is likely to be more expensive than your standard TITAN though as Gigabyte is having to sell the card with two coolers and increased support costs. On the other hand, users could swap out the coolers and then sell the unused TITAN reference cooler to offset some of the cost of the kit.
Gigabyte is actually showing off the new graphics card with WindForce 3X 450W cooler at Computex this week. The dual slot WindForce cooler is said to keep a GTX 680 2°C and 23.3 dB quieter than the reference cooler when running the Furmark benchmark suite. The major benefit of the WindForce is having three large fans that can spin at lower RPMs to give you the same cooling performance as the reference NVIDIA design at a much lower noise volume rather than pure cooling performance. Should you be looking to push the TITAN to the extreme, a water block would be your best bet, but for many users i think the allure of a quieter air cooled TITAN may be enough for Gigabyte to snag a few adventurous enthusiasts willing to put up with assembling the new card themselves.
More information on the WindForce 3X 450W cooler can be found here.
Subject: Cases and Cooling | June 6, 2013 - 02:31 PM | Tim Verry
Tagged: evga, minibox, mini-itx, gtx titan, gk110, gaming, computex, computex 2013
First shown off at CES 2013, the EVGA Minibox is a small form factor chassis for Mini-ITX systems that can accommodate large graphics cards. EVGA has managed to enable users to pack a lot of hardware into this tiny form factor chassis. As a demonstration of the case's capabilities, the company showed off the latest version using a full system build with Core i7-4770K and GTX TITAN interals at Computex this week in Taipei.
The Minibox chassis itself is a dark brushed metal case with two USB 3.0 ports on the front IO and space for a slot loading optical drive. The MiniBox chassis further features a motherboard tray that supports Mini-ITX boards, two 2.5" SATA hard drive bays – and best of all – enough room to install full size GPUs. In order to support lengthy graphics cards, EVGA is including a small form factor 500W power supply that is mounted on the floor of the case..
HEXUS reporters spot the EVGA Minibox at Computex 2013. Look how small it is!
There will be at least two SKUs of the Minibox, depending on whether you want to go with air or water cooling. According to Bit-Tech.net, the air cooled version will use two 92mm fans in the top of the case and one 80mm fan for the bottom-mounted PSU. The water cooled SKU will be slightly larger but have enough room for a water cooling radiator (likely 240mm). Beyond that, details are scarce, but the air cooled version is said to be available as soon as next month with water cooled options becoming available later this year.
The Minibox looks to be one of the better Mini-ITX cases out there (although the price is still unknown), and should be popular among enthusiasts wanting a small box that does not sacrifice gaming potential.
Zotac has announced a new GTX TITAN graphics card that will fall under the company’s AMP! Edition branding. This new Titan graphics card will feature factory overclocks on both the GPU and GDDR5 memory. However, due to NVIDIA’s restrictions, the Zotac GeForce GTX TITAN AMP! Edition does not feature a custom cooler or PCB.
The Zotac TITAN AMP! Edition card features a single GK110 GPU with 2,688 CUDA cores clocked at 902MHz base and 954MHz boost. That is a healthy boost over the reference TITAN’s 836MHz base and 876MHz boost clock speeds. Further, while Zotac’s take on the TITAN continues the reference specification’s 6GB of GDDR5 memory, it is impressively overclocked to 6,608Mhz (especially since Zotac has not changed the cooler). The GPU clocks might be able to be replicated by many of the reference cards though. For example, Ryan managed to get his card up to 992MHz boost in his review of the NVIDIA GTX TITAN.
The card has two DL-DVI, one HDMI, and one DisplayPort video output(s). The cooler, PCB, and PCI-E power specifications are still the same as the reference design. You can find more details on the heatsink in the TITAN review. Not allowing vendors to use custom coolers is disappointing and possibly limiting the factory GPU overclocks that they are able/willing to offer and support, but within that restriction the Zotac AMP! Edition looks to be a decent card so long as the (not yet announced) price premium over the $999 NVIDIA reference card is minimal.
Our 4K Testing Methods
You may have recently seen a story and video on PC Perspective about a new TV that made its way into the office. Of particular interest is the fact that the SEIKI SE50UY04 50-in TV is a 4K television; it has a native resolution of 3840x2160. For those that are unfamiliar with the new upcoming TV and display standards, 3840x2160 is exactly four times the resolution of current 1080p TVs and displays. Oh, and this TV only cost us $1300.
In that short preview we validated that both NVIDIA and AMD current generation graphics cards support output to this TV at 3840x2160 using an HDMI cable. You might be surprised to find that HDMI 1.4 can support 4K resolutions, but it can do so only at 30 Hz (60 Hz 4K TVs won't be available until 2014 most likely), half the refresh rate of most TVs and monitors at 60 Hz. That doesn't mean we are limited to 30 FPS of performance though, far from it. As you'll see in our testing on the coming pages we were able to push out much higher frame rates using some very high end graphics solutions.
I should point out that I am not a TV reviewer and I don't claim to be one, so I'll leave the technical merits of the monitor itself to others. Instead I will only report on my experiences with it while using Windows and playing games - it's pretty freaking awesome. The only downside I have found in my time with the TV as a gaming monitor thus far is with the 30 Hz refresh rate and Vsync disabled situations. Because you are seeing fewer screen refreshes over the same amount of time than you would with a 60 Hz panel, all else being equal, you are getting twice as many "frames" of the game being pushed to the monitor each refresh cycle. This means that the horizontal tearing associated with Vsync will likely be more apparent than it would otherwise.
I would likely recommend enabling Vsync for a tear-free experience on this TV once you are happy with performance levels, but obviously for our testing we wanted to keep it off to gauge performance of these graphics cards.
A very early look at the future of Catalyst
Today is a very interesting day for AMD. It marks both the release of the reference design of the Radeon HD 7990 graphics card, a dual-GPU Tahiti behemoth, and the first sample of a change to the CrossFire technology that will improve animation performance across the board. Both stories are incredibly interesting and as it turns out both feed off of each other in a very important way: the HD 7990 depends on CrossFire and CrossFire depends on this driver.
If you already read our review (or any review that is using the FCAT / frame capture system) of the Radeon HD 7990, you likely came away somewhat unimpressed. The combination of a two AMD Tahiti GPUs on a single PCB with 6GB of frame buffer SHOULD have been an incredibly exciting release for us and would likely have become the single fastest graphics card on the planet. That didn't happen though and our results clearly state why that is the case: AMD CrossFire technology has some serious issues with animation smoothness, runt frames and giving users what they are promised.
Our first results using our Frame Rating performance analysis method were shown during the release of the NVIDIA GeForce GTX Titan card in February. Since then we have been in constant talks with the folks at AMD to figure out what was wrong, how they could fix it, and what it would mean to gamers to implement frame metering technology. We followed that story up with several more that showed the current state of performance on the GPU market using Frame Rating that painted CrossFire in a very negative light. Even though we were accused by some outlets of being biased or that AMD wasn't doing anything incorrectly, we stuck by our results and as it turns out, so does AMD.
Today's preview of a very early prototype driver shows that the company is serious about fixing the problems we discovered.
If you are just catching up on the story, you really need some background information. The best place to start is our article published in late March that goes into detail about how game engines work, how our completely new testing methods work and the problems with AMD CrossFire technology very specifically. From that piece:
It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand. We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern. Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case. Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?” My answer based on the below graph would be no.
An example of a runt frame in a CrossFire configuration
NVIDIA's solution for getting around this potential problem with SLI was to integrate frame metering, a technology that balances frame presentation to the user and to the game engine in a way that enabled smoother, more consistent frame times and thus smoother animations on the screen. For GeForce cards, frame metering began as a software solution but was actually integrated as a hardware function on the Fermi design, taking some load off of the driver.
Subject: Graphics Cards | April 14, 2013 - 11:59 PM | Tim Verry
Tagged: windforce, nvidia, gtx titan, gtx 680, gpu cooler, gigabyte
Earlier this week, PC component manufacturer Gigabyte showed off its new graphics card cooler at its New Idea Tech Tour even in Berlin, Germany. The new triple slot cooler is built for this generation's highest-end graphics cards. It is capable of cooling cards with up to 450W TDPs while keeping the cards cooler and quiter than reference heatsinks.
The Gigabyte WindForce 450W cooler is a triple slot design that combines a large heatsink with three 80mm fans. The heatsink features two aluminum fin arrays connected to the GPU block by three 10mm copper heatpipes. Gigabyte stated during the card's reveal that its cooler keeps a NVIDIA GTX 680 graphics card 2°C cooler and 23.3 dB quiter during a Furmark benchmark run. Further, the cooler will allow these high end cards, like the GTX Titan to achieve higher (stable) boost clocks.
ComputerBase.de was on hand at Gigabyte's event in Berlin to snap shots of the upcoming GPU cooler.
The company has not announced which graphics cards will use the new cooler or when it will be available, but A Gigabyte GTX 680 and a custom cooled-Titan seem to be likely candidates considering these cards were mentioned in the examples given in the presentation. Note that NVIDIA has prohibited AIB partners from putting custom coolers on the Titan thus far, but other rumored Titan graphics cards with custom coolers seem to suggest that the company will allow custom-cooled Titans to be sold at retail at some point. In addition to using it for the top-end NVIDIA cards, I think a GTX 670 or GTX 660 Ti GPU using this cooler would also be great, as it would likely be one of the quieter running options available (because you could spin the three 80mm fans much slower than the single reference fan and still get the same temps).
What do you think about Gigabyte's new 450W GPU cooler? You can find more photos over at Computer Base (computerbase.de).
Summary Thus Far
Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons. Here is the schedule:
- 3/27: Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing
- 3/27: Radeon HD 7970 GHz Edition vs GeForce GTX 680 (Single and Dual GPU)
- 3/30: AMD Radeon HD 7990 vs GeForce GTX 690 vs GeForce GTX Titan
- 4/2: Radeon HD 7950 vs GeForce GTX 660 Ti (Single and Dual GPU)
- 4/5: Radeon HD 7870 GHz Edition vs GeForce GTX 660 (Single and Dual GPU)
Welcome to the second in our intial series of articles focusing on Frame Rating, our new graphics and GPU performance technology that drastically changes how the community looks at single and multi-GPU performance. In the article we are going to be focusing on a different set of graphics cards, the highest performing single card options on the market including the GeForce GTX 690 4GB dual-GK104 card, the GeForce GTX Titan 6GB GK110-based monster as well as the Radeon HD 7990, though in an emulated form. The HD 7990 was only recently officially announced by AMD at this years Game Developers Conference but the specifications of that hardware are going to closely match what we have here on the testbed today - a pair of retail Radeon HD 7970s in CrossFire.
Will the GTX Titan look as good in Frame Rating as it did upon its release?
If you are just joining this article series today, you have missed a lot! If nothing else you should read our initial full release article that details everything about the Frame Rating methodology and why we are making this change to begin with. In short, we are moving away from using FRAPS for average frame rates or even frame times and instead are using a secondary hardware capture system to record all the frames of our game play as they would be displayed to the gamer, then doing post-process analyzation on that recorded file to measure real world performance.
Because FRAPS measures frame times at a different point in the game pipeline (closer to the game engine) its results can vary dramatically from what is presented to the end user on their display. Frame Rating solves that problem by recording video through a dual-link DVI capture card that emulates a monitor to the testing system and by simply applying a unique overlay color on each produced frame from the game, we can gather a new kind of information that tells a very unique story.
The capture card that makes all of this work possible.
I don't want to spend too much time on this part of the story here as I already wrote a solid 16,000 words on the topic in our first article and I think you'll really find the results fascinating. So, please check out my first article on the topic if you have any questions before diving into these results today!
|Test System Setup|
|CPU||Intel Core i7-3960X Sandy Bridge-E|
|Motherboard||ASUS P9X79 Deluxe|
|Memory||Corsair Dominator DDR3-1600 16GB|
|Hard Drive||OCZ Agility 4 256GB SSD|
NVIDIA GeForce GTX TITAN 6GB
NVIDIA GeForce GTX 690 4GB
AMD Radeon HD 7970 CrossFire 3GB
AMD: 13.2 beta 7
NVIDIA: 314.07 beta (GTX 690)
NVIDIA: 314.09 beta (GTX TITAN)
|Power Supply||Corsair AX1200i|
|Operating System||Windows 8 Pro x64|
On to the results!
Subject: General Tech | February 28, 2013 - 08:45 PM | Ken Addison
Tagged: video, titan, sli, R5000, podcast, nvidia, H90, H110, gtx titan, frame rating, firepro, crossfire, amd
PC Perspective Podcast #240 - 02/28/2013
Join us this week as we discuss GTX TITAN Benchmarks, Frame Rating, Tegra 4 Details and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano
Program length: 1:24:28
Podcast topics of discussion:
- 0:01:18 PCPer Podcast BINGO!
- Week in Reviews:
- 0:40:30 This Podcast is brought to you by MSI!
News items of interest:
- 0:41:45 New Offices coming for NVIDIA
- 0:45:00 Chromebook Pixel brings high-res to high-price
- 0:48:00 GPU graphics market updates from JPR
- 0:55:45 Tegra 4 graphics details from Mobile World Congress
- 1:01:00 Unreal Engine 4 on PS4 has reduced quality
- 1:04:10 Micron SAS SSDs
- 1:08:25 AMD FirePro R5000 PCoIP Card
1:13:35 Hardware / Software Pick of the Week
- Ryan: NOT this 3 port HDMI switch
- Jeremy: Taxidermy + PICAXE, why didn't we think of this before?
- Josh: Still among my favorite headphones
- Allyn: Cyto
- 1:13:35 Hardware / Software Pick of the Week
- 1-888-38-PCPER or email@example.com
- http://twitter.com/ryanshrout and http://twitter.com/pcper