Podcast #248 - AMD HD 7990, CrossFire Frame Rating improvements, 4K TVs and more!

Subject: General Tech | April 25, 2013 - 02:13 PM |
Tagged: video, Xe, seiki, raidr, podcast, nvidia, Never Serttle, hd 7990, GA-Z77N-WiFi, frame rating, crossfire, amd, 4k

PC Perspective Podcast #248 - 04/25/2013

Join us this week as we discuss AMD HD 7990, CrossFire Frame Rating improvements, 4K TVs and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Program length: 1:16:34

  1. 0:01:20 Update on Indiegogo: You guys rock!
  2. Week in Review:
  3. News items of interest:
    1. Ryan: Seiki 4K TV - more support from enthusiasts! and wet puppies
    2. Jeremy: This is not news people, NFC is a feature but if you are paranoid you can check with this app
    3. Allyn: Put your bits on an ioSafe. Put your 'papers' here.
    4. Tim: BT Sync, it's in public alpha now so go grab it!
  4. 1-888-38-PCPER or podcast@pcper.com
  5. Closing/outro

 

NVIDIA's plans for Tegra and Tesla

Subject: General Tech | April 24, 2013 - 01:38 PM |
Tagged: Steve Scott, nvidia, HPC, tesla, logan, tegra

The Register had a chance to sit down with Steve Scott, once CTO of Cray and now CTO of NVIDIA's Tesla projects to discuss the future of their add-in cards as well as that of x86 in the server room.  They discussed Tegra and why it is not receiving the same amount of attention at NVIDIA as Tegra is, as well as some of the fundamental differences in the chips both currently and going forward.  NVIDIA plans to unite GPU and CPU onto both families of chips, likely with a custom interface as opposed to placing them on the same die, though both will continue to be designed for very different functions.  A lot of the article focuses on Tegra, its memory bandwidth and most importantly its networking capabilities as it seems NVIDIA is focused on the server room and providing hundreds or thousands of interconnected Tegra processors to compete directly with x86 offerings.  Read on for the full interview.

ELreg_nvidia_echelon_system.jpg

"Jen-Hsun Huang, co-founder and CEO of Nvidia has been perfectly honest about the fact that the graphics chip maker didn't intend to get into the supercomputing business. Rather, it was founded by a bunch of gamers who wanted better graphics cards to play 3D games. Fast forward two decades, though, and the Nvidia Tesla GPU coprocessor and the CUDA programming environment have taken the supercomputer world by storm."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register
Author:
Manufacturer: Various

A very early look at the future of Catalyst

Today is a very interesting day for AMD.  It marks both the release of the reference design of the Radeon HD 7990 graphics card, a dual-GPU Tahiti behemoth, and the first sample of a change to the CrossFire technology that will improve animation performance across the board.  Both stories are incredibly interesting and as it turns out both feed off of each other in a very important way: the HD 7990 depends on CrossFire and CrossFire depends on this driver. 

If you already read our review (or any review that is using the FCAT / frame capture system) of the Radeon HD 7990, you likely came away somewhat unimpressed.  The combination of a two AMD Tahiti GPUs on a single PCB with 6GB of frame buffer SHOULD have been an incredibly exciting release for us and would likely have become the single fastest graphics card on the planet.  That didn't happen though and our results clearly state why that is the case: AMD CrossFire technology has some serious issues with animation smoothness, runt frames and giving users what they are promised. 

Our first results using our Frame Rating performance analysis method were shown during the release of the NVIDIA GeForce GTX Titan card in February.  Since then we have been in constant talks with the folks at AMD to figure out what was wrong, how they could fix it, and what it would mean to gamers to implement frame metering technology.  We followed that story up with several more that showed the current state of performance on the GPU market using Frame Rating that painted CrossFire in a very negative light.  Even though we were accused by some outlets of being biased or that AMD wasn't doing anything incorrectly, we stuck by our results and as it turns out, so does AMD. 

Today's preview of a very early prototype driver shows that the company is serious about fixing the problems we discovered. 

If you are just catching up on the story, you really need some background information.  The best place to start is our article published in late March that goes into detail about how game engines work, how our completely new testing methods work and the problems with AMD CrossFire technology very specifically.  From that piece:

It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand.  We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern.  Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case.  Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?”  My answer based on the below graph would be no.

runt.jpg

An example of a runt frame in a CrossFire configuration

NVIDIA's solution for getting around this potential problem with SLI was to integrate frame metering, a technology that balances frame presentation to the user and to the game engine in a way that enabled smoother, more consistent frame times and thus smoother animations on the screen.  For GeForce cards, frame metering began as a software solution but was actually integrated as a hardware function on the Fermi design, taking some load off of the driver.

Continue reading our article on the new prototype driver from AMD to address frame pacing issues in CrossFire!!

New GeForce Game-Ready Drivers Just in Time for 'Dead Island: Riptide,' 'Star Trek', 'Neverwinter'; Boost Performance up to 20%

Subject: Graphics Cards | April 23, 2013 - 03:53 PM |
Tagged: nvidia, graphics drivers, geforce, 320.00 beta

NVIDIA rolled out a new set of beta drivers that provide up to 20% faster performance and have been optimized for a handful of new titles, including Dead Island: Riptide, Star Trek, and Neverwinter.

GeForce 320.00 beta drivers are now available for automatic download and installation using GeForce Experience, the easiest way to keep your drivers up to date.

With a single click in GeForce Experience, gamers can also optimize the image quality of top new games like Dead Island: Riptide and have it instantly tuned to take full advantage of their PC’s hardware.

Here are examples of the performance increases in GeForce 320.00 drivers (measured with GeForce GTX 660):

  • Up to 20% in Dirt: Showdown
  • Up to 18% in Tomb Raider
  • Up to 8% in StarCraft II
  • Up to 6% in other top games like Far Cry 3

For more details, refer to the release highlights on the driver download pages and read the GeForce driver article on GeForce.com.

Enjoy the new GeForce Game Ready drivers and let us know what you think.

nvidia-geforce-320-00-beta-drivers-gtx-680-performance.png

Windows Vista/Windows 7 Fixed Issues

The Windows 7 Magnifier window flickers. [1058231]

Games default to stereoscopic 3D mode after installing the driver. [1261633]

[GeForce 330M][Notebook]: The display goes blank when rebooting the notebook after installing th e driver. [1239252]

[Crysis 3]: There are black artifacts in the game. [1251495]

[Dirt 3]: When ambient occlusion is enabled, there is rendering corruption in the game while in split-screen mode. [1253727]

[3DTV Play][Mass Effect]: The NVIDIA Cont rol Panel “override antialiasing” setting does not work when stereoscopic 3D is enabled [1220312]

[Microsoft Flight Simulator]: Level D Simulations add-on aircraft gauges are not drawn correctly. [899771]

[GeForce 500 series][Stereoscopic 3D][Two World 2]: The application crashes when switching to windowed mode with stereoscopic 3D enabled. [909749]

[GeForce 660 Ti][All Points Bulletin (APB) Reloaded]: The game crashes occasionally, followed by a black/grey/red screen. [1042342]

[Geforce GTX 680][Red Orchestra 2 Heroes of Stalingrad]: Red-screen crash occurs after exiting the game. [1021046]

[GeForce 6 series][Final Fantasy XI]: TDR crash occurs in the game when using the Smite of Rage ability. [1037744]

[SLI][Surround][GeForce GTX Titan][Tomb Raider]: There is corruption in the game and the system hangs when played at high resolution and Ultra or Ultimate settings. [1254359]

[3D Surround, SLI], GeForce 500 Series: With Surround enabled, all displays may not be activated when selecting Activate All Displays from the NVIDIA Control Panel- > Set SLI Configuration page. [905544]

[SLI][Starcraft II][3D Vision]: The game crashes when run with 3D Vision enabled. [1253206]

[SLI][GeForce GTX 680][Tomb Raider (2013)]: The game crashes and TDR occurs while running the game at Ultra settings. [1251578]

[SLI][Starcraft II][3D Vision]: The game cras hes when played with 3D Vision and SLI enabled. [1253206]

SLI][Call of Duty: Black Ops 2]: The player emblems are not drawn correctly.

Source: NVIDIA

NVIDIA Bundles Metro: Last Light with GTX 660 and Higher

Subject: Graphics Cards | April 16, 2013 - 10:24 PM |
Tagged: nvidia, metro last light, Metro

Late this evening we got word from NVIDIA about an update to its game bundle program for GeForce GTX 600 series cards.  Replacing the previously running Free to Play bundle that included $50 in credit for each World of Tanks, Hawken and Planetside 2 title, NVIDIA is moving back to the AAA game with Metro: Last Light.

metro.jpg

Metro: Last Light is the sequel to surprise hit from 2010, Metro 2033 and I am personally really looking forward to the game and seeing how it can stress PC hardware like the first did. 

metro2.jpg

This bundle is only good for GTX 660 cards and above with the GTX 650 Ti sticking with the Free to Play $75 credit offer.

NVIDIA today announced that gamers who purchase a NVIDIA GeForce GTX 660 or above would also receive a copy of the highly anticipated Metro: Last Light, published by Deep Silver and is the sequel to the multi award winning Metro 2033. Metro: Last Light will be available May 14, 2013 within the US and May 17, 2013 across Europe.

The deal is already up and running on Newegg.com but with the release date of Metro: Last Light set at May 14th, you'll have just about a month to wait before you can get your hands on it.

How do you think this compares to AMD's currently running bundle with Bioshock Infinite and more?  Did NVIDIA step up its game this time around?

Source: NVIDIA

NVIDIA Rumored To Release 700-Series GeForce Cards At Computex 2013

Subject: Graphics Cards | April 15, 2013 - 03:34 PM |
Tagged: rumor, nvidia, kepler, gtx 700, geforce 700, computex

Recent rumors seem to suggest that NVIDIA will release its desktop-class GeForce 700 series of graphics cards later this year. The new card will reportedly be faster than the currently-available GTX 600 series, but will likely remain based on the company's Kepler architecture.

NVIDIA GeForce Logo.jpg

According to the information presented during NVIDIA's GTC keynote, its Kepler architecture will dominate 2012 and 2013. It will then follow up with Maxwell-based cards in 2014. Notably absent from the slides are product names, meaning the publicly-available information at least leaves the possibility of a refreshed Kepler GTX 700 lineup in 2013 open.

Fudzilla further reports that NVIDIA will release the cards as soon as May 2013, with an official launch as soon as Computex. Having actual cards available for sale by Computex is a bit unlikely, but a summer launch could be possible if the new 700 series is merely a tweaked Kepler-based design with higher clocks and/or lower power usage. The company is rumored to be accelerating the launch of the GTX 700 series in the desktop space in response to AMD's heavy game-bundle marketing, which seems to be working well at persuading gamers to choose the red team.

What do you make of this rumor? Do you think a refreshed Kepler is coming this year?

Source: Fudzilla

A New Gigabyte WindForce 450W GPU Cooler May Be Coming to a GTX 680 and Titan Near You

Subject: Graphics Cards | April 14, 2013 - 07:59 PM |
Tagged: windforce, nvidia, gtx titan, gtx 680, gpu cooler, gigabyte

Earlier this week, PC component manufacturer Gigabyte showed off its new graphics card cooler at its New Idea Tech Tour even in Berlin, Germany. The new triple slot cooler is built for this generation's highest-end graphics cards. It is capable of cooling cards with up to 450W TDPs while keeping the cards cooler and quiter than reference heatsinks.

The Gigabyte WindForce 450W cooler is a triple slot design that combines a large heatsink with three 80mm fans. The heatsink features two aluminum fin arrays connected to the GPU block by three 10mm copper heatpipes. Gigabyte stated during the card's reveal that its cooler keeps a NVIDIA GTX 680 graphics card 2°C cooler and 23.3 dB quiter during a Furmark benchmark run. Further, the cooler will allow these high end cards, like the GTX Titan to achieve higher (stable) boost clocks.

Gigabyte WindForce 450W GPU Cooler for NVIDIA GTX Titan and GTX 680 Graphics Cards.jpg

ComputerBase.de was on hand at Gigabyte's event in Berlin to snap shots of the upcoming GPU cooler.

The company has not announced which graphics cards will use the new cooler or when it will be available, but A Gigabyte GTX 680 and a custom cooled-Titan seem to be likely candidates considering these cards were mentioned in the examples given in the presentation. Note that NVIDIA has prohibited AIB partners from putting custom coolers on the Titan thus far, but other rumored Titan graphics cards with custom coolers seem to suggest that the company will allow custom-cooled Titans to be sold at retail at some point. In addition to using it for the top-end NVIDIA cards, I think a GTX 670  or GTX 660 Ti GPU using this cooler would also be great, as it would likely be one of the quieter running options available (because you could spin the three 80mm fans much slower than the single reference fan and still get the same temps).

What do you think about Gigabyte's new 450W GPU cooler? You can find more photos over at Computer Base (computerbase.de).

SECO Introduces mITX GPU Devkit for CUDA Programmers

Subject: General Tech | April 12, 2013 - 02:08 AM |
Tagged: SECO, nvidia, mini ITX, kepler, kayla, GTC 13, GTC, CUDA, arm

Last month, NVIDIA revealed its Kayla development platform that combines a quad core Tegra System on a Chip (SoC) with a NVIDIA Kepler GPU. Kayla will out later this year, but that has not stopped other board makers from putting together their own solutions. One such solution that began shipping earlier this week is the mITX GPU Devkit from SECO.

The new mITX GPU Devkit is a hardware platform for developers to program CUDA applications for mobile devices, desktops, workstations, and HPC servers. It combines a NVIDIA Tegra 3 processor, 2GB of RAM, and 4GB of internal storage (eMMC) on a Qseven module with a Mini-ITX form factor motherboard. Developers can then plug their own CUDA-capable graphics card into the single PCI-E 2.0 x16 slot (which actually runs at x4 speeds). Additional storage can be added via an internal SATA connection, and cameras can be hooked up using the CIC headers.

SECO mITX GPU DEVKIT.jpg

Rear IO on the mITX GPU Devkit includes:

  • 1 x Gigabit Ethernet
  • 3 x USB
  • 1 x OTG port
  • 1 x HDMI
  • 1 x Display Port
  • 3 x Analog audio
  • 2 x Serial
  • 1 x SD card slot

The SECO platform is a proving to be popular for GPGPU in the server space, especially with systems like Pedraforca. The intention of using these types of platforms in servers is to save power by using a low power ARM chip for inter-node communication and basic tasks while the real computing is done solely on the graphics cards. With Intel’s upcoming Haswell-based Xeon chips getting down to 13W TPDs though, systems like this are going to be more difficult to justify. SECO is mostly positioning this platform as a development board, however. One use in that respect is to begin optimizing GPU-accelerated code for mobile devices. With future Tegra chips to get CUDA-compatible graphics cards, new software development and optimization of existing GPGPU code for smartphones and tablet will be increasingly important.

SECO mITX GPU DEVKIT box.jpg

Either way, the SECO mITX GPU Devkit is available now for 349 EUR or approximately $360 (in both cases, before any taxes).

Source: SECO
Author:
Manufacturer: PC Perspective

What to look for and our Test Setup

Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons.  Here is the schedule:

 

Today marks the conclusion of our first complete round up of Frame Rating results, the culmination of testing that was started 18 months ago.  Hopefully you have caught our other articles on the subject at hand, and you really will need to read up on the Frame Rating Dissected story above to truly understand the testing methods and results shown in this article.  Use the links above to find the previous articles!

To round out our Frame Rating testing in this interation, we are looking at more cards further down the product stack in two different sets.  The first comparison will look at the AMD Radeon HD 7870 GHz Edition and the NVIDIA GeForce GTX 660 graphics cards in both single and dual-card configurations.  Just like we saw with our HD 7970 vs GTX 680 and our HD 7950 vs GTX 660 Ti testing, evaluating how the GPUs compare in our new and improved testing methodology in single GPU configurations is just as important as testing in SLI and CrossFire.  The GTX 660 ($199 at Newegg.com) and the HD 7870 ($229 at Newegg.com) are the closest matches in terms of pricing though both card have some interesting game bundle options as well.

7870.jpg

AMD's Radeon HD 7870 GHz Edition

Our second set of results will only be looking at single GPU performance numbers for lower cost graphics cards like the AMD Radeon HD 7850 and Radeon HD 7790 and from NVIDIA the GeForce GTX 650 Ti and GTX 650 Ti BOOST.  We didn't include multi-GPU results on these cards simply due to time constraints internally and because we are eager to move onto further Frame Rating testing and input testing. 

gtx660.jpg

NVIDIA's GeForce GTX 660


If you are just joining this article series today, you have missed a lot!  If nothing else you should read our initial full release article that details everything about the Frame Rating methodology and why we are making this change to begin with.  In short, we are moving away from using FRAPS for average frame rates. We are using a secondary hardware capture system to record each frame of game play as the monitor would receive them. That recorded video is then analyzed to measure real world performance.

Because FRAPS measures frame times at a different point in the game pipeline (closer to the game engine) its results can vary dramatically from what is presented to the end user on their display.  Frame Rating solves that problem by recording video through a dual-link DVI capture card that emulates a monitor to the testing system and by simply applying a unique overlay color on each produced frame from the game, we can gather a new kind of information that tells a very unique story.

card1.jpg

The capture card that makes all of this work possible.

I don't want to spend too much time on this part of the story here as I already wrote a solid 16,000 words on the topic in our first article and I think you'll really find the results fascinating.  So, please check out my first article on the topic if you have any questions before diving into these results today!

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 660 2GB
AMD Radeon HD 7870 2GB
NVIDIA GeForce GTX 650 Ti 1GB
NVIDIA GeForce GTX 650 Ti BOOST 2GB
AMD Radeon HD 7850 2GB
AMD Radeon HD 7790 1GB
Graphics Drivers AMD: 13.2 beta 7
NVIDIA: 314.07 beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

On to the results! 

Continue reading our review of the GTX 660 and HD 7870 using Frame Rating!!

Factory Overclocked ASUS GTX 660 Ti Dragon Pictured

Subject: Graphics Cards | April 3, 2013 - 11:24 AM |
Tagged: nvidia, kepler, gtx 660 Ti, 660 ti

Two new photos recently popped up on Cowcotland, showing off an unreleased "Dragon Edition" GTX 660 Ti graphics card from ASUS. The new card boasts some impressive factory overclocks on both the GPU and memory as well as a beefy heatsink and a new blue and black color scheme.

ASUS-nvidia-gtx-660-ti-dragon.jpg

The ASUS GTX 660 Ti Dragon will feature a custom cooler with two fans and an aluminum heastink. The back of the card includes a metal backplate to secure the cooler and help dissipate a bit of heat itself. However, there is also a cutout in the backplate to allow for (likely) additional power management circuitry. The card also features the company's power phase technology, NVIDIA's 660 Ti GK-104 GPU, and 2GB of GDDR5 memory. The graphics core is reportedly clocked at 1150MHz (no word on whether that is the base or boost figure) while the memory is overclocked to 6100MHz. For comparison, the reference GTX 660 Ti clocks are 915MHz base, 980MHz boost, and 6,000MHz memory. The new card will support DVI, DisplayPort, and HDMI video outputs.

asus-nvidia-gtx-660-ti-dragon-1.jpg

There is no word on pricing or availability, but the Dragon looks like it will be one of the fastest GTX 660 Ti cards available when (if?) it publicly released!

Source: Cowcotland