Subject: General Tech | April 24, 2013 - 01:38 PM | Jeremy Hellstrom
Tagged: Steve Scott, nvidia, HPC, tesla, logan, tegra
The Register had a chance to sit down with Steve Scott, once CTO of Cray and now CTO of NVIDIA's Tesla projects to discuss the future of their add-in cards as well as that of x86 in the server room. They discussed Tegra and why it is not receiving the same amount of attention at NVIDIA as Tegra is, as well as some of the fundamental differences in the chips both currently and going forward. NVIDIA plans to unite GPU and CPU onto both families of chips, likely with a custom interface as opposed to placing them on the same die, though both will continue to be designed for very different functions. A lot of the article focuses on Tegra, its memory bandwidth and most importantly its networking capabilities as it seems NVIDIA is focused on the server room and providing hundreds or thousands of interconnected Tegra processors to compete directly with x86 offerings. Read on for the full interview.
"Jen-Hsun Huang, co-founder and CEO of Nvidia has been perfectly honest about the fact that the graphics chip maker didn't intend to get into the supercomputing business. Rather, it was founded by a bunch of gamers who wanted better graphics cards to play 3D games. Fast forward two decades, though, and the Nvidia Tesla GPU coprocessor and the CUDA programming environment have taken the supercomputer world by storm."
Here is some more Tech News from around the web:
- AMD pins future growth to embedded marketplace @ The Register
- AMD announces new embedded G-series SoC @ DigiTimes
- TSMC captures almost 50 percent of foundry market thanks to 28nm demand @ The Inquirer
- $45 BeagleBone Black Keeps Eyes on the Pi's @ Linux.com
- BlackBerry OS 10.1 leaks its secret goo over all the web @ The Register
- Samsung MV900F Wi-Fi 16.3MP Digital Camera Review @ ModSynergy
- i’m Watch: A Smartwatch Review @ TechwareLabs
A very early look at the future of Catalyst
Today is a very interesting day for AMD. It marks both the release of the reference design of the Radeon HD 7990 graphics card, a dual-GPU Tahiti behemoth, and the first sample of a change to the CrossFire technology that will improve animation performance across the board. Both stories are incredibly interesting and as it turns out both feed off of each other in a very important way: the HD 7990 depends on CrossFire and CrossFire depends on this driver.
If you already read our review (or any review that is using the FCAT / frame capture system) of the Radeon HD 7990, you likely came away somewhat unimpressed. The combination of a two AMD Tahiti GPUs on a single PCB with 6GB of frame buffer SHOULD have been an incredibly exciting release for us and would likely have become the single fastest graphics card on the planet. That didn't happen though and our results clearly state why that is the case: AMD CrossFire technology has some serious issues with animation smoothness, runt frames and giving users what they are promised.
Our first results using our Frame Rating performance analysis method were shown during the release of the NVIDIA GeForce GTX Titan card in February. Since then we have been in constant talks with the folks at AMD to figure out what was wrong, how they could fix it, and what it would mean to gamers to implement frame metering technology. We followed that story up with several more that showed the current state of performance on the GPU market using Frame Rating that painted CrossFire in a very negative light. Even though we were accused by some outlets of being biased or that AMD wasn't doing anything incorrectly, we stuck by our results and as it turns out, so does AMD.
Today's preview of a very early prototype driver shows that the company is serious about fixing the problems we discovered.
If you are just catching up on the story, you really need some background information. The best place to start is our article published in late March that goes into detail about how game engines work, how our completely new testing methods work and the problems with AMD CrossFire technology very specifically. From that piece:
It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand. We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern. Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case. Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?” My answer based on the below graph would be no.
An example of a runt frame in a CrossFire configuration
NVIDIA's solution for getting around this potential problem with SLI was to integrate frame metering, a technology that balances frame presentation to the user and to the game engine in a way that enabled smoother, more consistent frame times and thus smoother animations on the screen. For GeForce cards, frame metering began as a software solution but was actually integrated as a hardware function on the Fermi design, taking some load off of the driver.
New GeForce Game-Ready Drivers Just in Time for 'Dead Island: Riptide,' 'Star Trek', 'Neverwinter'; Boost Performance up to 20%
Subject: Graphics Cards | April 23, 2013 - 03:53 PM | Jeremy Hellstrom
Tagged: nvidia, graphics drivers, geforce, 320.00 beta
GeForce 320.00 beta drivers are now available for automatic download and installation using GeForce Experience, the easiest way to keep your drivers up to date.
With a single click in GeForce Experience, gamers can also optimize the image quality of top new games like Dead Island: Riptide and have it instantly tuned to take full advantage of their PC’s hardware.
Here are examples of the performance increases in GeForce 320.00 drivers (measured with GeForce GTX 660):
- Up to 20% in Dirt: Showdown
- Up to 18% in Tomb Raider
- Up to 8% in StarCraft II
- Up to 6% in other top games like Far Cry 3
For more details, refer to the release highlights on the driver download pages and read the GeForce driver article on GeForce.com.
Enjoy the new GeForce Game Ready drivers and let us know what you think.
Windows Vista/Windows 7 Fixed Issues
The Windows 7 Magnifier window flickers. 
Games default to stereoscopic 3D mode after installing the driver. 
[GeForce 330M][Notebook]: The display goes blank when rebooting the notebook after installing th e driver. 
[Crysis 3]: There are black artifacts in the game. 
[Dirt 3]: When ambient occlusion is enabled, there is rendering corruption in the game while in split-screen mode. 
[3DTV Play][Mass Effect]: The NVIDIA Cont rol Panel “override antialiasing” setting does not work when stereoscopic 3D is enabled 
[Microsoft Flight Simulator]: Level D Simulations add-on aircraft gauges are not drawn correctly. 
[GeForce 500 series][Stereoscopic 3D][Two World 2]: The application crashes when switching to windowed mode with stereoscopic 3D enabled. 
[GeForce 660 Ti][All Points Bulletin (APB) Reloaded]: The game crashes occasionally, followed by a black/grey/red screen. 
[Geforce GTX 680][Red Orchestra 2 Heroes of Stalingrad]: Red-screen crash occurs after exiting the game. 
[GeForce 6 series][Final Fantasy XI]: TDR crash occurs in the game when using the Smite of Rage ability. 
[SLI][Surround][GeForce GTX Titan][Tomb Raider]: There is corruption in the game and the system hangs when played at high resolution and Ultra or Ultimate settings. 
[3D Surround, SLI], GeForce 500 Series: With Surround enabled, all displays may not be activated when selecting Activate All Displays from the NVIDIA Control Panel- > Set SLI Configuration page. 
[SLI][Starcraft II][3D Vision]: The game crashes when run with 3D Vision enabled. 
[SLI][GeForce GTX 680][Tomb Raider (2013)]: The game crashes and TDR occurs while running the game at Ultra settings. 
[SLI][Starcraft II][3D Vision]: The game cras hes when played with 3D Vision and SLI enabled. 
SLI][Call of Duty: Black Ops 2]: The player emblems are not drawn correctly.
Subject: Graphics Cards | April 16, 2013 - 10:24 PM | Ryan Shrout
Tagged: nvidia, metro last light, Metro
Late this evening we got word from NVIDIA about an update to its game bundle program for GeForce GTX 600 series cards. Replacing the previously running Free to Play bundle that included $50 in credit for each World of Tanks, Hawken and Planetside 2 title, NVIDIA is moving back to the AAA game with Metro: Last Light.
Metro: Last Light is the sequel to surprise hit from 2010, Metro 2033 and I am personally really looking forward to the game and seeing how it can stress PC hardware like the first did.
This bundle is only good for GTX 660 cards and above with the GTX 650 Ti sticking with the Free to Play $75 credit offer.
NVIDIA today announced that gamers who purchase a NVIDIA GeForce GTX 660 or above would also receive a copy of the highly anticipated Metro: Last Light, published by Deep Silver and is the sequel to the multi award winning Metro 2033. Metro: Last Light will be available May 14, 2013 within the US and May 17, 2013 across Europe.
The deal is already up and running on Newegg.com but with the release date of Metro: Last Light set at May 14th, you'll have just about a month to wait before you can get your hands on it.
How do you think this compares to AMD's currently running bundle with Bioshock Infinite and more? Did NVIDIA step up its game this time around?
Subject: Graphics Cards | April 15, 2013 - 03:34 PM | Tim Verry
Tagged: rumor, nvidia, kepler, gtx 700, geforce 700, computex
Recent rumors seem to suggest that NVIDIA will release its desktop-class GeForce 700 series of graphics cards later this year. The new card will reportedly be faster than the currently-available GTX 600 series, but will likely remain based on the company's Kepler architecture.
According to the information presented during NVIDIA's GTC keynote, its Kepler architecture will dominate 2012 and 2013. It will then follow up with Maxwell-based cards in 2014. Notably absent from the slides are product names, meaning the publicly-available information at least leaves the possibility of a refreshed Kepler GTX 700 lineup in 2013 open.
Fudzilla further reports that NVIDIA will release the cards as soon as May 2013, with an official launch as soon as Computex. Having actual cards available for sale by Computex is a bit unlikely, but a summer launch could be possible if the new 700 series is merely a tweaked Kepler-based design with higher clocks and/or lower power usage. The company is rumored to be accelerating the launch of the GTX 700 series in the desktop space in response to AMD's heavy game-bundle marketing, which seems to be working well at persuading gamers to choose the red team.
What do you make of this rumor? Do you think a refreshed Kepler is coming this year?
Subject: Graphics Cards | April 14, 2013 - 07:59 PM | Tim Verry
Tagged: windforce, nvidia, gtx titan, gtx 680, gpu cooler, gigabyte
Earlier this week, PC component manufacturer Gigabyte showed off its new graphics card cooler at its New Idea Tech Tour even in Berlin, Germany. The new triple slot cooler is built for this generation's highest-end graphics cards. It is capable of cooling cards with up to 450W TDPs while keeping the cards cooler and quiter than reference heatsinks.
The Gigabyte WindForce 450W cooler is a triple slot design that combines a large heatsink with three 80mm fans. The heatsink features two aluminum fin arrays connected to the GPU block by three 10mm copper heatpipes. Gigabyte stated during the card's reveal that its cooler keeps a NVIDIA GTX 680 graphics card 2°C cooler and 23.3 dB quiter during a Furmark benchmark run. Further, the cooler will allow these high end cards, like the GTX Titan to achieve higher (stable) boost clocks.
ComputerBase.de was on hand at Gigabyte's event in Berlin to snap shots of the upcoming GPU cooler.
The company has not announced which graphics cards will use the new cooler or when it will be available, but A Gigabyte GTX 680 and a custom cooled-Titan seem to be likely candidates considering these cards were mentioned in the examples given in the presentation. Note that NVIDIA has prohibited AIB partners from putting custom coolers on the Titan thus far, but other rumored Titan graphics cards with custom coolers seem to suggest that the company will allow custom-cooled Titans to be sold at retail at some point. In addition to using it for the top-end NVIDIA cards, I think a GTX 670 or GTX 660 Ti GPU using this cooler would also be great, as it would likely be one of the quieter running options available (because you could spin the three 80mm fans much slower than the single reference fan and still get the same temps).
What do you think about Gigabyte's new 450W GPU cooler? You can find more photos over at Computer Base (computerbase.de).
Subject: General Tech | April 12, 2013 - 02:08 AM | Tim Verry
Tagged: SECO, nvidia, mini ITX, kepler, kayla, GTC 13, GTC, CUDA, arm
Last month, NVIDIA revealed its Kayla development platform that combines a quad core Tegra System on a Chip (SoC) with a NVIDIA Kepler GPU. Kayla will out later this year, but that has not stopped other board makers from putting together their own solutions. One such solution that began shipping earlier this week is the mITX GPU Devkit from SECO.
The new mITX GPU Devkit is a hardware platform for developers to program CUDA applications for mobile devices, desktops, workstations, and HPC servers. It combines a NVIDIA Tegra 3 processor, 2GB of RAM, and 4GB of internal storage (eMMC) on a Qseven module with a Mini-ITX form factor motherboard. Developers can then plug their own CUDA-capable graphics card into the single PCI-E 2.0 x16 slot (which actually runs at x4 speeds). Additional storage can be added via an internal SATA connection, and cameras can be hooked up using the CIC headers.
Rear IO on the mITX GPU Devkit includes:
- 1 x Gigabit Ethernet
- 3 x USB
- 1 x OTG port
- 1 x HDMI
- 1 x Display Port
- 3 x Analog audio
- 2 x Serial
- 1 x SD card slot
The SECO platform is a proving to be popular for GPGPU in the server space, especially with systems like Pedraforca. The intention of using these types of platforms in servers is to save power by using a low power ARM chip for inter-node communication and basic tasks while the real computing is done solely on the graphics cards. With Intel’s upcoming Haswell-based Xeon chips getting down to 13W TPDs though, systems like this are going to be more difficult to justify. SECO is mostly positioning this platform as a development board, however. One use in that respect is to begin optimizing GPU-accelerated code for mobile devices. With future Tegra chips to get CUDA-compatible graphics cards, new software development and optimization of existing GPGPU code for smartphones and tablet will be increasingly important.
Either way, the SECO mITX GPU Devkit is available now for 349 EUR or approximately $360 (in both cases, before any taxes).
What to look for and our Test Setup
Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons. Here is the schedule:
- 3/27: Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing
- 3/27: Radeon HD 7970 GHz Edition vs GeForce GTX 680 (Single and Dual GPU)
- 3/30: AMD Radeon HD 7990 vs GeForce GTX 690 vs GeForce GTX Titan
- 4/2: Radeon HD 7950 vs GeForce GTX 660 Ti (Single and Dual GPU)
- 4/5: Radeon HD 7870 GHz Edition vs GeForce GTX 660 (Single and Dual GPU)
- 4/16: Frame Rating: Visual Effects of Vsync on Gaming Animation
Today marks the conclusion of our first complete round up of Frame Rating results, the culmination of testing that was started 18 months ago. Hopefully you have caught our other articles on the subject at hand, and you really will need to read up on the Frame Rating Dissected story above to truly understand the testing methods and results shown in this article. Use the links above to find the previous articles!
To round out our Frame Rating testing in this interation, we are looking at more cards further down the product stack in two different sets. The first comparison will look at the AMD Radeon HD 7870 GHz Edition and the NVIDIA GeForce GTX 660 graphics cards in both single and dual-card configurations. Just like we saw with our HD 7970 vs GTX 680 and our HD 7950 vs GTX 660 Ti testing, evaluating how the GPUs compare in our new and improved testing methodology in single GPU configurations is just as important as testing in SLI and CrossFire. The GTX 660 ($199 at Newegg.com) and the HD 7870 ($229 at Newegg.com) are the closest matches in terms of pricing though both card have some interesting game bundle options as well.
AMD's Radeon HD 7870 GHz Edition
Our second set of results will only be looking at single GPU performance numbers for lower cost graphics cards like the AMD Radeon HD 7850 and Radeon HD 7790 and from NVIDIA the GeForce GTX 650 Ti and GTX 650 Ti BOOST. We didn't include multi-GPU results on these cards simply due to time constraints internally and because we are eager to move onto further Frame Rating testing and input testing.
NVIDIA's GeForce GTX 660
If you are just joining this article series today, you have missed a lot! If nothing else you should read our initial full release article that details everything about the Frame Rating methodology and why we are making this change to begin with. In short,
Because FRAPS measures frame times at a different point in the game pipeline (closer to the game engine) its results can vary dramatically from what is presented to the end user on their display. Frame Rating solves that problem by recording video through a dual-link DVI capture card that emulates a monitor to the testing system and by simply applying a unique overlay color on each produced frame from the game, we can gather a new kind of information that tells a very unique story.
The capture card that makes all of this work possible.
I don't want to spend too much time on this part of the story here as I already wrote a solid 16,000 words on the topic in our first article and I think you'll really find the results fascinating. So, please check out my first article on the topic if you have any questions before diving into these results today!
|Test System Setup|
|CPU||Intel Core i7-3960X Sandy Bridge-E|
|Motherboard||ASUS P9X79 Deluxe|
|Memory||Corsair Dominator DDR3-1600 16GB|
|Hard Drive||OCZ Agility 4 256GB SSD|
NVIDIA GeForce GTX 660 2GB
AMD Radeon HD 7870 2GB
NVIDIA GeForce GTX 650 Ti 1GB
NVIDIA GeForce GTX 650 Ti BOOST 2GB
AMD Radeon HD 7850 2GB
AMD Radeon HD 7790 1GB
AMD: 13.2 beta 7
NVIDIA: 314.07 beta
|Power Supply||Corsair AX1200i|
|Operating System||Windows 8 Pro x64|
On to the results!
Subject: Graphics Cards | April 3, 2013 - 11:24 AM | Tim Verry
Tagged: nvidia, kepler, gtx 660 Ti, 660 ti
Two new photos recently popped up on Cowcotland, showing off an unreleased "Dragon Edition" GTX 660 Ti graphics card from ASUS. The new card boasts some impressive factory overclocks on both the GPU and memory as well as a beefy heatsink and a new blue and black color scheme.
The ASUS GTX 660 Ti Dragon will feature a custom cooler with two fans and an aluminum heastink. The back of the card includes a metal backplate to secure the cooler and help dissipate a bit of heat itself. However, there is also a cutout in the backplate to allow for (likely) additional power management circuitry. The card also features the company's power phase technology, NVIDIA's 660 Ti GK-104 GPU, and 2GB of GDDR5 memory. The graphics core is reportedly clocked at 1150MHz (no word on whether that is the base or boost figure) while the memory is overclocked to 6100MHz. For comparison, the reference GTX 660 Ti clocks are 915MHz base, 980MHz boost, and 6,000MHz memory. The new card will support DVI, DisplayPort, and HDMI video outputs.
There is no word on pricing or availability, but the Dragon looks like it will be one of the fastest GTX 660 Ti cards available when (if?) it publicly released!
Subject: Graphics Cards | April 3, 2013 - 10:14 AM | Tim Verry
Tagged: nvidia, mini-itx, gtx 670, GK104, directcu mini, asus
ASUS has finalized the design for its Kepler-based DirectCU Mini graphics card. The new card combines NVIDIA's GTX 670 GPU and reference PCB with ASUS' own power management technology and a new, much smaller, air cooler. The new ASUS cooler has allowed the company to offer a card that is a mere 17cm long. Compared to traditional GTX 670 graphics cards with coolers at approximately 24cm, the DirectCU Mini is noticeably smaller.
The DirectCU Mini features a GTX 670 GPU clocked at 928MHz base and 1,006MHz boost. It also has 2GB of GDDR5 memory on a 256-bit bus. The card requires a single 8-pin PCI-E power connector. Video outputs include two DVI, one DisplayPort, and a single HDMI port. The ASUS cooler includes a copper vapor chamber and a single CoolTech fan. According to ASUS, the DirectCU Mini is up to 20% cooler and slightly quieter than previous GTX 670 cards despite the smaller form factor.
This new card will be a great addition to Mini-ITX-based systems where saving space anyway possible is key. It is nice to know that gamers will soon have the option of powering a small form factor LAN box with a GPU as fast as the GTX 670. Even better, water cooling enthusiasts will be happy to know that the card still uses a reference PCB, meaning it is compatible with existing water blocks made for the current crop of GTX 670 cards.
Pricing and availability have not been announced, but the small form factor-friendly GPU is now official and should be coming sometime soon.