Subject: Graphics Cards | January 13, 2010 - 08:14 PM | Jeremy Hellstrom
AMDs next graphics card will be the HD5830, with performance in between the HD5770 and HD5850 and a price to match. eTechnix speculates that it will be a downclocked HD5850 with a 128-bit memory bus using GDDR3 instead of GDDR5.
Hopefully it will sit right around a $200 price point, making it easier to upgrade your card or to slap in a second card for a Crossfire rig.
Subject: Graphics Cards | January 11, 2010 - 04:45 AM | Steve Grever
Our good friends at NVIDIA took us around their booth at the Consumer Electronics Show to show us some new tablets that included their new Tegra 2 CPU as well as give us a demonstration of their next-gen GF100 graphics processor. Ryan and the rest of the PC Perspective team also got a quick overview of D-Link's new Boxie Box and checked out Electronic Art's game called Need for Speed SHIFT using NVIDIA's 3DVision glasses and the GF100 with Fermi architecture.
Subject: Graphics Cards | January 10, 2010 - 09:38 PM | Ryan Shrout
Even though the next-generation GF100 GPU is not rumored to be available until March, NVIDIA was showcasing the technology and what it can do. We have discussed NVIDIA Surround in a basic manner already and we'll have more later but I wanted to show our readers another demonstration that we captured on video from the show floor. In it you will see a system based around a Fermi GPU do some impressive GPU compute simulations in a game-style settings.
Subject: Graphics Cards | January 7, 2010 - 06:32 PM | Ryan Shrout
While we may be late thanks to awesome traffic and bad shuttle drivers in Las Vegas, we will be LIVE at the NVIDIA press conference where we plan to hear about NVIDIA Surround, Tegra 2 and more!
Subject: Graphics Cards | January 7, 2010 - 05:51 PM | Jeremy Hellstrom
We have been waiting with baited breath for the arrival of a LucidLogix Hydra board that is freely tested and reported on and now thanks to AnandTech we do. The MSI Big Bang Fuzion is ready for launch and it seems that the delay was unquestionably necessary. The performance numbers are not awful, but neither do they beat the native multiGPU solutions from nVIDIA and AMD. There were also issues with games that were on the supported list, something that is never endearing to a consumer. Drop
Subject: Graphics Cards | January 7, 2010 - 04:38 PM | Jeremy Hellstrom
LAS VEGAS--(EON: Enhanced Online News)--At the 2010 Consumer Electronics Show (CES), AMD (NYSE: AMD) today announced the shipment of its two millionth DirectX 11-capable graphics processor to its technology partners, cementing AMD graphics as the standard for DirectX 11 computing. This millions-shipped milestone comes just three months after the launch of the ATI Radeon HD 5800 series, the first DirectX 11-capable graphics products from AMD.
Subject: Graphics Cards | January 7, 2010 - 05:35 AM | Jonathan Hung
NVIDIA is putting its efforts into what has emerged to become one of the hot technology trends of the year - 3D entertainment.
Subject: Graphics Cards | January 5, 2010 - 06:37 PM | Jeremy Hellstrom
Revisiting the performance of AMD's new Eyefinity on various cards as new games come out certainly helps consumers get an idea what card (or cards) they will need to pick up in order to play their favourites on three monitors, but that isn't the only reason. Reviewers see a lot of different products, of which the vast majority are improvements on existing technology, only faster or with new features. The chance to see something completely different comes infrequently and Eyefinity is one of those special technologies.
Subject: Graphics Cards | December 30, 2009 - 07:45 PM | Ryan Shrout
Earlier in December we reported that NVIDIA had more than likely lost the big high-performance computing win with Oak Ridge Labs, and the lack of any kind of response pretty much confirms that.
The Fermi Architecture - 512 cores no more?
Subject: Graphics Cards | December 17, 2009 - 07:51 PM | Ryan Shrout
If true, the reports that NVIDIA has lost the deal with Oak Ridge Labs to power one of its upcoming super computers would be a huge blow to the ground NVIDIA has won in the high performance computing market. A story over at SemiAccurate repots as follows: