Subject: Graphics Cards | March 21, 2008 - 06:19 PM | Ryan Shrout
We know about as much as you do abut this, but news hit me that S3, a company that many had forgotten, launched a new GPU recently. It's a $60 graphics card that is sold through S3 directly that offers DX10.1 support and runs on PCI Express 2.0 with a 256MB frame buffer.
As for performance: S3 compares it to the AMD HD 3450 and the NVIDIA 8400 GPUs:
Subject: Graphics Cards | March 21, 2008 - 05:49 PM | Ryan Shrout
Digitimes is hearing that AMD is going to be releasing a new card in the Radeon HD 3800-series that will be the HD 3830. Definitely an entry level card, it is rumored to have the full 320 stream processors, 16 texture units and 16 RBEs that the HD 3850 and 3870 have but will have a 128-bit memory interface and will top out at 256MB of memory. Performance is a mystery, but with prices around $129, this could be a GREAT value card for a cheap system upgrade.
Subject: Graphics Cards | March 20, 2008 - 05:52 PM | Jeremy Hellstrom
AMD is really moving along in their efforts to make their graphics cards and source code open source. As Phoronix reports, Radeon R100 to R600 microcode has been released. It is nice to see a company opening up as much as AMD has, and it signifies great things for the Linux community.
Subject: Graphics Cards | March 19, 2008 - 02:02 PM | Jeremy Hellstrom
- Watercooled Intel Core 2 Quad QX9650 overclocked to 4GHz - CHECK!
- EVGA 780i SLI that supports Quad SLI - CHECK!
- 4GB OCZ Reaper X PC6400 DDR-2 4-4-3-12 - CHECK!
- WD Raptor 150GB 10,00RPM SATA - CHECK!
- Zalman 1000W Modular PSU - CHECK!
- Forceware 174.53 driver - CHECK!
- Two GeForce 9800 GX2's in SLI - CHECK!
Subject: Graphics Cards | March 18, 2008 - 12:14 PM | Jeremy Hellstrom
The GeForce 9800 GX2 has arrived, with XFX supplying the card that Ryan tested. The big news for nVIDIA is performance; Crysis at 1024X768 with 4xAA ran with a higher average frame rate than any previous cards maximum. This card doesn't beat the competition by a small margin, it is far in the lead. The performance does come with a price, and I don't mean the $600 it goes for on NewEgg. The ca
NVIDIA revisits dual-GPU graphics boards
NVIDIA has classed up their card line up with a new product that is both powerful for gaming and easy on the eyes with a unique completely enclosed cooling design. You might want to keep that credit card with your better half though before heading in to read.
Subject: Graphics Cards | March 17, 2008 - 03:07 PM | Jeremy Hellstrom
The Arctic Cooling Accelero S1 Rev. 2 is a great deal, and you can look at it in two ways. If you want to play it safe, the cooler temperatures you can reach on your GPU with this cooler ought to extend it's life; or you can use it to overclock the bejesus out of the card. Either way, there is a good chance you will like this cooler as much as [H]ard|OCP does.
Subject: Graphics Cards | March 12, 2008 - 01:46 PM | Ryan Shrout
According to this report on VR-Zone their is a good chance that NVIDIA's next generation chip will be a die-shrunk version of the G92 architecture. The current G92 runs on a 65nm process as the 8800 GT and 8800 GTS 512MB cards and a 55nm revision would likely bring higher clocks and lower power - always a plus.
Subject: Graphics Cards | March 12, 2008 - 01:46 PM | Jeremy Hellstrom
It's deja vu all over again, nVIDIA is releasing a 'two cards in one' graphics card, and the initial results aren't encouraging. The performance isn't that much better, and the price is much higher. As The Inquirer points out, the price is actually in the same ball park as a 3870X2 AND a 3870. Three-way CrossfireX for the same price as single card SLI ... which would you go with?
Subject: Graphics Cards | March 12, 2008 - 01:15 PM | Ryan Shrout
Could a new graphics card from AMD that uses two HD 3850 GPUs instead of two HD 3870 GPUs be ready for Computex? That is one rumor that is circulating though the prospect is not as odd as it might first sound. Remember, the HD 3870 and HD 3850 are essentially the same GPU with different clock speeds and different memory configurations in most cases. It would be very easy for AMD to simply use two lower clocked GPUs and cut the memory in half to offer a po