Subject: Graphics Cards | August 5, 2012 - 08:33 PM | Tim Verry
Tagged: nvidia, kepler, gtx 660ti, graphics cards, gaming
Update: US-based retailers are starting to list the GTX 660 Ti as well, and at least one card is listed for $299, so there may be some hope despite the $349.99 MSRP. See the $299 PNY GTX 660 Ti graphics card here.
The GTX 660 Ti is an NVIDIA Kepler-based graphics card that has seen several leaks and even a full review ahead of official release. In the leaked review, rumored specifications were confirmed, and the card was shown to be very close to the existing GTX 670 GPU. Sometimes it was merely a couple of frames behind the $400+ GPU.
On the podcast, Ryan, Josh, and Jeremy speculated that–should the GTX 660 Ti be priced closer to the $300 mark in the rumored $300-400 pricing–it would be a very desirable gaming graphics card. Hardware-wise, the GTX 660 Ti is nearly identical to the GTX 670, and only sees a reduction in the memory bus from 256-bit to 192-bit. For a $100 cheaper card, gamers would be getting extremely close to the performance of the much more expensive GTX 670 Kepler card.
Unfortunately, it may not be the gaming card that people have been hoping for. According to Tom’s Hardware, a Swedish retailer has listed the GTX 660 Ti on its website for pre-orders at just under $400. At that price point, the GTX 660 Ti is much less desirable, and will be hard to justify versus springing for the GTX 670 for a bit more money.
Here’s hoping that the pre-order pricing is simply higher than the prices people will see once actual cards from NVIDIA and partners are officially released en masse. Do you think that there is still hope for the GTX 660 Ti as the gaming card of choice, or will you be looking elsewhere?
Subject: Graphics Cards | August 3, 2012 - 05:42 PM | Jeremy Hellstrom
Tagged: MSI GTX680 Lightning, LN2 BIOS, factory overclocked, gtx 680, MSI Afterburner, overvolting
[H]ard|OCP recently tested the highly overclocked MSI GTX 680 Lightning, but because of the new release of the MSI Afterburner 2.2.3 tool they decided to retest to see if the new Afterburner will raise the ceiling on their maximum overclock. This new version allows voltage control of the GPU, the memory, and the PLL which ought to help push the card to higher frequencies. That did certainly turn out to be the case as they saw noticeable increases to all of the clocks on the card and more importantly translated into improvements in game play. When they used the LN2 BIOS the improvements were even more impressive. Remember that volt modding will shorten the lifespan of the card, but what a life it will have while it survives.
"Today we are revisiting the MSI GeForce GTX 680 Lightning video card. with its long-promised GPU and RAM voltage tweaking Afterburner software. We test both the stock BIOS and "LN2 BIOS" to find the best possible gaming experience the Lightning has to offer, and determine if the performance justifies the price."
Here are some more Graphics Card articles from around the web:
- EVGA GeForce GTX 680 Classified 4GB with EVBOT @ Guru of 3D
- Gigabyte GTX 680 Super Over Clock 2GB Graphics Card Review @ eTeknix
- Nvidia GeForce GTX 690 SLI @ Hardware.info
- ASUS GeForce GTX 670 DirectCU II TOP@Bjorn3D
- NVIDIA GEFORCE GTX 660 Ti 2GB @ Tweaktown
- KFA GeForce GTX 680 LTD OC 2 GB @ techPowerUp
- ARCTIC Accelero Hybrid AIO Video Card Cooler @ Tweaktown
- Arctic Accelero Twin Turbo II VGA Cooler Review @ Hardware Secrets
- Arctic Accelero Xtreme III VGA Cooler Review @ Hardware Secrets
- A New Dawn DX11 Demo Compared to the Old Dawn @ [H]ard|OCP
- XFX Radeon HD7850 2GB Black Edition Video Card Review @ Legit Reviews
- Club3D HD 7850 Royal Queen 1 GB @ techPowerUp
- Gigabyte Radeon HD 7970 Super OverClock @ TechSpot
- HIS Radeon HD 7970 X IceQ X2 Turbo 3GB Overclocked @ Tweaktown
- Gigabyte HD 7970 Super OC 3 GB @ techPowerUp
- AMD Radeon HD 7970 Video Card Review @ Madshrimps
- Sapphire TOXIC 7970 GHz 6GB Graphics Card Review @ HardwareHeaven
Subject: General Tech, Graphics Cards, Processors, Systems | July 31, 2012 - 08:35 PM | Scott Michaud
Eurogamer and Digital Foundry believe that a next-generation Xbox developer kit somehow got into the hands of an internet user looking to fence it for $10,000. If the rumors are true, a few interesting features are included in the kit: an Intel CPU and an NVIDIA graphics processor.
A little PC perspective on console gaming news…
If the source and people who corroborate it are telling the truth: somehow Microsoft lost control of a single developer’s kit for their upcoming Xbox platform. Much like their Cupertino frenemies who lost an iPhone 4 in a bar which was taken and sold for $5000 to a tech blog, the current owner of the Durango devkit is looking for a buyer for a mere $10000. It is unlikely he found it on a bar stool.
One further level of irony, the Xbox 360 alpha devkit were repurposed Apple Mac Pros.
Image source: DaE as per its own in-image caption.
Alpha developer kits will change substantially externally but often do give clues to what to expect internally.
The first Xbox 360 software demonstrations were performed on slightly altered Apple Mac Pros. At that time, Apple was built on a foundation of PowerPC by IBM while the original Xbox ran Intel hardware. As it turned out, the Xbox 360 was based on the PowerPC architecture.
Huh, looks like a PC.
The leaked developer kit for the next Xbox is said to be running X86 hardware and an NVIDIA graphics processor. 8GB of RAM is said to be present on the leaked kit albeit that only suggests that the next Xbox will have less than 8GB of RAM. With as cheap as RAM is these days -- a great concern for PC gamers would be that Microsoft would load the console to the brim with memory and remove the main technical advantage of our platform. Our PCs will still have that advantage once our gamers stop being scared of 64-bit compatibility issues. As a side note, those specifications are fairly identical to the equally nebulous specs rumored for Valve’s Steam Box demo kit.
The big story is the return to x86 and NVIDIA.
AMD is not fully ruled out of the equation if they manage to provide Microsoft with a bid they cannot refuse. Of course practically speaking AMD only has an iceball’s chance in Hell of have a CPU presence in the upcoming Xbox – upgraded from snowball. More likely than not Intel will pick up the torch that IBM kept warm for them with their superior manufacturing.
PC gamers might want to pay close attention from this point on…
Contrast the switch for Xbox from PowerPC to X86 with the recent commentary from Gabe Newell and Rob Pardo of Blizzard. As Mike Capps has allured to – prior to the launch of Unreal Tournament 3 – Epic is concerned about the console mindset coming to the PC. It is entirely possible that Microsoft could be positioning the Xbox platform closer to the PC. Perhaps there are plans for cross-compatibility in exchange for closing the platform around certification and licensing fees?
Moving the Xbox platform closer to the PC in hardware specifications could renew their attempts to close the platform as has failed with their Games for Windows Live initiative. What makes the PC platform great is the lack of oversight about what can be created for it and the ridiculous time span for compatibility for what has been produced for it.
It might be no coincidence that the two companies who are complaining about Windows 8 are the two companies who design their games to be sold and supported for decades after launch.
And if the worst does happen, PC gaming has been a stable platform despite repetitive claims of its death – but could the user base be stable enough to handle a shift to Linux? I doubt that most would even understand the implications of proprietary platforms on art to even consider it. What about Adobe and the other software and hardware tool companies who have yet to even consider Linux as a viable platform?
The dark tunnel might have just gotten longer.
Subject: Graphics Cards | July 31, 2012 - 03:03 PM | Tim Verry
Tagged: nvidia, kepler, gtx 670, gtx 660ti, graphics cards
Last week, additional information leaked about the upcoming Kepler-based NVIDIA GTX 660 Ti graphics card. Those rumors suggested that the GPU would be very similar to the one found in existing GTX 670 (which we recently reviewed).
We speculated that the GTX 660 Ti could be an awesome card, assuming the price was right. While we do not have any pricing information–the best guess from rumors is that it is in the $300 to $400 range–as a result of Tweaktown breaking the release date, we now know that the latest rumors were true.
The GTX 660Ti will feature 1344 CUDA cores and 2GB of GDDR5 memory on a 192-bit memory bus. This puts the GTX 660 Ti very close to the current 670 in terms of potential performance. According to the leaked benchmarks, that seems to be the case. The GTX 660 Ti is only a couple of frames behind the GTX 670 in Just Cause 2 and Dirt 3, for example. Considering this card is likely to use a bit less power and cost less, it is shaping up to be a rather desirable card. If this ends up being on the low end of the $300-400 range (rumors suggest otherwise, however), I suspect many gamers are going to opt for this new Kepler card rather than the more expensive and only very slightly faster GTX 670.
What do you think about the GTX 660 Ti, is the card you were hoping for?
Subject: General Tech, Graphics Cards | July 28, 2012 - 01:20 AM | Ryan Shrout
Tagged: nvidia, end of nations, beta
Looking for something to do August 10-12th? We have some good news for you!
The wait for End of Nations is over. NVIDIA and PC Perspective are inviting our lucky readers to join the global conflict in Trion's End of Nations closed beta weekend on August 10-12th.
Wage sprawling 56-player battles in this year's most anticipated MMO real-time strategy game for three action-packed days, absolutely free!
To get started, gear up with the ultimate weapon —an NVIDIA GeForce GTX graphics card - then click the link below to get your FREE beta key!
We have 1000 keys up for grabs!! So get one and pass it on to your friends as well!
Subject: General Tech, Graphics Cards, Systems, Mobile | July 27, 2012 - 02:12 PM | Scott Michaud
Tagged: windows 8, winRT, gpgpu
Paul Thurrott of Windows Supersite reports that Windows 8 is finally taking hardware acceleration seriously and will utilize the GPU across all applications. This hardware acceleration should make Windows 8 perform better and consume less power than if the setup were running Windows 7. With Microsoft finally willing to adopt modern hardware for performance and battery life I wonder when they will start using the GPU to accelerate tasks like file encryption.
It is painful when you have the right tool for the job but must use the wrong one.
Windows has, in fact, used graphics acceleration for quite some time albeit in fairly mundane and obvious ways. Windows Vista and Windows 7 brought forth the Windows Aero Glass look and feel. Aero was heavily reliant on Shader Model 2.0 GPU computing to the point that much of it would not run on anything less.
Washington State is not that far away from Oregon.
Microsoft is focusing their hardware acceleration efforts for Windows 8 on what they call mainstream graphics. 2D graphics and animation were traditionally CPU-based with a couple of applications such as Internet Explorer 9, Firefox, and eventually Chrome allowing the otherwise idle GPU to lend a helping hand. As such, Microsoft is talking up Direct2D and DirectWrite usage all throughout Windows 8 on a wide variety of hardware.
The driving force that neither Microsoft nor Paul Thurrott seems to directly acknowledge is battery life. Graphics Processors are considered power-hogs until just recently for almost anyone who assembles a higher-end gaming computer. Despite this, the GPU is actually more efficient at certain tasks than a CPU -- this is especially true when you consider the GPUs which will go into WinRT devices. The GPU will help the experience be more responsive and smooth but also consume less battery power. I guess Microsoft is finally believes that the time is right to bother using what you already have.
There are many more tasks which can be GPU accelerated than just graphics -- be it 3D or the new emphasis on 2D acceleration. Hopefully after Microsoft dips in their toe they will take the GPU more seriously as an all-around parallel task processor. Maybe now that they are implementing the GPU for all applications they can consider using it for all applications -- in all applications.
Subject: Graphics Cards | July 26, 2012 - 02:59 PM | Tim Verry
Tagged: nvidia, gtx 670, gtx 660 Ti, GK104
Earlier this year, we covered rumors on several mid-range NVIDIA Kepler graphics cards. Swedish enthusiast site Sweclockers claims to have launch specifications and pricing on one of those cards: the GTX 660 Ti. The specifications the site has managed to get a hold of reinforce previous rumors except for the amount of RAM. While initial reports suggested the GTX 660 Ti would have either 1.5 GB or 3 GB, Sweclockers has stated that the card will have 2 GB of GDDR5 memory.
Aside from the bump down in the memory interface to a 192-bit bus (from the 256-bit interface of the GTX 670) capable of 144.19 GB/s throughput, the GTX 660 Ti is nearly the same as the currently available GTX 670. Allegedly, the GTX 660 Ti will run at the same GPU and memory clockspeeds as the GTX 670 – 915 MHz base/980 MHz boost and 6008 MHz effective respectively. The reference design will further be a dual-slot design with two DVI, one HDMI, and one DisplayPort output. Allegedly, it will be powered by two six-pin PCI-E power connectors and will have a 150W TDP, which means it needs slightly less cooling than the GTX 670 (which has a 170W TDP).
Interestingly, the site claims that the GTX 660 Ti will be available for purchase on August 16, 2012 for a bit over 3000 SEK (including VAT) which is roughly $436 (minus VAT since we do not have that). This price is in contrast to our prediction of a $300 to $400 graphics card. It may end up being very close to the high-end $400 number, or a bit above as it is very similar to the GTX 670. Hopefully the change in the amount of graphics memory means that you will be able to get custom 4GB GTX 660 Ti cards.
You can find more information about NVIDIA’s latest “Kepler” GK104 cards in our recent GTX 670 review. Are you ready for the mid-range NVIDIA cards? Which ones are you planning to get, should the rumored specs hold true?
Subject: Graphics Cards | July 25, 2012 - 08:55 PM | Scott Michaud
Tagged: A New Dawn, video, nvidia, kepler
NVIDIA has released a demo for their latest GPUs which might be a little familiar for long-time PC gamers. A New Dawn is a remake of their Dawn demo for the GeForce FX line of graphics processors -- now with less fake Subsurface Scattering.
What better way to gratify your purchases than scantily clad demihumans?
It has been just months shy of a decade ago when NVIDIA revealed Dawn to attendees of the 2002 Game Developer’s Conference. Dawn, titled after both lead character as well as setting, was set to promote NVIDIA’s GeForce FX line of graphics processors.
Less lensflare than a dance-pop video.
Every day I'm fluttering -- ROFL.
A New Dawn demonstrates the progression between early-stage DirectX 9 through current DirectX 11.1 hardware. Dawn has been remodeled and reskinned with actual subsurface scattering to liven up her skin; fine-grained hair with blur shaders to prevent aliasing; and certainly no extra clothing. Her environment is modeled and tessellated -- a step up from the box she used to reside in.
A New Dawn is available for download from NVIDIA’s website if your system meets the minimum specifications -- which, if you have a Kepler GPU, is quite forgiving for a system that a Kepler GPU would make sense to be in.
Subject: Graphics Cards | July 25, 2012 - 11:56 AM | Tim Verry
Tagged: radeon hd 7990, GCN, dual gpu, amd, 7990
The long-awaited dual GPU Graphics Core Next architecture Radeon HD 7990 has missed its original Computex reveal and will likely miss the July release suggested by previous rumors. Interestingly, VR-Zone China reportedly has some updated information on specifications and release date.
The 7970. Expect the 7990 to have a much larger PCB and heatsink!
The dual GPU 7990 will allegedly not be released until at least late August 2012. Further, it will be powered by four six-pin PCI-E power connectors, and will have 6GB of GDDR5 memory (total, 3GB per GPU). Connecting the two 7970 Tahiti XT GPU cores in CrossFire will be a PLX chip – similar to that found in the dual GPU NVIDIA GTX 690 graphics card. As far as video outputs, you can expect four mini-DisplayPorts and two dual-link DVI connectors.
Additionally, previous rumors suggested that the GPU cores would be clocked at 850 MHz, but that may not be the case now that AMD is seeing much better binning with its GHz Edition chips. Also unclear is whether or not the Radeon HD 7990 will have any sort of Powertune with Turbo boost technology like the 7970 GHz Edition. Being based on two 7970 GPU cores, you can look forward to 4,096 stream processors, 64 ROP units, and a dual slot design with three fans providing cooling for the heatsink.
Right now, AMD does not have an answer to the NVIDIA GTX 690 which has been on the market for a while. At this point, you may be better off getting two 7970 GHz Edition graphics cards and putting them in CrossFire. Granted, they are going to take up more space in your case but you can get them today, they will have GPU boost, and will likely cost less to boot. With that said, I do understand the allure of a dual GPU AMD card based on GCN and hope to see it soon.
Stay tuned for more Radeon 7990 coverage as it arrives.
The HAWK Returns
The $300 to $400 range of video cards has become quite crowded as of late. If we can remember way back to March when AMD introduced their HD 7800 series of cards, and later that month we saw NVIDIA release their GTX 680 card. Even though NVIDIA held the price/performance crown, AMD continued to offer their products at what many considered to be grossly overpriced considering the competition. Part of this was justified because NVIDIA simply could not meet demand of their latest card, and they were often unavailable for purchase at MSRPs. Eventually AMD started cutting back prices, but this led to another issue. The HD 7950 was approaching the price of the HD 7870 GHz Edition. The difference in prices between these products was around $20, but the 7950 was around 20% faster than the 7870. This made the HD 7870 (and the slightly higher priced overclocked models) a very unattractive option for users.
It seems as though AMD and their partners have finally rectified this situation, and just in time. With NVIDIA finally being able to adequately provide stock for both the GTX 680 and GTX 670, the prices on the upper-midrange cards has taken a nice drop to where we feel they should be. We are now starting to see some very interesting products based on the HD 7850 and HD 7870 cards, one of which we are looking at today.
The MSI R7870 HAWK
The R7870 Hawk utilizes the AMD HD 7870 GPU. This chip has a reference speed of 1 GHz, but with the Hawk it is increased to a full 1100 MHz. The GPU has the entire 20 compute units enabled featuring 1280 stream processors. It has the 256 bit memory bus running 2GB of GDDR-5 memory at 1200 MHz, which gives a total bandwidth of 160 GB/sec. I am somewhat disappointed that MSI did not give the memory speed a boost, but at least the user can enable that for themselves through the Afterburner software.