Subject: Graphics Cards | February 19, 2013 - 01:50 PM | Jeremy Hellstrom
Tagged: nvidia, graphics drivers, geforce, 314.07
Just in time for the arrival of the Titan previews comes the new WHQL 314.07 Geforce driver from NVIDIA. Instead of offering a list of blanket improvements and average frame rate increased, NVIDIA has assembled a list of charts showing performance differences between this driver and the previous one for their four top GPUs in both SLI and single card setups. As well they attempt to answer the question "Will it play Crysis 3?" with the chart below, showing the performance you can expect with Very High settings at 1080p resolution and 4x AA. They also provide a link to their GeForce Experience tool which will optimize your Crysis 3 settings to whatever NVIDIA card(s) you happen to be using. Upgrade now as the new driver seems to offer improvements across the board.
The new GeForce 314.07 WHQL driver is now available to download. An essential update for gamers jumping into Crysis 3 this week, 314.07 WHQL improves single-GPU and multi-GPU performance in Crytek’s sci-fi shooter by up to 65%.
Other highlights include sizeable SLI and single-GPU performance gains of up to 27% in Assassin’s Creed III, 19% in Civilization V, 14% in Call of Duty: Black Ops 2, 14% in DiRT 3, 11% in Just Cause 2, 10% in Deus Ex: Human Revolution, 10% in F1 2012, and 10% in Far Cry 3.
Rounding out the release is a ‘Excellent’ 3D Vision profile for Crysis 3, a SLI profile for Ninja Theory’s DmC: Devil May Cry, and an updated SLI profile for the free-to-play, third-person co-op shooter, Warframe.
You can download the GeForce 314.07 WHQL drivers with one click from the GeForce.com homepage; Windows XP, Windows 7 and Windows 8 packages are available for desktop systems, and for notebooks there are Windows 7 and Windows 8 downloads that cover all non-legacy products.
Subject: General Tech, Graphics Cards | February 19, 2013 - 01:38 PM | Jeremy Hellstrom
Tagged: Q4 2012, NVIDA, jon peddie, Intel, amd
Jon Peddie Research have released their findings on the state of the discrete and integrated graphics market, not counting servers, smartphone nor ARM based systems. While the overall PC market showed a negligible gain of 2.8% over the final quarter of 2012, discrete graphics sales saw a decline of 8.2%, which JPR attributes to a noticeable increase of purchases of systems with only an Intel or AMD embedded GPU. When you break the quarter down by manufacturer the news is not good. For AMD the last quarter did see an increase of less than 1% on desktop CPUs but declines of 19% in laptop CPU sales and 13.6% in discrete GPU sales. Intel saw desktop CPU sales up 3% but lost over 6% on laptop sales with their overall decline compared to last quarter sitting at about 3%. NVIDIA was hit the hardest at the end of 2012 with only their discrete GPU sales applying to this survey, a loss of 15% on the desktop and a loss of 18% on mobile GPUs lead to an overall decline of 16%.
Compared to the final quarter of 2011, AMD lost 29.4%, Intel 5% and NVIDIA 4.6%, reflecting the difficulty of making sales in the past year; the total discrete GPU market dropped almost 10% or about 3 million units. Even with the companies making profits, in some cases significant profits, the entire GPU market is depressed with ARM based devices and smartphones starting to erode the market that is already shrinking thanks to Intel and AMD shipping CPUs with embedded GPUs that are good enough for many users needs.
"The news was disappointing for every one of the major players. AMD dropped 13.6%, Intel slipped the least, just 2.9%, and Nvidia declined the most with 16.7% quarter-to-quarter change, this coming on the heels of a spectacular third quarter. The overall PC market actually grew 2.8% quarter-to-quarter while the graphics market declined 8.2% reflecting a decline in double-attach. That may be attributed to Intel's improved embedded graphics, finally making "good enough" a true statement."
Here is some more Tech News from around the web:
- Ubuntu? Fedora? Mint? Debian? We'll find you the right Linux to swallow @ The Register
- HDMI breakout lets you sniff HDCP crypto keys @ Hack a Day
- Nvidia announces Tegra 4i : Tegra 4's smaller sibling @ Hardware.info
- AMD: Star Trek holodecks within reach @ The Register
- Kingston Joint Giveaway @ NikKTech
GK110 Makes Its Way to Gamers
Our NVIDIA GeForce GTX TITAN Coverage Schedule:
- Tuesday, February 19 @ 9am ET: GeForce GTX TITAN Features Preview
- Thursday, February 21 @ 9am ET: GeForce GTX TITAN Benchmarks and Review
- Thursday, February 21 @ 2pm ET: PC Perspective Live! GTX TITAN Stream
Back in May of 2012 NVIDIA released information on GK110, a new GPU that the company was targeting towards HPC (high performance computing) and the GPGPU markets that are eager for more processing power. Almost immediately the questions began on when we might see the GK110 part make its way to consumers and gamers in addition to finding a home in supercomputers like Cray's Titan system capable of 17.59 Petaflops/s.
Nine months later we finally have an answer - the GeForce GTX TITAN is a consumer graphics card built around the GK110 GPU. Comprised of 2,688 CUDA cores, 7.1 billion transistors and with a die size of 551 mm^2, the GTX TITAN is a big step forward (both in performance and physical size).
From a pure specifications standpoint the GeForce GTX TITAN based on GK110 is a powerhouse. While the full GPU sports a total of 15 SMX units, TITAN will have 14 of them enabled for a total of 2688 shaders and 224 texture units. Clock speeds on TITAN are a bit lower than on GK104 with a base clock rate of 836 MHz and a Boost Clock of 876 MHz. As we will show you later in this article though the GPU Boost technology has been updated and changed quite a bit from what we first saw with the GTX 680.
The bump in the memory bus width is also key, being able to feed that many CUDA cores definitely required a boost from 256-bit to 384-bit, a 50% increase. Even better, the memory bus is still running at 6.0 GHz resulting in total memory bandwdith of 288.4 GB/s.
Subject: Graphics Cards | February 15, 2013 - 01:50 PM | Ryan Shrout
Tagged: southern islands, Solar System, Sea Islands, radeon, oland, mars, holycrapiamtotallyconfused, amd
Remember that story we posted last week and then discussed on the podcast about AMD not releasing any new GPUs in 2013? Today we had a call with AMD that attempted to answer some questions, clear up some confusion and give us some insight to the company's direction. I say 'attempted' because after a 53 minute discussion, we have some answers, but we also have some interesting questions that remain.
First, some definitions. If you have heard about code names like "Solar System" and "Sea Islands" you might not know what they refer to. Sea Islands is a new line that will fall into the 8000-series of products and will be a refresh, slightly different architecture based heavily on the Southern Islands parts you've come to love in the Radeon HD 7000 parts. Solar System is the name AMD has given to the sub-category of Sea Islands directly related to mobile products, the 8000M.
The slide that started this confusion - and our questions.
What might make things even more confusing is that there are some 8000-series parts that are already shipping in OEM desktops and notebooks that use verbatim HD 7000 GPU specs. So what you have is a combination series with Radeon HD 8000 that is made up of some rebrands and at least a couple of "new" chips thus far. Those two new GPUs, Mars and Oland (Radeon HD 8650 and HD 8670) depending on the mobile or desktop target, are already out and you can find them if you look hard. They are NOT available in the channel or for DIY PC users.
Our readers might be disappointed to learn that Sea Islands is heavily focused on the notebook and mobile markets though AMD did indicate that there some good things coming for the channel users in the future in 2013.
We also learned that the HD 7900-series will remain the company's high end parts through the end of 2013 but AMD said that there are new SKUs set to be released in this series sometime this year as well. Will that be the elusive HD 7990 dual-GPU product or maybe just something in the mainstream 7800 segments? They wouldn't tell us but we are definitely hoping for higher performance parts. You might also expect to see these new 7000-series parts to use Sea Islands silicon...
The Radeon HD 7970 looks like it will stay a focus for AMD throughout 2013.
Many readers might be wondering why AMD is breaking its standard cadence of near-yearly GPU releases. The answer came from AMD's Roy Taylor, VP of Channel Sales, who said that "7000 series parts are continuing to ramp UP, sales are increasing" so it is premature for AMD, as a company intending to make money, to introduce a new series or architecture.
In fact Roy was very emphatic about relieving us of potential ambiguity.
We have products, we have a road map. We are not announcing them now because we want to reposition the ones we have now. We are not sitting still, we do not lack resources, we do not lack imagination.
So what can you expect for the future? Sea Islands chips will continue to be released and eventually in the desktop, channel market and some of them will be branded as 7000-series parts and some of them will be branded as 8000-series parts. They wouldn't give us information on whether or not you'll see BIGGER chips (which we would assume would be faster) than the current HD 7900 cards or if they would all be in the mainstream segment.
AMD thinks its partnerships with key games like Crysis 3 will help keep momentum in 2013.
The residual message from this call was that AMD wants everyone to know that they have the best products on the market today and to maintain that momentum, AMD will enhance drivers, establish big partnerships with gaming companies and developers and release SOME new GPUs.
AMD was cagey again when asked about the possibility of a new architecture by the end of 2013 but based on the reactions of AMD reps I tend to believe we will see it, though probably very very close to the end of that time. (Update: AMD did in fact say that an entire new product stack would be releaed by the end of 2013.)
That all clear now?
Subject: General Tech, Graphics Cards | February 15, 2013 - 01:43 PM | Jeremy Hellstrom
Tagged: UNIGINE, valley benchmark
Move over Heaven, there is an uncanny new benchmark in town from UNIGINE called Valley, which takes your GPU on a journey to Siberia and forces it to labour on wide open spaces with full DX11 scenery.
Valley Benchmark is a new GPU stress-testing tool from the developers of the very popular and highly acclaimed Heaven Benchmark. The forest-covered valley surrounded by vast mountains amazes with its scale from a bird's-eye view and is extremely detailed down to every leaf and flower petal. This non-synthetic benchmark powered by the state-of-the art UNIGINE Engine showcases a comprehensive set of cutting-edge graphics technologies with a dynamic environment and fully interactive modes available to the end user.
Editions of Valley Benchmark
Alongside a completely free Basic edition, Valley Benchmark provides in-depth performance reviews in the Advanced and Pro editions for hardware manufacturers, graphics driver developers, industry professionals and all individuals involved with video card stability testing.
Targeted mainly towards overclockers and hardware reviewers, the Advanced Edition allows for stress-testing under different conditions and thorough reports outputted into a flexible format.
The Advanced Edition exclusive features:
- Command line automation for full control over run tests S
- tress testing mode (benchmark looping)
- Highly customizable reports in CSV format
The Professional Edition is a comprehensive benchmarking tool for hardware manufacturers and graphics driver developers as it is bestowed with the complexity of top-level gaming technology.
The Professional Edition exclusive features include:
- Licensed for commercial use (for one PC, site licensing option is available on request)
- Command line automation for full control over run tests Stress testing mode (benchmark looping)
- Highly customizable reports in CSV format Per-frame deep analysis
- Rendering of a specified frame Software rendering mode in DirectX 11 for reference purposes Technical support
Subject: Graphics Cards | February 12, 2013 - 05:33 PM | Jeremy Hellstrom
Tagged: galaxy, GTX 660 GC, factory overclocked, nvidia
For those unable or unwilling to spend over $200 on a GPU, the non-Ti Galaxy GTX 660 GC comes with a nice factory overclock of 6GHz on its 2GB of RAM and a core of 1006MHz with a boost of 1074MHz as well as a custom dual fan cooler. You are not going to be maxing out Crysis 3 with it, at this level of power perhaps online gaming is the way to go in which case NVIDIA's new bundle of in game currency might make a lot of sense for you. [H]ard|OCP tested it against the similarly priced HD7850 as well as the slightly more expensive HD7870. In the tests the 660 GC beat the HD7850 by enough that it is not really worth your consideration and traded wins with the slightly more expensive HD7870. In this particular case it might be the bundle that decides you, do you want in game currency or free full games?
"GALAXY has a factory overclocked NVIDIA GeForce GTX 660 complete with a custom cooler. Today, we have it on our test bench to run against an AMD Radeon HD 7870 GHz Edition and an AMD Radeon HD 7850 to see which is the go-to card at the $200 price point now in the latest games with the latest drivers."
Here are some more Graphics Card articles from around the web:
- GeForce GTX 660 Graphics Cards Roundup @ X-bit Labs
- Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards @ iXBT Labs
- Arctic Accelero Hybrid VGA Cooler Review @ Hi Tech Legion
- 3DMark 2013 review: 52 graphic cards tested with the new benchmarks @ Hardware.info
- AMD Radeon Gallium3D Starting To Out-Run Catalyst In Some Cases @ Phoronix
- Workstation Graphics Card Comparison Guide @ TechARP
- Desktop Graphics Card Comparison Guide @ TechARP
- A Trio from HIS: 7970 IceQ X² GHz Edition, 7950 IceQ X² Boost Clock and 7850 IceQ Turbo X Graphics Cards @ X-bit Labs
- Club 3D Radeon HD 7990 6GB @ Hardware.info
Subject: Graphics Cards | February 11, 2013 - 12:33 PM | Ryan Shrout
Tagged: world of tanks, planetside 2, nvidia, Hawken, gtx, geforce, bundle
AMD has definitely been winning the "game" of game bundles and bonus content with graphics cards purchases, as is evident from the recent Never Settle Reloaded campaign that includes titles like Crysis 3, Bioshock Infinite and Tomb Raider. I made comments that NVIDIA was falling behind and may even start to look like they have moved away from a focus on PC gamers since they hadn't made any reply over the last year...
After losing a bidding war with AMD over Crysis 3, today NVIDIA is unveiling a bundle campaign that attack at a different angle; rather than including in bundled games NVIDIA is working free-to-play titles. How do you give gamers bonuses by including free to play games? Credits! Cold hard cash!
Starting today if you pick up any GeForce GTX graphics card you'll be eligible to get free in-game credit to use in one of the three free-to-play titles partnering with NVIDIA. A GTX 650 or GTX 650 Ti will net you $25 in each for a total bonus of $75 while buying a GTX 660 or higher, all the way up to the GTX 690 results in $50 per game for a total of $150.
Also, after asking NVIDIA about it, this is a PER CARD bundle so if you get an SLI pair of anything, you'll get double the credit. A pair of GeForce GTX 660s for an SLI rig results in $100 per game, $300 total!
This is a very interesting approach that NVIDIA has decided to take and I am eager to get feedback from our readers on the differences between AMD's and NVIDIA's bundles. I have played quite a bit of Planetside 2 and definitely enjoyed it; it is a graphics showcase as well with huge and expansive levels and hundreds of people per server. World of Tanks and Hawken I am less familiar with but they also are extremely popular.
Leave us your comments below! Do you think NVIDIA's new GeForce GTX gaming bundle for free-to-play game credits can be successful!
If you are looking for a new GeForce GTX card today and this bundle convinced you to buy, feel free to use the links below.
- GeForce GTX 690 - $999
- GeForce GTX 680 - $459
- GeForce GTX 670 - $359
- GeForce GTX 660 Ti - $279
- GeForce GTX 660 - $219
- GeForce GTX 650 Ti - $149
- GeForce GTX 650 - $114
Subject: Graphics Cards | February 8, 2013 - 10:03 PM | Ryan Shrout
Tagged: amd, radeon
In a report first spotted by Rage3D from source website 4gamer.net, news is filtering out that AMD may in fact have no new discrete graphics card releases for the remainder of 2013! While talking with the APAC media about the fantastic Never Settle Reloaded game bundle, they showed THIS slide.
That seems to indicate that at the very least through the 3rd quarter of 2013, AMD has no plans to update or add to its discrete graphics card roadmap. We had heard whispers of this fact while at CES in January but this pretty much puts a cap on it. And with the wording of "throughout 2013" it could indicate we won't see new product until 2014.
Also shown, this product comparison between AMD and NVIDIA, put together by AMD, is a bit lopsided and less than 100% accurate in my eyes. With the release of the new 3DMark Fire Strike benchmark AMD has a distinct advantage and it seems the slide here is based completely on that....blech.
Regardless, what does it mean if AMD actually has no new discrete, enthusiast class cards for 2013? We know the rumors are swirling about the NVIDIA GeForce Titan based on the GK110 and sporting 2688 CUDA cores and it will likely take the place as the fastest single GPU card on the market. AMD has been depending on its partners to build multi-GPU options based on Southern Islands like the ASUS ARES II and Powercolor Devil 13 but they have been pretty low volume. Our original review of the HD 7970 launched in December 2011....this could be quite a drought.
Subject: General Tech, Graphics Cards, Motherboards | February 4, 2013 - 06:36 PM | Scott Michaud
Tagged: msi, 3dmark
Do you have a beastly system with MSI parts, intense overclocking knowledge, and a desire for even more high-end parts? In honor of the new 3DMark's release, the motherboard and graphics card manufacturer is letting users of their parts enter in a contest for the highest 3DMark scores.
In a partnership with the benchmarking leaderboard site, HWBot, MSI wants to see top scores for the Fire Strike test on the newly released 3DMark. The contest will run until March 3rd for entries looking to post top ranks. Beyond that, anyone with an MSI Z77 motherboard who enters before February 10th will be entered in a “Lucky Draw” for the MSI Z77A-GD55 Motherboard.
Winners of the leader contest will receive the MSI R7970 Lightning Boost Edition card for first place and an MSI Z77A-GD80 for second place. Note that we are not affiliated with this contest, we just think that our readers might like to know.
The Ice Storm Test
Love it or hate it, 3DMark has a unique place in the world of PC gaming and enthusiasts. Since 3DMark99 was released...in 1998...with a target on DirectX 6, Futuremark has been developing benchmarks on a regular basis in time with major API changes and also major harware changes. The most recent release of 3DMark11 has been out since late in 2010 and has been a regular part of our many graphics card reviews on PC Perspective.
Today Futuremark is not only releasing a new version of the benchmark but is also taking fundamentally different approach to performance testing and platforms. The new 3DMark, just called "3DMark", will not only target high-end gaming PCs but integrated graphics platforms and even tablets and smartphones.
We interviewed the President of Futuremark, Oliver Baltuch, over the weekend and asked some questions about this new direction for 3DMark, how mobile devices were going to affect benchmarks going forward and asked about the new results patterns, stuttering and more. Check out the video below!
Make no bones about it, this is a synthetic benchmark and if you have had issues with that in the past because it is not a "real world" gaming test, you will continue to have those complaints. Personally I see the information that 3DMark provides to be very informative though it definitely shouldn't be depended on as the ONLY graphics performance metric.