Subject: Graphics Cards, Mobile | March 3, 2015 - 10:43 PM | Ryan Shrout
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3
Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.
I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.
Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!
While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.
Subject: Graphics Cards | March 3, 2015 - 02:44 PM | Sebastian Peak
Tagged: video cards, nvidia, gtx 960, geforce, 4GB
They said it couldn't be done, but where there are higher density chips there's always a way. Today EVGA and Inno3D have both announced new versions of GTX 960 graphics cards with 4GB of GDDR5 memory, placing the cards in a more favorable mid-range position depending on the launch pricing.
EVGA's new 4GB NVIDIA GTX 960 SuperSC
Along with the expanded memory capacity EVGA's card features their ACX 2.0+ cooler, which promises low noise and better cooling. The SuperSC is joined by a standard ACX and the higher-clocked FTW variant, which pushes Base/Boost clocks to 1304/1367MHz out of the box.
Inno3D's press release provides fewer details, and the company appears to be launching a single new model featuring 4GB of memory which looks like a variant of their existing GTX 960 OC card.
The existing Inno3D GTX 960 OC card
The current 2GB version of the GTX 960 can be found starting at $199, so expect these expanded versions to include a price bump. The GTX 960, with only 1024 CUDA cores (half the count of a GTX 980) and a 128-bit memory interface, has been a very good performer nonetheless with much better numbers than last year's GTX 760, and is very competitive with AMD's R9 280/285. (It's a great overclocker, too.) The AMD/NVIDIA debate rages on, and NVIDIA's partners adding another 4GB offering to the mix will certainly add to the conversation, particularly as an upcoming 4GB version of the GTX 960 was originally said to be unlikely.
Subject: General Tech | February 26, 2015 - 11:29 PM | Scott Michaud
Tagged: nvidia, hearthstone, esports
Professional and amateur players of Hearthstone: Heroes of Warcraft can compete for a share of the $25,000 prize pool and other perks, hosted by NVIDIA. Once the pool of players are whittled down to the sixteen invited pros and the top sixteen non-professionals, they will compete in a playoff format. The 32 players at that stage will each receive an NVIDIA Shield Tablet, the top 16 will receive money, and the top eight will get Blizzard World Championship qualifier points may either start their career or get them even closer to being invited to the autumn finals.
Breaking down the above into a little more detail:
|Prize Money||Qualification Points||Shield Tablet|
|3rd & 4th Place||$1,500||Some||✔|
|5th - 8th Place||$750||Some||✔|
|9th - 16th Place||$500||-||✔|
|17th - 32nd Place||-||-||✔|
NVIDIA will be streaming the event as a four-hour event every week, which consist of group-stage highlights. Registration will close on March 19th at noon (EST). The actual playoffs will take place on May 30th and May 31st, also streamed on NVIDIA's Twitch channel.
Quiet, Efficient Gaming
The last few weeks have been dominated by talk about the memory controller of the Maxwell based GTX 970. There are some very strong opinions about that particular issue, and certainly NVIDIA was remiss on actually informing consumers about how it handles the memory functionality of that particular product. While that debate rages, we have somewhat lost track of other products in the Maxwell range. The GTX 960 was released during this particular firestorm and, while it also shared the outstanding power/performance qualities of the Maxwell architecture, it is considered a little overpriced when compared to other cards in its price class in terms of performance.
It is easy to forget that the original Maxwell based product to hit shelves was the GTX 750 series of cards. They were released a year ago to some very interesting reviews. The board is one of the first mainstream cards in recent memory to have a power draw that is under 75 watts, but can still play games with good quality settings at 1080P resolutions. Ryan covered this very well and it turned out to be a perfect gaming card for many pre-built systems that do not have extra power connectors (or a power supply that can support 125+ watt graphics cards). These are relatively inexpensive cards and very easy to install, producing a big jump in performance as compared to the integrated graphics components of modern CPUs and APUs.
The GTX 750 and GTX 750 Ti have proven to be popular cards due to their overall price, performance, and extremely low power consumption. They also tend to produce a relatively low amount of heat, due to solid cooling combined with that low power consumption. The Maxwell architecture has also introduced some new features, but the major changes are to the overall design of the architecture as compared to Kepler. Instead of 192 cores per SMK, there are now 128 cores per SMM. NVIDIA has done a lot of work to improve performance per core as well as lower power in a fairly dramatic way. An interesting side effect is that the CPU hit with Maxwell is a couple of percentage points higher than Kepler. NVIDIA does lean a bit more on the CPU to improve overall GPU power, but most of this performance hit is covered up by some really good realtime compiler work in the driver.
Asus has taken the GTX 750 Ti and applied their STRIX design and branding to it. While there are certainly faster GPUs on the market, there are none that exhibit the power characteristics of the GTX 750 Ti. The combination of this GPU and the STRIX design should result in an extremely efficient, cool, and silent card.
Subject: Graphics Cards | February 23, 2015 - 04:12 PM | Scott Michaud
Tagged: nvidia, geforce, GTX 970
So apparently NVIDIA and a single AIB partner, Gigabyte, are facing a class action lawsuit because of the GeForce GTX 970 4GB controversy. I am not sure why they singled out Gigabyte, but I guess that is the way things go in the legal world. Unlucky for them, and seemingly lucky for the rest.
For those who are unaware, the controversy is based on NVIDIA claiming that the GeForce GTX 970 has 4GB of RAM, 64 ROPs, and 2048 KB of L2 Cache. In actuality, it has 56 ROPs and 1792KB of L2 Cache. The main talking point is that the RAM is segmented into two partitions, one that is 3.5GB and another that is 0.5GB. All 4GB are present on the card though, and accessible (unlike the disable L2 Cache and ROPs). Then again, I cannot see an instance in that class action lawsuit's exhibits which claim an incorrect number of ROPs or amount of L2 Cache.
Again, the benchmarks that you saw when the GeForce GTX 970 launched are still valid. Since the issue came up, Ryan has also tried various configurations of games in single- and multi-GPU systems to find conditions that would make the issue appear.
Subject: General Tech, Graphics Cards | February 21, 2015 - 04:23 PM | Scott Michaud
Tagged: wddm 2.0, nvidia, geforce 349.65, geforce, dx12
Update 2: Outside sources have confirmed to PC Perspective that this driver contains DirectX 12 as well as WDDM 2.0. They also claim that Intel and AMD have DirectX 12 drivers available through Windows Update as well. After enabling iGPU graphics on my i7-4790K, the Intel HD 4600 received a driver update, which also reports as WDDM 2.0 in DXDIAG. I do not have a compatible AMD GPU to test against (just a couple of old Windows 7 laptops) but the source is probably right and some AMD GPUs will be updated to DX12 too.
So it turns out that if your motherboard dies during a Windows Update reboot, then you are going to be spending several hours reinstalling software and patches, but that is not important. What is interesting is the installed version number for NVIDIA's GeForce Drivers when Windows Update was finished with its patching: 349.65. These are not available on NVIDIA's website, and the Driver Model reports WDDM 2.0.
It looks like Microsoft pushed out NVIDIA's DirectX 12 drivers through Windows Update. Update 1 Pt. 1: The "Runtime" reporting 11.0 is confusing though, perhaps this is just DX11 with WDDM 2.0?
I am hearing online that these drivers support the GeForce 600 series and later GPUs, and that there are later, non-public drivers available (such as 349.72 whose release notes were leaked online). NVIDIA has already announced that DirectX 12 will be supported on GeForce 400-series and later graphics cards, so Fermi drivers will be coming at some point. For now, it's apparently Kepler-and-later, though.
So with OS support and, now, released graphics drivers, all that we are waiting on is software and an SDK (plus any NDAs that may still be in effect). With Game Developers Conference (GDC 2015) coming up in a little over a week, I expect that we will get each of these very soon.
Update 1 Pt. 2: I should note that the release notes for 349.72 specifically mention DirectX 12. As mentioned above, is possible that 349.65 contains just WDDM 2.0 and not DX12, but it contains at least WDDM 2.0.
Subject: Graphics Cards | February 21, 2015 - 12:18 PM | Ryan Shrout
Tagged: radeon, nvidia, marketshare, market share, geforce, amd
One of the perennial firms that measures GPU market share, Jon Peddie Research, has come out with a report on Q4 of 2014 this weekend and the results are eye opening. According to the data, NVIDIA and AMD each took dramatic swings from Q4 of 2013 to Q4 of 2014.
|Q4 2014||Q3 2014||Q4 2013||Year-to-year Change|
Data source: Jon Peddie Research
Here is the JPR commentary to start us out:
JPR's AIB Report tracks computer add-in graphics boards, which carry discrete graphics chips. AIBs used in desktop PCs, workstations, servers, and other devices such as scientific instruments. They are sold directly to customers as aftermarket products, or are factory installed. In all cases, AIBs represent the higher end of the graphics industry using discrete chips and private high-speed memory, as compared to the integrated GPUs in CPUs that share slower system memory.
The news was encouraging and seasonally understandable, quarter-to-quarter, the market decreased -0.68% (compared to the desktop PC market, which decreased 3.53%).
On a year-to-year basis, we found that total AIB shipments during the quarter fell -17.52% , which is more than desktop PCs, which fell -0.72%.
However, in spite of the overall decline, somewhat due to tablets and embedded graphics, the PC gaming momentum continues to build and is the bright spot in the AIB market.
NVIDIA's Maxwell GPU
The overall PC desktop market increased quarter-to-quarter including double-attach-the adding of a second (or third) AIB to a system with integrated processor graphics-and to a lesser extent, dual AIBs in performance desktop machines using either AMD's Crossfire or Nvidia's SLI technology.
The attach rate of AIBs to desktop PCs has declined from a high of 63% in Q1 2008 to 36% this quarter.
The year to year change that JPR is reporting is substantial and shows a 20+ point change in market share in favor of NVIDIA over AMD. According to this data, AMD's market share has now dropped from 35% at the end of 2013 to just 24% at the end of 2014. Meanwhile, NVIDIA continues to truck forward, going from 64.9% at the end of 2013 to 76% at the end of 2014.
The Radeon R9 285 release didn't have the impact AMD had hoped
Clearly the release of NVIDIA's Maxwell GPUs, the GeForce GTX 750 Ti, GTX 970 and GTX 980 have impacted the market even more than we initially expected. In recent weeks the GTX 970 has been getting a lot of negative press with the memory issue and I will be curious to see what effect this has on sales in the near future. But the 12 month swing that you see in the table above is the likely cause for the sudden departure of John Byrne, Collette LaForce and Raj Naik.
AMD has good products, even better pricing and a team of PR and marketing folks that are talented and aggressive. So how can the company recover from this? Products, people; new products. Will the rumors circling around the Radeon R9 390X develop into such a product?
Hopefully 2015 will provide it.
Subject: Graphics Cards | February 20, 2015 - 02:08 PM | Jeremy Hellstrom
Tagged: gigabyte, nvidia, GTX 980 G1 GAMING, windforce, maxwell, factory overclocked
If you want the same amount of Maxwell Streaming Multiprocessors and ROP units as a GTX 980 as well as that last 500MB of RAM to run at full speed then you will need to pay for a GTX 980. One choice is Gigabyte's GTX 980 G1 Gaming which will cost you $580, around $240 more than a GTX 970 but the premium can be worth it if you need the power. [H]ard|OCP took the already overclocked card from a boost GPU frequency of 1329MHz and RAM of 7GHz all the way to a boost of 1513MHz with RAM topping out at 8.11GHz. That overclock had a noticeable effect on performance and helped the card garner an Editors Choice award. See it in action here.
"Today we have the GIGABYTE GTX 980 G1 GAMING, which features the WINDFORCE 600W cooling system and a high factory overclock. We will make comparisons to the competition, find out how fast it is compared to a GTX 980, and you won't believe the overclock we achieved! We will make both performance and price comparisons."
Here are some more Graphics Card articles from around the web:
- ASUS GTX 980 Matrix 4 GB @ techPowerUp
- OcUK GTX 970 @ HardwareHeaven
- Palit GTX960 Super JetStream @ Kitguru
- PowerColor PCS+ R9 290X 8GB Review @ TechwareLabs
- Sapphire Vapor-X R9 290x 8GB Tri-X Video Card @ Modders-Inc
Subject: General Tech, Mobile | February 20, 2015 - 07:00 AM | Scott Michaud
Tagged: shieldtuesday, shield, Saints Row IV, nvidia, metro last light, gridtuesday, grid, alan wake
Once again, NVIDIA brings some really good games to their GRID service, which is currently free for all SHIELD owners. The concept is that NVIDIA will compute the graphics at their server farms, accept your input, and return an audio/video stream of the result. This is a very convenient way to access content, but it cannot replace actual ownership for guaranteed access to specific art that find intrinsically valuable. It can help you discover new content, though.
This week, Saint's Row IV is available to be played on the GRID gaming service. Its predecessor, Saint's Row: The Third, was published on GRID earlier this month. It would be good to play them in order, and they are both worth your time. I did find that the campaign of Saint's Row IV was a bit less unique because the majority of missions were a handful of side-missions strung together, while Saint's Row: The Third had more scenario-based objectives, with the side-missions as an option to build up stats (or just be fun) between these. On the other hand, the movement mechanics were genius in IV. Play them both.
Looking ahead, next Tuesday will be Alan Wake. This is a survival-horror title from Remedy that makes you appreciate just how long your batteries last in real life. Basically, electricity is light and light is a vulnerability for the monsters that want to destroy you. The week after, the third of March, is Metro: Last Light Redux. This is one of the most visually demanding games available, and it is still used as a GPU benchmark at this site.
Saint's Row IV went live last Tuesday, while Alan Wake arrives on the 24th and Metro: Last Light comes in last, on March 3rd.
Subject: Graphics Cards, Mobile | February 19, 2015 - 03:58 PM | Ryan Shrout
Tagged: nvidia, notebooks, mobile, gpu
After a week or so of debate circling NVIDIA's decsision to disable overclocking on mobility GPUs, we have word that the company has reconsidered and will be re-enabling the feature in next month's driver release:
As you know, we are constantly tuning and optimizing the performance of your GeForce PC.
We obsess over every possible optimization so that you can enjoy a perfectly stable machine that balances game, thermal, power, and acoustic performance.
Still, many of you enjoy pushing the system even further with overclocking.
Our recent driver update disabled overclocking on some GTX notebooks. We heard from many of you that you would like this feature enabled again. So, we will again be enabling overclocking in our upcoming driver release next month for those affected notebooks.
If you are eager to regain this capability right away, you can also revert back to 344.75.
Now, I don't want to brag here, but we did just rail NVIDIA for this decision on last night's podcast...and then the decision was posted on NVIDIA's forums just four hours ago... I'm not saying, but I'm just saying!
All kidding aside, this is great news! And NVIDIA desperately needs to be paying attention to what consumers are asking for in order to make up for some poor decisions made in the last several months. Now (or at least soon), you will be able to return to your mobile GPU overclocking!