All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech, Graphics Cards, Motherboards, Cases and Cooling | September 28, 2014 - 12:25 AM | Ryan Shrout
Tagged: X99, video, maxwell, live, GTX 980, GTX 970, evga
UPDATE: If you missed the live stream with myself and Jacob, you can catch the entire event in the video below. You won't want to miss out on seeing the first ever GTX 980 water block as well as announcements on new Torq mice!
EVGA has been a busy company recently. It has continued to innovate with new coolers for the recent GTX 980 and GTX 970 card releases, newer power supplies offer unique features and improved quality and power output, a new line of X99 chipset motherboards including a Micro ATX variant and hey, the company even released a line of high-performance mice this year! PC Perspective has covered basically all of these releases (and will continue to do so with pending GPU and MB reviews) but there is a lot that needs explaining.
To help out, an industry and community favorite will be stopping by from EVGA to the PC Perspective offices: Jacob Freeman. You might know him as @EVGA_JacobF on Twitter or have seen him on countless forums, but he will making an in-person appearance on Friday, September 26th on PC Perspective Live! We plan on discussing the brand new ACX 2.0 cooler on the Maxwell GPUs released last week, go over some of highlights of the new X99 motherboards and even touch on power supplies and the Torq mice line as well.
EVGA GTX 980/970, X99, PSU and Torq Live Stream featuring Jacob Freeman
3pm ET / 12pm PT - September 26th
EVGA has been a supporter of PC Perspective for a long time and we asked them to give back to our community during this live stream - and they have stepped up! Look at this prize list:
- 1 x EVGA GeForce GTX 980 SC
- 1 x EVGA GeForce GTX 970 SC ACX 2.0
- 1 x EVGA X99 Classified
- 1 x EVGA X99 FTW
- 1 x EVGA X99 SLI
- 1 x EVGA SuperNOVA 750 G2 PSU
- 4 x Torq Mice
How can you participate and win these awesome pieces of hardware? Just be here at 3pm ET / 12pm PT on http://www.pcper.com/live and we'll be announcing winners as we go for those that tune in. It really couldn't be more simple!
If you have questions you want to ask Jacob about EVGA, or any of its line of products, please leave them in the comments section below and we'll start compiling a list to address on the live stream Friday. Who knows, we may even save some prizes for some of our favorite questions!
To make sure you don't miss our live stream events, be sure you sign up for our spam-free PC Perspective Live! Mailing List. We email that group a couple hours before each event gets started.
Subject: Graphics Cards | September 27, 2014 - 07:24 PM | Ryan Shrout
Tagged: nvidia, maxwell, gsync, g-sync, freesync, adaptive sync
During an interview that we streamed live with NVIDIA's Tom Petersen this past Thursday, it was confirmed that NVIDIA is not currently working on, or has any current plans to, add support for the VESA-based and AMD-pushed Adaptive Sync portion of the DisplayPort 1.2a specification. To quote directly:
There is no truth [to that rumor of NVIDIA Adaptive Sync support] and we have made no official comments about Adaptive Sync. One thing I can say is that NVIDIA as a company is 100% dedicated to G-Sync. We are going to continue to invest in G-Sync and it is a way we can make the gaming experience better. We have no need for Adaptive Sync. We have no intention of [implementing it]."
To be clear, the Adaptive Sync part of DP 1.2a and 1.3+ are optional portions of the VESA spec that is not required for future graphics processors or even future display scalar chips. That means that upcoming graphics cards from NVIDIA could still be DisplayPort 1.3 compliant without implementing support for the Adaptive Sync feature. Based on the comments above, I fully expect that to be the case.
The ASUS ROG Swift PG278Q G-Sync monitor
With that new information, you can basically assume that the future of variable refresh monitors is going to be divided: one set for users of GeForce cards and one set for users with Radeon cards. (Where Intel falls into this is up in the air.) Clearly that isn't ideal for a completely open ecosystem but NVIDIA has made the point, over and over, that what they have developed with G-Sync is difficult and not at all as simple as could be solved with the blunt instrument that Adaptive Sync is. NVIDIA has a history of producing technologies and then keeping them in-house, focusing on development specifically for GeForce owners and fans. The dream of having a VRR monitor that will run on both vendors GPUs appears to be dead.
When asked about the possibility of seeing future monitors that can support both NVIDIA G-Sync technology as well as Adaptive Sync technology, Petersen stated that while not impossible, he "would not expect to see such a device."
The future of G-Sync is still in development. Petersen stated:
"Don't think that were done. G-Sync is not done. Think of G-Sync as the start of NVIDIA solving the problems for gamers that are related to displays...G-Sync is our first technology that makes games look better on displays. But you can start looking at displays and make a lot of things better."
Diagram showing how G-Sync affects monitor timings
So now we await for the first round of prototype FreeSync / Adaptive Sync monitors to hit our labs. AMD has put a lot of self-inflicted pressure on itself for this release by making claims, numerous times, that FreeSync will be just as good of an experience as G-Sync, and I am eager to see if they can meet that goal. Despite any ill feelings that some users might have about NVIDIA and some of its policies, it typically does a good job of maintaining a high quality user experience with these custom technologies. AMD will have to prove that what it has developed is on the same level. We should know more about that before we get too much further into fall.
You can check out our stories and reviews covering G-Sync here:
- PCPer Live! NVIDIA Maxwell, GTX 980, GTX 970 Discussion with Tom Petersen, Q&A
- Acer XB280HK 28-in 4K G-Sync Monitor Review
- NVIDIA G-Sync Surround Impressions: Using 3 ASUS ROG Swift Displays
- PCPer Live! Recap - NVIDIA G-Sync Surround Demo and Q&A
- ASUS ROG Swift PG278Q 27-in Monitor Review - NVIDIA G-Sync at 2560x1440
Subject: General Tech, Graphics Cards | September 27, 2014 - 02:59 AM | Scott Michaud
Tagged: rage, pc gaming, consolitis
Shinji Mikami has been developing a survival horror game, which makes sense given a good portion of his portfolio. He created Resident Evil and much of the following franchise. The Evil Within is about to release, having recently gone gold. At around this time, publishers begin to release system requirements and Bethesda does not disappoint in that regard.
Are the requirements... RAGE-inducing?
A case could be made for disappointing requirements, themselves, though.
Basically, Bethesda did not release minimum requirements. Instead, they said "This is what we recommend. It will run on less. Hope it does!" This would not be so problematic if one of their requirements wasn't a "GeForce GTX 670 with 4GBs of VRAM".
They also recommend a quad-core Core i7, 4GB of system memory, 50GB of hard drive space, and a 64-bit OS (Windows 7 or Windows 8.x).
Before I go on, I would like to mention that The Evil Within is built on the RAGE engine. Our site has dealt extensively with that technology when it first came out in 2011. While I did not have many showstopping performance problems with that game, personally, it did have a history with texture streaming. Keep that in mind as you continue to read.
A typical GTX 670 does not even have 4GBs of VRAM. In fact, the GTX 780 Ti does not even have 4GB of VRAM. Thankfully, both of the newly released Maxwell GPUs, the GTX 970 and the GTX 980, have at least 4GB of RAM. Basically, Bethesda is saying, "I really hope you bought the custom model from your AIB vendor". They literally say:
Note: We do not have a list of minimum requirements for the game. If you’re trying to play with a rig with settings below these requirements (you should plan to have 4 GBs of VRAM regardless), we cannot guarantee optimal performance.
Each time I read, "You should plan to have 4 GBs of VRAM regardless", it is more difficult for me to make an opinion about it. That is a lot of memory. Personally, I would wait for reviews and benchmarks, specifically for the PC, before purchasing the title. These recommended settings could be fairly loose, to suit the vision of the game developers, or the game could be a revival of RAGE, this time without the engine's original architect on staff.
The Evil Within launches on October 14th.
Subject: Graphics Cards | September 26, 2014 - 02:22 PM | Jeremy Hellstrom
Tagged: asus, ROG, gtx 780 ti, MATRIX Platinum, DirectCU II
With the release of the new Maxwell cards comes an opportunity for those with a smaller budget to still get a decent upgrade for their systems. Early adopters will often sell their previous GPUs once they've upgraded allowing you to get a better card than your budget would usually allow, though with a risk of ending up with a bum card. The ASUS ROG GTX 780 Ti MATRIX Platinum is a good example with a DirectCU II air cooler for general usage but the LN2 switch will also allow more extreme cooling methods for those looking for something a little more impressive. The factory overclock is not bad at 1006/1072MHz core and 7GHz effective memory but the overclock [H]ard|OCP managed at 1155/1220MHz and 7.05GHz pushes the performance above that of the R9 290X of the same family. If you can find this card used at a decent price it could give you more of an upgrade than you thought you could afford.
"In today's evaluation we are breaking down the ASUS ROG GTX 780 Ti MATRIX Platinum video card. We put this head-to-head with the ASUS ROG R9 290X MATRIX Platinum. Which provides a better gaming experience, best overclocking performance, and power and temperature? Which one provides the best value? "
Here are some more Graphics Card articles from around the web:
- MSI GTX 980 OC @ HardwareHeaven
- Taking It To The Limit: Overclocking NVIDIA’s GeForce GTX 970 & 980 @ Techgage
- Gigabyte G1 Gaming Geforce GTX 980 Review @ HiTech Legion
- Palit GTX 970 JetStream 4 GB @ techPowerUp
- ASUS Strix Edition GeForce GTX 970 Graphics Card Review @ Techgage
- ASUS GTX 970 STRIX OC Review @ Hardware Canucks
- Palit GTX970 JetStream OC @ Kitguru
- Testing Nvidia’s GeForce GTX 980 4GB Graphics Cards In SLI @ eTeknix
- Gigabyte G1 Gaming GeForce GTX 980 4GB @ eTeknix
- ASUS STRIX GTX 970 DirectCU II OC 4GB Review @HiTech Legion
- Nvidia GeForce GTX 980 4GB @ eTeknix
- MSI GTX 970 Gaming 4G Review @HiTech Legion
- Nvidia Quadro K5200, K4200 and K2200 Professional Graphics Cards @ X-bit Labs
- Gigabyte R7 250X OC Performance Review @ Neoseeker
- Sapphire R9 285 ITX Compact v MSI GTX760 Gaming Mini ITX @ Kitguru
Subject: Graphics Cards | September 26, 2014 - 12:14 PM | Ryan Shrout
Tagged: vxgi, video, tom petersen, nvidia, mfaa, maxwell, livestream, live, GTX 980, GTX 970, dsr
UPDATE: If you missed the live stream yesterday, I have good news: the interview and all the information/demos provided are available to you on demand right here. Enjoy!
Last week NVIDIA launched GM204, otherwise known as Maxwell and now branded as the GeForce GTX 980 and GTX 970 graphics cards. You should, of course, have already read the PC Perspective review of these two GPUs, but undoubtedly there are going to be questions and thoughts circulating through the industry.
To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Thursday afternoon where he will run through some demonstrations and take questions from the live streaming audience.
Be sure to stop back at PC Perspective on Thursday, September 25th at 4pm ET / 1pm PT to discuss the new Maxwell GPU, the GTX 980 and GTX 970, new features like Dynamic Super Resolution, MFAA, VXGI and more! You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA Maxwell Live Stream
1pm PT / 4pm ET - September 25th
We also want your questions!! The easiest way to get them answered is to leave them for us here in the comments of this post. That will give us time to filter through the questions and get the answers you need from Tom. We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with.
So be sure to join us on Thursday afternoon!
UPDATE: We have confirmed at least a handful of prizes for those of you that tune into the live stream today. We'll giveaway an NVIDIA SHIELD as well as several of the brand new SLI LED bridges that were announced for sale this week!
Subject: General Tech, Graphics Cards | September 26, 2014 - 02:03 AM | Scott Michaud
Tagged: steam, precisionx 16, precisionx, overclocking, nvidia, evga
If you were looking to download EVGA Precision X recently, you were likely disappointed. For a few months now, the software was unavailable because of a disagreement between the add-in board (AIB) partner and Guru3D (and the RivaTuner community). EVGA maintains that it was a completely original work, and references to RivaTuner are a documentation error. As a result, they pulled the tool just a few days after launching X 15.
This new version, besides probably cleaning up all of the existing issues mentioned above, adds support for the new GeForce GTX 900-series cards, a new interface, an "OSD" for inside applications, and Steam Achievements (??). You can get a permanent badge on your Steam account for breaking 1200 MHz on your GPU, taking a screenshot, or restoring settings to default. I expect that latter badge is one of shame, like the Purple Heart from Battlefield, that is not actually a bad thing and says nothing less of your overclocking skills by pressing it. Seriously, save yourself some headache and just press default if things just do not seem right.
PrecisionX 16 is free, available now, and doesn't require an EVGA card (just a site sign-up).
Subject: General Tech, Graphics Cards | September 24, 2014 - 02:41 AM | Scott Michaud
Tagged: sli, nvidia
SLI Bridges are thrown in with compatible motherboards and there is usually little reason to want anything else. They work. There is no performance advantage for getting a "better" one, unless it does not connect with your specific arrangement of two-to-four cards. Today, NVIDIA gives another reason: a soft, beautiful glow to match the green "GeForce GTX" on the cards themselves.
Mind you, this is not the first glowing SLI Bridge. EVGA even provided us with a few of their own for a giveaway last year.
NVIDIA has three models, depending on the layout of your cards. 3-way SLI will need to be arranged as a series of two-wide with no gaps, using the "3-Way SLI Bridge". 2-way configurations have the choice of two empty slots between the two-wide cards, or no gap; former would purchase the "2-Way Spaced SLI Bridge" and the later, the "2-Way SLI Bridge". They each require GeForce GTX 770 cards, or better, as well as a recent GeForce Experience (1.7+). Certain non-reference designs may be incompatible.
The SLI Bridges are available now. Both 2-Way bridges are $29.99 and the 3-Way is $39.99.
NVIDIA GeForce 344.11 Driver and GeForce Experience 2.1.2 Released Alongside Maxwell-based GTX 980 and GTX 970
Subject: General Tech, Graphics Cards | September 20, 2014 - 12:56 PM | Scott Michaud
Tagged: nvidia, maxwell, graphics drivers, geforce experience
Update: There is also the 344.16 for the GTX 970 and GTX 980, resolving an issue specific to them.
When they release a new graphics card, especially in a new architecture, NVIDIA will have software ready to support it. First and most obvious, Maxwell comes with the GeForce 344.11 drivers - which is the first to support only Fermi and later GPUs. Mostly, the driver's purpose is supporting the new graphics cards and optimizing to Borderlands: The Pre-Sequel, The Evil Within, F1 2014, and Alien: Isolation. It also supports multi-monitor G-Sync, which was previously impossible, even with three single-DisplayPort Kepler cards.
At the same time, NVIDIA launched a new GeForce Experience with more exciting features. First, and I feel least expected, it allows the SHIELD Wireless Controller to be connected to a PC, but only wired with its provided USB cable. This also means that you cannot use the controller without a GeForce graphics card.
If you have a GeForce GTX 900-series add-in board, you will be able to use Dynamic Super Resolution (DSR) and record in 4K video with ShadowPlay. Performance when recording on a PC in SLI mode has been improved also, apparently even for Kepler-based cards.
Subject: General Tech, Graphics Cards | September 20, 2014 - 12:06 PM | Scott Michaud
Tagged: unreal engine 4, nvidia, microsoft, maxwell, DirectX 12, DirectX
Microsoft and NVIDIA has decided to release some information about DirectX 12 (and DirectX 11.3) alongside the launch of the Maxwell-based GeForce GTX 980 and GeForce GTX 970 graphics cards. Mostly, they announced that Microsoft teamed up with Epic Games to bring DirectX 12 to Unreal Engine 4. They currently have two demos, Elemental and Infiltrator, that are up and running with DirectX 12.
Moreover, they have provided a form for developers who are interested in "early access" to apply for it. They continually discuss it in terms of Unreal Engine 4, but they do not explicitly say that other developers cannot apply. UE4 subscribers will get access to the Elemental demo in DX12, but it does not look like Infiltrator will be available.
DirectX 12 is expected to target games for Holiday 2015.
Subject: Graphics Cards | September 19, 2014 - 04:57 PM | Jeremy Hellstrom
Tagged: asus, strix, STRIX GTX 970, STRIX GTX 980, maxwell
The ASUS STRIX series comes with a custom DirectCU II cooler that is capable of running at 0dB when not under full load, in fact you can choose the temperature at which the fans activate using the included GPU Tweak application. The factory overclock is modest but thanks to that cooler and the 10-phase power you will be able to push the card even further. The best news is the price, you get all of these extras for almost the same price as the reference cards are selling at!
Fremont, CA (19th September, 2014) - ASUS today announced the STRIX GTX 980 and STRIX GTX 970, all-new gaming graphics cards packed with exclusive ASUS technologies, including DirectCU II and GPU Tweak for cooler, quieter and faster performance. The STRIX GTX 980 and STRIX GTX 970 are factory-overclocked at 1279MHz and 1253MHz respectively and are fitted with 4GB of high-speed GDDR5 video memory operating at speeds up to 7010MHz for the best gameplay experience.
Play League of Legends and StarCraft in silence!
The STRIX GTX 980 and STRIX GTX 970 both come with the ASUS-exclusive DirectCU II cooling technology. With a 10mm a heatpipe to transport heat away from the GPU core, operating temperatures are 30% cooler and 3X quieter than reference designs. Efficient cooling and lower operating temperatures allow STRIX graphics cards to incorporate an intelligent fan-stop mode that can handle games such as League of Legends1 and StarCraft1 passively, making both cards ideal for gamers that prefer high-performance, low-noise PCs.
Improved stability and reliability with Digi+ VRM technology
STRIX GTX 980 and STRIX GTX 970 graphics cards include Digi+ VRM technology. This 10-phase power design in the STRIX GTX 980 and 6-phase design in the STRIX GTX 970 uses a digital voltage regulator to reduce power noise by 30% and enhance energy efficiency by 15% – increasing long term stability and reliability. The STRIX GTX 970 is designed to use a single 8-pin power connecter for clean and easy cable management.
Real-time monitoring and control with GPU Tweak software
The STRIX GTX 980 and STRIX GTX 970 come with GPU Tweak, an exclusive ASUS tool that enables users to squeeze the very best performance from their graphics card. GPU Tweak provides the ability to finely control GPU speeds, voltages and video memory clock speeds in real time, so overclocking is easy and can be carried out with high confidence.
GPU Tweak also includes a streaming tool that lets users share on-screen action over the internet in real time, meaning others can watch live as games are played. It is even possible to add a title to the streaming window along with scrolling text, pictures and webcam images.
AVAILABILITY & PRICING
ASUS STRIX GTX 980 and GTX 970 graphics cards will be available at ASUS authorized resellers and distributors starting on September 19, 2014. Suggested US MSRP pricing is $559 for the STRIX GTX 980 and $339 for the STRIX GTX 970.
Subject: Graphics Cards | September 19, 2014 - 02:17 PM | Jeremy Hellstrom
Tagged: vr direct, video, nvidia, mfaa, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr
The answer to the two most important questions are as follows, the GTX 980 will cost you around $560 compared to the $500 for an R9 290X and the GTX 970 an attractive $330 compared to $380 for an R9 290. Availability is hard to predict but the cards will be shipping soon and you can pre-order your choice of card by following the links on the last page of Ryan's review. Among all the new features that have been added to this new GPU one of the most impressive is the power draw, as you can see in [H]ard|OCP's review this card pulls 100W less than the 290X at full load although it did run warmer than the 290X Double Dissipation card which [H] compared it to, something that may change with a 980 bearing a custom cooler. Follow those links to see the benchmarking results of this card, both synthetic and in game.
"Today NVIDIA launches its newest Maxwell GPU. There will be two new GPUs, the GeForce GTX 980 and GeForce GTX 970. These next generation GPUs usher in new features and performance that move the gaming industry forward. We discuss new features, architecture, and evaluate the gameplay performance against the competition."
Here are some more Graphics Card articles from around the web:
- Nvidia's GeForce GTX 980 and 970 @ The Tech Report
- Maxwell is here! GTX 980 is now revealed! @ Bjorn3D.com
- Nvidia unveils second-gen Maxwell GPUs, the Geforce GTX 970 and GTX 980 @ The Inquirer
- NVIDIA GeForce GTX 980 4GB Review @HiTech Legion
- NVIDIA GeForce GTX 980 Review @ Neoseeker
- NVIDIA GTX 980 Review @ OCC
- NVIDIA GeForce GeForce GTX 980 Maxwell Video Card Review @ Legit Reviews
- MSI GTX 970 Gaming 4G GPU Review @ Modders-Inc
- GeForce GTX 980 @ HardwareHeaven
- NVIDIA GeForce GTX 980 @ Benchmark Reviews
- NVIDIA GeForce GTX 980 Review: Does Maxwell Bring Maximum Gameplay? @ Techgage
- ASUS Strix GTX 970 OC 4 GB @ techPowerUp
- NVIDIA GeForce GTX 980 Performance Review @ Hardware Canucks
- EVGA GTX 970 SC ACX 4 GB @ techPowerUp
- NVIDIA GeForce GTX 970 SLI @ techPowerUp
- NVIDIA GeForce GTX 980 4 GB @ techPowerUp
- Gigabyte G1 Gaming GTX 970 & GTX 980 @ Legion Hardware
- Nvidia Geforce GTX 980 @ Kitguru
- Asus GTX970 STRIX OC @ Kitguru
- MSI GTX 970 Gaming 4G @ Kitguru
Subject: Graphics Cards, Displays | September 18, 2014 - 06:52 PM | Scott Michaud
Tagged: amd, freesync, DisplayPort, adaptive sync
MStar, Novatek, and Realtek, three vendors of scaler units for use in displays, have announced support for AMD's FreeSync. Specifically, for the Q1'15 line of monitors, these partners will provide scaler chips that use DisplayPort Adaptive-Sync and, when paired with a compatible AMD GPU, will support FreeSync.
The press release claims that these scalar chips will either support 1080p and 1440p monitors that are up to 144Hz, or drive 4K displays that are up to 60Hz. While this is promising, at least compared to the selection at G-Sync's launch a year earlier, it does not mean that this variety of monitors will be available -- just that internal components will be available for interested display vendors. Also, it means that there are probably interested display vendors.
AMD and partners "intend to reveal" displays via a "media review program" in Q1. This is a little later than what we expected from Richard Huddy's "next month" statements, but it is possible that "Sampling" and "Media Review Program" are two different events. Even if it is "late", this is the sort of thing that is forgivable to me (missing a few months while relying on a standards body and several, independent companies).
Subject: Graphics Cards | September 18, 2014 - 03:56 PM | Jeremy Hellstrom
Tagged: DirectCU II, asus, STRIX GTX 750 Ti OC, factory overclocked
The ASUS STRIX GTX 750 Ti OC sports the custom DirectCU II cooling system which not only improves the temperatures on the card but also reduces the noise produced by the fans. It comes out of the box with an overclocked GPU base clock 1124MHz and a boost clock of 1202MHz, with the 2GB of VRAM set to the stock speed of 5.4GHz; [H]ard|OCP managed to increase that to an impressive 1219/1297MHz and 6.0GHz even for the VRAM without increasing voltages. Unfortunately even with that overclock it lagged behind the Sapphire Radeon R9 270 Dual-X which happens to be about the same price at $170.
"Rounding out our look at ASUS' new STRIX technology we have another STRIX capable video card on our test bench today, this time based on the GTX 750 Ti GPU. We will take the ASUS STRIX GTX 750 Ti OC Edition and test it against an AMD Radeon R9 270 and AMD Radeon R9 265 to see what reigns supreme under $200."
Here are some more Graphics Card articles from around the web:
- ASUS GTX750 Ti STRIX OC Edition @ Kitguru
- NZXT Kraken G10 Review @ OCC
- ASUS ROG R9 290X MATRIX Platinum @ [H]ard|OCP
- Radeon R9-285 @ HardwareHeaven
- Sapphire R9 285 2GB ITX Compact OC Edition @ eTeknix
- How Low Should You Go? ASUS Radeon R7 250X Graphics Card Review @ Techgage
- Sapphire ITX Compact R9 285 OC Review @ OCC
- XFX Radeon R9 285 DD @ Benchmark Reviews
Subject: General Tech, Graphics Cards | September 15, 2014 - 08:02 PM | Scott Michaud
Tagged: nvidia, geforce, GTX 980
Details and photographs of the GeForce GTX 980 are leaking on various forums and websites. Based on the Maxwell architecture, it is supposed to be faster and more efficient than Kepler while being manufactured on an identical, 28nm fab process. While we were uncertain before, it now looks like the GTX 980 will be its formal name, as seen in leaked photographs, below.
Image Credit: Videocardz
As expected, the cooler is a continuation of NVIDIA's reference cooler, as seen on recent high-end graphics cards (such as the GeForce Titan). Again, this is not a surprise. The interesting part is that it is rated for about 250W whereas Maxwell is rumored to draw 180W. While the reference card has two six-pin PCIe power connectors, I am curious to see if the excess cooling will lead to interesting overclocks. That is not even mentioning what the AIB partners can do.
Image Credit: Videocardz
Beyond its over-engineering for Maxwell's TDP, it also includes a back plate.
Its display connectors have been hotly anticipated. As you can see above, the GTX 980 has five outputs: three DisplayPort, one HDMI, and one DVI. Which version of HDMI? Which version of DisplayPort? No clue at the moment. There has been some speculation regarding HDMI 2.0, and the DisplayPort 1.3 standard was just released to the public today, but I would be pleasantly surprised if even one of these made it in.
Check out Videocardz for a little bit more.
Subject: General Tech, Graphics Cards, Cases and Cooling | September 15, 2014 - 05:50 PM | Scott Michaud
Tagged: amd, R9, r9 390x, liquid cooler, liquid cooling, liquid cooling system, asetek
Less than a year after the launch of AMD's R9 290X, we are beginning to hear rumors of a follow-up. What is being called the R9 390X, because if it is called anything else, then that was a very short-lived branding scheme, might be liquid cooled. This would be the first single-processor, reference graphics card to have an integrated water cooler. That said, the public evidence is not as firm as I would normally like.
Image Credit: Baidu Forums
According to Tom's Hardware, Asetek is working on a liquid-cooled design for "an undisclosed OEM". The product is expected to ship during the first half of 2015 and the press release claims that it will "continue Asetek's success in the growing liquid cooling market". Technically, this could be a collaboration with an AIB partner, not necessarily a GPU developer. That said, the leaked photograph looks like a reference card.
We don't really know anything more than this. I would expect that it will be a refresh based on Hawaii, but that is pure speculation. I have no evidence to support that.
Personally, I would hope that a standalone air-cooled model would be available. While I have no experience with liquid cooling, it seems like a bit extra of a burden that not all purchasers of a top-of-the-line single GPU add-in board would want to bare. Specifically, placing the radiator if their case even supports it. That said, having a high-performing reference card will probably make the initial benchmarks look extra impressive, which could be a win in itself.
Subject: General Tech, Graphics Cards | September 11, 2014 - 07:46 PM | Scott Michaud
Tagged: quad sli, quad crossfire, nvidia, amd
Psst. AMD fans. Don't tell "Team Green" but Linus decided to take four R9 290X graphics cards and configure them in Quad Crossfire formation. They did not seem to have too much difficulty setting it up, although they did have trouble with throttling and setting up Crossfire profiles. When they finally were able to test it, they got a 3D Mark Fire Strike Extreme score of 14979.
Psst. NVIDIA fans. Don't tell "Team Red" but Linus decided to take four GeForce Titan Black graphics cards and configure them in Quad SLI formation. He had a bit of a difficult time setting up the machine at first, requiring a reshuffle of the cards (how would reordering PCIe slots for identical cards do anything?) and a few driver crashes, but it worked. Eventually, they got a 3D Mark Fire Strike Extreme score of around 13,300 (give or take a couple hundred).
Subject: General Tech, Graphics Cards | September 8, 2014 - 05:03 PM | Jeremy Hellstrom
Tagged: leak, nvidia, GM204, GTX 980, GTX 980M, GTX 970, GTX 970M
Please keep in mind that this information has been assembled via research done by WCCF Tech and Videocardz off of 3DMark entries of unreleased GPUs; we won't get the official numbers until the middle of this month. That said, rumours and guesswork about new hardware are a favourite past time of our readers so here is the information we've seen so far about the upcoming GM204 chip from NVIDIA. On the desktop side is the GeForce GTX 980 and GeForce GTX 970 which should both have 4GB of GDDR5 on a 256-bit bus with GPU clock speeds ranging from 1127 to 1190 MHz. The performance that was shown on 3DMark has the GTX 980 beating the 780 Ti and R9 290X and the GTX 970 performing similarly to the plain GTX 780 and falling behind the 290X. SLI scaling looks rather attractive with a pair of GTX 980 coming within a hair of the performance of the R9 295X2.
On the mobile side things look bleak for AMD, the GTX 980M and GTX 970M surpass the current GTX 880M which in turn benchmarks far better than AMD's M290X chip. Again the scaling in SLI systems will be impressive assuming that the leaks that you can see indepth here are accurate. It won't be too much longer before we know one way or the other so you might want to keep your finger off of the Buy Button for a short while.
Subject: General Tech, Graphics Cards, Processors | September 6, 2014 - 05:25 PM | Scott Michaud
Tagged: iris pro, iris, intel hd graphics, Intel
I was originally intending to test this with benchmarks but, after a little while, I realized that Ivy Bridge was not supported. This graphics driver starts and ends with Haswell. While I cannot verify their claims, Intel advertises up to 30% more performance in some OpenCL tasks and a 10% increase in games like Batman: Arkham City and Sleeping Dogs. They even claim double performance out of League of Legends at 1366x768.
Intel is giving gamers a "free lunch".
The driver also tunes Conservative Morphological Anti-Aliasing (CMAA). They claim it looks better than MLAA and FXAA, "without performance impact" (their whitepaper from March showed a ~1-to-1.5 millisecond cost on Intel HD 5000). Intel recommends disabling it after exiting games to prevent it from blurring other applications, and they automatically disable it in Windows, Internet Explorer, Chrome, Firefox, and Windows 8.1 Photo.
Adaptive Rendering Control was also added in this driver. This limits redrawing identical frames by comparing the ones it does draw with previously drawn ones, and adjusts the frame rate accordingly. This is most useful for games like Angry Birds, Minesweeper, and Bejeweled LIVE. It is disabled when not on battery power, or when the driver is set to "Maximum Performance".
Subject: General Tech, Graphics Cards | September 3, 2014 - 09:28 PM | Scott Michaud
Tagged: amd, R9, r9 295x2, price cut
While not fully in effect yet, AMD is cutting $500 off of the R9 295X2 price tag to $999 USD. Currently, there are two models available on Newegg USA at the reduced price, and one at Amazon for $1200. We expect to see other SKUs reduce soon, as well. This puts the water-cooled R9 295X2 just below the cost of two air-cooled R9 290X graphics cards.
If you were interested in this card, now might be the time (if one of the reduced units are available).
Subject: General Tech, Graphics Cards | September 3, 2014 - 06:15 PM | Scott Michaud
Tagged: Matrox, firepro, cape verde xt gl, cape verde xt, cape verde, amd
Matrox, along with S3, develop GPU ASICs for use with desktop add-in boards, alongside AMD and NVIDIA. Last year, they sold less than 7000 units in their quarter according to my math (rounding to 0.0% market share implies < 0.05% of total market, which was 7000 units that quarter). Today, Matrox Graphics Inc. announce that they will use an AMD GPU on their upcoming product line.
While they do not mention a specific processor, they note that "the selected AMD GPU" will be manufactured at a 28nm process with 1.5 billion transistors. It will support DirectX 11.2, OpenGL 4.4, and OpenCL 1.2. It will have a 128-bit memory bus.
Basically, it kind-of has to be Cape Verde XT (or XT GL) unless it is a new, unannounced GPU.
If it is Cape Verde XT, it would have about 1.0 to 1.2 TFLOPs of single precision performance (depending on the chosen clock rate). Whatever clock rate is chosen, the chip contains 640 shader processors. It was first released in February 2012 with the Radeon HD 7770 GHz Edition. Again, this is assuming that AMD will not release a GPU refresh for that category.
Matrox will provide their PowerDesk software to configure multiple monitors. It will work alongside AMD's professional graphics drivers. It is a sad that to see a GPU ASIC manufacturer throw in the towel, at least temporarily, but hopefully they can use AMD's technology to remain in the business with competitive products. Who knows: maybe they will make a return when future graphics APIs reduce the burden of driver and product development?