Subject: Graphics Cards | October 28, 2013 - 09:29 AM | Ryan Shrout
Tagged: nvidia, kepler, gtx 780 ti, gtx 780, gtx 770, geforce
A lot of news coming from the NVIDIA camp today, including some price drops and price announcements.
First up, the high-powered GeForce GTX 780 is getting dropped from $649 to $499, a $150 savings that will bring the GTX 780 into line with the competition of AMD's new Radeon R9 290X launched last week.
Next, the GeForce GTX 770 2GB is going to drop from $399 to $329 to help it compete more closely with the R9 280X.
Even you weren't excited about the R9 290X, you have to be excited by competition.
In a surprising turn of events, NVIDIA is now the company with the great bundle deal with GPUs as well! Starting today you'll be able to get a free copy of Batman: Arkham Origins, Splinter Cell: Blacklist and Assassin's Creed IV: Black Flag with the GeForce GTX 780 Ti, GTX 780 and GTX 770. If you step down to the GTX 760 or 660 you'll lose out on the Batman title.
SHIELD discounts are available as well: $100 off you buy the upper tier GPUs and $50 off if you but the lower tier.
UPDATE: NVIDIA just released a new version of GeForce Experience that enabled ShadowPlay, the ability to use Kepler GPUs to record game play in the background with almost no CPU/system ovehead. You can see Scott's initial impressions of the software right here; it seems like its going to be a pretty awesome feature.
Need more news? The yet-to-be-released GeForce GTX 780 Ti is also getting a price - $699 based on the email we just received. And it will be available starting November 7th!!
With all of this news, how does it change our stance on the graphics market? Quite a bit in fact. The huge price drop on the GTX 780, coupled with the 3-game bundle means that NVIDIA is likely offering the better hardware/software combo for gamers this fall. Yes, the R9 290X is likely still a step faster, but now you can get the GTX 780, three great games and spend $50 less.
The GTX 770 is now poised to make a case for itself against the R9 280X as well with its $70 drop. The R9 280X / HD 7970 GHz Edition was definitely a better option with its $100 price delta but with only $30 separating the two competing cards, and the three free games, again the advantage will likely fall to NVIDIA.
Finally, the price point of the GTX 780 Ti is interesting - if NVIDIA is smart they are pricing it based on comparable performance to the R9 290X from AMD. If that is the case, then we can guess the GTX 780 Ti will be a bit faster than the Hawaii card, while likely being quieter and using less power too. Oh, and again, the three game bundle.
NVIDIA did NOT announce a GTX TITAN price drop which might surprise some people. I think the answer as to why will be addressed with the launch of the GTX 780 Ti next month but from what I was hearing over the last couple of weeks NVIDIA can't make the cards fast enough to satisfy demand so reducing margin there just didn't make sense.
NVIDIA has taken a surprisingly aggressive stance here in the discrete GPU market. The need to address and silence critics that think the GeForce brand is being damaged by the AMD console wins is obviously potent inside the company. The good news for us though, and the gaming community as a whole, is that just means better products and better value for graphics card purchases this holiday.
NVIDIA says these price drops will be live by tomorrow. Enjoy!
- NVIDIA GeForce GTX 780 (Newegg.com)
- NVIDIA GeForce GTX 770 (Newegg.com)
- AMD Radeon R9 290X (Newegg.com)
- AMD Radeon R9 280X (Newegg.com)
ShadowPlay is NVIDIA's latest addition to their GeForce Experience platform. This feature allows their GPUs, starting with Kepler, to record game footage either locally or stream it online through Twitch.tv (in a later update). It requires Kepler GPUs because it is accelerated by that hardware. The goal is to constantly record game footage without any noticeable impact to performance; that way, the player can keep it running forever and have the opportunity to save moments after they happen.
Also, it is free.
I know that I have several gaming memories which come unannounced and leave undocumented. A solution like this is very exciting to me. Of course a feature on paper not the same as functional software in the real world. Thankfully, at least in my limited usage, ShadowPlay mostly lives up to its claims. I do not feel its impact on gaming performance. I am comfortable leaving it on at all times. There are issues, however, that I will get to soon.
This first impression is based on my main system running the 331.65 (Beta) GeForce drivers recommended for ShadowPlay.
- Intel Core i7-3770, 3.4 GHz
- NVIDIA GeForce GTX 670
- 16 GB DDR3 RAM
- Windows 7 Professional
- 1920 x 1080 @ 120Hz.
- 3 TB USB3.0 HDD (~50MB/s file clone).
The two games tested are Starcraft II: Heart of the Swarm and Battlefield 3.
Subject: Graphics Cards | October 24, 2013 - 02:38 PM | Jeremy Hellstrom
Tagged: radeon, R9 290X, kepler, hawaii, amd
If you didn't stay up to watch our live release of the R9 290X after the podcast last night you missed a chance to have your questions answered but you will be able to watch the recording later on. The R9 290X arrived today, bringing 4K and Crossfire reviews as well as single GPU testing on many a site including PCPer of course. You don't just have to take our word for it, [H]ard|OCP was also putting together a review of AMD's Titan killer. Their benchmarks included some games we haven't adopted yet such as ARMA III. Check out their results and compare them to ours, AMD really has a winner here.
"AMD is launching the Radeon R9 290X today. The R9 290X represents AMD's fastest single-GPU video card ever produced. It is priced to be less expensive than the GeForce GTX 780, but packs a punch on the level of GTX TITAN. We look at performance, the two BIOS mode options, and even some 4K gaming."
Here are some more Graphics Card articles from around the web:
- AMD's Radeon R9 290X graphics card @ The Tech Report
- AMD Radeon R9 290X 4GB Video Card Review @ Legit Reviews
- AMD Radeon R9 290X @ Hardware.info
- 4K Gaming Showdown - AMD R9 290X & R9 280X Vs Nvidia GTX Titan & GTX 780 @ eTeknix
- AMD Radeon R9 290X 4GB @ eTeknix
- AMD Radeon R9 290X @ Legit Reviews
- AMD Radeon R9 290X 4GB Review @ Hardware Canucks
- AMD Radeon R9 290X CrossFire @ techPowerUp
- AMD Radeon R9 290X @ Techspot
- AMD R9 290X @ Kitguru
- AMD Radeon R9 290X 4 GB @ techPowerUp
A bit of a surprise
Okay, let's cut to the chase here: it's late, we are rushing to get our articles out, and I think you all would rather see our testing results NOW rather than LATER. The first thing you should do is read my review of the AMD Radeon R9 290X 4GB Hawaii graphics card which goes over the new architecture, new feature set, and performance in single card configurations.
Then, you should continue reading below to find out how the new XDMA, bridge-less CrossFire implementation actually works in both single panel and 4K (tiled) configurations.
A New CrossFire For a New Generation
CrossFire has caused a lot of problems for AMD in recent months (and a lot of problems for me as well). But, AMD continues to make strides in correcting the frame pacing issues associated with CrossFire configurations and the new R9 290X moves the bar forward.
Without the CrossFire bridge connector on the 290X, all of the CrossFire communication and data transfer occurs over the PCI Express bus that connects the cards to the entire system. AMD claims that this new XDMA interface was designed for Eyefinity and UltraHD resolutions (which were the subject of our most recent article on the subject). By accessing the memory of the GPU through PCIe AMD claims that it can alleviate the bandwidth and sync issues that were causing problems with Eyefinity and tiled 4K displays.
Even better, this updated version of CrossFire is said to compatible with the frame pacing updates to the Catalyst driver to improve multi-GPU performance experiences for end users.
When an extra R9 290X accidentally fell into my lap, I decided to take it for a spin. And if you have followed my graphics testing methodology in the past year then you'll understand the important of these tests.
A slightly new architecture
Last month AMD brought media, analysts, and customers out to Hawaii to talk about a new graphics chip coming out this year. As you might have guessed based on the location: the code name for this GPU was in fact, Hawaii. It was targeted at the high end of the discrete graphics market to take on the likes of the GTX 780 and GTX TITAN from NVIDIA.
Earlier this month we reviewed the AMD Radeon R9 280X, R9 270X, and the R7 260X. None of these were based on that new GPU. Instead, these cards were all rebrands and repositionings of existing hardware in the market (albeit at reduced prices). Those lower prices made the R9 280X one of our favorite GPUs of the moment as it offers performance per price points currently unmatched by NVIDIA.
But today is a little different, today we are talking about a much more expensive product that has to live up to some pretty lofty goals and ambitions set forward by the AMD PR and marketing machine. At $549 MSRP, the new AMD Radeon R9 290X will become the flagship of the Radeon brand. The question is: to where does that ship sail?
The AMD Hawaii Architecture
To be quite upfront about it, the Hawaii design is very similar to that of the Tahiti GPU from the Radeon HD 7970 and R9 280X cards. Based on the same GCN (Graphics Core Next) architecture AMD assured us would be its long term vision, Hawaii ups the ante in a few key areas while maintaining the same core.
Hawaii is built around Shader Engines, of which the R9 290X has four. Each of these includes 11 CU (compute units) which hold 4 SIMD arrays each. Doing the quick math brings us to a total stream processor count of 2,816 on the R9 290X.
Subject: General Tech, Graphics Cards | October 23, 2013 - 07:30 PM | Scott Michaud
Tagged: amd, firepro
Currently AMD holds 18% market share with their FirePro line of professional GPUs. This compares to NVIDIA who owns 81% with Quadro. I assume the "other" category is the sum of S3 and Matrox who, together, command 1% of the professional market (just the professional market)
According to Jon Peddie of JPR, as reported by X-Bit Labs, AMD intends to wrestle back revenue left unguarded for NVIDIA. "After years of neglect, AMD’s workstation group, under the tutorage of Matt Skyner, has the backing and commitment of top management and AMD intends to push into the market aggressively." They have already gained share this year.
During AMD's 3rd Quarter (2013) earnings call, CEO Rory Read outlined the importance of the professional graphics market.
We also continue to make steady progress in another of growth businesses in the third quarter as we delivered our fifth consecutive quarter of revenue and share growth in the professional graphics area. We believe that we can continue to gain share in this lucrative part of the GPU market based on our product portfolio, design wins in flight, and enhanced channel programs.
On the same conference call (actually before and after the professional graphics sound bite), Rory noted their renewed push into the server and embedded SoC markets with 64-bit x86 and 64-bit ARM processors. They will be the only company manufacturing both x86 and ARM solutions which should be an interesting proposition for an enterprise in need of both. Why deal with two vendors?
Either way, AMD will probably be refocusing on the professional and enterprise markets for the near future. For the rest of us, this hopefully means that AMD has a stable (and confident) roadmap in the processor and gaming markets. If that is the case, a profitable Q3 is definitely a good start.
Subject: Graphics Cards | October 23, 2013 - 06:20 PM | Jeremy Hellstrom
Tagged: amd, overclocking, asus, ASUS R9 280X DirectCU II TOP, r9 280x
Having already seen what the ASUS R9 280X DirectCU II TOP can do at default speeds the obvious next step, once they had time to fully explore the options, was for [H]ard|OCP to see just how far this GPU can overclock. To make a long story short, they went from a default clock of 1070MHz up to 1230MHz and pushed the RAM to 6.6GHz from 6.4GHz though the voltage needed to be bumped from 1.2v to 1.3v. The actual frequencies are nowhere near as important as the effect on gameplay though, to see those results you will have to click through to the full article.
"We take the new ASUS R9 280X DirectCU II TOP video card and find out how high it will overclock with GPU Tweak and voltage modification. We will compare performance to an overclocked GeForce GTX 770 and find out which card comes out on top when pushed to its overclocking limits."
Here are some more Graphics Card articles from around the web:
- Gigabyte R9 280X OC 3 GB @ techPowerUp
- HIS R9 280X iPower IceQ X2 Turbo Boost Clock 3GB Video Card Review @ Madshrimps
- AMD Radeon R9 290X Versus NVIDIA GeForce GTX 780 Benchmarks @ Legit Reviews
- XFX R9 280X Black OC Edition @ Kitguru
- AMD Radeon R9 280X Video Card Review w/ ASUS, XFX and MSI @ Legit Reviews
- HIS R9 280X iPower IceQ X² Turbo and R9 270X IceQ X² Turbo @ Legion Hardware
- Sapphire Radeon R9 270X Vapor-X @ Benchmark Reviews
- MSI R9 270X Hawk Review @ OCC
- Asus Matrix R9 280X Platinum @ LanOC Reviews
- ASUS R9 280X DirectCU II TOP 3 GB @ techPowerUp
- HIS Radeon R9 280X IceQ X2 @ Benchmark Reviews
- Sapphire Toxic Edition R9 270X Video Card Review @HiTech Legion
- MSI Radeon R9 270X Gaming Video Card Review @ Ninjalane
- Sapphire Toxic R9 270X @ LanOC Review
- AMD Radeon 7000 and Radeon R200 Series Mixed CrossFire Testing @ Legit Reviews
- AMD Radeon R9 270X Graphics Card Review @ Techgage
- MSI R9 270X HAWK 2 GB @ techPowerUp
- HIS Radeon R9 270X IceQ X2 Turbo Boost @ Benchmark Reviews
- Asus R9 270X DirectCU II Top @ LanOC Reviews
- Diamond Multimedia Radeon 7870 7870PE52GV Review @ HCW
- AMD Radeon R9 270X On Linux @ Phoronix
- ASUS GTX760 DirectCU Mini OC @ Hardawre.info
- Gigabyte GeForce GTX 770 OC / GTX 780 OC @ Hardware.info
- MSI N660 Gaming Review: affordable and silent GeForce GTX 660 @ Hardawre.info
- Asus GTX 670 Direct CU Mini @ LanOC Reviews
- NVIDIA GeForce GTX 650 On Linux @ Phoronix
Subject: General Tech, Graphics Cards | October 23, 2013 - 12:21 AM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce
Mid-June kicked up a storm of poop across the internet when IGN broke the AMD optimizations for Frostbite 3. It was reported that NVIDIA would not receive sample code for those games until after they launched. The article was later updated with a statement from AMD: "... the AMD Gaming Evolved program undertakes no efforts to prevent our competition from optimizing for games before their release."
Now, I assume, the confusion was caused by then-not-announced Mantle.
And, as it turns out, NVIDIA did receive the code for Battlefield 4 prior to launch. Monday, the company launched their 331.58 WHQL-certified drivers which are optimized for Batman: Arkham Origins and Battlefield 4. According to the release notes, you should even be able to use SLi out of the gate. If, on the other hand, you are a Civilization V player: HBAO+ should enhance your shadowing.
They also added a DX11 SLi profile for Watch Dogs... awkwarrrrrd.
To check out the blog at GeForce.com for a bit more information, check out the release notes, or just head over to the drivers page. If you have GeForce Experience installed, it probably already asked you to update.
Subject: Graphics Cards, Displays | October 20, 2013 - 02:50 PM | Ryan Shrout
Tagged: video, tom petersen, nvidia, livestream, live, g-sync
UPDATE: If you missed our live stream today that covered NVIDIA G-Sync technology, you can watch the replay embedded below. NVIDIA's Tom Petersen stops by to talk about G-Sync in both high level and granular detail while showing off some demonstrations of why G-Sync is so important. Enjoy!!
Last week NVIDIA hosted press and developers in Montreal to discuss a couple of new technologies, the most impressive of which was NVIDIA G-Sync, a new monitor solution that looks to solve the eternal debate of smoothness against latency. If you haven't read about G-Sync and how impressive it was when first tested on Friday, you should check out my initial write up, NVIDIA G-Sync: Death of the Refresh Rate, that not only does that, but dives into the reason the technology shift was necessary in the first place.
G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor. In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc. What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.
- A new frame has completed rendering and has been copied to the front buffer. Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
- A substantial amount of time has passed and the currently displayed image needs to be refreshed to avoid brightness variation.
In current display timing setups, the submission of the vBlank signal has been completely independent from the rendering pipeline. The result was varying frame latency and either horizontal tearing or fixed refresh frame rates. With NVIDIA G-Sync creating an intelligent connection between rendering and frame updating, the display of PC games is fundamentally changed.
Every person that saw the technology, including other media members and even developers like John Carmack, Johan Andersson and Tim Sweeney, came away knowing that this was the future of PC gaming. (If you didn't see the panel that featured those three developers on stage, you are missing out.)
But it is definitely a complicated technology and I have already seen a lot of confusion about it in our comment threads on PC Perspective. To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Monday afternoon where he will run through some demonstrations and take questions from the live streaming audience.
Be sure to stop back at PC Perspective on Monday, October 21st at 2pm ET / 11am PT as to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming. You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA G-Sync Live Stream
11am PT / 2pm ET - October 21st
We also want your questions!! The easiest way to get them answered is to leave them for us here in the comments of this post. That will give us time to filter through the questions and get the answers you need from Tom. We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with.
So be sure to join us on Monday afternoon!
Subject: Graphics Cards | October 18, 2013 - 07:55 PM | Ryan Shrout
Tagged: video, tim sweeney, nvidia, Mantle, john carmack, johan andersson, g-sync, amd
If you weren't on our live stream from the NVIDIA "The Way It's Meant to be Played" tech day this afternoon, you missed a hell of an event. After the announcement of NVIDIA G-Sync variable refresh rate monitor technology, NVIDIA's Tony Tomasi brough one of the most intriguing panels of developers on stage to talk.
John Carmack, Tim Sweeney and Johan Andersson talk for over an hour, taking questions from the audience and even getting into debates amongst themselves in some instances. Topics included NVIDIA G-Sync of course, AMD's Mantle low-level API, the hurdles facing PC gaming and what direction each luminary is currently on for future development.
If you are a PC enthusiast or gamer you are definitely going to want to listen and watch the video below!
Get notified when we go live!