Subject: Graphics Cards | August 13, 2014 - 06:11 PM | Jeremy Hellstrom
Tagged: factory overclocked, sapphire, R9 290X, Vapor-X R9 290X TRI-X OC
As far as factory overclocks go, the 1080MHz core and 5.64GHz RAM on the new Sapphire Vapor-X 290X is impressive and takes the prize for the highest factory overclock on this card [H]ard|OCP has seen yet. That didn't stop them from pushing it to 1180MHz and 5.9GHz after a little work which is even more impressive. At both the factory and manual overclocks the card handily beat the reference model and the manually overclocked benchmarks could meet or beat the overclocked MSI GTX 780 Ti GAMING 3G OC card. The speed is not the only good feature, Intelligent Fan Control keeps two of the three fans from spinning when the GPU is under 60C which vastly reduces the noise produced by this card. It is currently selling for $646, lower than the $710 that the GeForce is currently selling for as well.
"We take a look at the SAPPHIRE Vapor-X R9 290X TRI-X OC video card which has the highest factory overclock we've ever encountered on any AMD R9 290X video card. This video card is feature rich and very fast. We'll overclock it to the highest GPU clocks we've seen yet on R9 290X and compare it to the competition."
Here are some more Graphics Card articles from around the web:
- Sapphire Radeon R7 260X CrossFire Review @HiTech Legion
- ASUS Radeon R9 270X DirectCU II TOP @ [H]ard|OCP
- ASUS R9 270 Direct CU II OC 2 GB Video Card Review @ Madshrimps
- EKWB ASUS GTX 780 Ti DCII OC Full Cover Water Block Review @ Madshrimps
- Zotac GTX 750 Zone Edition @ Hardware Heaven
- Palit GTX 750 Ti KalmX 2 GB @ techPowerUp
- Palit GTX750 Ti KalmX @ Kitguru
- PNY GTX 780 Ti XLR8 OC Single & SLI Review @ Hardware Canucks
- Gigabyte GeForce GTX Titan Black GHz Edition @ X-bit Labs
- ASUS Republic of Gamers Striker Platinum GTX 760 4GB SLI @ eTeknix
- PNY GTX 750 Ti XLR8 OC @ [H]ard|OCP
The Waiting Game
NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing -- almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.
In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.
Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.
That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.
Subject: General Tech, Graphics Cards | August 6, 2014 - 01:34 PM | Jeremy Hellstrom
Tagged: radeon, Gallium3D, catalyst 14.6 Beta, linux, ubuntu 14.04
The new Gallium3D is up against the open source Catalyst 14.6 Beta, running under Ubuntu 14.04 and both the 3.14 and 3.16 Linux kernels, giving Phoronix quite a bit of testing to do. They have numerous cards in their test ranging from an HD 6770 to an R9 290 though unfortunately there are no Gallium3D results for the R9 290 as it will not function until the release of the Linux 3.17 kernel. Overall the gap is closing, the 14.6 Beta still remains the best performer but the open source alternative is quickly closing the gap.
"After last week running new Nouveau vs. NVIDIA proprietary Linux graphics benchmarks, here's the results when putting AMD's hardware on the test bench and running both their latest open and closed-source drivers. Up today are the results of using the latest Radeon Gallium3D graphics code and Linux kernel against the latest beta of the binary-only Catalyst driver."
Here is some more Tech News from around the web:
- Phison announces new quad-core SATA 6Gb/s SSD controller chip @ DigiTimes
- Microsoft KILLS Windows 8.1 Update 2 and Patch Tuesday @ The Register
- Yes, we know Active Directory cloud sync is a MESS, says Microsoft @ The Register
- Paypal ignores bug discovery that lets anyone bypass two factor authentication @ The Inquirer
- How To Emulate Rare and Retro Platforms on the Raspberry Pi @ MAKE:Blog
- European Rosetta Space Craft About To Rendezvous With Comet @ Slashdot
- How To Give Adobe Photoshop A Performance Boost With Your GPU @ Tech ARP
- How to Set up Server-to-Server Sharing in ownCloud 7 on Linux @ Linux.com
- It's official: You can now legally carrier-unlock your mobile in the US @ The Register
- Facebook goes down, people dial 911 @ The Register
- NikKTech And XSPC Worldwide Giveaway
Experience with Silent Design
In the time periods between major GPU releases, companies like ASUS have the ability to really dig down and engineer truly unique products. With the expanded time between major GPU releases, from either NVIDIA or AMD, these products have continued evolving to offer better features and experiences than any graphics card before them. The ASUS Strix GTX 780 is exactly one of those solutions – taking a GTX 780 GPU that was originally released in May of last year and twisting it into a new design that offers better cooling, better power and lower noise levels.
ASUS intended, with the Strix GTX 780, to create a card that is perfect for high end PC gamers, without crossing into the realm of bank-breaking prices. They chose to go with the GeForce GTX 780 GPU from NVIDIA at a significant price drop from the GTX 780 Ti, with only a modest performance drop. They double the reference memory capacity from 3GB to 6GB of GDDR5, to assuage any buyer’s thoughts that 3GB wasn’t enough for multi-screen Surround gaming or 4K gaming. And they change the cooling solution to offer a near silent operation mode when used in “low impact” gaming titles.
The ASUS Strix GTX 780 Graphics Card
The ASUS Strix GTX 780 card is a pretty large beast, both in physical size and in performance. The cooler is a slightly modified version of the very popular DirectCU II thermal design used in many of the custom built ASUS graphics cards. It has a heat dissipation area more than twice that of the reference NVIDIA cooler and uses larger fans that allow them to spin slower (and quieter) at the improved cooling capacity.
Out of the box, the ASUS Strix GTX 780 will run at 889 MHz base clock and 941 MHz Boost clock, a fairly modest increase over the 863/900 MHz rates of the reference card. Obviously with much better cooling and a lot of work being done on the PCB of this custom design, users will have a lot of headroom to overclock on their own, but I continue to implore companies like ASUS and MSI to up the ante out of the box! One area where ASUS does impress is with the memory – the Strix card features a full 6GB of GDDR5 running 6.0 GHz, twice the capacity of the reference GTX 780 (and even GTX 780 Ti) cards. If you had any concerns about Surround or 4K gaming, know that memory capacity will not be a problem. (Though raw compute power may still be.)
Subject: General Tech, Graphics Cards | August 3, 2014 - 04:59 PM | Scott Michaud
Tagged: nvidia, maxwell, gtx 880
Just recently, we posted a story that claimed NVIDIA was preparing to launch high-end Maxwell in the October/November time frame. Apparently, that was generous. The graphics company is said to announce their GeForce GTX 880 in mid-September, with availability coming later in the month. It is expected to be based on the GM204 architecture (which previous rumors claim is 28nm).
It is expected that the GeForce GTX 880 will be available with 4GB of video memory, with an 8GB version possible at some point. As someone who runs multiple (five) monitors, I can tell you that 2GB is not enough for someone of my use case. Windows 7 says the same. It kicks me out of applications to tell me that it does not have enough video memory. This would be enough reason for me to get more GPU memory.
We still do not know how many CUDA cores will be present in the GM204 chip, or if the GeForce GTX 880 will have all of them enabled (but I would be surprised if it didn't). Without any way to derive its theoretical performance, we cannot compare it against the GTX 780 or 780Ti. It could be significantly faster, it could be marginally faster, or it could be somewhere between.
But we will probably find out within two months.
Subject: General Tech, Graphics Cards, Displays | July 29, 2014 - 09:02 PM | Scott Michaud
Tagged: vesa, nvidia, g-sync, freesync, DisplayPort, amd
Dynamic refresh rates have two main purposes: save power by only forcing the monitor to refresh when a new frame is available, and increase animation smoothness by synchronizing to draw rates (rather than "catching the next bus" at 16.67ms, on the 16.67ms, for 60 Hz monitors). Mobile devices prefer the former, while PC gamers are interested in the latter.
Obviously, the video camera nullifies the effect.
NVIDIA was first to make this public with G-Sync. AMD responded with FreeSync, starting with a proposal that was later ratified by VESA as DisplayPort Adaptive-Sync. AMD, then, took up "Project FreeSync" as an AMD "hardware/software solution" to make use of DisplayPort Adaptive-Sync in a way that benefits PC gamers.
Today's news is that AMD has just released an FAQ which explains the standard much more thoroughly than they have in the past. For instance, it clarifies the distinction between DisplayPort Adaptive-Sync and Project FreeSync. Prior to the FAQ, I thought that FreeSync became DisplayPort Adaptive-Sync, and that was that. Now, it is sounding a bit more proprietary, just built upon an open, VESA standard.
If interested, check out the FAQ at AMD's website.
Subject: General Tech, Graphics Cards | July 29, 2014 - 08:27 PM | Scott Michaud
Tagged: nvidia, geforce, graphics drivers, shield tablet, shield
Alongside the NVIDIA SHIELD Tablet launch, the company has released their GeForce 340.52 drivers. This version allows compatible devices to use GameStream and it, also, is optimized for Metro: Redux and Final Fantasy XIV (China).
The driver supports GeForce 8-series graphics cards, and later. As a reminder, for GPUs that are not based on the Fermi architecture (or later), 340.xx will be your last driver version. NVIDIA does intend to provided extended support for 340.xx (and earlier) drivers until April 1st, 2016. But, when Fermi, Kepler, and Maxwell move on to 343.xx, Tesla and earlier will not. That said, most of the content of this driver is aimed at Kepler and later. Either way, the driver itself is available for those pre-Fermi cards.
I should also mention that a user of Anandtech's forums noted the removal of Miracast from NVIDIA documentation. NVIDIA has yet to comment, although it is still very short notice, at this point.
Subject: Graphics Cards | July 29, 2014 - 02:27 PM | Jeremy Hellstrom
Tagged: asus, gtx 780, R9 290X DC2 OC, sli, crossfire, STRIX GTX 780 OC 6GB, R9 290X
We have seen [H]ard|OCP test ASUS' STRIX GTX 780 OC 6GB and R9 290X DirectCU II before but this time they have been overclocked and paired up for a 4k showdown. For a chance NewEgg gives the price advantage to AMD, $589 versus $599 at the time of writing (with odd blips in prices on Amazon). The GTX 780 has been set to 1.2GHz and 6.6GHz while the 290X is 1.1GHz and 5.6GHz, keep in mind dual GPU setups may not reach the same frequencies as single cards. Read on for their conclusions and decide if you prefer to brag about a higher overclock or have better overall performance.
"We take the ASUS STRIX GTX 780 OC 6GB video card and run two in SLI and overclock both of these at 4K resolutions to find the ultimate gameplay performance with 6GB of VRAM. We will also compare these to two overclocked ASUS Radeon R9 290X DirectCU II CrossFire video cards for the ultimate VRAM performance showdown."
Here are some more Graphics Card articles from around the web:
- ASUS GTX 780 STRIX OC 6GB Review @ Hardware Canucks
- Gigabyte GeForce GTX 770 (GV-N770OC-2GD) vs. ASUS Matrix Platinum (R9280X-P-3GD5) Video Card Review @ Hardware Secrets
- Ice-cold Temperature Killa: Arctic Accelero Hybrid II-120 GPU Cooler Review @ Techgage
- Examining AMD’s Driver Progress Since Launch Drivers: R9 290X & HD 7970 @ eTeknix
- HIS Radeon R7 250X and 260X iCooler @ Funky Kit
- Sapphire Dual-X R9 280 3GB OC Video Card Review @ Legit Reviews
- AMD Radeon R9 280X Round-up @ Legion Hardware
- HIS R9 280 IceQ X2 OC 3GB GDDR5 Video Card Review @ Madshrimps
- Sapphire Radeon R9 290 Vapor-X 4 GB @ techPowerUp
Subject: General Tech, Graphics Cards | July 28, 2014 - 09:00 AM | Scott Michaud
Tagged: raptr, pc game streaming
Raptr seems to be gaining in popularity. Total playtime recorded by the online service was up 15% month-over-month, from May to June. The software is made up of a few features that are designed to make the lives of PC gamers easier and better, ranging from optimizing game settings to recording gameplay. If you have used a recent version of GeForce Experience, then you probably have a good idea of what Raptr does.
Today, Raptr has announced a new, major update. The version's headlining feature is hardware accelerated video recording, and streaming, for both AMD and NVIDIA GPUs. Raptr claims that their method leads to basically no performance lost, regardless of which GPU vendor is used. Up to 20 minutes of previous gameplay can be recorded after it happened and video of unlimited length can be streamed on demand.
Notice the recording overlay in the top left.
The other, major feature of this version is enhanced sharing of said videos. They can be uploaded to Raptr.com and shared to Facebook and Twitter, complete with hashtags (#BecauseYolo?)
If interested, check out Raptr at their website.
Subject: General Tech, Graphics Cards | July 24, 2014 - 07:32 PM | Scott Michaud
Tagged: nvidia, gtx 880
Many of our readers were hoping to drop one (or more) Maxwell-based GPUs in their system for use with their 4K monitors, 3D, or whatever else they need performance for. That has not happened, nor do we even know, for sure, when it will. The latest rumors claim that the NVIDIA GeForce GTX 870 and 880 desktop GPUs will arrive in October or November. More interesting, it is expected to be based on GM204 at the current, 28nm process.
The recent GPU roadmap, as of GTC 2014
NVIDIA has not commented on the delay, at least that I know of, but we can tell something is up from their significantly different roadmap. We can also make a fairly confident guess, by paying attention to the industry as a whole. TSMC has been struggling to keep up with 28nm production, having increased wait times by six extra weeks in May, according to Digitimes, and whatever 20nm capacity they had was reportedly gobbled up by Apple until just recently. At around the same time, NVIDIA inserted Pascal between Maxwell and Volta with 3D memory, NVLink, and some unified memory architecture (which I don't believe they yet elaborated on).
The previous roadmap. (Source: Anandtech)
And, if this rumor is true, Maxwell was pushed from 20nm to a wholly 28nm architecture. It was originally supposed to be host of unified virtual memory, not Pascal. If I had to make a safe guess, I would assume that NVIDIA needed to redesign their chip to 28nm and, especially with the extra delays at TSMC, cannot get the volume they need until Autumn.
Lastly, going by the launch of the 750ti, Maxwell will basically be a cleaned-up Kepler architecture. Its compute units were shifted into power-of-two partitions, reducing die area for scheduling logic (and so forth). NVIDIA has been known to stash a few features into each generation, sometimes revealing them well after retail availability, so that is not to say that Maxwell will be "a more efficient Kepler".
I expect its fundamental architecture should be pretty close, though.