Subject: Graphics Cards | December 12, 2013 - 05:20 PM | Ryan Shrout
Tagged: video, amd, radeon, hawaii, r9 290, R9 290X, bitcoin, litecoin, mining
If you already listened to this weeks PC Perspective Podcast, then feel free to disregard this post. For the rest of you - subscribe to our damned weekly podcast would you already?!?
In any event, I thought it might be interesting to extract this 6 minute discussion we had during last nights live streamed podcast about how the emergence of Litecoin mining operations is driving up prices of GPUs, particularly the compute-capable R9 290 and R9 290X Hawaii-based cards from AMD.
Check out these prices currently on Amazon!
- Radeon R9 290X - $725+
- Radeon R9 290 - $499+
- Radeon R9 280X - $429+
- GeForce GTX 770 - $409+
- GeForce GTX 780 - $509+
- GeForce GTX 780 Ti - $699+
The price of the GTX 770 is a bit higher than it should be while the GTX 780 and GTX 780 Ti are priced in the same range they have been for the last month or so. The same cannot be said for the AMD cards listed here - the R9 280X is selling for $130 more than its expected MSRP at a minimum but you'll see quite a few going for much higher on Amazon, Ebay (thanks TR) and others. The Radeon R9 290 has an MSRP of $399 from AMD but the lowest price we found on Amazon was $499 and anything on Newegg.com is showing at the same price, but sold out. The R9 290X is even more obnoxiously priced when you can find them.
Do you have any thoughts on this? Do you think Litecoin mining is really causing these price inflations and what does that mean for AMD, NVIDIA and the gamer?
Quality time with G-Sync
Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.
NVIDIA's Prototype G-Sync Monitor
We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics. All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.
Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed. This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.
In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.
The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.
Subject: General Tech, Graphics Cards | December 11, 2013 - 05:58 PM | Scott Michaud
Tagged: frame pacing, frame rating, amd, southern islands, 4k, eyefinity, crossfire, microstutter
The frame pacing issue has been covered at our website for almost a year now. It stems from the original "microstutter" problem which dates back over a year before we could quantify it. We like to use the term "Frame Rating" to denote the testing methodology we now use for our GPU tests.
AMD fared worse at these tests than NVIDIA (although even they had some problems in certain configurations). They have dedicated a lot of man-hours to the problem resulting in a driver updates for certain scenarios. Crossfire while utilizing Eyefinity or 4K MST was one area they did not focus on. The issue has been addressed in Hawaii and AMD asserted that previous cards will get a software fix soon.
The good news is that we have just received word from AMD that they plan on releasing a beta driver for Southern Islands and earlier GPUs (AMD believes it should work for anything that's not "legacy"). As usual, until it ships anything could change, but it looks good for now.
The beta "frame pacing" driver addressing Crossfire with 4K and Eyefinity, for supported HD-series and Southern Islands-based Rx cards, is expected to be public sometime in January.
Subject: General Tech, Graphics Cards | December 9, 2013 - 03:03 AM | Scott Michaud
Tagged: R9 290X, DirectCU II, asus
The AMD Radeon R9 290X is a very good graphics processor whose reference design is marred with a few famous design choices. AMD specs the GPU to run at a maximum of 95C, perpetually, and will push its frequency up to 1 GHz if it can stay at or under that temperature. Its cooler in the typical, "Quiet", default setting is generally unable to keep this frequency for more than a handful of minutes. This lead to countless discussions about what it means to be a default and what are the components actual specifications.
All along we note that custom designs from add-in board (AIB) partners could change everything.
ASUS seems to be first to tease their custom solution. This card, based on their DirectCU II design, uses two fans and multiple 10mm nickel plated heatpipes directly atop the processor. The two fans should be able to move more air at a slower rate of rotation and thus be more efficient per decibel. The heatsink itself might also be able to pull heat, quicker, altogether. I am hoping that ASUS provisioned the part to remain at a stable 1GHz under default settings or perhaps even more!
The real test for Hawaii will be when the wave of custom editions washes on shore. We know the processor is capable of some pretty amazing performance figures when it can really open up. This, and other partner boards, would make for possibly the most interesting AIB round-up we have ever had.
No word, yet, on pricing or availability.
Subject: General Tech, Graphics Cards | December 5, 2013 - 03:17 AM | Scott Michaud
Tagged: linux, nvidia surround, eyefinity
Could four 1080p monitors be 4K on the cheap? Probably not... but keep reading.
Image Credit: Phoronix
Phoronix published an article for users interested in quad monitor gaming on Linux. Sure, you might think this is a bit excessive especially considering the bezel at the center of your screen. On the other hand, imagine you are playing a four player split-screen game. That would definitely get some attention. Each player would be able to tilt their screen out of the view of their opponents while only using a single computer.
In his 8-page editorial, Michael Larabel tests the official and popular open source drivers for both AMD and NVIDIA. The winner was NVIDIA's proprietary driver although the open source solution, Nouveau, seemed to fair the worst of the batch. This is the typical trade-off with NVIDIA. It was only just recent that The Green Giant opened up documentation for the other chefs in the kitchen... so these results may change soon.
If you are interested in gaming on Linux, give the article a read.
Subject: General Tech, Graphics Cards, Motherboards | December 4, 2013 - 12:02 AM | Scott Michaud
Tagged: uppercase, msi, mini-itx
MSI is calling these products, "Mini, but Mighty". These components are designed for the mini-ITX form factor which is smaller than 7 inches in length and width. Its size makes it very useful for home theater PCs (HTPCs) and other places where discretion is valuable. You also want these machines to be quiet, which MSI claims this product series is.
The name is also written in full uppercase so you imagine yourself yelling every time you read it.
The MSI Z87I GAMING AC Motherboard comes with an Intel 802.11ac (hence, "GAMING AC", I assume) wireless adapter. If you are using a wired connection, it comes with a Killer E2205 Ethernet adapter from Qualcomm's BigFoot Networks (even small PCs can be BigFoot). Also included is an HDMI 1.4 output capable of 4K video (HDMI 1.4 is limited to 30Hz output at 2160p).
Good features to have, especially for an HTPC build.
The other launch is the GTX 760 GAMING ITX video card. This card is a miniature GeForce 760 designed to fit in mini-ITX cases. If your box is a Home Theater PC, expect it to run just about any game at 1080p.
No information on pricing and availability yet. Check out the press release after the break.
Subject: General Tech, Graphics Cards, Processors | December 3, 2013 - 04:12 AM | Scott Michaud
Tagged: Kaveri, APU, amd
The launch and subsequent availability of Kaveri is scheduled for the CES time frame. The APU unites Steamroller x86 cores with several Graphics Core Next (GCN) cores. The high-end offering, the A10-7850K, is capable of 856 GFLOPs of compute power (most of which is of course from the GPU).
Image/Leak Credit: Prohardver.hu
We now know about two SKUs: the A10-7850K and the A10-7700K. Both parts are quite similar except that the higher model is given a 200 MHz CPU bump, 3.8 GHz to 4.0 Ghz, and 33% more GPU units, 6 to 8.
But how does this compare? The original source (prohardver.hu) claims that Kaveri will achieve an average 28 FPS in Crysis 3 on low at 1680x1050; this is a 12% increase over Richland. It also achieved an average 53 FPS with Sleeping Dogs on Medium which is 26% more than Richland.
These are healthy increases over the previous generation but do not even account for HSA advantages. I am really curious what will happen if integrated graphics become accessible enough that game developers decide to target it for general compute applications. The reduction in latency (semi-wasted time bouncing memory between compute devices) might open this architecture to where it can really shine.
We will do our best to keep you up to date on this part especially when it launches at CES.
Subject: General Tech, Graphics Cards | December 2, 2013 - 03:16 PM | Scott Michaud
Tagged: nvidia, ShadowPlay
They grow up so fast these days...
GeForce Experience is NVIDIA's software package, often bundled with their driver updates, to optimize the experience of their customers. This could be adding interesting features, such as GPU-accelerated game video capture, or just recommending graphics settings for popular games.
Version 1.8 adds many desired features lacking from the previous version. I always found it weird that GeForce Experience would recommend one good baseline settings for games, and set them for you, but force you to then go into the game and tweak from there. It would be nice to see multiple presets but that is not what we get; instead, we are able to tweak the settings from within GeForce Experience. The baseline tries to provide a solid 40 FPS at the most difficult moments, computationally. You can then tune the familiar performance and quality slider from there.
You are also able to set resolutions up to 3840x2160 and select whether you would like to play in windowed (including "borderless") mode.
Also, with ShadowPlay, Windows 7 users will also be able to "shadow" the last 20 minutes like their Windows 8 neighbors. You will also be able to combine your microphone audio with the in-game audio should you select it. I can see the latter feature being very useful for shoutcasters. Apparently it allows capturing VoIP communication and not just your microphone itself.
Still no streaming to Twitch.tv, yet. It is still coming.
For now, you can download GeForce Experience from NVIDIA's GeForce website. If you want to read a little more detail about it, first, you can check out their (much longer) blog post.
Subject: General Tech, Graphics Cards, Processors | November 28, 2013 - 03:30 AM | Scott Michaud
Tagged: Intel, Xeon Phi, gpgpu
Intel was testing the waters with their Xeon Phi co-processor. Based on the architecture designed for the original Pentium processors, it was released in six products ranging from 57 to 61 cores and 6 to 16GB of RAM. This lead to double precision performance of between 1 and 1.2 TFLOPs. It was fabricated using their 22nm tri-gate technology. All of this was under the Knights Corner initiative.
In 2015, Intel plans to have Knights Landing ready for consumption. A modified Silvermont architecture will replace the many simple (basically 15 year-old) cores of the previous generation; up to 72 Silvermont-based cores (each with 4 threads) in fact. It will introduce the AVX-512 instruction set. AVX-512 allows applications to vectorize 8 64-bit (double-precision float or long integer) or 16 32-bit (single-precision float or standard integer) values.
In other words, packing a bunch of related problems into a single instruction.
The most interesting part? Two versions will be offered: Add-In Boards (AIBs) and a standalone CPU. It will not require a host CPU, because of its x86 heritage, if your application is entirely suited for an MIC architecture; unlike a Tesla, it is bootable with existing and common OSes. It can also be paired with standard Xeon processors if you would like a few strong threads with the 288 (72 x 4) the Xeon Phi provides.
And, while I doubt Intel would want to cut anyone else in, VR-Zone notes that this opens the door for AIB partners to make non-reference cards and manage some level of customer support. I'll believe a non-Intel branded AIB only when I see it.
Subject: Graphics Cards | November 27, 2013 - 04:44 PM | Jeremy Hellstrom
Tagged: sapphire, radeon, R9 290X, hawaii, amd, 290x
Ryan is not the only one who felt it necessary to investigate the reports of differing performance between retail R9 290X cards and the ones sent out for review. Legit Reviews also ordered a retail card made by Sapphire and tested it against the card sent to them by AMD. As with our results, ambient temperature had more effect on the frequency of the retail card than it did on the press sample with a 14% difference being common. Legit had another idea after they noticed that while the BIOS version was the same on both cards the part numbers differed. Find out what happened when they flashed the retail card to exactly match the press sample.
"The AMD Radeon R9 290X and R9 290 have been getting a ton of attention lately due to a number of reports that the retail cards are performing differently than the press cards that the media sites received. We have been following these stories for the past few weeks and finally decided to look into the situation ourselves."
Here are some more Graphics Card articles from around the web:
- HIS R9 270X IceQ X² Turbo Boost 2 GB @ techPowerUp
- Sapphire Toxic Edition R9 280X Video Card Review @HiTech Legion
- ASUS R9 270 Direct CU II OC 2 GB @ techPowerUp
- Powercolor Radeon R9-270X Devil @ Bjorn3D
- AMD Radeon R9 290 Review On Linux @ Phoronix
- PowerColor Devil R9 270X 2GB @ Custom PC Review
- 2560×1600: GeForce GTX 780 Ti vs Radeon R9 290X @ Benchmark Reviews
- ASUS GTX 760 MARS @ Kitguru
- Gigabyte GeForce GTX 760 4GB Video Card Review – 2GB or 4GB of VRAM @ Legit Reviews
- NVIDIA GeForce GTX 780 Ti Steams Ahead On Linux @ Phoronix
- Palit GTX 780 Ti JetStream OC @ Kitguru
- EVGA GTX 780 Ti SC ACX Review @ Hardware Canucks
- NVIDIA GeForce GTX TITAN: Windows 8.1 vs. Ubuntu 13.10 @ Phoronix
Get notified when we go live!