AMD Expects Beta Eyefinity Fix in January

Subject: General Tech, Graphics Cards | December 11, 2013 - 05:58 PM |
Tagged: frame pacing, frame rating, amd, southern islands, 4k, eyefinity, crossfire, microstutter

The frame pacing issue has been covered at our website for almost a year now. It stems from the original "microstutter" problem which dates back over a year before we could quantify it. We like to use the term "Frame Rating" to denote the testing methodology we now use for our GPU tests.

amd-crossfire.JPG

AMD fared worse at these tests than NVIDIA (although even they had some problems in certain configurations). They have dedicated a lot of man-hours to the problem resulting in a driver updates for certain scenarios. Crossfire while utilizing Eyefinity or 4K MST was one area they did not focus on. The issue has been addressed in Hawaii and AMD asserted that previous cards will get a software fix soon.

The good news is that we have just received word from AMD that they plan on releasing a beta driver for Southern Islands and earlier GPUs (AMD believes it should work for anything that's not "legacy"). As usual, until it ships anything could change, but it looks good for now.

The beta "frame pacing" driver addressing Crossfire with 4K and Eyefinity, for supported HD-series and Southern Islands-based Rx cards, is expected to be public sometime in January.

Source: AMD

ASUS Teases R9 290X DirectCU II

Subject: General Tech, Graphics Cards | December 9, 2013 - 03:03 AM |
Tagged: R9 290X, DirectCU II, asus

The AMD Radeon R9 290X is a very good graphics processor whose reference design is marred with a few famous design choices. AMD specs the GPU to run at a maximum of 95C, perpetually, and will push its frequency up to 1 GHz if it can stay at or under that temperature. Its cooler in the typical, "Quiet", default setting is generally unable to keep this frequency for more than a handful of minutes. This lead to countless discussions about what it means to be a default and what are the components actual specifications.

ASUS-R9290X-DirectCU2.jpg

All along we note that custom designs from add-in board (AIB) partners could change everything.

ASUS seems to be first to tease their custom solution. This card, based on their DirectCU II design, uses two fans and multiple 10mm nickel plated heatpipes directly atop the processor. The two fans should be able to move more air at a slower rate of rotation and thus be more efficient per decibel. The heatsink itself might also be able to pull heat, quicker, altogether. I am hoping that ASUS provisioned the part to remain at a stable 1GHz under default settings or perhaps even more!

The real test for Hawaii will be when the wave of custom editions washes on shore. We know the processor is capable of some pretty amazing performance figures when it can really open up. This, and other partner boards, would make for possibly the most interesting AIB round-up we have ever had.

No word, yet, on pricing or availability.

(Phoronix) Four Monitors on Linux: AMD and NVIDIA

Subject: General Tech, Graphics Cards | December 5, 2013 - 03:17 AM |
Tagged: linux, nvidia surround, eyefinity

Could four 1080p monitors be 4K on the cheap? Probably not... but keep reading.

phoronix-linux-4monitor.jpg

Image Credit: Phoronix

Phoronix published an article for users interested in quad monitor gaming on Linux. Sure, you might think this is a bit excessive especially considering the bezel at the center of your screen. On the other hand, imagine you are playing a four player split-screen game. That would definitely get some attention. Each player would be able to tilt their screen out of the view of their opponents while only using a single computer.

In his 8-page editorial, Michael Larabel tests the official and popular open source drivers for both AMD and NVIDIA. The winner was NVIDIA's proprietary driver although the open source solution, Nouveau, seemed to fair the worst of the batch. This is the typical trade-off with NVIDIA. It was only just recent that The Green Giant opened up documentation for the other chefs in the kitchen... so these results may change soon.

If you are interested in gaming on Linux, give the article a read.

Source: Phoronix

MSI Z87I GAMING AC Motherboard and GTX 760 GAMING ITX Video Card

Subject: General Tech, Graphics Cards, Motherboards | December 4, 2013 - 12:02 AM |
Tagged: uppercase, msi, mini-itx

MSI is calling these products, "Mini, but Mighty". These components are designed for the mini-ITX form factor which is smaller than 7 inches in length and width. Its size makes it very useful for home theater PCs (HTPCs) and other places where discretion is valuable. You also want these machines to be quiet, which MSI claims this product series is.

The name is also written in full uppercase so you imagine yourself yelling every time you read it.

msi-GAMING-moboandgpu.png

The MSI Z87I GAMING AC Motherboard comes with an Intel 802.11ac (hence, "GAMING AC", I assume) wireless adapter. If you are using a wired connection, it comes with a Killer E2205 Ethernet adapter from Qualcomm's BigFoot Networks (even small PCs can be BigFoot). Also included is an HDMI 1.4 output capable of 4K video (HDMI 1.4 is limited to 30Hz output at 2160p).

Good features to have, especially for an HTPC build.

The other launch is the GTX 760 GAMING ITX video card. This card is a miniature GeForce 760 designed to fit in mini-ITX cases. If your box is a Home Theater PC, expect it to run just about any game at 1080p.

No information on pricing and availability yet. Check out the press release after the break.

Source: MSI

AMD A10-7850K and A10-7700K Kaveri Leaks Including Initial GPU Benchmarks

Subject: General Tech, Graphics Cards, Processors | December 3, 2013 - 04:12 AM |
Tagged: Kaveri, APU, amd

The launch and subsequent availability of Kaveri is scheduled for the CES time frame. The APU unites Steamroller x86 cores with several Graphics Core Next (GCN) cores. The high-end offering, the A10-7850K, is capable of 856 GFLOPs of compute power (most of which is of course from the GPU).

amd-kaveri-slide.png

Image/Leak Credit: Prohardver.hu

We now know about two SKUs: the A10-7850K and the A10-7700K. Both parts are quite similar except that the higher model is given a 200 MHz CPU bump, 3.8 GHz to 4.0 Ghz, and 33% more GPU units, 6 to 8.

But how does this compare? The original source (prohardver.hu) claims that Kaveri will achieve an average 28 FPS in Crysis 3 on low at 1680x1050; this is a 12% increase over Richland. It also achieved an average 53 FPS with Sleeping Dogs on Medium which is 26% more than Richland.

These are healthy increases over the previous generation but do not even account for HSA advantages. I am really curious what will happen if integrated graphics become accessible enough that game developers decide to target it for general compute applications. The reduction in latency (semi-wasted time bouncing memory between compute devices) might open this architecture to where it can really shine.

We will do our best to keep you up to date on this part especially when it launches at CES.

Source: ProHardver

NVIDIA Launches GeForce Experience 1.8

Subject: General Tech, Graphics Cards | December 2, 2013 - 03:16 PM |
Tagged: nvidia, ShadowPlay

They grow up so fast these days...

GeForce Experience is NVIDIA's software package, often bundled with their driver updates, to optimize the experience of their customers. This could be adding interesting features, such as GPU-accelerated game video capture, or just recommending graphics settings for popular games.

geforce-experience-1-8-adjusted-optimal-settings-overclock.png

Version 1.8 adds many desired features lacking from the previous version. I always found it weird that GeForce Experience would recommend one good baseline settings for games, and set them for you, but force you to then go into the game and tweak from there. It would be nice to see multiple presets but that is not what we get; instead, we are able to tweak the settings from within GeForce Experience. The baseline tries to provide a solid 40 FPS at the most difficult moments, computationally. You can then tune the familiar performance and quality slider from there.

You are also able to set resolutions up to 3840x2160 and select whether you would like to play in windowed (including "borderless") mode.

geforce-experience-1-8-shadowplay-recording-time.png

Also, with ShadowPlay, Windows 7 users will also be able to "shadow" the last 20 minutes like their Windows 8 neighbors. You will also be able to combine your microphone audio with the in-game audio should you select it. I can see the latter feature being very useful for shoutcasters. Apparently it allows capturing VoIP communication and not just your microphone itself.

Still no streaming to Twitch.tv, yet. It is still coming.

For now, you can download GeForce Experience from NVIDIA's GeForce website. If you want to read a little more detail about it, first, you can check out their (much longer) blog post.

Intel Xeon Phi to get Serious Refresh in 2015?

Subject: General Tech, Graphics Cards, Processors | November 28, 2013 - 03:30 AM |
Tagged: Intel, Xeon Phi, gpgpu

Intel was testing the waters with their Xeon Phi co-processor. Based on the architecture designed for the original Pentium processors, it was released in six products ranging from 57 to 61 cores and 6 to 16GB of RAM. This lead to double precision performance of between 1 and 1.2 TFLOPs. It was fabricated using their 22nm tri-gate technology. All of this was under the Knights Corner initiative.

Intel_Xeon_Phi_Family.jpg

In 2015, Intel plans to have Knights Landing ready for consumption. A modified Silvermont architecture will replace the many simple (basically 15 year-old) cores of the previous generation; up to 72 Silvermont-based cores (each with 4 threads) in fact. It will introduce the AVX-512 instruction set. AVX-512 allows applications to vectorize 8 64-bit (double-precision float or long integer) or 16 32-bit (single-precision float or standard integer) values.

In other words, packing a bunch of related problems into a single instruction.

The most interesting part? Two versions will be offered: Add-In Boards (AIBs) and a standalone CPU. It will not require a host CPU, because of its x86 heritage, if your application is entirely suited for an MIC architecture; unlike a Tesla, it is bootable with existing and common OSes. It can also be paired with standard Xeon processors if you would like a few strong threads with the 288 (72 x 4) the Xeon Phi provides.

And, while I doubt Intel would want to cut anyone else in, VR-Zone notes that this opens the door for AIB partners to make non-reference cards and manage some level of customer support. I'll believe a non-Intel branded AIB only when I see it.

Source: VR-Zone

Controversy continues to erupt over AMD's new GPU

Subject: Graphics Cards | November 27, 2013 - 04:44 PM |
Tagged: sapphire, radeon, R9 290X, hawaii, amd, 290x

Ryan is not the only one who felt it necessary to investigate the reports of differing performance between retail R9 290X cards and the ones sent out for review.  Legit Reviews also ordered a retail card made by Sapphire and tested it against the card sent to them by AMD.  As with our results, ambient temperature had more effect on the frequency of the retail card than it did on the press sample with a 14% difference being common.  Legit had another idea after they noticed that while the BIOS version was the same on both cards the part numbers differed.  Find out what happened when they flashed the retail card to exactly match the press sample.

290x-cards-645x485.jpg

"The AMD Radeon R9 290X and R9 290 have been getting a ton of attention lately due to a number of reports that the retail cards are performing differently than the press cards that the media sites received. We have been following these stories for the past few weeks and finally decided to look into the situation ourselves."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Author:
Manufacturer: Sapphire

Another retail card reveals the results

Since the release of the new AMD Radeon R9 290X and R9 290 graphics cards, we have been very curious about the latest implementation of AMD's PowerTune technology and its scaling of clock frequency as a result of the thermal levels of each graphics card.  In the first article covering this topic, I addressed the questions from AMD's point of view - is this really a "configurable" GPU as AMD claims or are there issues that need to be addressed by the company? 

The biggest problems I found were in the highly variable clock speeds from game to game and from a "cold" GPU to a "hot" GPU.  This affects the way many people in the industry test and benchmark graphics cards as running a game for just a couple of minutes could result in average and reported frame rates that are much higher than what you see 10-20 minutes into gameplay.  This was rarely something that had to be dealt with before (especially on AMD graphics cards) so to many it caught them off-guard.

02.jpg

Because of the new PowerTune technology, as I have discussed several times before, clock speeds are starting off quite high on the R9 290X (at or near the 1000 MHz quoted speed) and then slowly drifting down over time.

Another wrinkle occurred when Tom's Hardware reported that retail graphics cards they had seen were showing markedly lower performance than the reference samples sent to reviewers.  As a result, AMD quickly released a new driver that attempted to address the problem by normalizing to fan speeds (RPM) rather than fan voltage (percentage).  The result was consistent fan speeds on different cards and thus much closer performance.

However, with all that being said, I was still testing retail AMD Radeon R9 290X and R9 290 cards that were PURCHASED rather than sampled, to keep tabs on the situation. 

Continue reading our article on retail variance in R9 290X clock speeds and performance!!

So Apparently Some R9 290 Cards Can Flash in to a 290X?

Subject: General Tech, Graphics Cards | November 26, 2013 - 03:18 AM |
Tagged: R9 290X, r9 290, amd

Multiple sites are reporting that some AMD's Radeon R9 290 cards could be software-unlocked into 290Xs with a simple BIOS update. While the difference in performance is minor, free extra shader processors might be tempting for some existing owners.

"Binning" is when a manufacturer increases yield by splitting one product into several based on how they test after production. Semiconductor fabrication, specifically, is prone to constant errors and defects. Maybe only some of your wafers are not stable at 4 GHz but they can attain 3.5 or 3.7 GHz. Why throw those out when they can be sold as 3.5 GHz parts?

amd-gpu14-06.png

This is especially relevant to multi-core CPUs and GPUs. Hawaii XT has 2816 Stream processors; a compelling product could be made even with a few of those shut down. The R9 290, for instance, permits 2560 of these cores. The remaining have been laser cut or, at least, should have been.

Apparently certain batches of Radeon R9 290s were developed with fully functional Hawaii XT chips that were software locked to 290 specifications. There have been reports that several users of cards from multiple OEMs were able to flash a new BIOS to unlock these extra cores. However, other batches seem to be properly locked.

This could be interesting for lucky and brave users but I wonder why this happened. I can think of two potential causes:

  • Someone (OEMs or AMD) had too many 290X chips, or
  • The 290 launch was just that unprepared.

Either way, newer shipments should be properly locked even from affected OEMs. Again, not that it really matters given the performance differences we are talking about.

Source: WCCFTech