AMD Gaming Evolved App with Redeemable Prizes

Subject: General Tech, Graphics Cards | February 18, 2014 - 09:01 PM |
Tagged: raptr, gaming evolved, amd

The AMD Gaming Evolved App updates your drivers, optimizes your game settings, streams your gameplay to Twitch, accesses some social media platforms, and now gives prizes. Points are given for playing games using the app, optimizing game settings, and so forth. These can be exchanged for rewards ranging from free games, to Sapphire R9-series graphics cards.

amd-raptr.jpg

This program has been in beta for a little while now, without the ability to redeem points. The system has been restructured to encourage using the entire app by lowering the accumulation rate for playing games and adding other goals. Beta participants do not lose all of their points, rather it is rescaled more in line with the new system.

The Gaming Evolved prize program has launched today.

Press release after the teaser.

Source: raptr

NVIDIA Releases GeForce TITAN Black

Subject: General Tech, Graphics Cards | February 18, 2014 - 06:03 AM |
Tagged: nvidia, gtx titan black, geforce titan, geforce

NVIDIA has just announced the GeForce GTX Titan Black. Based on the full high-performance Kepler (GK110) chip, it is mostly expected to be a lower cost development platform for GPU processing applications. All 2,880 single precision (FP32) CUDA Cores and 960 double precision (FP64) CUDA Cores are unlocked, yielding 5.1 TeraFLOPs of 32-bit decimal and 1.3 TeraFLOPs of 64-bit decimal performance. The chip contains 1536kB of L2 Cache and will be paired with 6GB of video memory on the board.

nvidia-titan-black-2.jpg

The original GeForce GTX Titan launched last year, almost to the day. Also based on the GK110 design, it also featured full double precision performance with only one SMX disabled. Of course, no component at the time contained a fully-enabled GK110 processor. The first product with all 15 SMX units active was not realized until the Quadro K6000, announced in July but only available in the fall. It was followed by the GeForce GTX 780 Ti (with a fraction of its FP64 performance) in November, and the fully powered Tesla K40 less than two weeks after that.

nvidia-titan-black-3.jpg

For gaming applications, this card is expected to have comparable performance to the GTX 780 Ti... unless you can find a use for the extra 3GB of memory. Games do not display much benefit with the extra 64-bit floating point (decimal) performance because the majority of their calculations are at 32-bit precision.

The NVIDIA GeForce GTX Titan Black is available today at a price of $999.

Source: NVIDIA

AMD Radeon R9 290X Hits $900 on Newegg. Thanks *coin

Subject: General Tech, Graphics Cards | February 14, 2014 - 03:02 PM |
Tagged: supply shortage, shortage, R9 290X, podcast, litecoin, dogecoin, bitcoin

UPDATE (Feb 14th, 11pm ET): As a commenter has pointed out below, suddenly, as if by magic, Newegg has lowered prices on the currently in stock R9 290X cards by $200.  That means you can currently find them for $699 - only $150 over the expected MSRP.  Does that change anything about what we said above or in the video?  Not really.  It only lowers the severity.

I am curious to know if this was done by Newegg voluntarily due to pressure from news stories such as these, lack of sales at $899 or with some nudging from AMD...

If you have been keeping up with our podcasts and reviews, you will know that AMD cards are great compute devices for their MSRP. This is something that cryptocurrency applies a value to. Run a sufficient amount of encryption tasks and you are rewarded with newly created tokens (or some fee from validated transactions). Some people seem to think that GPUs are more valuable for that purpose than their MSRP, so retailers raise prices and people still buy them.

amd-shortage-900.png

Currently, the cheapest R9 290X is being sold for $900. This is a 64% increase over AMD's intended $549 MSRP. They are not even the ones receiving this money!

This shortage also affects other products such as Corsair's 1200W power supply. Thankfully, only certain components are necessary for mining (mostly GPUs and a lot of power) so at least we are not seeing the shortage spread to RAM, CPUs, APUs, and so forth. We noted a mining kit on Newegg which was powered by a Sempron processor. This line of cheap and low-performance CPUs has not been updated since 2009.

We have kept up with GPU shortages, historically. We did semi-regular availability checks during the GeForce GTX 680 and 690 launch windows. The former was out of stock for over two months after its launch. Those also sometimes strayed from their MSRP, slightly.

Be sure to check out the clip (above) for a nice, 15-minute discussion.

Pitcairn rides again in the R7 265

Subject: Graphics Cards | February 13, 2014 - 11:31 AM |
Tagged: radeon, r7 265, pitcairn, Mantle, gpu, amd

Some time in late February or March you will be able to purchase the R7 265 for around $150, a decent price for an entry level GPU that will benefit those who are currently dependent on the GPU portion of an APU.  This leads to the question of its performance and if this Pitcairn refresh will really benefit a gamer on a tight budget.  Hardware Canucks tested it against the two NVIDIA cards closest in price, the GTX 650 Ti Boost which is almost impossible to find and the GTX 660 2GB which is $40 more than the MSRP of the R7 265.  The GTX 660 is faster overall but when you look at the price to performance ratio the R7 265 is a more attractive offering.  Of course with NVIDIA's Maxwell release just around the corner this could change drastically.

If you already caught Ryan's review, you might have missed the short video he just added on the last page.

slides04.jpg

Crowded house

"AMD's R7 265 is meant to reside in the space between the R7 260X and R9 270, though performance is closer to its R9 sibling. Could this make it a perfect budget friendly graphics card?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

AMD Launches Radeon R7 250X at $99 - HD 7770 Redux

Subject: Graphics Cards | February 9, 2014 - 09:00 PM |
Tagged: radeon, R7, hd 7770, amd, 250x

With the exception of the R9 290X, the R9 290, and the R7 260X, AMD's recent branding campaign with the Radeon R7 and R9 series of graphics cards is really just a reorganization and rebranding of existing parts.  When we reviewed the Radeon R9 280X and R9 270X, both were well known entities though this time with lower price tags to sweeten the pot.  

Today, AMD is continuing the process of building the R7 graphics card lineup with the R7 250X.  If you were looking for a new ASIC, maybe one that includes TrueAudio support, you are going to be let down.  The R7 250X is essentially the same part that was released as the HD 7770 in February of 2012: Cape Verde.

02.jpg

AMD calls the R7 250X "the successor" to the Radeon HD 7770 and its targeting the 1080p gaming landscape in the $99 price range.  For those keeping track at home, the Radeon HD 7770 GHz Edition parts are currently selling for the same price.  The R7 250X will be available in both 1GB and 2GB variants with a 128-bit GDDR5 memory bus running at 4.5 GHz.  The card requires a single 6-pin power connection and we expect a TDP of 95 watts.  

Here is a table that details the current product stack of GPUs from AMD under $140.  It's quite crowded as you can see.

  Radeon R7 260X Radeon R7 260 Radeon R7 250X Radeon R7 250 Radeon R7 240
GPU Code name Bonaire Bonaire Cape Verde Oland Oland
GPU Cores 896 768 640 384 320
Rated Clock 1100 MHz 1000 MHz 1000 MHz 1050 MHz 780 MHz
Texture Units 56 48 40 24 20
ROP Units 16 16 16 8 8
Memory 2GB 2GB 1 or 2GB 1 or 2GB 1 or 2GB
Memory Clock 6500 MHz 6000 MHz 4500 MHz 4600 MHz 4600 MHz
Memory Interface 128-bit 128-bit 128-bit 128-bit 128-bit
Memory Bandwidth 104 GB/s 96 GB/s 72 GB/s 73.6 GB/s 28.8 GB/s
TDP 115 watts 95 watts 95 watts 65 watts 30 watts
Peak Compute 1.97 TFLOPS 1.53 TFLOPS 1.28 TFLOPS 0.806 TFLOPS 0.499 TFLOPS
MSRP $139 $109 $99 $89 $79

The current competition from NVIDIA rests in the hands of the GeForce GTX 650 and the GTX 650 Ti, a GPU that was released itself in late 2012.  Since we already know what performance to expect from the R7 250X because of its pedigree, the numbers below aren't really that surprising, as provided by AMD.

01.jpg

AMD did leave out the GTX 650 Ti from the graph above... but no matter, we'll be doing our own testing soon enough, once our R7 250X cards find there way into the PC Perspective offices.  

The AMD Radeon R7 250X will be available starting today but if that is the price point you are looking at, you might want to keep an eye out for sales on those remaining Radeon HD 7770 GHz Edition parts.

Source: AMD

Linus Brings SLI and Crossfire Together

Subject: General Tech, Graphics Cards | February 7, 2014 - 12:54 AM |
Tagged: sli, crossfire

I will not even call this a thinly-veiled rant. Linus admits it. To make a point, he assembled a $5000 PC running a pair of NVIDIA GeForce 780 Ti GPUs and another pair of AMD Radeon R9 290X graphics cards. While Bitcoin mining would likely utilize all four video cards well enough, games will not. Of course, he did not even mention the former application (thankfully).

No, his complaint was about vendor-specific features.

Honestly, he's right. One of the reasons why I am excited about OpenCL (and its WebCL companion) is that it simply does not care about devices. Your host code manages the application but, when the jobs get dirty, it enlists help from an available accelerator by telling it to perform a kernel (think of it like function) and share the resulting chunk of memory.

This can be an AMD GPU. This can be an NVIDIA GPU. This can be an x86 CPU. This can be an FPGA. If the host has multiple, independent tasks, it can be several of the above (and in any combination). OpenCL really does not care.

The only limitation is whether tasks can effectively utilized all accelerators present in a machine. This might be the future we are heading for. This is the future I envisioned when I started designing my GPU-accelerated software rendering engine. In that case, I also envisioned the host code being abstracted into Javascript - because when you jump into platform agnosticism, jump in!

Obviously, to be fair, AMD is very receptive to open platforms. NVIDIA is less-so, and they are honest about that, but they conform to standards when it benefits their users more than their proprietary ones. I know that point can be taken multiple ways, and several will be hotly debated, but I really cannot find the words to properly narrow it.

Despite the fragmentation in features, there is one thing to be proud of as a PC gamer. You may have different experiences depending on the components you purchase.

But, at least you will always have an experience.

AMD Radeon R7 250X Spotted

Subject: General Tech, Graphics Cards | February 6, 2014 - 05:54 PM |
Tagged: amd, radeon, R7 250X

The AMD Radeon R7 250X has been mentioned on a few different websites over the last day, one of which was tweeted by AMD Radeon Italia. The SKU, which bridges the gap between the R7 250 and the R7 260, is expected to have a graphics processor with 640 Stream Processors, 40 TMUs, and 16 ROPs. It should be a fairly silent launch, with 1GB and 2GB versions appearing soon for an expected price of around 90 Euros, including VAT.

AMD-Sapphire-R7-250X-620x620.jpg

Image Credit: Videocardz.com

The GPU is expected to be based on the 28nm Oland chip design.

While it may seem like a short, twenty Euro jump from the R7 250 to the R7 260, the single-precision FLOP performance actually doubles from around 800 GFLOPs to around 1550 GFLOPs. If that metric is indicative of overall performance, there is quite a large gap to place a product within.

We still do not know official availability, yet.

Source: Videocardz

In a Galaxy far far away?

Subject: General Tech, Graphics Cards | February 6, 2014 - 10:44 AM |
Tagged:

*****update*******

We have more news and it is good for Galaxy fans.  The newest update states that they will be sticking around!

galaxy2.png

Good news GPU fans, the rumours that Galaxy's GPU team is leaving the North American market might be somewhat exaggerated, at least according to their PR Team. 

Streisand.png

This post appeared on Facebook and was quickly taken off again, perhaps for rewording or perhaps it is a perfect example of the lack of communication that [H]ard|OCP cites in their story.  Stay tuned as we update you as soon as we hear more.

box.jpg

Party like it's 2008!

[H]ard|OCP have been following Galaxy's business model closely for the past year as they have been seeing hints that the reseller just didn't get the North American market.  Their concern grew as they tried and failed to contact Galaxy at the end of 2013, emails went unanswered and advertising campaigns seemed to have all but disappeared.  Even with this reassurance that Galaxy is not planning to leave the North American market a lot of what [H] says rings true, with the stock and delivery issues Galaxy seemed to have over the past year there is something going on behind the scenes.  Still it is not worth abandoning them completely and turning this into a self fulfilling prophecy, they have been in this market for a long time and may just be getting ready to move forward in a new way.  On the other hand you might be buying a product which will not have warranty support in the future.

"The North American GPU market has been one that is at many times a swirling mass of product. For the last few years though, we have seen the waters calm in that regard as video card board partners have somewhat solidified and we have seen solid players emerge and keep the stage. Except now we seen one exit stage left."

Here is some more Tech News from around the web:

Tech Talk

Source: [H]ard|OCP

Focus on Mantle

Subject: General Tech, Graphics Cards | February 5, 2014 - 11:43 AM |
Tagged: gaming, Mantle, amd, battlefield 4

Now that the new Mantle enabled driver has been released several sites have had a chance to try out the new API to see what effect it has on Battlefield 4.  [H]ard|OCP took a stock XFX R9 290X paired with an i7-3770K and tested both single and multiplayer BF4 performance and the pattern they saw lead them to believe Mantle is more effective at relieving CPU bottlenecks than ones caused by the GPU.  The performance increases they saw were greater at lower resolutions than at high resolutions.  At The Tech Report another XFX R9 290X was paired with an A10-7850K and an i7-4770K and compared the systems performance in D3D as well as Mantle.  To make the tests even more interesting they also tested D3D with a 780Ti, which you should fully examine before deciding which performs the best.  Their findings were in line with [H]ard|OCP's and they made the observation that Mantle is going to offer the greatest benefits to lower powered systems, with not a lot to be gained by high end systems with the current version of Mantle.  Legit Reviews performed similar tests but also brought the Star Swarm demo into the mix, using an R7 260X for their GPU.  You can catch all of our coverage by clicking on the Mantle tag.

bf4-framegraph.jpg

"Does AMD's Mantle graphics API deliver on its promise of smoother gaming with lower-spec CPUs? We take an early look at its performance in Battlefield 4."

Here is some more Tech News from around the web:

Gaming

NitroWare Tests AMD's Photoshop OpenCL Claims

Subject: General Tech, Graphics Cards, Processors | February 4, 2014 - 11:08 PM |
Tagged: photoshop, opencl, Adobe

Adobe has recently enhanced Photoshop CC to accelerate certain filters via OpenCL. AMD contacted NitroWare with this information and claims of 11-fold performance increases with "Smart Sharpen" on Kaveri, specifically. The computer hardware site decided to test these claims on a Radeon HD 7850 using the test metrics that AMD provided them.

Sure enough, he noticed a 16-fold gain in performance. Without OpenCL, the filter's loading bar was on screen for over ten seconds; with it enabled, there was no bar.

Dominic from NitroWare is careful to note that an HD 7850 is significantly higher performance than an APU (barring some weird scenario involving memory transfers or something). This might mark the beginning of Adobe's road to sensible heterogeneous computing outside of video transcoding. Of course, this will also be exciting for AMD. While they cannot keep up with Intel, thread per thread, they are still a heavyweight in terms of total performance. With Photoshop, people might actually notice it.

AMD Catalyst 14.1 Beta Available Now. Now, Chewie, NOW!

Subject: General Tech, Graphics Cards | February 1, 2014 - 08:29 PM |
Tagged: Mantle, BF4, amd

AMD has released the Catalyst 14.1 Beta driver (even for Linux) but you should, first, read Ryan's review. This is a little less than what he expects in a Beta from AMD. We are talking about crashes to desktop and freezes while loading a map on a single-GPU configuration - and Crossfire is a complete wash in his experience (although AMD acknowledges the latter in their release notes). According to AMD, there is even the possibility that the Mantle version of Battlefield 4 will render with your APU and ignore your dedicated graphics.

amd-bf4-mantle.jpg

If you are determined to try Catalyst 14.1, however, it does make a first step into the promise of Mantle. Some situations show slightly lower performance than DirectX 11, albeit with a higher minimum framerate, while other results impress with double-digit percentage gains.

Multiplayer in BF4, where the CPU is more heavily utilized, seems to benefit the most (thankfully).

If you understand the risk (in terms of annoyance and frustration), and still want to give it a try, pick up the driver from AMD's support website. If not? Give it a little more time for AMD to whack-a-bug. At some point, there should be truly free performance waiting for you.

Press release after the break!

Source: AMD

Video Perspective: Free to Play Games on the A10-7850K vs. Intel Core i3 + GeForce GT 630

Subject: Graphics Cards, Processors | January 31, 2014 - 01:36 PM |
Tagged: 7850k, A10-7850K, amd, APU, gt 630, Intel, nvidia, video

As a follow up to our first video posted earlier in the week that looked at the A10-7850K and the GT 630 from NVIDIA in five standard games, this time we compare the A10-7850K APU against the same combination of the Intel and NVIDIA hardware in five of 2013's top free to play games.

UPDATE: I've had some questions about WHICH of the GT 630 SKUs were used in this testing.  Our GT 630 was this EVGA model that is based on 96 CUDA cores and a 128-bit DDR3 memory interface.  You can see a comparison of the three current GT 630 options on NVIDIA's website here.

If you are looking for more information on AMD's Kaveri APUs you should check out my review of the A8-7600 part as well our testing of Dual Graphics with the A8-7600 and a Radeon R7 250 card.

Need the Double D? XFX has the R9 290X for you!

Subject: Graphics Cards | January 30, 2014 - 12:15 PM |
Tagged: xfx, double d, R9 290X

The only thing more fun that an XFX Double Dissipation R9 290X is two of them in Crossfire, which is exactly what [H]ard|OCP just tested.  These cards sport the familiar custom cooler though they are not overclocked nor is [H] testing overclocking in this review though they will revisit this card in the future to do exactly that.  This review is about the Crossfire performance of these cards straight out of the box and it is rather impressive.  When [H] tested 4K performance they could feel the frame pacing improvements the new driver gives as well as seeing these cards outperform the SLI'd GTX 780 Ti cards in every test; though not always by a huge margin.  The current selling price of these cards is about $100 above the MSRP but still come in cheaper than the current NVIDIA card; these particular cards really show off what Hawaii can be capable of.

1390541180GVyEsgrEO9_1_1.jpg

"Take two custom XFX R9 290X Double Dissipation Edition video cards, enable CrossFire, and let your jaw hit the floor. We will test this combination against the competition in a triple-display Eyefinity setup as well as 4K Ultra HD display gaming. We will find out if custom cards hold any advantage over the reference designed R9 290X."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Video Perspective: 2013 Games on the A10-7850K vs. Intel Core i3 + GeForce GT 630

Subject: Graphics Cards, Processors | January 29, 2014 - 12:44 PM |
Tagged: video, nvidia, Intel, gt 630, APU, amd, A10-7850K, 7850k

The most interesting aspect of the new Kaveri-based APUs from AMD, in particularly the A10-7850K part, is how it improves mainstream gaming performance.  AMD has always stated that these APUs shake up the need for low-cost discrete graphics and when we got the new APU in the office we did a couple of quick tests to see how much validity there to that claim.

In this short video we compare the A10-7850K APU against a combination of the Intel Core i3-4330 and GeForce GT 630 discrete graphics card in five of 2013's top PC releases.  I think you'll find the results pretty interesting.

UPDATE: I've had some questions about WHICH of the GT 630 SKUs were used in this testing.  Our GT 630 was this EVGA model that is based on 96 CUDA cores and a 128-bit DDR3 memory interface.  You can see a comparison of the three current GT 630 options on NVIDIA's website here.

If you are looking for more information on AMD's Kaveri APUs you should check out my review of the A8-7600 part as well our testing of Dual Graphics with the A8-7600 and a Radeon R7 250 card.

Curious where all of those AMD R9 graphics cards are going?

Subject: Graphics Cards | January 29, 2014 - 12:00 PM |
Tagged: R9 290X, r9 290, r9 270, mining, litecoin, dogecoin, amd

I know we have posted about this a few times on PC Perspective and have discussed it on the PC Perspective Podcast as well, but if you are curious as to why the prices of AMD's latest generation of R9 graphics cards have skyrocketed, look no further than this enterprising consumer and his/her Dogecoin mining rig.  

riser1.jpg

riser2.jpg

What you are looking at are six MSI Gaming series R9 270 cards running through the aid of PCI Express to USB 3.0 riser cards.  

riser3.jpg

You can see the rest of the photos here and if you want to see more of this kind of abuse of graphics cards and you can also check out the Litecoin Mining subreddit where this was sourced.

Source: Imgur

These Aren't the Drivers You're Looking For. AMD 13.35 Leak.

Subject: General Tech, Graphics Cards | January 28, 2014 - 04:00 PM |
Tagged: Mantle, BF4, amd

A number of sites have reported on Toshiba's leak of the Catalyst 13.35 BETA driver. Mantle and TrueAudio support highlight its rumored changelog. Apparently Ryan picked it up, checked it out, and found that it does not have the necessary DLLs included. I do not think he has actual Mantle software to test against, and I am not sure how he knew what libraries Mantle requires, but this package apparently does not include them. Perhaps it was an incomplete build?

amd-not-drivers.jpg

Sorry folks, unlike the above image, these are not the drivers you are looking for.

The real package should be coming soon, however. Recent stories which reference EA tech support (at this point we should all know better) claim that the Mantle update for Battlefield 4 will be delayed until February. Fans reached out to AMD's Robert Hallock who responded that it was, "Categorically not true". It sounds like AMD is planning on releasing at least their end of the patch before Friday ends.

This is looking promising, at least. Something is being done behind the scenes.

Is custom air cooling enough for the R9 290X?

Subject: Graphics Cards | January 23, 2014 - 03:01 PM |
Tagged: amd, asus, R9 290X DC2 OC, overclocking

[H]ard|OCP has had a chance to take the time to really see how well the R9 290X can overclock, as frequencies get lower as heat increases a quick gaming session is not enough to truly represent the performance of this new GPU.  The ASUS R9 290X DirectCU II OC offers a custom cooler which demonstrated the overclocking potential of this GPU on air cooling, or at least this specific GPU as we have seen solid evidence of performance variability with 28nm Hawaii GPUs.  You should read the full review to truly understand what they saw when overclocking but the good news is that once they found a sweet spot for fan speed and voltage the GPU remained at the frequency they chose.  Unfortunately at 1115MHz the overclock they managed was only 75MHz higher than the cards default speed and while that could beat a stock GTX 780 Ti, the NVIDIA product overclocked higher and proved the superior card. 

1389621638LtCMz6ijr3_1_1.jpg

"We will take the ASUS R9 290X DC2 OC custom AMD R9 290X based video card and for the first time see how well the 290X can overclock. We will also for the first time compare it to an overclocked GeForce GTX 780 Ti video card head-to-head and see who wins when overclocking is accounted for."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Even More NVIDIA ShadowPlay Features with 1.8.2

Subject: General Tech, Graphics Cards | January 23, 2014 - 12:29 AM |
Tagged: ShadowPlay, nvidia, geforce experience

NVIDIA has been upgrading their GeForce Experience just about once per month, on average. Most of their attention has been focused on ShadowPlay which is their video capture and streaming service for games based on DirectX. GeForce Experience 1.8.1 brought streaming to Twitch and the ability to overlay the user's webcam.

This time they add a little bit more control in how ShadowPlay records.

nvidia-shadowplay-jan2014.png

Until this version, users could choose between "Low", "Medium", and "High" quality stages. GeForce Experence 1.8.2 adds "Custom" which allows manual control over resolution, frame rate, and bit rate. NVIDIA wants to makes it clear: frame rate controls the number of images per second and bit rate controls the file size per second. Reducing the frame rate without adjusting the bit rate will result in a file of the same size (just with better quality per frame).

Also with this update, NVIDIA allows users to set a push-to-talk key. I expect this will be mostly useful for Twitch streaming in a crowded dorm or household. Only transmitting your voice when you have something to say prevents someone else from accidentally transmitting theirs globally and instantaneously.

GeForce Experience 1.8.2 is available for download at the GeForce website. Users with a Fermi-based GPU will no longer be pushed GeForce Experience (because it really does not do anything for those graphics cards). The latest version can always be manually downloaded, however.

Is AMD Showing Decent Recovery?

Subject: General Tech, Graphics Cards, Processors | January 22, 2014 - 06:41 PM |
Tagged: amd

AMD had a decent quarter and close to a profitable year as a whole. For the quarter ending on December 28th, the company managed $89 million dollars in profits. This accounts for interest payments on loans and everything else. The whole year averaged to a $103 million dollar gain in operating income although that still works out to a loss of $74 million (for the year) all things considered. That said, a quarterly gain of $89 million versus an annual loss of $74 million. One more quarter would forgive the whole year.

amd-new2.png

This is a hefty turn-around from their billion dollar operating loss of last year.

This gain was led by Graphics and Visual Solutions. While Computing Solutions revenue has declined, the graphics team has steadily increased in both revenue and profits. Graphics and Visual Solutions are in charge of graphics processors as well as revenue from the game console manufacturers. Even then, their processor division is floating just below profitability.

Probably the best news for AMD is that they plan the next four quarters to each be profitable. Hopefully this means that there are no foreseen hurdles in the middle of their marathon.

Source: Ars Technica

(Phoronix) Intel Haswell iGPU Linux Performance in a Slump?

Subject: Editorial, General Tech, Graphics Cards | January 21, 2014 - 11:12 PM |
Tagged: linux, intel hd graphics, haswell

Looking through this post by Phoronix, it would seem that Intel had a significant regression in performance on Ubuntu 14.04 with the Linux 3.13 kernel. In some tests, HD 4600 only achieves about half of the performance recorded on the HD 4000. I have not been following Linux iGPU drivers and it is probably a bit late to do any form of in-depth analysis... but yolo. I think the article actually made a pretty big mistake and came to the exact wrong conclusion.

Let's do this!

7-TuxGpu.png

According to the article, in Xonotic v0.7, Ivy Bridge's Intel HD 4000 scores 176.23 FPS at 1080p on low quality settings. When you compare this to Haswell's HD 4600 and its 124.45 FPS result, this seems bad. However, even though they claim this as a performance regression, they never actually post earlier (and supposedly faster) benchmarks.

So I dug one up.

Back in October, the same test was performed with the same hardware. The Intel HD 4600 was not significantly faster back then, rather it was actually a bit slower with a score of 123.84 FPS. The Intel HD 4000 managed 102.68 FPS. Haswell did not regress between that time and Ubuntu 14.04 on Linux 3.13, Ivy Bridge received a 71.63% increase between then and Ubuntu 14.04 on Linux 3.13.

Of course, there could have been a performance increase between October and now and that recently regressed for Haswell... but I could not find those benchmarks. All I can see is that Haswell has been quite steady since October. Either way, that is a significant performance increase on Ivy Bridge since that snapshot in time, even if Haswell had a rise-and-fall that I was unaware of.

Source: Phoronix