Author:
Manufacturer: ARM

ARM is Serious About Graphics

Ask most computer users from 10 years ago who ARM is, and very few would give the correct answer.  Some well informed people might mention “Intel” and “StrongARM” or “XScale”, but ARM remained a shadowy presence until we saw the rise of the Smartphone.  Since then, ARM has built up their brand, much to the chagrin of companies like Intel and AMD.  Partners such as Samsung, Apple, Qualcomm, MediaTek, Rockchip, and NVIDIA have all worked with ARM to produce chips based on the ARMv7 architecture, with Apple being the first to release the first ARMv8 (64 bit) SOCs.  The multitude of ARM architectures are likely the most shipped chips in the world, going from very basic processors to the very latest Apple A7 SOC.

t700_01.jpg

The ARMv7 and ARMv8 architectures are very power efficient, yet provide enough performance to handle the vast majority of tasks utilized on smartphones and tablets (as well as a handful of laptops).  With the growth of visual computing, ARM also dedicated itself towards designing competent graphics portions of their chips.  The Mali architecture is aimed at being an affordable option for those without access to their own graphics design groups (NVIDIA, Qualcomm), but competitive with others that are willing to license their IP out (Imagination Technologies).

ARM was in fact one of the first to license out the very latest graphics technology to partners in the form of the Mali-T600 series of products.  These modules were among the first to support OpenGL ES 3.0 (compatible with 2.0 and 1.1) and DirectX 11.  The T600 architecture is very comparable to Imagination Technologies’ Series 6 and the Qualcomm Adreno 300 series of products.  Currently NVIDIA does not have a unified mobile architecture in production that supports OpenGL ES 3.0/DX11, but they are adapting the Kepler architecture to mobile and will be licensing it to interested parties.  Qualcomm does not license out Adreno after buying that group from AMD (Adreno is an anagram of Radeon).

Click to read the entire article here!

(Nitroware) AMD Radeon R9 290X Discussion

Subject: General Tech, Graphics Cards | October 28, 2013 - 10:21 PM |
Tagged: R9 290X, amd

Hawaii launches and AMD sells their inventory (all of it, in many cases). The Radeon R9 290X brought reasonably Titan-approaching performance to the $550-600 USD dollar value. Near and dear to our website, AMD also took the opportunity to address much of the Crossfire and Eyefinity frame pacing issues.

amd-gpu14-06.png

Nitroware also took a look at the card... from a distance because they did not receive a review unit. His analysis was based on concepts, such as revisions to AMD design over the life of their Graphics Core Next architecture. The discussion goes back to the ATI Rage series of fixed function hardware and ends with a comparisson between the Radeon HD 7900 "Tahiti" and the R9 290X "Hawaii".

Our international viewers (or even curious North Americans) might also like to check out the work Dominic undertook compiling regional pricing and comparing those values to currency conversion data. There is more to an overview (or review) than benchmarks.

Source: NitroWare

NVIDIA Drops GTX 780, GTX 770 Prices, Announces GTX 780 Ti Price

Subject: Graphics Cards | October 28, 2013 - 09:29 AM |
Tagged: nvidia, kepler, gtx 780 ti, gtx 780, gtx 770, geforce

A lot of news coming from the NVIDIA camp today, including some price drops and price announcements. 

First up, the high-powered GeForce GTX 780 is getting dropped from $649 to $499, a $150 savings that will bring the GTX 780 into line with the competition of AMD's new Radeon R9 290X launched last week

Next, the GeForce GTX 770 2GB is going to drop from $399 to $329 to help it compete more closely with the R9 280X. 

r9290x.JPG

Even you weren't excited about the R9 290X, you have to be excited by competition.

In a surprising turn of events, NVIDIA is now the company with the great bundle deal with GPUs as well!  Starting today you'll be able to get a free copy of Batman: Arkham Origins, Splinter Cell: Blacklist and Assassin's Creed IV: Black Flag with the GeForce GTX 780 Ti, GTX 780 and GTX 770.  If you step down to the GTX 760 or 660 you'll lose out on the Batman title.

SHIELD discounts are available as well: $100 off you buy the upper tier GPUs and $50 off if you but the lower tier. 

UPDATE: NVIDIA just released a new version of GeForce Experience that enabled ShadowPlay, the ability to use Kepler GPUs to record game play in the background with almost no CPU/system ovehead.  You can see Scott's initial impressions of the software right here; it seems like its going to be a pretty awesome feature.

bundle.png

Need more news?  The yet-to-be-released GeForce GTX 780 Ti is also getting a price - $699 based on the email we just received.  And it will be available starting November 7th!!

With all of this news, how does it change our stance on the graphics market?  Quite a bit in fact.  The huge price drop on the GTX 780, coupled with the 3-game bundle means that NVIDIA is likely offering the better hardware/software combo for gamers this fall.  Yes, the R9 290X is likely still a step faster, but now you can get the GTX 780, three great games and spend $50 less. 

The GTX 770 is now poised to make a case for itself against the R9 280X as well with its $70 drop.  The R9 280X / HD 7970 GHz Edition was definitely a better option with its $100 price delta but with only $30 separating the two competing cards, and the three free games, again the advantage will likely fall to NVIDIA.

Finally, the price point of the GTX 780 Ti is interesting - if NVIDIA is smart they are pricing it based on comparable performance to the R9 290X from AMD.  If that is the case, then we can guess the GTX 780 Ti will be a bit faster than the Hawaii card, while likely being quieter and using less power too.  Oh, and again, the three game bundle. 

NVIDIA did NOT announce a GTX TITAN price drop which might surprise some people.  I think the answer as to why will be addressed with the launch of the GTX 780 Ti next month but from what I was hearing over the last couple of weeks NVIDIA can't make the cards fast enough to satisfy demand so reducing margin there just didn't make sense. 

NVIDIA has taken a surprisingly aggressive stance here in the discrete GPU market.  The need to address and silence critics that think the GeForce brand is being damaged by the AMD console wins is obviously potent inside the company.  The good news for us though, and the gaming community as a whole, is that just means better products and better value for graphics card purchases this holiday.

NVIDIA says these price drops will be live by tomorrow.  Enjoy!

Source: NVIDIA
Manufacturer: NVIDIA

It impresses.

ShadowPlay is NVIDIA's latest addition to their GeForce Experience platform. This feature allows their GPUs, starting with Kepler, to record game footage either locally or stream it online through Twitch.tv (in a later update). It requires Kepler GPUs because it is accelerated by that hardware. The goal is to constantly record game footage without any noticeable impact to performance; that way, the player can keep it running forever and have the opportunity to save moments after they happen.

Also, it is free.

shadowplay-vs.jpg

I know that I have several gaming memories which come unannounced and leave undocumented. A solution like this is very exciting to me. Of course a feature on paper not the same as functional software in the real world. Thankfully, at least in my limited usage, ShadowPlay mostly lives up to its claims. I do not feel its impact on gaming performance. I am comfortable leaving it on at all times. There are issues, however, that I will get to soon.

This first impression is based on my main system running the 331.65 (Beta) GeForce drivers recommended for ShadowPlay.

  • Intel Core i7-3770, 3.4 GHz
  • NVIDIA GeForce GTX 670
  • 16 GB DDR3 RAM
  • Windows 7 Professional
  • 1920 x 1080 @ 120Hz.
  • 3 TB USB3.0 HDD (~50MB/s file clone).

The two games tested are Starcraft II: Heart of the Swarm and Battlefield 3.

Read on to see my thoughts on ShadowPlay, the new Experience on the block.

Fall of a Titan, check out the R9 290X

Subject: Graphics Cards | October 24, 2013 - 02:38 PM |
Tagged: radeon, R9 290X, kepler, hawaii, amd

If you didn't stay up to watch our live release of the R9 290X after the podcast last night you missed a chance to have your questions answered but you will be able to watch the recording later on.  The R9 290X arrived today, bringing 4K and Crossfire reviews as well as single GPU testing on many a site including PCPer of course.  You don't just have to take our word for it, [H]ard|OCP was also putting together a review of AMD's Titan killer.  Their benchmarks included some games we haven't adopted yet such as ARMA III.  Check out their results and compare them to ours, AMD really has a winner here.

1382088059a47QS23bNQ_4_8_l.jpg

"AMD is launching the Radeon R9 290X today. The R9 290X represents AMD's fastest single-GPU video card ever produced. It is priced to be less expensive than the GeForce GTX 780, but packs a punch on the level of GTX TITAN. We look at performance, the two BIOS mode options, and even some 4K gaming."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: AMD

A bit of a surprise

Okay, let's cut to the chase here: it's late, we are rushing to get our articles out, and I think you all would rather see our testing results NOW rather than LATER.  The first thing you should do is read my review of the AMD Radeon R9 290X 4GB Hawaii graphics card which goes over the new architecture, new feature set, and performance in single card configurations. 

Then, you should continue reading below to find out how the new XDMA, bridge-less CrossFire implementation actually works in both single panel and 4K (tiled) configurations.

IMG_1802.JPG

 

A New CrossFire For a New Generation

CrossFire has caused a lot of problems for AMD in recent months (and a lot of problems for me as well).  But, AMD continues to make strides in correcting the frame pacing issues associated with CrossFire configurations and the new R9 290X moves the bar forward.

Without the CrossFire bridge connector on the 290X, all of the CrossFire communication and data transfer occurs over the PCI Express bus that connects the cards to the entire system.  AMD claims that this new XDMA interface was designed for Eyefinity and UltraHD resolutions (which were the subject of our most recent article on the subject).  By accessing the memory of the GPU through PCIe AMD claims that it can alleviate the bandwidth and sync issues that were causing problems with Eyefinity and tiled 4K displays.

Even better, this updated version of CrossFire is said to compatible with the frame pacing updates to the Catalyst driver to improve multi-GPU performance experiences for end users.

IMG_1800.JPG

When an extra R9 290X accidentally fell into my lap, I decided to take it for a spin.  And if you have followed my graphics testing methodology in the past year then you'll understand the important of these tests.

Continue reading our article Frame Rating: AMD Radeon R9 290X CrossFire and 4K Preview Testing!!

Author:
Manufacturer: AMD

A slightly new architecture

Note: We also tested the new AMD Radeon R9 290X in CrossFire and at 4K resolutions; check out that full Frame Rating story right here!!

Last month AMD brought media, analysts, and customers out to Hawaii to talk about a new graphics chip coming out this year.  As you might have guessed based on the location: the code name for this GPU was in fact, Hawaii. It was targeted at the high end of the discrete graphics market to take on the likes of the GTX 780 and GTX TITAN from NVIDIA. 

Earlier this month we reviewed the AMD Radeon R9 280X, R9 270X, and the R7 260X. None of these were based on that new GPU.  Instead, these cards were all rebrands and repositionings of existing hardware in the market (albeit at reduced prices).  Those lower prices made the R9 280X one of our favorite GPUs of the moment as it offers performance per price points currently unmatched by NVIDIA.

But today is a little different, today we are talking about a much more expensive product that has to live up to some pretty lofty goals and ambitions set forward by the AMD PR and marketing machine.  At $549 MSRP, the new AMD Radeon R9 290X will become the flagship of the Radeon brand.  The question is: to where does that ship sail?

 

The AMD Hawaii Architecture

To be quite upfront about it, the Hawaii design is very similar to that of the Tahiti GPU from the Radeon HD 7970 and R9 280X cards.  Based on the same GCN (Graphics Core Next) architecture AMD assured us would be its long term vision, Hawaii ups the ante in a few key areas while maintaining the same core.

01.jpg

Hawaii is built around Shader Engines, of which the R9 290X has four.  Each of these includes 11 CU (compute units) which hold 4 SIMD arrays each.  Doing the quick math brings us to a total stream processor count of 2,816 on the R9 290X. 

Continue reading our review of the AMD Radeon R9 290X 4GB Graphics Card!!

AMD Aggressively Targets Professional GPU Market

Subject: General Tech, Graphics Cards | October 23, 2013 - 07:30 PM |
Tagged: amd, firepro

Currently AMD holds 18% market share with their FirePro line of professional GPUs. This compares to NVIDIA who owns 81% with Quadro. I assume the "other" category is the sum of S3 and Matrox who, together, command 1% of the professional market (just the professional market)

According to Jon Peddie of JPR, as reported by X-Bit Labs, AMD intends to wrestle back revenue left unguarded for NVIDIA. "After years of neglect, AMD’s workstation group, under the tutorage of Matt Skyner, has the backing and commitment of top management and AMD intends to push into the market aggressively." They have already gained share this year.

W600-card.jpg

During AMD's 3rd Quarter (2013) earnings call, CEO Rory Read outlined the importance of the professional graphics market.

We also continue to make steady progress in another of growth businesses in the third quarter as we delivered our fifth consecutive quarter of revenue and share growth in the professional graphics area. We believe that we can continue to gain share in this lucrative part of the GPU market based on our product portfolio, design wins in flight, and enhanced channel programs.

On the same conference call (actually before and after the professional graphics sound bite), Rory noted their renewed push into the server and embedded SoC markets with 64-bit x86 and 64-bit ARM processors. They will be the only company manufacturing both x86 and ARM solutions which should be an interesting proposition for an enterprise in need of both. Why deal with two vendors?

Either way, AMD will probably be refocusing on the professional and enterprise markets for the near future. For the rest of us, this hopefully means that AMD has a stable (and confident) roadmap in the processor and gaming markets. If that is the case, a profitable Q3 is definitely a good start.

Taking the ASUS R9 280X DirectCU II TOP as far as it can go

Subject: Graphics Cards | October 23, 2013 - 06:20 PM |
Tagged: amd, overclocking, asus, ASUS R9 280X DirectCU II TOP, r9 280x

Having already seen what the ASUS R9 280X DirectCU II TOP can do at default speeds the obvious next step, once they had time to fully explore the options, was for [H]ard|OCP to see just how far this GPU can overclock.  To make a long story short, they went from a default clock of  1070MHz up to 1230MHz and pushed the RAM to 6.6GHz from 6.4GHz though the voltage needed to be bumped from 1.2v to 1.3v.  The actual frequencies are nowhere near as important as the effect on gameplay though, to see those results you will have to click through to the full article.

1381749747uEpep2oCxY_1_2.gif

"We take the new ASUS R9 280X DirectCU II TOP video card and find out how high it will overclock with GPU Tweak and voltage modification. We will compare performance to an overclocked GeForce GTX 770 and find out which card comes out on top when pushed to its overclocking limits."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

NVIDIA Launches WHQL Drivers for Battlefield and Batman

Subject: General Tech, Graphics Cards | October 23, 2013 - 12:21 AM |
Tagged: nvidia, graphics drivers, geforce

Mid-June kicked up a storm of poop across the internet when IGN broke the AMD optimizations for Frostbite 3. It was reported that NVIDIA would not receive sample code for those games until after they launched. The article was later updated with a statement from AMD: "... the AMD Gaming Evolved program undertakes no efforts to prevent our competition from optimizing for games before their release."

Now, I assume, the confusion was caused by then-not-announced Mantle.

nvidia-geforce.png

And, as it turns out, NVIDIA did receive the code for Battlefield 4 prior to launch. Monday, the company launched their 331.58 WHQL-certified drivers which are optimized for Batman: Arkham Origins and Battlefield 4. According to the release notes, you should even be able to use SLi out of the gate. If, on the other hand, you are a Civilization V player: HBAO+ should enhance your shadowing.

They also added a DX11 SLi profile for Watch Dogs... awkwarrrrrd.

To check out the blog at GeForce.com for a bit more information, check out the release notes, or just head over to the drivers page. If you have GeForce Experience installed, it probably already asked you to update.

Source: NVIDIA