Review Index:
Feedback

Power Consumption Concerns on the Radeon RX 480

Author:
Manufacturer: AMD

Too much power to the people?

UPDATE (7/1/16): I have added a third page to this story that looks at the power consumption and power draw of the ASUS GeForce GTX 960 Strix card. This card was pointed out by many readers on our site and on reddit as having the same problem as the Radeon RX 480. As it turns out...not so much. Check it out!

UPDATE 2 (7/2/16): We have an official statement from AMD this morning.

As you know, we continuously tune our GPUs in order to maximize their performance within their given power envelopes and the speed of the memory interface, which in this case is an unprecedented 8Gbps for GDDR5. Recently, we identified select scenarios where the tuning of some RX 480 boards was not optimal. Fortunately, we can adjust the GPU's tuning via software in order to resolve this issue. We are already testing a driver that implements a fix, and we will provide an update to the community on our progress on Tuesday (July 5, 2016).

Honestly, that doesn't tell us much. And AMD appears to be deflecting slightly by using words like "some RX 480 boards". I don't believe this is limited to a subset of cards, or review samples only. AMD does indicate that the 8 Gbps memory on the 8GB variant might be partially to blame - which is an interesting correlation to test out later. The company does promise a fix for the problem via a driver update on Tuesday - we'll be sure to give that a test and see what changes are measured in both performance and in power consumption.

The launch of the AMD Radeon RX 480 has generally been considered a success. Our review of the new reference card shows impressive gains in architectural efficiency, improved positioning against NVIDIA’s competing parts in the same price range as well as VR-ready gaming performance starting at $199 for the 4GB model. AMD has every right to be proud of the new product and should have this lone position until the GeForce product line brings a Pascal card down into the same price category.

If you read carefully through my review, there was some interesting data that cropped up around the power consumption and delivery on the new RX 480. Looking at our power consumption numbers, measured directly from the card, not from the wall, it was using slightly more than the 150 watt TDP it was advertised as. This was done at 1920x1080 and tested in both Rise of the Tomb Raider and The Witcher 3.

When overclocked, the results were even higher, approaching the 200 watt mark in Rise of the Tomb Raider!

A portion of the review over at Tom’s Hardware produced similar results but detailed the power consumption from the motherboard PCI Express connection versus the power provided by the 6-pin PCIe power cable. There has been a considerable amount of discussion in the community about the amount of power the RX 480 draws through the motherboard, whether it is out of spec and what kind of impact it might have on the stability or life of the PC the RX 480 is installed in.

As it turns out, we have the ability to measure the exact same kind of data, albeit through a different method than Tom’s, and wanted to see if the result we saw broke down in the same way.

Our Testing Methods

This is a complex topic so it makes sense to detail the methodology of our advanced power testing capability up front.

How do we do it? Simple in theory but surprisingly difficult in practice, we are intercepting the power being sent through the PCI Express bus as well as the ATX power connectors before they go to the graphics card and are directly measuring power draw with a 10 kHz DAQ (data acquisition) device. A huge thanks goes to Allyn for getting the setup up and running. We built a PCI Express bridge that is tapped to measure both 12V and 3.3V power and built some Corsair power cables that measure the 12V coming through those as well.

The result is data that looks like this.

View Full Size

What you are looking at here is the power measured from the GTX 1080. From time 0 to time 8 seconds or so, the system is idle, from 8 seconds to about 18 seconds Steam is starting up the title. From 18-26 seconds the game is at the menus, we load the game from 26-39 seconds and then we play through our benchmark run after that.

There are four lines drawn in the graph, the 12V and 3.3V results are from the PCI Express bus interface, while the one labeled PCIE is from the PCIE power connection from the power supply to the card. We have the ability to measure two power inputs there but because the GTX 1080 only uses a single 8-pin connector, there is only one shown here. Finally, the blue line is labeled total and is simply that: a total of the other measurements to get combined power draw and usage by the graphics card in question.

From this we can see a couple of interesting data points. First, the idle power of the GTX 1080 Founders Edition is only about 7.5 watts. Second, under a gaming load of Rise of the Tomb Raider, the card is pulling about 165-170 watts on average, though there are plenty of intermittent, spikes. Keep in mind we are sampling the power at 1000/s so this kind of behavior is more or less expected.

Different games and applications impose different loads on the GPU and can cause it to draw drastically different power. Even if a game runs slowly, it may not be drawing maximum power from the card if a certain system on the GPU (memory, shaders, ROPs) is bottlenecking other systems.

One interesting note on our data compared to what Tom’s Hardware presents – we are using a second order low pass filter to smooth out the data to make it more readable and more indicative of how power draw is handled by the components on the PCB. Tom’s story reported “maximum” power draw at 300 watts for the RX 480 and while that is technically accurate, those figures represent instantaneous power draw. That is interesting data in some circumstances, and may actually indicate other potential issues with excessively noisy power circuitry, but to us, it makes more sense to sample data at a high rate (10 kHz) but to filter it and present it more readable way that better meshes with the continuous power delivery capabilities of the system.

View Full Size

Image source: E2E Texas Instruments

An example of instantaneous voltage spikes on power supply phase changes

Some gamers have expressed concern over that “maximum” power draw of 300 watts on the RX 480 that Tom’s Hardware reported. While that power measurement is technically accurate, it doesn’t represent the continuous power draw of the hardware. Instead, that measure is a result of a high frequency data acquisition system that may take a reading at the exact moment that a power phase on the card switches. Any DC switching power supply that is riding close to a certain power level is going to exceed that on the leading edges of phase switches for some minute amount of time. This is another reason why our low pass filter on power data can help represent real-world power consumption accurately. That doesn’t mean the spikes they measure are not a potential cause for concern, that’s just not what we are focused on with our testing.

Continue reading our analysis of the power consumption concerns surrounding the Radeon RX 480!!

Setting up the Specification

Understanding complex specifications like PCI Express can be difficult, even for those of us working on hardware evaluation every day. Doing some digging, we were able to find a table that breaks things down for us.

View Full Size

We are dealing with high power PCI Express devices so we are only directly concerned with the far right column of data. For a rated 75 watt PCI Express slot, power consumption and current draw is broken down into two categories: +12V and +3.3V. The +3.3V line has a voltage tolerance of +/- 9% (3.03V – 3.597V) and has a 3A maximum current draw. Taking the voltage at the nominal 3.3V level, that results in a maximum power draw of 9.9 watts.

The +12V rail has a tolerance of +/- 8% (11.04V – 12.96V) and a maximum current draw of 5.5A, resulting in peak +12V power draw of 66 watts. The total for both +12V and +3.3V rails is 75.9 watts but noting from footer 4 at the bottom of the graph, the total should never exceed 75 watts, with either rail not extending past their current draw maximums.

Diving into the current

Let’s take a look at the data generated through our power testing and step through the information, piece by piece, so we can all understand what is going on. The graphs built by LabVIEW SignalExpress have a habit of switching around the colors of data points, so pay attention to the keys for each image.

View Full Size

Rise of the Tomb Raider (1080p) power draw, RX 480, Click to Enlarge

This graph shows Rise of the Tomb Raider running at 1080p. The yellow line up top is the total combined power consumption (in watts) calculated by adding up the power (12V and 3.3V) from the motherboard PCIe slot and the 6-pin PCIe power cable (12V). The line is hovering right at 150 watts, though we definitely see some spiking above that to 160 watts with an odd hit above 165 watts.

There is a nearly even split between the power draw of the 6-pin power connector and the motherboard PCIe connection. The blue line shows slightly higher power draw of the PCIe power cable (which is forgivable, as PSU 6-pin and 8-pin supplies are generally over-built) while the white line is the wattage drawn from the motherboard directly.

Below that is the red line for 3.3V power (only around 4-5 watts generally) and the green line (not used, only when the GPU has two 6/8-pin power connections).

View Full Size

Rise of the Tomb Raider (1080p) power draw, RX 480, Click to Enlarge

In this shot, we are using the same data but zooming on a section towards the beginning. It is easier to see our power consumption results, with the highest spike on total power nearly reaching the 170-watt mark. Keep in mind this is NOT with any kind of overclocking applied – everything is running at stock here. The blue line hits 85 watts and the white line (motherboard power) hits nearly 80 watts. PCI Express specifications state that the +12V power output through a motherboard connection shouldn’t exceed 66 watts (actually it is based on current, more on that later). Clearly, the RX 480 is beyond the edge of these limits but not to a degree where we would be concerned.

View Full Size

The Witcher 3 (1080p) power draw, RX 480, Click to Enlarge

The second game I tested before the controversy blew up was The Witcher 3, and in my testing this was a bigger draw on power than Rise of the Tomb Raider. When playing the game at 1080p it was averaging 155+ watts towards the end of the benchmark run and spiking to nearly 165 watts in a couple of instances.

View Full Size

The Witcher 3 (1080p) power draw, RX 480, Click to Enlarge

Zooming in a bit on the data we get more detail on the individual power draw from the motherboard and the PCIe 6-pin cable. The white line of the MB +12V power is going over 75 watts, but not dramatically so, while the +3.3V power is hovering just under 5 watts, for a total of ~80 watts. Power over the 6-pin connector goes above 80 watts here as well.

Video News


July 3, 2016 | 02:10 PM - Posted by Anonymous (not verified)

ITS GONNA POP

July 3, 2016 | 04:21 PM - Posted by WithMyGoodEyeClosed (not verified)

Allyn, Ryan, Josh, Jeremy... team.

How much is the maximum power draw of the fan of the reference card?

Would it be possible to make a bypass/extension from the fan power cord to connect to one of the motherbord's pwm fan connectors instead of the card's PCB one?

If also a motherboard's BIOS update would be applied to read the temps from the GPU's diode and control this motherboard's pwm connector, wouldn't this be a way to avoid an eventual downgrade of the specs via a driver fix? Or a complementary/optional aid to it?

At least, it would be a non-expensive "hardware" solution.

Could you guys please check the power draw of the stock fan?

Just a thought.

We all need AMD.

From Portugal, with respect.

July 3, 2016 | 04:28 PM - Posted by JEREMY O (not verified)

Well that does it I'll wait for the sapphire nitro custom board. I've got a feeling the only way to fix this in software is to throttle the cards clockspeed and or voltage in the high power state and reduce performance and thereby consumption on reference cards.

This is my educated guess of what amd will do. I do not claim to be an engineer. I'm a math guy..

Thank for explaining Mr. John Pombrio.

Sapphire is making a custom pcb with an eight pin.. I think they will solder the traces right..

July 3, 2016 | 05:19 PM - Posted by WithMyGoodEyeClosed (not verified)

In-depth analysis of 1070/1080 PCBs and RX480 PCB.

FE 10[78]0: https://www.youtube.com/watch?v=OsWJLKlDFCQ

Ref. RX480: https://www.youtube.com/watch?v=qG2e-v94L4M

Nothing to hide.

Very good channel btw.

July 3, 2016 | 09:17 PM - Posted by Anonymous (not verified)

Outdated, watch his twitch stream where he actually verifies what is attached to what.

The 6-pin is out of spec. missing sense pin. It's actually electrically setup to allow 8-pin power(3 12v and 3 ground).

3 VRMS are attached to 6-pin and 3 VRMs are attached to PCI-E bus. Totally isolated from each other. Even if they lower the power usage on PCI-E bus so its not overloading the PCI-E bus contacts( ATX12 v 2.2 and newer power supplies and motherboards, spec was around 2006 updated the ATX pins to 9A each from 6A, and 20+4 ATX MB power has 2 12v lines(the +4 adds one 12v line, done specifically for GPUs).. 2 x 12 x 9 is 216W.. alot more than 66W allowed for the 5 12v (1.1A each) contacts on the PCI-E bus),

anyway, back on topic, even if they lower the PCI-E bus power so it does not overload the contacts and than pull 150W from 6-pin, none of that extra power can go to whatever is attached to the 3 VRMs tied to PCI-E bus. They are isolated from each other.

July 3, 2016 | 10:22 PM - Posted by WithMyGoodEyeClosed (not verified)

Yes I see.

Removing the strain from the fan would only aliviate the draw from the PSU.

On this video stream:

https://www.youtube.com/watch?v=E_E2eqtm4Yw

... he explains how the 6-pin is in fact functioning like an 8-pin and why AMD did it like that (a smart guess).

At 13:04 he finds the solution for revision B. I think.

This card is a real beast... with to much "character".

Lisa Su should get this guy "on-board". Or at least offer him a new Fury-X :-)

July 3, 2016 | 07:30 PM - Posted by StephanS

Sabotage or incompetence ?
Any way you look at it, someone at AMD let this fiasco happen,
and ultimately raja kadori should be fired for letting something so easy to catch go unnoticed. AMD work 5 years to rebuild their image, and this was their one chance to do it... that opportunity is now gone for ever. AMD is even more of a joke to the tech community... Polaris reference polaris board is botched so badly it makes you wonder if anyone, anyone at AMD know what they are doing.

(and 'wattman'.. lame name and execution... show AMD lack of expertise in software design)

BTW, didn't AMD hired a known GPU tester not long ago from TechReport ? How come the board was NOT tested by AMD in the past 6 month anticipating the reviews we have... no equipment, or enough sample to do simple PCIe voltage test ? But TR and TomsHardware got the money and enought RX 480 to do it. but not AMD ? This situation is borderline criminal on AMD part...
Seem like a dozen + people at AMD should be held accountable for this.

From the deep analysis of the ref design, it seem many things went wrong with the RX 480. From the power distribution, to the to limited power delivery.

Now, it almost seem that AMD was expecting different form GF but it never materialized and in panic had to over-volt Polaris to its limit.
Or AMD was greedy and wanted to sell a $150 card for $200 by overclocking it past it design limits.
Its 1_ or 2).. both show AMD management to be incompetent.

I can only guess that AMD reputation is so horrible, no good engineer want to work there ?

It seem AMD is having the same problem it had 5 years ago.

The result was drop in sales.. and in turn drop in silicon order, and in turn Global Foundry penalizing AMD with billion $ penalties fees.

So AMD is once again facing the same tune... first in line, and will pay fines that will benefit the other GF customers.

It almost seem like its an "inside job", or utmost incompetence on AMD management side

Yet. AMD is giving away million in shares and bonus to upper management for a "good job" ...

From the outside looking in, it seem like Zen is going to flop, really bad, and vega will fall short by about 30% compared to Pascal. Forcing AMD to sell GPU at razor thin margins while Global Foundries collect its usual fat margin on raw silicon.

July 3, 2016 | 08:32 PM - Posted by Anonymous (not verified)

Makes you wonder if the rumors of Raja Koduri wanting to go to Intel is true and the process needed speeding up.

July 5, 2016 | 09:31 PM - Posted by pdjblum

Can you please provide a link to the source of this rumor?

July 3, 2016 | 08:32 PM - Posted by Anonymous (not verified)

Makes you wonder if the rumors of Raja Koduri wanting to go to Intel is true and the process needed speeding up.

July 3, 2016 | 08:33 PM - Posted by Anonymous (not verified)

Makes you wonder if the rumors of Raja Koduri wanting to go to Intel is true and the process needed "speeding up".

July 5, 2016 | 06:15 PM - Posted by Dean (not verified)

"BTW, didn't AMD hired a known GPU tester not long ago from TechReport ? How come the board was NOT tested by AMD in the past 6 month anticipating the reviews we have... no equipment, or enough sample to do simple PCIe voltage test ? But TR and TomsHardware got the money and enought RX 480 to do it. but not AMD ? This situation is borderline criminal on AMD part...
Seem like a dozen + people at AMD should be held accountable for this."

I'm erring towards agreeing with much of this. AMD cocked this one up badly.

July 5, 2016 | 06:15 PM - Posted by Dean (not verified)

"BTW, didn't AMD hired a known GPU tester not long ago from TechReport ? How come the board was NOT tested by AMD in the past 6 month anticipating the reviews we have... no equipment, or enough sample to do simple PCIe voltage test ? But TR and TomsHardware got the money and enought RX 480 to do it. but not AMD ? This situation is borderline criminal on AMD part...
Seem like a dozen + people at AMD should be held accountable for this."

I'm erring towards agreeing with much of this. AMD cocked this one up badly.

July 5, 2016 | 06:16 PM - Posted by Dean (not verified)

"BTW, didn't AMD hired a known GPU tester not long ago from TechReport ? How come the board was NOT tested by AMD in the past 6 month anticipating the reviews we have... no equipment, or enough sample to do simple PCIe voltage test ? But TR and TomsHardware got the money and enought RX 480 to do it. but not AMD ? This situation is borderline criminal on AMD part...
Seem like a dozen + people at AMD should be held accountable for this."

I'm erring towards agreeing with much of this. AMD cocked this one up badly.

July 5, 2016 | 06:17 PM - Posted by Dean (not verified)

I'm erring towards agreeing with much of this. AMD cocked this one up badly.

July 4, 2016 | 03:27 AM - Posted by Anonymous (not verified)

I picked up my RX480 from Akihabara on Saturday and have already gamed the crap out of it. Zero problems so far, even with GTA5.
I'd say if you have a decent motherboard and a decent power supply (coolermaster etc), you are not going to have any problems.

July 4, 2016 | 05:50 AM - Posted by Anonymous (not verified)

... immediately.

July 4, 2016 | 06:03 AM - Posted by Anonymous (not verified)

Yep. Give it anywhere between 3-6 months depending on motherboard quality.

July 4, 2016 | 06:32 AM - Posted by tuklap

Can't wait for AMD's new driver fix release for this.. I thing third party partners of AMD will not have the same problems and i doubt that AMD engineers did not see this coming. Specifications are mere baselines not limits as these specifications tend to have higher offsets than what is stated. That is what you call engineering back there, specification is all about being on the conservative side of the design but you are not held by the specification, it is just your baseline, you can go further than that at your own cost. Either way, AMD should fix this. And we must also point out that this is the first review on a card on this level i guess? I was laughing when Ryan keeps on saying "Nvidia, nvidia, nvidia" during their video with Allyn. Allyn was like WTF Ryan?! ahahaha!

July 4, 2016 | 09:29 AM - Posted by Anonymous Nvidia User (not verified)

Maybe AMD knew the problem existed and didn't care. Why was 8 gig version released first? AMD knew it would get greater performance and thus reference benches would be set by it. Bigger numbers mean bigger sales.

If they couldn't even beat last gen 970 convincingly, would have been merely an OK card. They wanted more sales than merely fanboy sales needing upgrade.

Partner cards are probably going to fix "current" problems but cost may be out of reach for some at closer to $300 than $200 price.

This is speculation on my part. AMD also may have had the fix ready to go if they managed to con everyone. Everyone would innocuously lose performance with next driver update. AMD may have given some lame excuse as to why performance went down a few % or not at all. Problem would have never been found. AMD would have gotten away with it if it wasn't for those damn kids at PC Per and Tom's and other sites that measured same way. Sorry couldn't resist a Scooby Doo reference.

July 4, 2016 | 10:00 AM - Posted by Anonymous (not verified)

What drivers where used when Testing the Power of GTX 960? Nvidia 347.25 Beta Driver is what Tom's used, drivers from over a year ago. more then enough time for the power to be worked out. AMD say they are fixing this with a driver in just a few days. are you using a driver 1.5 years newer then tom's?

July 4, 2016 | 01:19 PM - Posted by Anonymous Nvidia User (not verified)

Tom's thought of everything and used an appropriate driver. The 960 Strix came out in January of 2015. So what difference does it make if PC Per used newer drivers. Both found that there was NO issue.

July 4, 2016 | 03:34 PM - Posted by Anonymous Nvidia User (not verified)

Maximum draw was measured at 147 watts under torture. Correct me if I'm wrong but how does 166-200 watts of Rx 480 power draw over 6pin (75 watts max) + PCI express port (75 watts max) even compare. The 960 Strix isn' even above maximum spec. Normal TDP is 120 watts. With over clocking you can add at most 20% which would put this at 144 watts. Even if the Strix briefly spikes over 66 watts on PCI express port it won't do damage as the average is below spec. Sustained draw is where the Rx 480 is at. This causes damage over time.

https://www.techpowerup.com/reviews/ASUS/GTX_960_STRIX_OC/27.html

July 4, 2016 | 09:43 PM - Posted by Anonymous (not verified)

I have read this article with a great deal of interest.

Many of the people posting are correct , stating it is not an intermittent surge that will cause damage - it is what you do consistently. However, my gaming son will only leave his PC for food and drink .

most people using this card will recognise that fitting it into their old system is like putting a ferrari engine in a truck. It will move - but be severely limited. They are therefore likely to update old hardware at the same time.

When measuring signals like this, you have to be very careful about noise. connecting your reference point to the wrong connection can give the appearance of surges - for example the very high spikes seen on the oscilloscope graph.

Any software fix will only have one effect - slowing the card down , the only viable term solution is to wait for the next version card to be released.

I have an AMD card in my main system and an Nvidia card in my 'older' system. Both are excellent for what I use and I own no allegiance to either manufacturer.

I will have enjoy this card - later.

July 5, 2016 | 12:32 AM - Posted by anonymous (not verified)

I suspect that amd tried to get the card to do more than it was speced to do late in the game. It is remarkably cheaply made. You could even say it is great engineering. The least cost for the most bang at a 230 usd price point. (8gb) Turn the card back down to the design spec where it is not the equal of a gtx970 and all will be good but the card become less compelling. At $260 the card isn't quite as exciting even though it is still a nice chip. The 4gb card at $199 is a winning 1080 card but amd marketing apparently wanted more.

July 5, 2016 | 10:35 AM - Posted by Nicholas Kane (not verified)

Got the Sapphire on Friday, complete system shutdown 3 times playing House of the Dying Sun.
Maybe 5 minutes of actual gameplay.

http://pcpartpicker.com/list/bnq8KZ

July 5, 2016 | 07:50 PM - Posted by Vlad (not verified)

Same here, system shutdowns just after a few minutes of gaming. This is on a ASRock FM2A75 Pro4 mobo.

Returning it while still can. I'd rather pay another 50-100$ for a GTX 1060 that doesn't risk my board deteriorating and better performance.

July 5, 2016 | 06:09 PM - Posted by Dean (not verified)

I'm not an electrical engineer, but it sounds like the 480 reference card is under powered (AMD's fault).

July 5, 2016 | 07:16 PM - Posted by schulmaster

So who's surprised AMD failed to deliver their magical perf maintaining power consumption tweak driver on the Tuesday they promised?

July 5, 2016 | 10:03 PM - Posted by pdjblum

unofficial fix:

http://www.overclock.net/t/1604979/a-temporary-fix-for-the-excess-pci-e-...

So it turns out the power drawn by the 6-pin from the psu and the power drawn from the pci-express slot can be reallocated and does not have to be split equally between the two, as some youtube videos suggest.

Great news if you don't feel the need to trash AMD and just want more competition and better and better gpu's, regardless of what fucking company is making it.

Yes, it puts more stress on the 6-pin, but that connection has plenty of headroom according this and many other sites. In addition, non-reference cards are sure to have 8-pin connectors if needed.

July 5, 2016 | 11:11 PM - Posted by John Pombrio

Just going to say the same thing. AMD goofed by leaving the Voltage Regulator chip set to its default settings, causing the power to come equally from both the PCI slot and the 6 pin power connector. The solution is to pull more from the power connector (which should handle it without a problem) but will heat up the VRMs more for the 3 that will have to handle the extra burden. AMD will probably send out a BIOS update and let the users flash the BIOS to fix the issue. Bad PR issue but luckily it is not the disaster I was afraid it was going to be. DON'T overclock the card and don't run any benchmarks until the fix is in place. Perhaps it is better just to put in your old card for the week or so until AMD gets the proper settings out to the public.

July 6, 2016 | 03:18 PM - Posted by pdjblum

How come all the other sites have announced the driver fix and pcPer has not as yet? Seems odd considering how much effort was put into making sure the world new about the shit AMD invited onto themselves. It just seems fair to be as quick about announcing the fix AMD has proposed.

Now I get why some people accuse Ryan of favoring Nvidia. I always thought he was fair, but this does not seem so.No doubt pcPer will post something about it soon, but it is already 1400 CST and they knew about it early this am. They certainly would ridicule AMD if they didn't respond to one of their accusations in short order. How come it doesn't work both ways. Just seems odd, as i said.

July 6, 2016 | 04:02 PM - Posted by Anonymous (not verified)

they have to cash the check from Nvidia first.

July 6, 2016 | 04:23 PM - Posted by pdjblum

Roger that. It is now 1532 CST and still no post. I am starting to believe it.

July 6, 2016 | 07:47 PM - Posted by Jeremy Hellstrom

We literally mentioned that there would be an announcement about the new driver addressing that back on the 30th ... so ya, already covered. 

The only new info is that the name will be Radeon Software 16.7.1 and that ain't really enough for me to post about until it actually launches.

July 6, 2016 | 04:03 PM - Posted by Anonymous (not verified)

they have to cash the check from Nvidia first.

July 6, 2016 | 04:03 PM - Posted by Anonymous (not verified)

they have to cash the check from Nvidia first.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.