GPU Market Share: NVIDIA Gains in Shrinking Add-in Board Market

Subject: Graphics Cards | August 21, 2015 - 11:30 AM |
Tagged: PC, nvidia, Matrox, jpr, graphics cards, gpu market share, desktop market share, amd, AIB, add in board

While we reported recently on the decline of overall GPU shipments, a new report out of John Peddie Research covers the add-in board segment to give us a look at the desktop graphics card market. So how are the big two (sorry Matrox) doing?

GPU Supplier Market Share This Quarter Market Share Last Quarter Market Share Last Year
AMD 18.0% 22.5% 37.9%
Matrox 0.00% 0.1% 0.1%
NVIDIA 81.9% 77.4% 62.0%

The big news is of course a drop in market share for AMD of 4.5% quarter-to-quarter, and down to just 18% from 37.9% last year. There will be many opinions as to why their share has been dropping in the last year, but it certainly didn't help that the 300-series GPUs are rebrands of 200-series, and the new Fury cards have had very limited availability so far.

View Full Size

The graph from Mercury Research illustrates what is almost a mirror image, with NVIDIA gaining 20% as AMD lost 20%, for a 40% swing in overall share. Ouch. Meanwhile (not pictured) Matrox didn't have a statistically meaningful quarter but still manage to appear on the JPR report with 0.1% market share (somehow) last quarter.

The desktop market isn't actually suffering quite as much as the overall PC market, and specifically the enthusiast market.

"The AIB market has benefited from the enthusiast segment PC growth, which has been partially fueled by recent introductions of exciting new powerful (GPUs). The demand for high-end PCs and associated hardware from the enthusiast and overclocking segments has bucked the downward trend and given AIB vendors a needed prospect to offset declining sales in the mainstream consumer space."

But not all is well considering overall the add-in board attach rate with desktops "has declined from a high of 63% in Q1 2008 to 37% this quarter". This is indicative of the overall trend toward integrated GPUs in the industry with AMD APUs and Intel processor graphics, as illustrated by this graphic from the report.

View Full Size

The year-to-year numbers show an overall drop of 18.8%, and even with their dominant 81.9% market share NVIDIA has still seen their shipments decrease by 12% this quarter. These trends seem to indicate a gloomy future for discrete graphics in the coming years, but for now we in the enthusiast community will continue to keep it afloat. It would certainly be nice to see some gains from AMD soon to keep things interesting, which might help lower prices down from their lofty $400 - $600 mark for flagship cards at the moment.

Video News


August 21, 2015 | 01:01 PM - Posted by Anonymous (not verified)

It's going to take more than just the enthusiast community, add to that the professional community, and the computing community. Let's hope that Uncle Sam keeps those exascale computing dollars flowing. AMD needs to get its Workstation/HPC APUs on an Interposer some government design wins, because Nvidia is not standing still. I hope that more money can be secured from more government investment in AMD's, IBM's, and Nvidia's efforts at getting GPUs into more systems for GPGPU. AMD find one of those OpenPower licensees and get them to commission you to have your GPUs as an accelerators for their Power8 based systems as well and do so at a more affordable price point than Nvidia offers. That custom market for gaming consoles along with a custom market for Server/HPC GPU accelerators will get you through the market downturn.

AMD if you can take your semi-custom, custom business and integration business into the custom server room Power8 market for Google's and Facebook's GPU accelerator needs, then they will fund your integration of your GPU IP to be integrated into their custom power8 based server products, you just have to give them a better price than Nvidia will. It's not hard to integrate GPU technology onto server parts, and Nvidia's Nvlink is just another IBM CAPI (Coherent Accelerator Processor Interface) derived technology, and CAPI can be licensed from OpenPower, or AMD you can develop your own. Get the Google’s, and the Facebooks, and get them to fund the integration of your GPU IP with power8, it's not a hard thing to sell, and you have a great custom and semi-custom team at your disposal. Your x86 based Zen Server/HPC/workstation APUs will get wins, but the market is not all about x86, its about Power8(from third party power8 licensees, maybe Google and Facebook), and the ARM based servers too.

The consumer market is going to become even more difficult, helped in no part by the terrible options going forward with PC/Laptop OSs from one big monopolistic entity. Hopefully Steam OS will be there for those that want Vulkan's close to the metal graphics API, without all that up in you personal business spying and bloatware based windows 10 cloudware catastrophe. I'm definitely waiting for the Steam Boxes now, to see their benchmarks with the Vulkan API when they arrive. I'm not about to let windows 10 anywhere near my hardware and they can have DX12, AMD's GPU's will be even better on Vulkan (Mantle with a different name) also, more so than they are on DX12, and I can install Vulkan on 7. This is going to be the year of the holdouts, and most are going to wait and see what Vulkan will bring once it is released as an option.

August 21, 2015 | 05:25 PM - Posted by Anonymous (not verified)

There have been some reports that AMD may be making an HPC APU on an interposer, which would allow them to use multiple CPU and/or gpu dies. Power may be an option for them, but I would doubt that they would spend the resources. They could have an excellent HPC product with Zen CPU cores combined with their gpu tech and HBM though. With silicon interposer technology, they can tightly couple CPU cores and gpu style compute engines, even if they are spread across multiple chips.

August 21, 2015 | 06:12 PM - Posted by renz (not verified)

AMD has always been cheap than nvidia but it hardly help them towards adoption. hardware is one thing but without good software to pair it up your solution are not good enough. and in professional work what's more important is you can get the job done. go check the top 500 list of super computer. see how many of them were using AMD Firepro.

August 21, 2015 | 07:29 PM - Posted by Anonymous (not verified)

That AMD GPU accelerator deficit in the marketplace could change, and it only takes one semi-custom contract with a Power8, or x86, or ARM based custom server order from a company that has the money to pay AMD to design the custom server part, and AMD is back in the game, and that AMD HPC APU is partially government research grant funded. Old U-SAM is a big purchaser of supercomputer kit, and even the government will save giga-bucks having a second source for GPU accelerator parts, and I’ll bet that the research universities and governments would prefer that the contractor use OpenCL, and other open source code on those Linux based supercomputers! The Navy sure uses all open source code on its Linux Based warships, though the distro is very custom and secret, there are no binary blobs allowed.

There will be loads of Power8 licensees looking for GPU accelerators for their power8 based server systems, and some will utilize the Tyan motherboard’s PCIe lanes for a low cost way to integrate AMD's GPUs with their power8 based kit, while others will want more higher speed coherency based connections between CPU and GPU, AMD has that developed also, and if AMD has the skills to integrate ARM ISA designs with its Graphics, AMD has the skills to integrate its GPU IP with the Power8 based systems also. AMD has a good reputation with custom and semi-custom clients. AMD still owns all the IP related with its SeaMicro acquisition, so I'd expect that AMD's recently announced coherent HPC interconnect fabric has a lot of that SeaMicro Freedom Fabric plus more AMD IP inside of that new technology. SeaMicro may be shuttered, but its IP lives on. AMD has a lot of really interesting IP, and its GPU technology is very competitive and open on the driver side, with all that molten Mantel coming up from the Vulkan based API.

I'll bet that IBM being IBM would welcome AMD into its many potential project bidding partnerships, IBM for one loves to have a second source of parts, and GPU accelerators are just parts, and AMD owes its existence in great part to IBMs love of second source suppliers(x86 back in the day), so with IBM, and the Many licensed OpenPower Power8 makers on board with the Licensing of Power IP I'd expect a few would not buy/could not afford Nvidia's high priced GPU accelerator parts.

August 24, 2015 | 10:36 PM - Posted by renz (not verified)

forget AMD Firepro in HPC when intel xeon phi are used more right now. much easier to program than gpgpu and have full support for OpenCL as well. right now intel is more of a threat to nvidia than AMD. In theory GCN was supposed to have better DP rate than Kepler but in reality Kepler are kicking GCN in DP compute efficiency. also being cheap is pointless when you can only do the job half way.

August 21, 2015 | 01:45 PM - Posted by Anonymous (not verified)

This is really insteresting, I do hate the way this graph was made though. It starts at 15% and goes to 85% percent making the market share look skewed even more than it is.

August 21, 2015 | 02:22 PM - Posted by Anonymous (not verified)

The market share is based on number of sales of GPUs?

Sorry for the stupid question.

August 21, 2015 | 02:39 PM - Posted by BBMan (not verified)

I'm sad about the drop in AMDs. However, when I bought my 980 the 200s were out. I had told myself: If I move to northern Alaska I'll buy AMD.

I'm just one person, but for me the 980GTX was simply the smarter card to buy at the time.

August 21, 2015 | 03:42 PM - Posted by Anonymous31276 (not verified)

The real news should be that Matrox is up.

August 21, 2015 | 04:16 PM - Posted by Anonymous (not verified)

The real news is that Matrox is still in the graphics card business? I had a Matrox G200 back in the 90's, but they haven't produced there own gpu in a long time. They mostly moved into specializing in cards supporting a massive number of outputs. I don't even know what they offer at the moment.

August 21, 2015 | 08:07 PM - Posted by Anonymous (not verified)

"Matrox to Use AMD GPUs in Their Next Generation Multi-Display Graphics Cards"

http://www.anandtech.com/show/8480/matrox-to-use-amd-gpus-in-their-next-...

August 21, 2015 | 03:47 PM - Posted by Anonymous31276 (not verified)

There will always be high end discrete graphics, it's the low end segments that are going to disappear in the near future.

Sales will decline overall, but profits will be higher on the units moved, so it kind of works out.

August 21, 2015 | 04:09 PM - Posted by StephanS

Most of that market share loss, if not all is due to AMD driver team.

And for anyone with a 290x, try to lower the clock by just 10%...
This will drop the power usage by over 150w, yep, over ***150w***.

AMD got so stupid that they overvolted/overclocked to get higher number in benchmarks so they could raise the card price by $50

That extra $50 per card cost them million of unit sales.

Take a 290x, run 10% lower and provide decent Dx11 drivers, sell it for $329.. it overtake the GTX 970 with ease.

Good news for hawaii owners, dx12 will fix AMD blindness on the driver side.. and you can already boost the power efficiency and lower heat to unlock the silicon best power envelop.

Anyways.. AMD got no clue on marketing its product.

Other case in point the Fury X , what a waste of state of the art technology...

August 21, 2015 | 05:17 PM - Posted by Anonymous (not verified)

Fury X may be an exceptional card under DX12 rather than just a competitive card under DX11. AMD had to make a lot of trade-offs due to their budget. Engineering is always a careful choice between a number of trade-offs. I don't think you are aware of what their options actaually were at the time. A lot of choices concerning products on the market now had to be made several years ago. What was your thoughts the direction these things should have gone 2 years ago? I suspect the choices of AMD's engineers turned out a lot better than what some enthusiast could come up with.

AMD made a lot of far forward looking choices with the development of mantle, HSA, and a lot of other technologies. Nvidia and Intel were holding things back because they were perfectly fine with the status quo. Neither of them had that much reason to push these next generation technologies since it would essentially take away their competitive edge. Intel was making lots of money by being the single thread performance leader. Why would they push for multi-thread optimized graphics API? Nvidia didn't have that much of a reason to push a Mantle type API either since they were reaching good efficiency with DX11. AMD does not have the resources to do that. DX11 seems to require significant driver dependent on specific engines or even specific games. With the Ashes of the Singularity engine, one of the reviews I read noted that the latest Nvidia driver increased performance significantly, but in DX11, not DX12. It takes huge amounts of resources to optimize on such a case by case basis. Nvidia definately doesn't want HSA since they don't have a marketable CPU. Intel doesn't really want HSA yet either, because their GPU will probably not look good. They are working on it though.

Microsoft may have been holding back to some extent also. I don't know that they really wanted DX12 for the PC. Microsoft would rather have the Xbox dominate over the PC since they have complete control over the entire Xbox infrastructure. They don't have that with the PC.

Nvidia spent their software R&D developing DX11 drivers which will be obsolete relatively quickly. It looks like DX12 may displace DX11 quite quickly due to Windows 10, Xbox, etc. AMD spent their limited software R&D developing an entirely new API instead of driver optimizations which are going to be obsolete. Personally, I have to give AMD engineers credit for their foresight, although given the position that they are in, they may not have had that many options to choose from.

I am not a fanboy of any of these companies. I have an Intel based PC with an older Nvidia graphics card and an Apple laptop. It seems ridiculous to me to defend the dominate players. This will just lead to stagnation and high prices for consumers. It also leads to questionable business practices by the dominate players to protect their near monopoly position. Perhaps a lot of readers weren't around for all of the shenanigans Intel pulled when AMD came out with the K7 which supported DDR memory. Intel used all kinds of questionable, if not illegal, tactics to try to keep AMD and DDR out of the market. Intel was pushing the Pentium 4 with Rambus memory which was a bit of a disaster but that is a long story. One of my friends got stuck with a P4 with SDRAM because of that situation. It was easily outperformed by previous generation Intel parts, not to mention cheaper AMD options.

August 22, 2015 | 12:50 AM - Posted by StephanS

I agree with many points, but not with this one.

"DX11 drivers which will be obsolete relatively quickly"

I'm sure AMD believed this, but DX11 is ruling the PC market for past 2+ years and will for another 1 years.
3 years is an eternity, and you cant make up 20% to 30% in performance when you and your competition are using the same fabs. And your competition got way more transistor experts then you do.
So if you giver away 25% in performance for all your products, you have to lower your price (and clock/voltage) by a masive amount to compensate for not having hiring half a dozen software experts. (while the AMD exec pockets 40 + millions in bonuses&pay)

AMD should be commanded for their technological achievement with Mantle, its the gold standard for the industry.... But financially it was a big mistake. Yes, it benefit us all, but not AMD.
AMD would have been much better off tweaking their Dx11 class drivers, yea its a shitty thing to do, its a dead end, its lame... but it would have made their cards 20 to 30% faster in the 1080p market.
More money in sales == less firing of R&D team members.

HSA you say ? I personally I think HSA is a dead end, and Intel got it right with their new Xeon platform. Intel is implementing what "Fusion" represents...
(I also think nvidia days are numbered in this sector unless they follow in Intel footsteps, but in their case with ARM cores)
So if AMD is not making a modern "ARM Xeon" type class architecture while toying with HSA, AMD just wasted countless resource. HSA is just stop gap, it remind me of the days of the separate FPU. Its going to be history.. like all that dx11 nonsense.

My take is AMD spend money on dead ends (seamicro being a recent example, over 300 millions! ). (not mentioning the lavish management bonuses and pay that almost equal Intel. Big scandal I exposed 2 years ago, and only now people are sniffing at AMD exec financials)

More to rant about.. maybe later :)

August 22, 2015 | 05:30 PM - Posted by Anonymous (not verified)

HSA means GPU acceleration, and GPUs using their massive ranks of FP resources for general purpose computations. So who needs a Xeon with its paltry amounts of FP(Floating Point) resources compared to even an Integrated GPU's FP resources. The biggest threat to Intel from Both AMD and Nvidia/others, is the continued GPU microarchitectural improvements that are including more of the decision making circuitry in the ACE units for AMD, and Nvidia's equivalent.

Take AMD's Carrizo for ray tracing workloads, my probook laptop with core i7 takes hours to render a heavy ray trace scene, but with the Carrizo and its ACEs and Blender 3d's now available Cycles rendering use on CGN cores, and better support for ray tracing on the GPU, and other GPGPU workloads, those render times will be cut down to just minutes!

It is finally time to say goodby to the awful CPU, and its limited core FP resources for Ray Tracing workloads. Who cares about the CPU cores going forward for rendering workloads, CPUs that are too overpriced for the amount of FP resources that they offer. GPUs are where the real computation happens on the integrated and on the discrete GPU!! Gaming, modern gaming, itself could not happen without the GPU and all those FP units, and ROP units, Etc. An even LibreOffice includes GPU acceleration on spread-sheets and the other graphical packages inside LibraOffice's suite of software. And with even more improvements in store for AMD's newer Arctic-Islands microarchitecture expect more decision making circuitry to be added, ditto for Nvidia's Pascal!

CPUs and the folks that only push CPUs for computation, and I mean all computation not just Graphics computation are in serious trouble. The only detrimental outcome surrounding the current Carrizo SKUs are that there is overabundance of Carrizo laptop SKUs that only use the 15 watt parts, or have a 35 watt potential Carrizo part shoehorned into a god awful Ultrabook laptop SKUs are are artificially thermally constrained to a lifetime of a 15 watt usage envelope. The biggest most detrimental occurrence in PC/Laptop computing is this illogical obsession among the OEMs in the industry with the Thin-and-light/Ultrabook form factor, where it's form over function, and the total gimping of the laptop as a desktop replacement that made many more people adopt laptops as their only computing device.

If there are Carrizo parts that can be run at 35 watts, and also be run at 15 watts, with a simple software/firmware setting why not have a 35 watt Carrizo part in a regular form factor laptop, that can run at 35 watts, for high powered workloads, but could be user dialed back for 15 watt usage and long battery live. Why condemn the part to a 15 watt life of service, just because of the obsession with having a thin and light laptop form factor.

The HSA 1.0 complaint Carrzo SKUs will allow for more graphics workloads to be done on the GPU, including the Ray Tracing types of graphics, so who cares about the CPU for any graphics workloads, AMD's Or Intel's, the real power of Carrizo is its GPU and that GPU's HSA ability, its just a shame that the 35 watt Carrizo rated parts are not being used in more regular form factor laptops. I really hate the Ultrabooks/Thin and light form over function thermally trottled laptop SKUs being offered currently, what useless piles of trash they are!

August 25, 2015 | 04:05 PM - Posted by Anonymous (not verified)

AMD user here: their drivers are terrible!
If I had to buy a Graphics Card today: 9XX Series is the way to go! No brand loyalty here, just best value!

August 21, 2015 | 06:17 PM - Posted by JohnGR

If the press continues to cover up Nvidia and just shoot AMD in every chance they have (or create), many tech sites in the future will have to add fashion articles, because with two monopolies in the market things will be extremely boring.

AMD is far from perfect. Having to fight the press, makes things almost impossible for a company with only debts in their pockets.

August 21, 2015 | 07:15 PM - Posted by siriq

As you said , AMD is far from perfect but this is what we got at the market share. Only the players (AMD,NVIDA) can make change on this look. Simple is that.

August 21, 2015 | 07:58 PM - Posted by Anonymous (not verified)

Actually the only Tech sites that I truly trust only take the ads from the makers of the products that they do not directly review. Those conflicts of interest get in the way of most of the impartiality of the so called technical press. So yes get some ads from the makers of adult diapers, or some clothing stores etc. The press, that is the financial press, will continue to report any of AMD's short term problems as that makes news. What you want is for some private equity firm to take AMD private, and that will mostly shut up the financial press. For you to expect the majority of the lap dog “technical Press” to bite the hands that feed them and treat AMD fairly and equitably, well that is not going to happen as the ad driven Technology press's job is to serve their masters, the ones that pay them to write most of those so called “Reviews” !

August 21, 2015 | 08:10 PM - Posted by siriq

What is this jigalish nonsense.

August 23, 2015 | 02:19 PM - Posted by Anonymous (not verified)

Butthurt much by the truth, conflicts of interest are endemic in the online Tech Review industry, wither it's pressure from advertisers or the need to get those review parts(At No COST). Just look at the cherry picking of benchmarks. Lots of those little lies of omission in the cheery picking of benchmarks, just pick those ones that make the product look good, and ignore the others. The producers of expensive CPU/GPU parts have the upper hand over the reviewers, and reviewers that do not tone down the negative in favor of the positive may risk loosing access to those free review samples, or having them arrive late after the sites that are more accommodating to the hands that feed them get their review samples.

When you receive and depend on the makers of the products that you are reviewing, review samples and advertising you are somewhat caught up in the economics of not being able to be as truthful for fear of the lost income. Any site that gets income this way is caught up in this bargain, and getting more of the outside of the (only from products/ad revenues from companies whose products tech sites are reviewing) ad revenues makes for less dependency on having to watch your step. So it's better to get a few Tee-shirt ads, or other types of Ad revenues.

It's always good to try and read as many reviews a possible, including the websites where actual posters post the performance of their personal gaming systems. Especially with regards to gaming laptops and the thin and light pathology that afflicts so many of the technology/PC/Laptop OEM's marketing departments. There are a lot of gaming laptops that have had thermal throttling issues.

Just look at AMD's new “Business” oriented laptop APUs, all 15 watt parts for those APUs for thin and light from factor laptops, and a lot of power business users wanting more of a laptop as a desktop replacement regular form factor workhorse machine, with a full 35 watt non thermally limited APU. If that Carrizo FX8800p can be made to fit in a 15 watt thermally restricted UltraBook abomination form factor. Why not get a regular form factor business Carrizo laptop with a little better cooling solution that can be run the APU at 35 watts and can be user profiled to run at 15 watts for longer battery life while still retaining the ability to run at 35 watts for times when more computational power in needed.

AMD's marketing department is ruining AMD! That and the glut of OEM laptop thin and light laptop case parts that the suppliers of these parts to the OEM's have warehouses full of, and the OEMs can get them at loss leader sales prices. They are still pushing these thin and light abominations and trying to pawn them off on the consumer, and the business users want the regular form factor workhorse laptop as a desktop replacement Laptops. The Gamers want even more thermal headroom for their gaming laptops with no possibility of thermal throttling!!!

Those laptop reviewers are such a joke, and on more than just a few website forum/posts the posters are wondering when are the Laptop SKUs with the full wattage Carrizo part are going to be available, and able to run at 35 watts, NOT 15 watts.

August 21, 2015 | 09:38 PM - Posted by Anonymous (not verified)

Do not worry, there will be a K7'time again with ZEN.
"It will cover ALL your needs, in the coming years"

I can not wait to see AMD's ZEN APU's take over the world, and reign throughout the galaxy.
-What will nVidia then do, come up with some black holes ..? To be Continued ..

I wonder how the next two quarters will look like. I do however expect, to see a small gain+ to AMD.)

August 22, 2015 | 01:50 AM - Posted by collie

It is starting to seem like they are fighting for a market that won't exist in a few years. Soon Enough Intel will have the best possible graphics, and crazy fast ram, and huge super fast storage, and better sound than the ear can conceive, and every other function you can conceive on ON THE SAME (multi layered, 3d, interposed, built on various different processes) CHIP, and the mother board will only matter for connections, and eventually we wont need physical connections (or our civilization will crash, or we will be wiped out as a species) The enthusiast market will die off, just like it always does. Once upon a time the automobile enthusiast tweaked and bought and evolved and pushed for that, at first huge, but then less, and finally minuscule improvement, but now they just like cars. And, if everything goes exactly the way that we ALL wanted it to go since we fell inlove with computers, then it will be the same for us.

August 22, 2015 | 01:40 PM - Posted by johnc (not verified)

Considering just how much Intel has stumbled lately -- especially with 14nm -- and to see them turn almost all their attention to low-end embedded devices, I'm not sure progress is going to be so quick.

August 22, 2015 | 07:22 AM - Posted by John H (not verified)

Are we surprised given how long 28nm has had reign in GPU land? The progress is slowing therefore the replacement rate will slow..

August 22, 2015 | 09:27 PM - Posted by Bhappy (not verified)

AMD's desktop standalone gpu market share now mirrors that of its processor market share with no signs of reversal. They will be lucky just to survive as the no#2 option in both markets in the years to come.

August 23, 2015 | 11:34 AM - Posted by Anonymous (not verified)

Nvidia's PR - stunts ,is all what it is.!! (It´s very easy, to mess with people's minds, especially young people)

What can I conclude from this? - nvidia try's to remove all positive focus from AMD, though AMD so nicely "up-lifts for free" over 95 % of someone else's hardware. It also tells me that AMD actully is gaining too, and that there strategy is working toword´s the era of ZEN APU.)

August 24, 2015 | 11:32 AM - Posted by BBMan (not verified)

Nah. It's because nVidia actually makes a better GPU.

It's up to AMD to trump it now. They need another 9800 Pro type card. If they can hit the window and tech right, they just might.

Their track record does not leave me with a lot of confidence- but it's up to them to change it.

August 24, 2015 | 06:57 PM - Posted by Anonymous (not verified)

nope, it´s called marketing.)

August 25, 2015 | 09:50 AM - Posted by BBMan (not verified)

If you have fewer and poorer people who can pony up $5-800 for a GPU, your marketing better have more power and performance charts to back it up than than Anime tits on the box.

August 24, 2015 | 04:20 AM - Posted by Anonymous (not verified)

Nvidia PR is really good at producing graphs that don't start at 0 for the Y-axis...

August 24, 2015 | 09:21 AM - Posted by gamerk2 (not verified)

Ironically, AMD is killing itself with APUs, by starving it's mid-tier dGPUs. As a result, the people who purchase dGPUs are more likely to go on the high end, when trends NVIDIA. So AMDs APU division is eating it's GPU division.

August 24, 2015 | 11:42 AM - Posted by collie

I'm just guessing here but I'd bet that those APUs have a higher profit margin than anything in the GPU market, just guessing.

August 24, 2015 | 12:46 PM - Posted by Anonymous (not verified)

How about a Graphs/story that show and tells the gain in the APU-marked. It could be like this - AMD is the KING of ALL GAMING "GPUs + CPUs (APUs)" for console. + They also still got about 20% of the GPU-market for pc and gaining, and nVidia have lost it all in the console-market and about to lose more gain in GPU-market.

The circle is now complete with Nintendo! No wait,, what about nVidia’s shield?? Hmm.. Can´t help somebody who don’t want help, because they seem to cut off all healthy reasons, to work with most others company, to make the gaming-industries a fine nice useful place for us ALL!

"I believe that nVidia is welcome to buy a APU form AMD" - But that´s most likely not going to happen. nVidia will properly say - just follow us and lets us decide what gamers need for gaming, and we will make our money game machine WORKS... --Can you dig it´-- .)) -- I say,, can you dig it´-- .))

Ps. About AMD´s stocks, are only showing that the investors knows that AMDs NeXT-Gen. of CPU, GPU and APU is going to be competition clever a "kick-ass product". So for them, it´s all about to wait as long as possible, to make the most Money.
Personal I buy ALL the shares/stock my economy can carry. - Why shouldn’t I.? The big passion that I have for Tech, and not to say, my love for the games. Think about it, a company that lives of making tech, that actually “WORKS”, -dig it´- for the gaming industry. Let´s just say that most of the shares holder, was us, the gamers! – I can tell you what will happen, but I will let you figure that out yourself..

August 24, 2015 | 02:14 PM - Posted by collie

I like your passion! Stop being Anon and join us!

August 24, 2015 | 07:05 PM - Posted by Anonymous (not verified)

Thanks and likewise.) Maybe I will someday, I feel the Force(no G) is strong with you.)

August 25, 2015 | 02:40 AM - Posted by nobodyspecial (not verified)

You do realize AMD has lost $6B+ in 12yrs correct? You do realize their stock is $1.75 and closing fast on possibly being de-listed right?

There is no chance of them gaining share in cpu or gpu for at least another few quarters. If that ends up being another few 150mil+ losses you'll be looking at a $1-1.25 stock by xmas probably and probably hitting $1 in Feb or so when Q4 earnings are out. If the stock market continues it's free-fall (along with other markets around the world), AMD will drop more without or without products for the next few quarters.

Not sure how you think they'll be gaining share when their chief rivals (Intel/NV) can just price them to death to keep share if desired, so they make no money. To make money you need the WINNER, so you can charge accordingly. Also note, due to ZEN they put off APU for later.

Consoles are exactly how AMD got to this point (along with other dumb management decisions), as all of that R&D should have went purely to CPU and GPU kings (no apu until HBM could be used) along with FAR better dx11 drivers. Instead they thought consoles would do wonders, when at measly margins they were never going to make them rich. They went so low trying to win the deal, that it left them with no way to profit until later in the console's lifespans. Unfortunately for AMD, they'll be caught by 14 or 10nm HBM capable socs at about 1/2 through their life, along with much easier vulkan portability on top. Devs will go mobile as they already show now long before Vulkan hits. They go where there are 2Billion units sold yearly and constantly upgraded now, not 10-25mil that are difficult to port to or from. It is far easier to sell a $6-10 game than $60 and with an audience so large, they make great money anyway. AMD should have passed on Mantle, Consoles etc, and concentrated on CORE products/drivers.

You should not buy stocks that lose money ;) You shouldn't buy stocks with dropping R&D, low cash, no profits, losing market share, bad debt ratio, high interest debt, dilution of shares, 30% of the engineers laid off, bad management etc etc. You should get the point ;) Do you read their balance sheets/quarterly reports or what? I really hope they get bought before they destroy themselves.

August 25, 2015 | 01:28 AM - Posted by nobodyspecial (not verified)

Personally I think sales of NV cards are down 12% (and amd too) due to us ALL waiting on a die shrink. Only those among us who have a dead card buy RIGHT NOW. Otherwise, I (like others) am more than happy to WAIT for 14nm and the MASSIVE power/lower watts it will usher in. Even if they max out the power, I can easily opt to downclock to drop the power/heat. Either way I should end up with a far more powerful card and lower temps in AZ on top as a bonus.

Buying today is silly unless you're ignorant (IE, not reading hardware sites) or have a dead card and thus, are forced to buy now. I don't believe for a second the drop comes from us just stopping gaming. That is ridiculous. What we all are doing is WAITING for the die shrink to drop for a whole year (more in my case). I could easily upgrade two cards right now (I have the cash waiting), but no freaking way. I refuse to be stupid with my money even if I can afford it ;) This is not just a new gen we're waiting for, it's HBM2 (might actually be needed unlike furyx/HBM1), 14nm, AND Finfet. That is just WAY too much for me to pass up. Even if my radeon 5850 up and died today (yeah, been waiting for AGES to buy again), I'd simply turn on my cpu's gpu and wait anyway...LOL. GOG has plenty of old games (and some cheap new ones) to keep me more than busy for 6 months.

I think you'll see a massive sales increase when the 14nm chips drop, if not from buying the new gen, then from thrifty shoppers picking up the older cards at a major discount because the new ones are so much better. It's just silly to buy today period. Most of us have been waiting long enough now, that it is easy to wait another quarter or two for the final payoff. Really, I don't care how long it is at this point, I'll wait! 17B transistors? Buy now? Don't make me laugh.

October 8, 2015 | 02:04 AM - Posted by frank2336

What's the meaning of "DT PC"?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.