Review Index:
Feedback

Microsoft Introduces us to DX12: My Thoughts on the Industry Moving Forward

Author: Josh Walrath
Subject: Editorial
Manufacturer: Microsoft
Tagged:

What We Know about DX12 So Far

Unfortunately, we know relatively little about DX12.  There was an initial assumption that DX12 would not require new hardware, as it would not have new features that relied on new hardware.  So any current DX11 compliant part will be DX12 compliant.  Well, part of this is true, and the rest is not.  Most current DX11 parts can take advantage of the low level efficiencies gained from DX12 programming (AMD GCN, NVIDIA Fermi and above, and Intel’s 4000 series and above), but Microsoft has yet to detail the other changes that WILL require new hardware to be fully DX12 compliant.  Throughout the next year and a half, we will get bits and pieces of what DX12 offers over DX11, other than the low level programming differences between the APIs.

View Full Size

A more granular look at what the driver overhead of DX11 does to performance and how it compares to what Microsoft is introducing with DX12.

Matt Sandy of Microsoft put up a nice little blog post that quickly details three major functions that DX12 will encompass.  These are pipeline state objects, command lists and bundles, and descriptor heaps and tables.  Matt describes these in a cursory, but effective manner.  I am afraid that I cannot compress the information any more than what he has done in his blog.  For a more detailed description, definitely look at his post.

I can try to summarize what Microsoft is aiming at with DX12, however.  Essentially the rendering pipeline in DirectX did a lot of the work for engines and applications, but did not exactly do so efficiently.  This high level abstraction of operations allowed for great portability because DX handles a lot of the low level work.  Unfortunately, one of the big issues is that the device driver has to wait for certain operations and state changes to occur before it can complete the current operation and submit more work.  The big hit comes with draw calls, which are negatively impacted by the relatively inefficient abstraction layer and driver interactions.  IHVs, like NVIDIA and AMD, work around this on a case by case basis, as is seen by the very frequent driver updates which improve performance (sometimes dramatically) for individual applications that have been recently released.

DirectX 12 will allow for some interesting work to go on in the background.  It will permit a much more low level of control of the hardware device from applications.  The really smart guys who design engines (Andersson, Sweeney, Yerli, etc.) can really optimize their engines to extract the greatest (or nearly greatest) amount of performance out of any hardware configuration.  They can control memory addresses for data assets directly, rather than having to ask DirectX to translate these addresses and handle all memory transactions.  This results in a lot of extra performance, but it is very intensive in terms of programming prowess and manpower.  Not every game developer has programmers that can handle that type of workload, and so they rely on engine level optimizations to achieve good performance.

View Full Size

What exactly a lower level API such as DX12 (and theoretically mantle) will provide for  developers and IHVs.

Even if a game developer does not institute lower level hardware control, the lower driver overhead and increased multi-thread aware software stack in DX12 will still allow for greater overall performance than what was seen in DX11.  A licensed engine which is developed for DX12 will still institute many low level operations that will take advantage of the much slimmer abstraction layer separating software from the driver/hardware.

Almost all DX11-compliant hardware will work in DX12 (the Radeon HD 5000 and 6000 series excepted).  These performance advantages will be applicable to current hardware, so users do not have to upgrade to see the benefits.  Unfortunately, they will still miss out on all the other features, requiring new hardware that will be designed specifically for DX12.  Considering that the first DX12 titles are not expected out until the Holiday season of 2015, there will be plenty of chances (and reasons) to upgrade before that time.

So what will become of Mantle?  AMD’s competitors believe that most development and porting engines over to Mantle will stop due to the (not so) impending release of DX12.  Personally, I think that quite the opposite will happen.  I believe we will see a few more developers start nosing around Mantle, not only for what performance advantages they can see from that API, but also to get their programmers experienced in going that close to the metal.  Porting from Mantle to DX12 should be relatively straight forward, due to the similar ways these APIs address hardware.  They certainly are not identical but, according to what we are hearing from developers, they are similar.  For AMD, Mantle allows them to more adequately tune drivers so that, when DX12 finally hits, they also have that experience under their belt.  Driver development for DX12 also takes a load off of the IHVs.  While performance improvements will continue through time, the onus of providing much of that performance through specialized drivers will be taken away from the IHVs.

View Full Size

Microsoft hinted a few more upcoming features for DX12 that will not see the light of day until 2015.  IHVs are likely quite pleased, as they can offer new and improved hardware to encompass these features.

DirectX 12 is a much needed improvement to the line.  It has been four years since DX11 was released with two minor iterations in that time.  We have seen the rise of extreme programmability and parallelism that requires new thinking on the software side in order to harness that type of horsepower.  My guess is that DirectX 12 will improve general computing in GPUs and APUs to a great degree as compared to what we have seen with DX11 and DirectCompute.  The ball has been rolling for some time, but now that the industry has seen the advances that DX12 brings to the table, I am sure it will lead to some interesting developments on both the hardware and software sides.  Good times await enthusiasts of this industry.

April 2, 2014 | 10:02 PM - Posted by Anonymous (not verified)

hello josh,
well i am kinda disapointed at your article, in my opinion it doesnt encompass all the possibilities, since you do not have any facts, and all you have is guessing, which is not a probleme, untill you start making it sound like facts.
" However, with the original timeline for DX12, AMD saw an opportunity to push these features as an exclusive for their hardware, thereby potentially increasing sales and support before the standardized DX12 is released."

the other possibility which is the most likely, giving the statements of AMD, Dice, Developpers, and the state of the annoncement of DX and how far it is, it goes as follow :
AMD Gpus are good, yes to cut the prices sometimes you get crapy cooler, but the architecture itself is often very good, sometimes better than Nvidia, but Nvidia has always been in front, because their Gpus get better drivers, and more often, fixing bugs, and increasing slightly fps.
so amd and devs asked microsoft for years to offer lower lvl Api, but never happened, so AMD decided to push the industry in this direction with Mantle.
1st hoping for Mantle to work and forces microsoft's hand to change the standard, or 2nd if it didnt to push for a new standard Api.
a lower level API strips Nvidia from it's Major tool, Drivers, and saves AMD resources and time, and AMD securing 3 major graphic Engines to support mantle shows that they didnt just plan on releasing it with couple games, just for the marketing, but they were planning the long run incase microsoft didnt give up.
all what AMD and johan andersson said during APU14, supports this theory, and the state of DX12 that microsoft showed 6 months after Mantle's announcement, also support, that all the talk about DX12 just meant ppl talking about the new DX and how it should be, not necessarily means they had any plans to go lower level before mantle showed up.

April 3, 2014 | 02:09 AM - Posted by wujj123456

I agree this is more likely. I don't see why a company would spend significant resources while they know all they can do is to spend a fraction and wait.

It's also possible that given current competitive environment, AMD feels rushing out Mantle is both helpful in later DX12 development and catching up the performance gap if any. However, this is assuming Mantle doesn't consume significant amount of resources, which only AMD knows.

April 3, 2014 | 01:15 PM - Posted by Josh Walrath

It is hard to say, from an outside perspective, what exactly happened in those DirectX meetings with all of the IHV's putting in their $0.02.  I know that developers and IHVs have been unhappy with the state of the abstraction layer, and how much performance it sucks up.  How poorly threaded it is... etc. etc.

Something else to consider is that AMD started developing Mantle quite a while ago as well.  Perhaps talks about DX were not going to their liking, so they started their own little path.  Obviously between here and there things have changed... but yeah, what was the initial impetus for AMD to develop Mantle in the first place?  Would love to hear some more inside scoop from all sides.

BTW, it is entirely appropriate to be disappointed in my article.  Feedback is good, sharing info is better!

April 4, 2014 | 06:52 AM - Posted by Anonymous (not verified)

in a completely unrelated Note, AMD's Freesync is on the way.
VESA has validated the SCR (Specification Change Request) made by AMD in novembre, to change Displayport 1.2a , the demande was listed as "DP1.2a Extend MSA Tmg Para AMD SCR"
and now the change shows up in the aprouved list "DP1.2a Extend MSA Tmg Param Ignore Option AMD"
sources:
http://www.hardware.fr/news/13640/amd-freesync-proposition-adoptee-par-v...

so if i understood correctly, everyone who has monitor with displayport 1.2a, would be able to download firmware for the monitor, and driver or software to enable it.

April 4, 2014 | 07:05 AM - Posted by renz (not verified)

so does freesync work on games yet? last time when AMD make the demo they never demo real game running the demo machine

April 4, 2014 | 07:32 AM - Posted by Anonymous (not verified)

well that's basicly the point of it, games xD
and it works the same way G-sync does.
the good news is, the vesa change makes it a standard, and a free one at that, so manufacturers will adopt it very fast, because it adds value for free...so better margins.
this is not proprietary technology, since AMD is busy with mantle, intel could offer support for it faster.
and i am guessing nvidia will just wait for devs to add the option ingame, i dont see nvidia offering an option to enable it through drivers, when they asked ppl to pay 600$ monitor for it 4 months ago

April 3, 2014 | 03:46 AM - Posted by Anonymous (not verified)

All these PcPerspective write ups forget or ignore what Nvidia said themselves about DX12 lower-level abstraction, that its optional.

April 3, 2014 | 12:44 PM - Posted by Josh Walrath

I think I did sorta cover that.  Perhaps in not so many words... but basically devs have the option of not worrying about the low level stuff and letting DX handle that functionality.

April 3, 2014 | 03:11 PM - Posted by Anonymous (not verified)

What ?

DX doesn't automaticly do it for them. That's crazy.

There wouldn't be a need to drop it for backwards compatibility because as you said it be handled by DX magicly.

Engine and game still have to be coded to take advantage of it and Nvidia wont push it because its on the CPU side which they don't have at the moment.

April 3, 2014 | 03:46 PM - Posted by Josh Walrath

Whoa, whoa, whoa... I didn't mention anything about magically doing anything.  I am looking at a high level when saying "DX will handle" stuff like memory addressing and transactions.  Eg. hey, the engine wants to access this texture data, but it does not need to define the addresses.  So, the abstraction layer does all that translation.  Not entirely sure what the rest of your statement actually refers to, sorry.

April 3, 2014 | 07:59 AM - Posted by Ophelos

I don't see why microsoft is even going down this road with DX12 for the simple fact that most game engines will be running on Linux. Like currently Epic Games is working on getting a port of UE4 working on linux, an i'm sure Crytek is doing the samething with the CryEngine which will be supporting Mantle.

So unless Microsoft opens up DX12 for more platforms other then it's own products this will become useless before it gets released to public.

April 3, 2014 | 12:46 PM - Posted by Josh Walrath

Well, the vast majority of PC gaming still happens on Windows, for better or for worse.  DX12 will be across PC, mobile, and xbox one.  I guess that is their idea of opening up to different platforms...

April 3, 2014 | 02:52 PM - Posted by Anonymous (not verified)

well microsoft doesnt really have a choice, win8 failed, and xbox failed, and dx being seriously threatened, what do you think they can do ?
the consoles ps4 sells twice as much, xbox api and tools sucks in addition to the spec, so microsoft need somehow to attract more devs to it's console, reworked api is a must since devs hated the current one, better games and exlusive games need larger player base, unifiying dx12 on desktop and xbox, allows devs to have larger player base consol+pc, with easier porting, could save xbox and give a boost to windows sells, as for them to expanding linux/mac not gonna happen, you dont ask a company that has monopole on a market to give it up.

April 3, 2014 | 08:07 AM - Posted by Anonymous (not verified)

""So did Mantle motivate Microsoft to create DX12? No, that project has been in the works for quite some time,""

Bullshit!! how did you know? how did you assume that MS is telling the truth and AMD is liying?

I'm not a direcx or Microsoft hater, but the quality of writing @ PCPER is headed downhill. Which is very sad nonetherless.

April 3, 2014 | 09:07 AM - Posted by renz (not verified)

so what about you? did you believe MS never have any planned on DX12 before AMD become public with Mantle? or you simply don't believe both?

April 3, 2014 | 09:46 AM - Posted by brisa117

What I think he meant was that it's completely irrational to think that Microsoft started development (from scratch) on DX12 because of Mantle. Mantle more than likely accelerated their plans for developing DX12, but you can't create a new API in six months. They clearly were working on it, at some level, prior to the Mantle announcement.

Also, don't criticize the writing on this website when half of your sentences aren't capitalized, you have way too many quotation marks around the article quote, you misspelled "DirectX", and you use the "@" symbol like the actual word "at". Do you know how much crap the writers would take here if they submitted articles like your post?

And if you want to jab at anyone's writing on this site, it should clearly be directed towards Scott Michaud. Half of his sentences don't make sense. His writing is very emotionally charged and fully of weird opinions (but of course, everyone is welcome to their own opinions).

April 3, 2014 | 11:15 AM - Posted by Allyn Malventano

...so you guys think that Microsoft knee jerked after Mantle launched, and somehow managed to get DX12 up and running, along with a game running under it, this quickly? Apparently you guys are not familiar with the (lack of) speed with the Microsoft development cycle...

April 3, 2014 | 01:18 PM - Posted by Anonymous (not verified)

hello allyn,
well i remember johan andersson saying that Dice needed to rewrtite the whole BF4 on Mantle, in about 4 months(even if it took a month delay still), you can go double check, and i am talking about the whole game, playable, not a single map, with a single car on a single graphic card or a system, on a video run, that we dont even know the specifications, it's not like if forza is now available on PC anytime soon, is it possible to setup such a demo in such short time in these conditions, yes i do believe so.
and i do believe they heard about mantle way before apu14 annoncement, these are big companies, they have ways to have info leaks, to adapt before hand, but i strongly doubt there was any kind of low level dx in developement in 2012 period, and i wouldn't believe otherwise untill they say it loud and clear, and explain why no devs or major manufacturers have been kept in the dark, especialy when another party states the opposite, and i rather listen to the other one who make a statement and wait for me to, come up with a conclusion he didnt even say.
just show me where microsoft stated they were working on lower level DX, not talking about it, but developing it, as far as i know nowhere.
beside microsoft lazyness isn't an option here, you are forgeting the failure of the xbox1, they stand to lose alot more than just an API, with all the money they spent on the console, and windows 8, they cant just keep failing, i bet microsoft is putting everything they got to save the xbox1, ie API, they have nothing more important right now

April 3, 2014 | 10:25 PM - Posted by Anonymous (not verified)

Bravo.. And this is what most people are saying and thinking. It would be soo funny if AMD leaked the info to a third party and made Nvidia and tech sites look silly as usual

April 4, 2014 | 08:30 AM - Posted by Allyn Malventano

...and in your example, that gives Microsoft 2 months to come up with a new API, as well as implementing it sufficiently enough for the game developer to start coding for it. It only seems less realistic in your case.

Thinking about these timelines, consider that AMD was more than likely in some of these supposed early meetings with Microsoft and NVidia. What if, hypothetically, Microsoft proposed the idea first, or it came up at a group meeting among the three. How does that make AMD look for coming up with your own API that does something very similar, which aside from making them look good for a few months, could arguably unnecessarily fragment the codebases. Windows developers might have been less likely to adopt Mantle if they knew DX12 was coming earlier (and gave similar or even increased benefits), and if AMD has claimed they were working with Microsoft earlier than the date they released Mantle, it makes them look even more underhanded.

April 4, 2014 | 10:18 AM - Posted by Anonymous (not verified)

allyn,
well if i said some other company did this, a company that doesnt own an API on desktop, owns a Console with a similar Api, who ran last year at E3 a demo for the same game on DX11.
if we knew the state of developement of DX12....there are so many things that makes it doable.
the person posting couple comments bellow, gave also a valide theory, game running on DX11 last, so they updated the renderer for DX12 this year.
or what do you really know about the state of developement of DX12 ? nothing!
we know that DX12 will have optional close to metal feature + traditional DX11, so i could very well rename and tweak DX11 into DX12, and call it DX12 beta developement.

and allyn, do you really believe that Nvidia wouldnt have used this against AMD ? this doesnt seem a sensitive information to be under NDA, everyone knows about lower level, everyone know it was in the talks for over a decade, just spill out when developement started...well untill then i still believe AMD and Dice who gave me a clear answer, i have the right the ignore, Nvidia's vague statement open to interpretations.

April 3, 2014 | 12:49 PM - Posted by Josh Walrath

Everyone is keeping pretty tight lipped about the timeline, but there really is no way to entirely flesh out an API of this nature in the 6 months since AMD released Mantle.  NV mentioned that they started doing driver development on initial DX12 builds about a year ago.  It certainly looks to have been in the pipeline for longer than that, but I do believe that Mantle did help to push MS to accelerate their introduction of DX12.

NVIDIA, AMD, Intel, and others all work extensively with NVIDIA to help define and develop DirectX... so they all know what the timelines were for this release.  

April 3, 2014 | 10:29 PM - Posted by Anonymous (not verified)

Nvidia actually said 4 years (Its on MS and Nvidia site). Which I don't believe for one second. Intel and AMD mentioned no time frame.

We have MS and Nvidia saying 4 years, and we have Intel and AMD saying no time frame... Hmm Who to believe

PS I think Nvidia and MS are telling Ponies

April 3, 2014 | 02:12 PM - Posted by Anonymous (not verified)

"Bullshit!! how did you know?"

Because this sort of thing isn't developed into a working state in the timeframe it took them to "respond" to mantle. The idea itself isn't new but making it work take a lot longer than the few months between mantle's release and the dx12 forza demo.

Even if microsoft were the evil beings we want to label them and they've had this technology working but hidden away from the public all along, they have been doing low level hardware coding for their consoles for way longer than mantle has existed and have waaay more experience doing so. Bringing it to PC as a vendor-locked option would be breaking the cease-fire on that front, which AMD isn't guaranteed to win now but will be beneficial to PC gaming as a whole.

April 3, 2014 | 03:42 PM - Posted by Anonymous (not verified)

The Forza Demo is a joke.

Microsoft showed a Forza Demo at E3 on Nvidia hardware last year. Microsoft got a lot of flak for it and now conveniently everyone forgets.

It took them 4man months to update the renderer from DX11.x to DX12 and that's all they did. Update the renderer for one circuit so they can demo it as a DX12 render tech demo.

There was no HUD no different views cycled just one 3rd person view and the constant 60fps weren't even constant. Throughout the demo the fps kept dropping in corners and the biggest fps came on a straight away.

How easily this site and people are fooled.

April 3, 2014 | 10:37 PM - Posted by Anonymous (not verified)

This site and everyone else running with the DX12 story aren't fooled whatsoever. What gave you that idea?

April 3, 2014 | 10:35 PM - Posted by Anonymous (not verified)

Exactly! And AMD came out and said no DX12.. Suddenly Nvidia, MS and Intel went in to overdrive and out pops news of DX12.

I honestly think DX12 will not have an easy time as in past. After all its just anther proprietary API which this article conveniently forgot to mention alongside Glide and Mantle etc.

Mantle will work on Windows, Linux and most others, so I don't see the problem, but I do see problems for DX12

April 4, 2014 | 07:13 AM - Posted by renz (not verified)

but they only work on AMD GCN cards only. AMD said they want to bring Mantle to linux but we will see how that goes. DX12 is only for windows platform but it works with any hardware.

April 3, 2014 | 09:26 AM - Posted by JoeJoe (not verified)

I have managed to upgrade my entire system to haswell and i am saving up to finally get a new GPU, i was basically waiting for the GTX800 series to come out. I'm just very gutted that they rushed out there (for the reasons mentioned above, like mantle, xbone) to announce DX12 a whole year ahead and make "news", but havent thought that they might be hurting their sales.

IN THEORY dx11 capable cards post fermi will be dx12 compliant, but u'll need newer hardware to access some features of the API...with dx12 hitting the shelves with a AAA title an entire year from now, im not so sure buying a maxwell based gpu is a good idea anymore.

Holidays 2015 is gonna be a year after maxwell comes out and probably a good time for nvidia to release a re-iteration of that chip (a-la keppler, 6xx-7xx series). What i'm scared of is that ill buy the 870, for example, expecting that card to last for at least 3 years and instead i get obsolete because maxwell 2.0 is now "FULL DX12", where as maxwell 1.0 i bought a year ago is "PARTIAL DX12".

I don't trust them, API upgrades used to drive card sales bi-annualy prior to dx11, im sure they'll be more than happy to move back to that, and with maxwell around the corner, im almost sure they're gonna eff-up the customer.

Somebody needs to ask nvidia about this.

April 3, 2014 | 09:31 AM - Posted by JoeJoe (not verified)

***edit- prior to dx9, not dx11

April 3, 2014 | 12:55 PM - Posted by Josh Walrath

Just find a good deal on a card now and buy.  With the concerns I have with GPUs actually making it to 20 nm planar, as well as the timeline for 16 nm finfet next year... you might end up waiting a long, long time for any meaningful upgrade as compared to what is available now.  I would just jump in and be happy with a current card/upgrade.

April 3, 2014 | 11:11 AM - Posted by Gregster

Good read Josh. As an nVidia fan, I am looking forward to this and it can't come quick enough. I want OpenGL to be the winner if truth be told but I just can't see that happening. Mantle is doing well in its infancy but it is proprietary for now and not sure if nVidia would take it up... Scrap that, I know nVidia will not take it up.

April 3, 2014 | 12:56 PM - Posted by Josh Walrath

OpenGL development is even slower than DX!

NV will never adopt Mantle.  Not sure what kind of royalty work AMD would require in the first place, also NV wants to focus on current DX11 development as well as future DX12 stuff.  So yeah, they don't want to split off their software group to approach Mantle.

April 3, 2014 | 11:45 AM - Posted by Anonymous (not verified)

fact+fact leading to a theory is fine.
but ignoring a fact(clear statement from devs, and AMD saying they have been asking microsoft to go lower for 9 years with no results), then adding to it an assumption built on a vague statement(when microsoft said they have been working on DX12 for 4years, not that they have been working on lower level DX, just working on DX12 which is logic, they didnt stop working on DX when 11 was released, nor will they stop working or talking about dx13-14 after the 12 is released), then you build a fact based on an ignored fact+assumption to lead to a Fact, not even a theory, makes ppl feel this is a directed bias propaganda.
why put yourself through this, to lead ppl to think that there are personal reasons behind what you write in your articles or say on live blogs, being a green team is fine, but it's good to say neutral when you are a reporter, otherwise ppl will think you have financial gains from this, stocks or bribes.
personaly i dont see microsoft working 4 years on lower level Api, then :
1- never say anything about it, untill 6 months after a concurent showed up on the market and started doing very well.
2-been working on it for 4years, then decided to throw everything out the window, to start working from scratch on a new version to include it on their console, a console specs that turned out to be crap and the API tools crapier, and have to wait 2 years, for the console to have it, thats basicly half the life of this console.
3-last year it was common knowledge that microsoft doesnt wanna go lower level, today it became ofc they were working on it for 4 years!... give me a break

josh, i dont agree with your article, for the simple reason, is that, nothing in it makes sens, all i see is 2+2= bad AMD, 1+1= Nvidia always true

"Some four years ago, if going by what NVIDIA has said, initial talks were initiated to start pursuing the development of DirectX 12."
you actualy based all your article upon this statement, and assuming that Nvidia ment they started working on lower level DX, then AMD jumped and stole it from them, which is complete bullshit, when i read this all i see is companies talking about an API, where they probably dismissed all lower level talk, and kept on the same direction as before, untill Mantle shaked them up like an orangina, and they started running all over the place.

April 3, 2014 | 01:26 PM - Posted by gamerk2 (not verified)

Companies typically don't reveal their products until they are good an ready. The manufactures (NVIDIA, AMD, and Qualcomm) were also likely under NDA up until GDC.

Anyone who knows about the development cycle knows it takes a good 3-4 years to define and build up an API, so yes, I believe NVIDIA/MSFT more then I do AMD. There's no way MSFT got the details about Mantle, reverse engineered it, made a new API based of it, convinced NVIDIA to get a driver that works with it, build a new game engine around it, and demo it, in just six months.

Its also worth noting that the development of GCN GPU's (the only AMD GPU that supports DX12) would have started about a year into the DX12 development cycle, according to the NVIDIA/MSFT timeline. Coincidence that only their newest cards made after the spec allegedly was being worked on supports it?

April 3, 2014 | 01:49 PM - Posted by Anonymous (not verified)

again, you are saying stuff that microsoft nor nvidia have said, they were talking about dx12, not about lower level abstarction, that you make the link between them by yourself, GCN is not the only compatible architecture, nvidia's and intel's are also, or you want to dx cast amd out of the equation ?.
remember that microsft already have an api lower level for xbox1, they are not starting from scratch, can they work on it, to support 1 system with 1 card and 1 cpu, to run a demo, a video... yup they can

April 3, 2014 | 10:48 PM - Posted by Anonymous (not verified)

MANTLE<-DX12 (Carbon Copy)

April 3, 2014 | 12:14 PM - Posted by Anonymous (not verified)

If they don't port DX12 to W7 then I think it will be a failure.

April 3, 2014 | 12:58 PM - Posted by Josh Walrath

Yeah... Win7 really could use DX12.  I will be very disappointed if it is Win8 and above only.  Hey, at least MS did port DX11 back to Vista!  Hope for us yet?

April 3, 2014 | 01:37 PM - Posted by Anonymous (not verified)

devs wouldnt develope games on DX12 with a very low prospect player base, and they will have to wait untill ppl migrate to the new OS, just like what happened with dx9-11, look how many years have passed since dx11 release, and still the majority of titles are directx9, because it took alot of time for ppl to migrate. so you have 30% on winx(dx9) and 50% win7 (dx11), so devs instead of developing for 50% they rather do it for 80%, thats why dx9 games are still so popular.
and you have about 5% on win8, have anyone heard of dx11.2 games ? not that i know off, and i doubt there will be any.

so here is how it will go if dx12 not compatible win7, dx12 wont come untill mid 2016(because i strongly believe the date announced is just to get some traction, not too far, and not too close), it will probably be delayed from end 2015 to mid 2016, no dev will make a dx12 game unless microsoft finances it, so from 2016-2018, the only games you will see are microsoft games port from xbox1 or financed games(which they wont do more than a couple a year, even nvidia could help with financing, but you still wont have many games).
guessing the migration to win9 would be around 50% (depanding on how good or bad it would be) from 2018-2020, so to me untill 2018 most game would be directx 11.1.
so personaly i am really not excited about dx12, untill they say if win7 compatible or not, and the partial compatibility is crap, everyone is talking about lower level abstraction, who cares about some other useless features.

April 3, 2014 | 01:48 PM - Posted by Mummy (not verified)

MS didn't port DirextX 11.2 to Windows 7. I wouldn't be surprised if DX 12 is a Windows 9 exclusive. The timing seems right. Late 2015 = Win 9 if MS continues with what has become their normal 3 year release cycle.

Even if it is ported to Win 8 and not Win 7, the recent news that MS is finally bringing back a proper Start Menu to Windows 8 should remove most of the issues folks have with the otherwise decent OS.

IMO, AMD comes out on top (better overall performance from their struggling APUs and drivers are less of an issue), Intel (worst graphics drivers and best CPUs) in second with Nvidia (best graphics drivers but worst perf per dollar and struggling ARM chips) looking to be the biggest loser in all this. Check that, the consumer is the biggest winner!!

April 3, 2014 | 12:55 PM - Posted by idiot101 (not verified)

AMD needed Mantle to stay competent in the market. After looking at what Nvidia was able to achieve with its new architecture, AMD is running scared with nothing new immediately on the horizon. The closer-to-the-metal Mantle coding improvements on a dual card setup with its ultra smooth gameplay might just be what AMD needed with the new dual GPU card yet to be released. AMD will do anything to stay relevant and with Mantle definitely benefiting low-performance hardware, it wouldn't be a stretch to see it implemented on the consoles (especially the PS4). This would make the PS4 a much more attractive offering in the next couple of quarters if developers, Sony and AMD are able to get it right.

The discussion on DX12 right now is not relevant. It is in its infancy and a lot is still needed to make it useful/significant in game development.

My unsolicited 2 cents.

April 3, 2014 | 01:27 PM - Posted by gamerk2 (not verified)

Except theres no reason to use Mantle on the PS4; you already have the libgcm graphical library, which is native to the console. You can't get better then raw HW access.

April 3, 2014 | 01:41 PM - Posted by idiot101 (not verified)

You are right. Stupid me. Scratch the statement regarding Mantle for Consoles. Information regarding Mantle irrelevance for Consoles was made available as early as October last year.

April 3, 2014 | 01:58 PM - Posted by Anonymous (not verified)

what's nvidia's new architerure, why do you think nintendo chose GCN for wiiU, why sony chose GCN for ps4, why microsoft chosed GCN for xbox1, GCN architecture is far more advanced than what nvidia has, why these big companies chose AMD gpu's over nvidia's, they are the one with the most expertise on the matter, they spend a huge amount of time researching which architecture is better, and they chosed GCN, so that idiot101 comes and say Nvidia's architecture is better...how ?
and it's not for the price, consoles have been known to sell at a loss, and make profit on games, the only reason they pick an architecture is for the features and the capabilities it offers, thats about it.

April 3, 2014 | 02:11 PM - Posted by idiot101 (not verified)

The performance per watt on Maxwell is brilliant. AMD needs to change the architecture to match this. All Nvidia needs to do is to drop the price of the 750 and the 750Ti by $20 and you have them flying off the shelves. They can afford to do it too but as they say, make hay while the sun shines.

AMD offered the best price for the hardware in the console. It had the advantage of being able to undercut the competitors by providing both the CPU and GPU together. It is so much easier to deal with one company you are sourcing parts from rather than multiple companies like the last generation, decreasing costs. There were too many things that were attractive to Sony and MS when they dealt with AMD.

April 3, 2014 | 02:17 PM - Posted by idiot101 (not verified)

Plus, I don't think it is AMD that is building the parts for Sony and MS. They have only sourced the tech and the hardware has been heavily modified to extract the best out of it by the console makers.

April 3, 2014 | 04:04 PM - Posted by Anonymous (not verified)

i knew you would bring the price, tat is completely irrelevant here, these kind of conracts are hundreds of millions worth, the margin from the cost isnt really relevant, they are not selling to retailers, so they pick according to what they need, not what's cheaper, consoles dont sell for margin as i said often they sell the console cheaper than the cost of making it, the profit comes from the games, so nothing from what you said is relevant here, but you seem to have your mind set, what can i say, GCN is still better architecture and more advanced, amd can rebrand any old low end gpu and make it more efficient, thats not what makes the architecture better, for mobiles or even mining thats more of an important point, but desktop and consoles hardly makes their top 3 priorities.

April 3, 2014 | 04:21 PM - Posted by idiot101 (not verified)

I am not disputing the quality of the GCN architecture. You can only extract so much performance from it. It is dated. There are more ways to extract juice out of GCN and one is to code closer to the metal. That is where Mantle comes in. It is trying to breathe life into a 3 year+ old architecture.

And this is a nice article to explain AMD's win for console hardware. I am not disputing the technology at all.

http://www.forbes.com/sites/patrickmoorhead/2013/06/26/the-real-reasons-...

April 3, 2014 | 05:27 PM - Posted by Josh Walrath

Wait... GCN is a dated architecture?  I would argue that it is actually more forward looking than NVIDIA's Kepler and Maxwell in terms of overall performance and flexibility.

April 3, 2014 | 06:42 PM - Posted by idiot101 (not verified)

I stand corrected. GCN is is good and all. It took me some time to go through all the articles about the different GPU designs (southern Islands to Volcanic Islands and the significant changes).

Ok, it is not a new architecture/process shrink, but a redesign that is needed as a stop gap measure to counter Maxwell (or am I wrong again?). The evolution has been interesting to read about but freaking difficult to follow for a mortal like me. I would rather stay away from this as I can barely follow the tech.

April 3, 2014 | 08:06 PM - Posted by Josh Walrath

I'm not sure what AMD's plans are for a refresh of these parts.  Maxwell does take power consumption to another level, and that will be hard for AMD to respond to.  20 nm planar process node is not necessarily good for GPUs due to how power and clockspeed scales.  There is a reason why Intel went 22 nm Trigate.

I am not entirely sure we will see a major refresh from AMD this year, as they are hanging their hats on the current R5/7/9 series of parts and Hawaii is still relatively new.  It sorta seems to me like these guys are aiming for 16 nm finfet for their next big generation.  I still have no idea if NV will be using 20 nm for Maxwell, but early indications say they are.  I still don't know.  Maybe 28 nm will be the basis for Maxwell from here on out.

April 3, 2014 | 11:11 PM - Posted by Anonymous (not verified)

WOW This coming from you, although am not surprised.

ARM is even more efficient than Maxwell, Does it matter though to gamers?

Nobody cares about Maxwell, as it cant play games but on low level stuff. See two can play that game.

When Big Maxwell comes, it will be no different than Kepler. When you see the performance and Specs of GTX750ti, its nothing special with every single thing cut down to bare levels

April 3, 2014 | 08:36 PM - Posted by Anonymous (not verified)

maxwell is not brilliant, actualy i dont see anything interesting in it worth mentioning.
the low consumption ofc they introduced the 750Ti low end, dont you ask yourself why they didnt release something higher end, because the consumption you are talking about wont be that significant, since when nvidia introduce a new architecture with low end and then stops ?

April 3, 2014 | 11:01 PM - Posted by Anonymous (not verified)

"The performance per watt on Maxwell is brilliant"

The performance per watt on ARM is even better (Annihilates Nvidia), but who cares bcoz they are both useless and slow that nobody (Gamers) cares in reality.

Come back and let me know when you can play recent games at decent FPS (And Smooth). Talking about smooth, something Nvidia lacks these days and something AMD is better at, much-much better.

April 3, 2014 | 11:01 PM - Posted by Anonymous (not verified)

"The performance per watt on Maxwell is brilliant"

The performance per watt on ARM is even better (Annihilates Nvidia), but who cares bcoz they are both useless and slow that nobody (Gamers) cares in reality.

Come back and let me know when you can play recent games at decent FPS (And Smooth). Talking about smooth, something Nvidia lacks these days and something AMD is better at, much-much better.

April 4, 2014 | 07:19 AM - Posted by renz (not verified)

Wii U is not based on GCN gpu.

April 4, 2014 | 07:21 AM - Posted by renz (not verified)

Wii U is not based on GCN gpu. even the cpu is not from AMD

April 4, 2014 | 07:22 AM - Posted by renz (not verified)

Wii U is not based on GCN gpu. even the cpu is not from AMD

April 4, 2014 | 07:53 AM - Posted by Anonymous (not verified)

man you repeated it 3 times, and if only you were right

http://www.nintendo.com/wiiu/fr/features/tech-specs/

GPU : haute définition de type AMD Radeon™.

April 3, 2014 | 03:53 PM - Posted by Anonymous (not verified)

Nice article Josh. You did miss some points that I found were there is Ryan Smith's DX12 writeup. But still, based on what I had seen in the 2-3 podcasts I have seen, I didn't think you could write such good articles.

April 3, 2014 | 04:13 PM - Posted by Josh Walrath

LMAO!  Not enough innuendoes in this article?

April 3, 2014 | 11:24 PM - Posted by Anonymous (not verified)

I then talked with Ritche Corpus, AMD's Software Alliances and Developer Relations Director. Corpus told me that AMD shared its work on Mantle with Microsoft "from day one" and that parts of Direct3D 12 are "very similar" to AMD's API. I asked if D3D12's development had begun before Mantle's. Corpus' answer: "Not that we know." Corpus explained that, when AMD was developing Mantle, it received no feedback from game developers that would suggest AMD was wasting its time because a similar project was underway at Microsoft. I recalled that, at AMD's APU13 event in "November 2013, EA DICE's Johan Andersson expressed a desire to use Mantle "everywhere and on everything." Those are perhaps not the words I would have used if I had known D3D12 was right around the corner.

The day after the D3D12 keynote, I got on the phone with Tony Tamasi, Nvidia's Senior VP of Content and Technology. Tamasi painted a rather different picture than Corpus. He told me D3D12 had been in in the works for "more than three years" (longer than Mantle) and that "everyone" had been involved in its development. As he pointed out, people from AMD, Nvidia, Intel, and even Qualcomm stood on stage at the D3D12 reveal keynote. Those four companies' logos are also featured prominently on the current landing page for the official DirectX blog"

Taken from TechReport: Who do you think is telling the truth?

April 4, 2014 | 12:00 AM - Posted by Anonymous (not verified)

well here is the point
AMD is talking about a lower level API.
Nvidia never mentioned anything about that " He told me D3D12 had been in in the works for "more than three years" (longer than Mantle) and that "everyone" had been involved in its development."
to me when i read this it's perfectly logical, that after each DX release, microsoft keeps on talking with their partenairs, and developing the next DX, as they released DX9, and right after it started talking about the new DX which will turn out to be DX11, devs and AMD asked for lower level abstraction, which we know microsoft didnt follow, same when DX11 was released, talks and developing started for next DX, like 4 years ago, but does that mean, they started developing a lower level abstraction of DX ? No!
you wont find any statement from nvidia,intel, amd, microsoft saying that, you just wont.
what nvidia did is just used the common knowledge that yes a next version of DX that will turn out to be DX12, was in talks or even developement, but why you go by yourself to assume, it was with the feature mantle provides, the lower level part ? could they be developing DX12 traditional like DX11, yes, that sentence is pretty much open to whatever you want it to be, traditional or close to metal, it's up to you to believe what you want.
but believe me if it was lower level they would have strongly focused on that point sooo hard.(how can nvidia let a chance like this slip to turn the table upside down on mantle? simply because they can't, it never happened the way you assumed)
so if you are saying that microsoft was working on a traditional version of DX for 4 years, i will agree.
but if you say that they were working on a lower level for 4 years, i would ask you to prove it, with something else other than " He told me D3D12 had been in in the works for "more than three years" (longer than Mantle) and that "everyone" had been involved in its development."

April 14, 2014 | 06:10 PM - Posted by drbaltazar (not verified)

Xbox one capability not up to expectation?rofl i dont believe that one second !why?where the eck is halo?from the get go halo was to be avail much later.there is one part xbox team didnt expect ,its the fact they would neex to dot the i and line the T of every line of code from dev,ms assumed dev werent just copying and pasting code?xbox team was wrong,xbox team literaly had to spoon feed everything,so in the end ms had to create ALL the tool they didnt expect dev would need (since said dev were supposedly experienced)having seen donnybrook in action ?dont sweat it guys,xbox should be good for uhd gaming when ms has shown how !halo is likly the one that will show that .

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.