Review Index:

Microsoft Introduces us to DX12: My Thoughts on the Industry Moving Forward

Subject: Editorial
Manufacturer: Microsoft

Taking it all the way to 12!

Microsoft has been developing DirectX for around 20 years now.  Back in the 90s, the hardware and software scene for gaming was chaotic, at best.  We had wonderful things like “SoundBlaster compatibility” and 3rd party graphics APIs such as Glide, S3G, PowerSGL, RRedline, and ATICIF.  OpenGL was aimed more towards professional applications and it took John Carmack and iD, through GLQuake in 1996, to start the ball moving in that particular direction.  There was a distinct need to provide standards across audio and 3D graphics that would de-fragment the industry and developers.  DirectX was introduced with Windows 95, but the popularity of Direct3D did not really take off until DirectX 3.0 that was released in late 1996.

View Full Size

DirectX has had some notable successes, and some notable let downs, over the years.  DX6 provided a much needed boost in 3D graphics, while DX8 introduced the world to programmable shading.  DX9 was the most long-lived version, thanks to it being the basis for the Xbox 360 console with its extended lifespan.  DX11 added in a bunch of features and made programming much simpler, all the while improving performance over DX10.  The low points?  DX10 was pretty dismal due to the performance penalty on hardware that supported some of the advanced rendering techniques.  DirectX 7 was around a little more than a year before giving way to DX8.  DX1 and DX2?  Yeah, those were very unpopular and problematic, due to the myriad changes in a modern operating system (Win95) as compared to the DOS based world that game devs were used to.

Some four years ago, if going by what NVIDIA has said, initial talks were initiated to start pursuing the development of DirectX 12.  DX11 was released in 2009 and has been an excellent foundation for PC games.  It is not perfect, though.  There is still a significant impact in potential performance due to a variety of factors, including a fairly inefficient hardware abstraction layer that relies more upon fast single threaded performance from a CPU rather than leveraging the power of a modern multi-core/multi-thread unit.  This has the result of limiting how many objects can be represented on screen as well as different operations that would bottleneck even the fastest CPU threads.

Click here to read the rest of the article!

Since the release of DX11, there had been very little word from Microsoft about what the next generation DX would bring.  In fact, there had been rumors that MS had put all development on the back burner for DirectX.  The focus was making the new Xbox One as competent as possible with its own version of DX11.  Now, I do not have any inside information about how the development of DX12 proceeded during these years, but I do think it is rather obvious that it was not a priority.  DX11 was doing great, there was very good support for it from the PC side, and DX9c was available to most mobile devices.

Things do not always go as planned.  The Xbox One has some performance issues as compared to the PS4.  Promises that Microsoft made to developers about the capabilities of the XBOne seem to have been far too optimistic for what was actually delivered.  It was likely during final development, with working silicon and the basic OS installed, that Microsoft realized they had some serious issues on their hands.  The APU being used for that console contains eight cores running at lower speeds than even what a modern, low-end PC exhibits.  With the single core/thread bias that DX11 exhibits, performance for the XBOne is not going to blow the doors off the competition.  While MS and AMD have done quite a bit of optimizing at the OS level for the XBOne, there are still many improvements to be made.  Consoles typically can be programmed for much closer to the metal than a PC, due to the hardware for that console being locked down and identical to every other XBOne out there.  It is still a DirectX development platform though, and that means DX11 is the basis for what we see.

View Full Size

3D Mark 2011 in DX11 mode shows the bias towards single thread usage.  Other threads are utilized, but not greatly so.

The PS4 is in a slightly different scenario with its reliance on PSGL.  PSGL is nearly identical to OpenGL, but custom tailored by Sony for the Playstation.  Sony has been able to move a bit closer to the metal than what Microsoft has, because Sony is not constrained by a more general programming environment that Microsoft is with DirectX 11.  We really can only make some general guesses as to what actual performance is on the PS4 vs. the XBOne, but the PS4 does have a far more robust implementation in terms of hardware than what Microsoft does.  It will be interesting to see more concrete performance figures from these consoles once more cross-platform titles become common.

Next we need to ask ourselves where AMD’s Mantle comes in.  With the launch of the R9 290X, AMD also introduced us to TrueAudio and Mantle.  We do not know what Microsoft’s internal schedule was for DirectX 12, but the feeling around the industry was that the release of Mantle helped to accelerate work on that project.  Undoubtedly, Microsoft learned of AMD’s work on Mantle by at least the Spring of 2013.  At that point, Microsoft started to work earnestly with its partners to begin fleshing out DX12 and get some basic support for it in software and drivers to announce it at GDC this year.

So did Mantle motivate Microsoft to create DX12?  No, that project has been in the works for quite some time, but I think that we can safely say that Mantle and the XBOne have helped to accelerate the process beyond what Microsoft had originally planned for.  This is only speculation on my part, but my gut feeling is that Microsoft was planning to wait a few years before releasing DX12, and when they did then they could advertise a mid-life performance boost for the XBOne on titles developed for DX12 and that console in particular.  The industry does not stand still, and AMD’s push with Mantle may have forced Microsoft’s hand more than they would have liked.  Obviously AMD and NVIDIA were not pleased at what appears to be a glacial pace for DirectX development since the release of DX11, but AMD took it into their own hands to offer an alternative which would also accentuate their brand.  Add in Microsoft’s challenges in the mobile market (Surface, Windows Phone) and it is clear that they needed more inertia to capture a larger portion of these growing spaces.  DirectX 12 should help provide some of that inertia.

View Full Size

3D Mark 2011 in DX12 mode shows much greater multi-thread awareness as well as utilizing that single thread less, and causing less CPU time to be used on that thread.  Less time spent means more potential frames rendered when not GPU bound.

AMD does not want to have to curate an API.  That is quite honestly a waste of resources for AMD in the long term.  Well defined industry standards allow these IHVs to focus on their products rather than gather support for a home-built API.  However, with the original timeline for DX12, AMD saw an opportunity to push these features as an exclusive for their hardware, thereby potentially increasing sales and support before the standardized DX12 is released.  This also has the added advantage for AMD of exposing developers to a more low level, close to the metal API that will benefit both the developer and AMD.  In the short term, these developers gain experience in programming that will reap benefits once DX12 is released. In the meantime, AMD gets a nice checkbox on their products for exclusive support of Mantle.

April 2, 2014 | 10:02 PM - Posted by Anonymous (not verified)

hello josh,
well i am kinda disapointed at your article, in my opinion it doesnt encompass all the possibilities, since you do not have any facts, and all you have is guessing, which is not a probleme, untill you start making it sound like facts.
" However, with the original timeline for DX12, AMD saw an opportunity to push these features as an exclusive for their hardware, thereby potentially increasing sales and support before the standardized DX12 is released."

the other possibility which is the most likely, giving the statements of AMD, Dice, Developpers, and the state of the annoncement of DX and how far it is, it goes as follow :
AMD Gpus are good, yes to cut the prices sometimes you get crapy cooler, but the architecture itself is often very good, sometimes better than Nvidia, but Nvidia has always been in front, because their Gpus get better drivers, and more often, fixing bugs, and increasing slightly fps.
so amd and devs asked microsoft for years to offer lower lvl Api, but never happened, so AMD decided to push the industry in this direction with Mantle.
1st hoping for Mantle to work and forces microsoft's hand to change the standard, or 2nd if it didnt to push for a new standard Api.
a lower level API strips Nvidia from it's Major tool, Drivers, and saves AMD resources and time, and AMD securing 3 major graphic Engines to support mantle shows that they didnt just plan on releasing it with couple games, just for the marketing, but they were planning the long run incase microsoft didnt give up.
all what AMD and johan andersson said during APU14, supports this theory, and the state of DX12 that microsoft showed 6 months after Mantle's announcement, also support, that all the talk about DX12 just meant ppl talking about the new DX and how it should be, not necessarily means they had any plans to go lower level before mantle showed up.

April 3, 2014 | 02:09 AM - Posted by wujj123456

I agree this is more likely. I don't see why a company would spend significant resources while they know all they can do is to spend a fraction and wait.

It's also possible that given current competitive environment, AMD feels rushing out Mantle is both helpful in later DX12 development and catching up the performance gap if any. However, this is assuming Mantle doesn't consume significant amount of resources, which only AMD knows.

April 3, 2014 | 01:15 PM - Posted by Josh Walrath

It is hard to say, from an outside perspective, what exactly happened in those DirectX meetings with all of the IHV's putting in their $0.02.  I know that developers and IHVs have been unhappy with the state of the abstraction layer, and how much performance it sucks up.  How poorly threaded it is... etc. etc.

Something else to consider is that AMD started developing Mantle quite a while ago as well.  Perhaps talks about DX were not going to their liking, so they started their own little path.  Obviously between here and there things have changed... but yeah, what was the initial impetus for AMD to develop Mantle in the first place?  Would love to hear some more inside scoop from all sides.

BTW, it is entirely appropriate to be disappointed in my article.  Feedback is good, sharing info is better!

April 4, 2014 | 06:52 AM - Posted by Anonymous (not verified)

in a completely unrelated Note, AMD's Freesync is on the way.
VESA has validated the SCR (Specification Change Request) made by AMD in novembre, to change Displayport 1.2a , the demande was listed as "DP1.2a Extend MSA Tmg Para AMD SCR"
and now the change shows up in the aprouved list "DP1.2a Extend MSA Tmg Param Ignore Option AMD"

so if i understood correctly, everyone who has monitor with displayport 1.2a, would be able to download firmware for the monitor, and driver or software to enable it.

April 4, 2014 | 07:05 AM - Posted by renz (not verified)

so does freesync work on games yet? last time when AMD make the demo they never demo real game running the demo machine

April 4, 2014 | 07:32 AM - Posted by Anonymous (not verified)

well that's basicly the point of it, games xD
and it works the same way G-sync does.
the good news is, the vesa change makes it a standard, and a free one at that, so manufacturers will adopt it very fast, because it adds value for better margins.
this is not proprietary technology, since AMD is busy with mantle, intel could offer support for it faster.
and i am guessing nvidia will just wait for devs to add the option ingame, i dont see nvidia offering an option to enable it through drivers, when they asked ppl to pay 600$ monitor for it 4 months ago

April 3, 2014 | 03:46 AM - Posted by Anonymous (not verified)

All these PcPerspective write ups forget or ignore what Nvidia said themselves about DX12 lower-level abstraction, that its optional.

April 3, 2014 | 12:44 PM - Posted by Josh Walrath

I think I did sorta cover that.  Perhaps in not so many words... but basically devs have the option of not worrying about the low level stuff and letting DX handle that functionality.

April 3, 2014 | 03:11 PM - Posted by Anonymous (not verified)

What ?

DX doesn't automaticly do it for them. That's crazy.

There wouldn't be a need to drop it for backwards compatibility because as you said it be handled by DX magicly.

Engine and game still have to be coded to take advantage of it and Nvidia wont push it because its on the CPU side which they don't have at the moment.

April 3, 2014 | 03:46 PM - Posted by Josh Walrath

Whoa, whoa, whoa... I didn't mention anything about magically doing anything.  I am looking at a high level when saying "DX will handle" stuff like memory addressing and transactions.  Eg. hey, the engine wants to access this texture data, but it does not need to define the addresses.  So, the abstraction layer does all that translation.  Not entirely sure what the rest of your statement actually refers to, sorry.

April 3, 2014 | 07:59 AM - Posted by Ophelos

I don't see why microsoft is even going down this road with DX12 for the simple fact that most game engines will be running on Linux. Like currently Epic Games is working on getting a port of UE4 working on linux, an i'm sure Crytek is doing the samething with the CryEngine which will be supporting Mantle.

So unless Microsoft opens up DX12 for more platforms other then it's own products this will become useless before it gets released to public.

April 3, 2014 | 12:46 PM - Posted by Josh Walrath

Well, the vast majority of PC gaming still happens on Windows, for better or for worse.  DX12 will be across PC, mobile, and xbox one.  I guess that is their idea of opening up to different platforms...

April 3, 2014 | 02:52 PM - Posted by Anonymous (not verified)

well microsoft doesnt really have a choice, win8 failed, and xbox failed, and dx being seriously threatened, what do you think they can do ?
the consoles ps4 sells twice as much, xbox api and tools sucks in addition to the spec, so microsoft need somehow to attract more devs to it's console, reworked api is a must since devs hated the current one, better games and exlusive games need larger player base, unifiying dx12 on desktop and xbox, allows devs to have larger player base consol+pc, with easier porting, could save xbox and give a boost to windows sells, as for them to expanding linux/mac not gonna happen, you dont ask a company that has monopole on a market to give it up.

April 3, 2014 | 08:07 AM - Posted by Anonymous (not verified)

""So did Mantle motivate Microsoft to create DX12? No, that project has been in the works for quite some time,""

Bullshit!! how did you know? how did you assume that MS is telling the truth and AMD is liying?

I'm not a direcx or Microsoft hater, but the quality of writing @ PCPER is headed downhill. Which is very sad nonetherless.

April 3, 2014 | 09:07 AM - Posted by renz (not verified)

so what about you? did you believe MS never have any planned on DX12 before AMD become public with Mantle? or you simply don't believe both?

April 3, 2014 | 09:46 AM - Posted by brisa117

What I think he meant was that it's completely irrational to think that Microsoft started development (from scratch) on DX12 because of Mantle. Mantle more than likely accelerated their plans for developing DX12, but you can't create a new API in six months. They clearly were working on it, at some level, prior to the Mantle announcement.

Also, don't criticize the writing on this website when half of your sentences aren't capitalized, you have way too many quotation marks around the article quote, you misspelled "DirectX", and you use the "@" symbol like the actual word "at". Do you know how much crap the writers would take here if they submitted articles like your post?

And if you want to jab at anyone's writing on this site, it should clearly be directed towards Scott Michaud. Half of his sentences don't make sense. His writing is very emotionally charged and fully of weird opinions (but of course, everyone is welcome to their own opinions).

April 3, 2014 | 11:15 AM - Posted by Allyn Malventano you guys think that Microsoft knee jerked after Mantle launched, and somehow managed to get DX12 up and running, along with a game running under it, this quickly? Apparently you guys are not familiar with the (lack of) speed with the Microsoft development cycle...

April 3, 2014 | 01:18 PM - Posted by Anonymous (not verified)

hello allyn,
well i remember johan andersson saying that Dice needed to rewrtite the whole BF4 on Mantle, in about 4 months(even if it took a month delay still), you can go double check, and i am talking about the whole game, playable, not a single map, with a single car on a single graphic card or a system, on a video run, that we dont even know the specifications, it's not like if forza is now available on PC anytime soon, is it possible to setup such a demo in such short time in these conditions, yes i do believe so.
and i do believe they heard about mantle way before apu14 annoncement, these are big companies, they have ways to have info leaks, to adapt before hand, but i strongly doubt there was any kind of low level dx in developement in 2012 period, and i wouldn't believe otherwise untill they say it loud and clear, and explain why no devs or major manufacturers have been kept in the dark, especialy when another party states the opposite, and i rather listen to the other one who make a statement and wait for me to, come up with a conclusion he didnt even say.
just show me where microsoft stated they were working on lower level DX, not talking about it, but developing it, as far as i know nowhere.
beside microsoft lazyness isn't an option here, you are forgeting the failure of the xbox1, they stand to lose alot more than just an API, with all the money they spent on the console, and windows 8, they cant just keep failing, i bet microsoft is putting everything they got to save the xbox1, ie API, they have nothing more important right now

April 3, 2014 | 10:25 PM - Posted by Anonymous (not verified)

Bravo.. And this is what most people are saying and thinking. It would be soo funny if AMD leaked the info to a third party and made Nvidia and tech sites look silly as usual

April 4, 2014 | 08:30 AM - Posted by Allyn Malventano

...and in your example, that gives Microsoft 2 months to come up with a new API, as well as implementing it sufficiently enough for the game developer to start coding for it. It only seems less realistic in your case.

Thinking about these timelines, consider that AMD was more than likely in some of these supposed early meetings with Microsoft and NVidia. What if, hypothetically, Microsoft proposed the idea first, or it came up at a group meeting among the three. How does that make AMD look for coming up with your own API that does something very similar, which aside from making them look good for a few months, could arguably unnecessarily fragment the codebases. Windows developers might have been less likely to adopt Mantle if they knew DX12 was coming earlier (and gave similar or even increased benefits), and if AMD has claimed they were working with Microsoft earlier than the date they released Mantle, it makes them look even more underhanded.

April 4, 2014 | 10:18 AM - Posted by Anonymous (not verified)

well if i said some other company did this, a company that doesnt own an API on desktop, owns a Console with a similar Api, who ran last year at E3 a demo for the same game on DX11.
if we knew the state of developement of DX12....there are so many things that makes it doable.
the person posting couple comments bellow, gave also a valide theory, game running on DX11 last, so they updated the renderer for DX12 this year.
or what do you really know about the state of developement of DX12 ? nothing!
we know that DX12 will have optional close to metal feature + traditional DX11, so i could very well rename and tweak DX11 into DX12, and call it DX12 beta developement.

and allyn, do you really believe that Nvidia wouldnt have used this against AMD ? this doesnt seem a sensitive information to be under NDA, everyone knows about lower level, everyone know it was in the talks for over a decade, just spill out when developement started...well untill then i still believe AMD and Dice who gave me a clear answer, i have the right the ignore, Nvidia's vague statement open to interpretations.

April 3, 2014 | 12:49 PM - Posted by Josh Walrath

Everyone is keeping pretty tight lipped about the timeline, but there really is no way to entirely flesh out an API of this nature in the 6 months since AMD released Mantle.  NV mentioned that they started doing driver development on initial DX12 builds about a year ago.  It certainly looks to have been in the pipeline for longer than that, but I do believe that Mantle did help to push MS to accelerate their introduction of DX12.

NVIDIA, AMD, Intel, and others all work extensively with NVIDIA to help define and develop DirectX... so they all know what the timelines were for this release.  

April 3, 2014 | 10:29 PM - Posted by Anonymous (not verified)

Nvidia actually said 4 years (Its on MS and Nvidia site). Which I don't believe for one second. Intel and AMD mentioned no time frame.

We have MS and Nvidia saying 4 years, and we have Intel and AMD saying no time frame... Hmm Who to believe

PS I think Nvidia and MS are telling Ponies

April 3, 2014 | 02:12 PM - Posted by Anonymous (not verified)

"Bullshit!! how did you know?"

Because this sort of thing isn't developed into a working state in the timeframe it took them to "respond" to mantle. The idea itself isn't new but making it work take a lot longer than the few months between mantle's release and the dx12 forza demo.

Even if microsoft were the evil beings we want to label them and they've had this technology working but hidden away from the public all along, they have been doing low level hardware coding for their consoles for way longer than mantle has existed and have waaay more experience doing so. Bringing it to PC as a vendor-locked option would be breaking the cease-fire on that front, which AMD isn't guaranteed to win now but will be beneficial to PC gaming as a whole.

April 3, 2014 | 03:42 PM - Posted by Anonymous (not verified)

The Forza Demo is a joke.

Microsoft showed a Forza Demo at E3 on Nvidia hardware last year. Microsoft got a lot of flak for it and now conveniently everyone forgets.

It took them 4man months to update the renderer from DX11.x to DX12 and that's all they did. Update the renderer for one circuit so they can demo it as a DX12 render tech demo.

There was no HUD no different views cycled just one 3rd person view and the constant 60fps weren't even constant. Throughout the demo the fps kept dropping in corners and the biggest fps came on a straight away.

How easily this site and people are fooled.

April 3, 2014 | 10:37 PM - Posted by Anonymous (not verified)

This site and everyone else running with the DX12 story aren't fooled whatsoever. What gave you that idea?

April 3, 2014 | 10:35 PM - Posted by Anonymous (not verified)

Exactly! And AMD came out and said no DX12.. Suddenly Nvidia, MS and Intel went in to overdrive and out pops news of DX12.

I honestly think DX12 will not have an easy time as in past. After all its just anther proprietary API which this article conveniently forgot to mention alongside Glide and Mantle etc.

Mantle will work on Windows, Linux and most others, so I don't see the problem, but I do see problems for DX12

April 4, 2014 | 07:13 AM - Posted by renz (not verified)

but they only work on AMD GCN cards only. AMD said they want to bring Mantle to linux but we will see how that goes. DX12 is only for windows platform but it works with any hardware.

April 3, 2014 | 09:26 AM - Posted by JoeJoe (not verified)

I have managed to upgrade my entire system to haswell and i am saving up to finally get a new GPU, i was basically waiting for the GTX800 series to come out. I'm just very gutted that they rushed out there (for the reasons mentioned above, like mantle, xbone) to announce DX12 a whole year ahead and make "news", but havent thought that they might be hurting their sales.

IN THEORY dx11 capable cards post fermi will be dx12 compliant, but u'll need newer hardware to access some features of the API...with dx12 hitting the shelves with a AAA title an entire year from now, im not so sure buying a maxwell based gpu is a good idea anymore.

Holidays 2015 is gonna be a year after maxwell comes out and probably a good time for nvidia to release a re-iteration of that chip (a-la keppler, 6xx-7xx series). What i'm scared of is that ill buy the 870, for example, expecting that card to last for at least 3 years and instead i get obsolete because maxwell 2.0 is now "FULL DX12", where as maxwell 1.0 i bought a year ago is "PARTIAL DX12".

I don't trust them, API upgrades used to drive card sales bi-annualy prior to dx11, im sure they'll be more than happy to move back to that, and with maxwell around the corner, im almost sure they're gonna eff-up the customer.

Somebody needs to ask nvidia about this.

April 3, 2014 | 09:31 AM - Posted by JoeJoe (not verified)

***edit- prior to dx9, not dx11

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.