Podcast #360 - Intel XPoint Memory, Windows 10 and DX12, FreeSync displays and more!

Subject: General Tech | July 30, 2015 - 02:45 PM |
Tagged: podcast, video, Intel, XPoint, nand, DRAM, windows 10, DirectX 12, freesync, g-sync, amd, nvidia, benq, uhd420, wasabi mango, X99, giveaway

PC Perspective Podcast #360 - 07/30/2015

Join us this week as we discuss Intel XPoint Memory, Windows 10 and DX12, FreeSync displays and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Sebastian Peak

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Manufacturer: PC Perspective

... But Is the Timing Right?

Windows 10 is about to launch and, with it, DirectX 12. Apart from the massive increase in draw calls, Explicit Multiadapter, both Linked and Unlinked, has been the cause of a few pockets of excitement here and there. I am a bit concerned, though. People seem to find this a new, novel concept that gives game developers the tools that they've never had before. It really isn't. Depending on what you want to do with secondary GPUs, game developers could have used them for years. Years!

Before we talk about the cross-platform examples, we should talk about Mantle. It is the closest analog to DirectX 12 and Vulkan that we have. It served as the base specification for Vulkan that the Khronos Group modified with SPIR-V instead of HLSL and so forth. Some claim that it was also the foundation of DirectX 12, which would not surprise me given what I've seen online and in the SDK. Allow me to show you how the API works.

amd-2015-mantle-execution-model.png

Mantle is an interface that mixes Graphics, Compute, and DMA (memory access) into queues of commands. This is easily done in parallel, as each thread can create commands on its own, which is great for multi-core processors. Each queue, which are lists leading to the GPU that commands are placed in, can be handled independently, too. An interesting side-effect is that, since each device uses standard data structures, such as IEEE754 decimal numbers, no-one cares where these queues go as long as the work is done quick enough.

Since each queue is independent, an application can choose to manage many of them. None of these lists really need to know what is happening to any other. As such, they can be pointed to multiple, even wildly different graphics devices. Different model GPUs with different capabilities can work together, as long as they support the core of Mantle.

microsoft-dx12-build15-ue4frame.png

DirectX 12 and Vulkan took this metaphor so their respective developers could use this functionality across vendors. Mantle did not invent the concept, however. What Mantle did is expose this architecture to graphics, which can make use of all the fixed-function hardware that is unique to GPUs. Prior to AMD's usage, this was how GPU compute architectures were designed. Game developers could have spun up an OpenCL workload to process physics, audio, pathfinding, visibility, or even lighting and post-processing effects... on a secondary GPU, even from a completely different vendor.

Vista's multi-GPU bug might get in the way, but it was possible in 7 and, I believe, XP too.

Read on to see a couple reasons why we are only getting this now...

Computex 2015: EVGA Builds PrecisionX 16 with DirectX 12 Support

Subject: Graphics Cards | June 1, 2015 - 10:58 AM |
Tagged: evga, precisionx, dx12, DirectX 12

Another interesting bit of news surrounding Computex and the new GTX 980 Ti comes from EVGA and its PrecisionX software. This is easily our favorite tool for overclocking and GPU monitoring, so it's great to see the company continuing to push forward with features and capability. EVGA is the first to add full support for DX12 with an overlay.

precisionx16.jpg

What does that mean? It means as DX12 applications that find their way out to consumers and media, we will now have a tool that can help measure performance and monitor GPU speeds and feeds via the PrecisionX overlay. Before this release, we were running the dark with DX12 demos, so this is great news!

You can download the latest version over on EVGA's website!

Podcast #348 - DirectX 12, New AMD GPU News, Giveaways and more!

Subject: General Tech | May 7, 2015 - 03:17 PM |
Tagged: podcast, video, amd, Fiji, hbm, microsoft, build 2015, DirectX 12, Intel, SSD 750, freesync, gsync, Oculus, rift

PC Perspective Podcast #348 - 05/07/2015

Join us this week as we discuss DirectX 12, New AMD GPU News, Giveaways and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Manufacturer: Microsoft

DirectX 12 Has No More Secrets

The DirectX 12 API is finalized and the last of its features are known. Before the BUILD conference, the list consisted of Conservative Rasterization, Rasterizer Ordered Viewed, Typed UAV Load, Volume Tiled Resources, and a new Tiled Resources revision for non-volumetric content. When the GeForce GTX 980 launched, NVIDIA claimed it would be compatible with DirectX 12 features. Enthusiasts were skeptical, because Microsoft did not officially finalize the spec at the time.

Last week, Microsoft announced the last feature of the graphics API: Multiadapter.

We already knew that Multiadapter existed, at least to some extent. It is the part of the specification that allows developers to address multiple graphics adapters to split tasks between them. In DirectX 11 and earlier, secondary GPUs would remain idle unless the graphics driver sprinkled some magic fair dust on it with SLI, CrossFire, or Hybrid CrossFire. The only other way to access this dormant hardware was by spinning up an OpenCL (or similar compute API) context on the side.

Read on to see what DirectX 12 does differently...

Square Enix Announces Deus Ex: Mankind Divided

Subject: General Tech | April 10, 2015 - 07:00 AM |
Tagged: tressfx, square enix, eidos montreal, dx12, DirectX 12, deus ex: mankind divided, deus ex

Deus Ex: Human Revolution came out in 2011 as a prequel to Ion Storm's Deus Ex and Deus Ex: Invisible War. Human Revolution was made after Warren Spector left the company and Eidos closed down the Austin, Texas developer, leaving the franchise to Eidos Montreal. By the time of Human Revolution's release, Eidos was already purchased by the Japanese publisher, Square Enix. Deus Ex was set in 2052 and Invisible War was set in 2072. Human Revolution, being a prequel as mentioned earlier, rewound the clock to 2027 and introduced a new main character, Adam Jensen. It explored the rise of machine-human augmentations that formed much of the lore in the original titles.

square-eidos-deus-ex-mankind-glow.jpg

Timeline and theme established, Square Enix has just announced Deus Ex: Mankind Divided, the sequel to the prequel with a great looking (albeit a little bloody) trailer. It is set in 2029, which is just two years after events of Human Revolution. It will be coming to the PC, as well as the two most-next-gen consoles. As expected, Adam Jensen returns as the main character. Now that Square Enix and its subsidiary, Eidos, spent so much to build him up as a brand, it makes sense that they would continue with the consumer recognition. Makes sense from a business perspective, although it probably means the franchise will meander less through time. I will leave that up to the reader to decide whether that's good or bad.

AMD Gaming has also tweeted out that Mankind Divided, or its PC version at the very least, will utilize both DirectX 12 and TressFX. I am curious whether TressFX has been updated to take advantage of the new API, given how important GPU compute is to the new graphics standards. No release date has been set.

Source: Square Enix

GDC 15: Intel shows 3DMark API Overhead Test at Work

Subject: Graphics Cards, Processors | March 4, 2015 - 08:46 PM |
Tagged: GDC, gdc 15, API, dx12, DirectX 12, dx11, Mantle, 3dmark, Futuremark

It's probably not a surprise to most that Futuremark is working on a new version of 3DMark around the release of DirectX 12. What might be new for you is that this version will include an API overhead test, used to evaluate a hardware configuration's ability to affect performance in Mantle, DX11 and DX12 APIs.

3dmark1.jpg

While we don't have any results quite yet (those are pending and should be very soon), Intel was showing the feature test running at an event at GDC tonight. In what looks like a simple cityscape being rendered over and over, the goal is to see how many draw calls, or how fast the CPU can react to a game engine, the API and hardware can be.

The test was being showcased on an Intel-powered notebook using a 5th Generation Core processor, code named Broadwell. Obviously this points to the upcoming support for DX12 (though obviously not Mantle) that Intel's integrated GPUs will provide.

It should be very interesting to see how much of an advantage DX12 offers over DX11, even on Intel's wide ranges of consumer and enthusiast processors.

Ubisoft Discusses Assassin's Creed: Unity with Investors

Subject: General Tech, Graphics Cards | February 15, 2015 - 07:30 AM |
Tagged: ubisoft, DirectX 12, directx 11, assassins creed, assassin's creed, assasins creed unity

During a conference call with investors, analysts, and press, Yves Guillemot, CEO of Ubisoft, highlighted the issues with Assassin's Creed: Unity with an emphasis on the positive outcomes going forward. Their quarter itself was good, beating expectations and allowing them to raise full-year projections. As expected, they announced that a new Assassin's Creed game would be released at the end of the year based on the technology they created for Unity, with “lessons learned”.

ubisoft-assassins-creed-unity-scene.jpg

Before optimization, every material on every object is at least one draw call.

Of course, there are many ways to optimize... but that effort works against future titles.

After their speech, the question period revisited the topic of Assassin's Creed: Unity and how it affected current sales, how it would affect the franchise going forward, and how should they respond to that foresight (Audio Recording - The question starts at 25:20). Yves responded that they redid “100% of the engine”, which was a tremendous undertaking. “When you do that, it's painful for all the group, and everything has to be recalibrated.” He continues: “[...] but the engine has been created, and it is going to help that brand to shine in the future. It's steps that we need to take regularly so that we can constantly innovated. Those steps are sometimes painful, but they allow us to improve the overall quality of the brand, so we think this will help the brand in the long term.”

This makes a lot of sense to me. When the issues first arose, it was speculated that the engine was pushing way too many draw calls, especially for DirectX 11 PCs. At the time, I figured that Ubisoft chose Assassin's Creed: Unity to be the first title to use their new development pipeline, focused on many simple assets rather than batching things together to minimize host-to-GPU and GPU-to-host interactions. Tens of thousands of individual tasks being sent to the GPU will choke a PC, and getting it to run at all on DirectX 11 might have diverted resources from, or even caused, many of the glitches. Currently, a few thousand is ideal although “amazing developers” can raise the ceiling to about ten thousand.

This also means that I expect the next Assassin's Creed title to support DirectX 12, possibly even in the graphics API's launch window. If I am correct, Ubisoft has been preparing for it for a long time. Of course, it is possible that I am simply wrong, but it would align with Microsoft's Holiday 2015 expectation for the first, big-budget titles to use the new interface and it would be silly to have done their big overhaul without planning on switching to DX12 ASAP.

Then there is the last concern: If I am correct, what should Ubisoft have done? Is it right for them to charge full price for a title that they know will have necessary birth pains? Do they delay it and risk (or even accept) that it will be non-profitable, and upset fans that way? There does not seem to be a clear answer, with all outcomes being some flavor of damage control.

Source: GamaSutra

NVIDIA Event on March 3rd. Why?

Subject: Graphics Cards, Mobile, Shows and Expos | February 11, 2015 - 03:25 PM |
Tagged: Tegra X1, nvidia, mwc 15, MWC, gdc 2015, GDC, DirectX 12

On March 3rd, NVIDIA will host an event called “Made to Game”. Invitations have gone out to numerous outlets, including Android Police, who published a censored screenshot of it. This suggests that it will have something to do with the Tegra X1, especially since the date is the day after Mobile World Congress starts. Despite all of this, I think it is for something else entirely.

nvidia-march-3-2015-event.png

Image Credit: Android Police

Allow me to highlight two points. First, Tech Report claims that the event is taking place in San Francisco, which is about as far away from Barcelona, Spain as you can get. It is close to GDC however, which takes also starts on March 2nd. If this was going to align with Mobile World Congress, you ideally would not want attendees to take 14-hour flight for a day trip.

Second, the invitation specifically says: “More than 5 years in the making, what I want to share with you will redefine the future of gaming.” Compare that to the DirectX 12 announcement blog post on March 20th of last year (2014): “Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC ((2014)).”

So yeah, while it might involve the Tegra X1 processor for Windows 10 on mobile devices, which is the only reason I can think of that they would want Android Police there apart from "We're inviting everyone everywhere", I expect that this event is for DirectX 12. I assume that Microsoft would host their own event that involves many partners, but I could see NVIDIA having a desire to save a bit for something of their own. What would that be? No idea.

Podcast #334 - GTX 970 Memory Issues, Samsung 840 Evo Slowdown, GTX 960 and more!

Subject: General Tech | January 29, 2015 - 02:43 PM |
Tagged: windows 10, wetbench, video, Samsung, Primochill, podcast, nvidia, microsoft, GTX 970, gtx 960, DirectX 12, 840 evo

PC Perspective Podcast #334 - 01/29/2015

Join us this week as we discuss GTX 970 Memory Issues, Samsung 840 Evo Slowdown, GTX 960 and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!