Intel and Microsoft Show DirectX 12 Demo and Benchmark

Subject: General Tech, Graphics Cards, Processors, Mobile, Shows and Expos | August 13, 2014 - 09:55 PM |
Tagged: siggraph 2014, Siggraph, microsoft, Intel, DirectX 12, directx 11, DirectX

Along with GDC Europe and Gamescom, Siggraph 2014 is going on in Vancouver, BC. At it, Intel had a DirectX 12 demo at their booth. This scene, containing 50,000 asteroids, each in its own draw call, was developed on both Direct3D 11 and Direct3D 12 code paths and could apparently be switched while the demo is running. Intel claims to have measured both power as well as frame rate.

intel-dx12-LockedFPS.png

Variable power to hit a desired frame rate, DX11 and DX12.

The test system is a Surface Pro 3 with an Intel HD 4400 GPU. Doing a bit of digging, this would make it the i5-based Surface Pro 3. Removing another shovel-load of mystery, this would be the Intel Core i5-4300U with two cores, four threads, 1.9 GHz base clock, up-to 2.9 GHz turbo clock, 3MB of cache, and (of course) based on the Haswell architecture.

While not top-of-the-line, it is also not bottom-of-the-barrel. It is a respectable CPU.

Intel's demo on this processor shows a significant power reduction in the CPU, and even a slight decrease in GPU power, for the same target frame rate. If power was not throttled, Intel's demo goes from 19 FPS all the way up to a playable 33 FPS.

Intel will discuss more during a video interview, tomorrow (Thursday) at 5pm EDT.

intel-dx12-unlockedFPS-1.jpg

Maximum power in DirectX 11 mode.

For my contribution to the story, I would like to address the first comment on the MSDN article. It claims that this is just an "ideal scenario" of a scene that is bottlenecked by draw calls. The thing is: that is the point. Sure, a game developer could optimize the scene to (maybe) instance objects together, and so forth, but that is unnecessary work. Why should programmers, or worse, artists, need to spend so much of their time developing art so that it could be batch together into fewer, bigger commands? Would it not be much easier, and all-around better, if the content could be developed as it most naturally comes together?

That, of course, depends on how much performance improvement we will see from DirectX 12, compared to theoretical max efficiency. If pushing two workloads through a DX12 GPU takes about the same time as pushing one, double-sized workload, then it allows developers to, literally, perform whatever solution is most direct.

intel-dx12-unlockedFPS-2.jpg

Maximum power when switching to DirectX 12 mode.

If, on the other hand, pushing two workloads is 1000x slower than pushing a single, double-sized one, but DirectX 11 was 10,000x slower, then it could be less relevant because developers will still need to do their tricks in those situations. The closer it gets, the fewer occasions that strict optimization is necessary.

If there are any DirectX 11 game developers, artists, and producers out there, we would like to hear from you. How much would a (let's say) 90% reduction in draw call latency (which is around what Mantle claims) give you, in terms of fewer required optimizations? Can you afford to solve problems "the naive way" now? Some of the time? Most of the time? Would it still be worth it to do things like object instancing and fewer, larger materials and shaders? How often?

NVIDIA 337.50 Driver and GeForce Experience 2.0 Released

Subject: General Tech, Graphics Cards | April 7, 2014 - 09:01 AM |
Tagged: nvidia, geforce experience, directx 11

We knew that NVIDIA had an impending driver update providing DirectX 11 performance improvements. Launched today, 337.50 still claims significant performance increases over the previous 335.23 version. What was a surprise is GeForce Experience 2.0. This version allows both ShadowPlay and GameStream to operate on notebooks. It also allows ShadowPlay to record, and apparently stream to Twitch, your Windows desktop (but not on notebooks). It also enables Battery Boost, discussed previously.

nvidia-shadowplay-desktop.png

Personally, I find desktop streaming is the headlining feature, although I rarely use laptops (and much less for gaming). This is especially useful for OpenGL, games which run in windowed mode, and if you want to occasionally screencast without paying for Camtasia or tinkering with CamStudio. If I were to make a critique, and of course I will, I would like the option to select which monitor gets recorded. Its current behavior records the primary monitor as far as I can tell.

I should also mention that, in my testing, "shadow recording" is not supported when not recording a fullscreen game. I'm guessing that NVIDIA believes their users would prefer to not record their desktops until manually started and likewise stopped. It seems like it had to have been a conscious decision. It does limit its usefulness in OpenGL or windowed games, however.

This driver also introduces GameStream for devices out of your home discussed in the SHIELD update.

nvidia-337-sli.png

This slide is SLi improvements, driver-to driver, for the GTX 770 and the 780 Ti.

As for the performance boost, NVIDIA claims up to 64% faster performance in configurations with one active GPU and up to 71% faster in SLI. It will obviously vary on a game-by-game and GPU-by-GPU basis. I do not have any benchmarks, besides a few examples provided by NVIDIA, to share. That said, it is a free driver. If you have a GeForce GPU, download it. It does complicate matters if you are deciding between AMD and NVIDIA, however.

Source: NVIDIA
Author:
Subject: Mobile
Manufacturer: ARM

 

ARM is a company that no longer needs much of an introduction.  This was not always the case.  ARM has certainly made a name for themselves among PC, tablet, and handheld consumers.  Their primary source of income is licensing CPU designs as well as their ISA.  While names like the Cortex A9 and Cortex A15 are fairly well known, not as many people know about the graphics IP that ARM also licenses.  Mali is the product name of the graphics IP, and it encompasses an entire range of features and performance that can be licensed by other 3rd parties.

I was able to get a block of time with Nizar Romdhane, Head of the Mali Ecosystem at ARM.  I was able to ask a few questions about Mali, ARM’s plans to address the increasingly important mobile graphics market, and how they will compete with competition from Imagination Technologies, Intel, AMD, NVIDIA, and Qualcomm.

 

We would like to thank Nizar for his time, as well as Phil Hughes in facilitating this interview.  Stay tuned as we are expecting to continue this series of interviews with other ARM employees in the near future.

Futuremark Teases 3DMark DirectX 11 With Tech Demo Video

Subject: General Tech | June 21, 2012 - 10:43 AM |
Tagged: windows 8, windows, Futuremark, directx 11, benchmarking, 3dmark

Popular benchmarking software developers Futuremark recently posted a video of latest 3DMark tech demo. Premiering in its Windows 8 benchmarking software, the tech demo uses complex volumetric lighting with real time scattering, tessellation, visible particles and clouds of smoke. It also uses fluid dynamics, audio by Pedro Macedo Camacho (who also created the 3DMark 11 soundtrack), ambient occlusion, and post processing. Whew, that’s a lot of shiny graphics!

We posted a few screenshots of the tech demo that showed up online a few weeks ago, and now it seems like the company is ready to show it off in video form. The embedded video below shows a mysterious figure walking through a small town nestled in a canyon with smoke, lava, and a flying robot to keep her company. The graphics are very detailed and the particle and fluid physics look really good. It should do a great job of stressing out your graphics cards when it comes out in the latest 3DMark.

Unfortunately, not much is known as far as specific release dates, or even if it will be called 3DMark 12 (or 3DMark for Windows 8). If you are into benchmarking software though, keep your eyes on Futuremark’s website as they release more details.

Source: Futuremark