A start to proper testing
During all the commotion last week surrounding the release of a new Ashes of the Singularity DX12 benchmark, Microsoft's launching of the Gears of War Ultimate Edition on the Windows Store and the company's supposed desire to merge Xbox and PC gaming, a constant source of insight for me was one Andrew Lauritzen. Andrew is a graphics guru at Intel and has extensive knowledge of DirectX, rendering, engines, etc. and has always been willing to teach and educate me on areas that crop up. The entire DirectX 12 and Unified Windows Platform was definitely one such instance.
Yesterday morning Andrew pointed me to a GitHub release for a tool called PresentMon, a small sample of code written by a colleague of Andrew's that might be the beginnings of being able to properly monitor performance of DX12 games and even UWP games.
The idea is simple and it's implementation even more simple: PresentMon monitors the Windows event tracing stack for present commands and records data about them to a CSV file. Anyone familiar with the kind of ETW data you can gather will appreciate that PresentMon culls out nearly all of the headache of data gathering by simplifying the results into application name/ID, Present call deltas and a bit more.
Gears of War Ultimate Edition - the debated UWP version
The "Present" method in Windows is what produces a frame and shows it to the user. PresentMon looks at the Windows events running through the system, takes note of when those present commands are received by the OS for any given application, and records the time between them. Because this tool runs at the OS level, it can capture Present data from all kinds of APIs including DX12, DX11, OpenGL, Vulkan and more. It does have limitations though - it is read only so producing an overlay on the game/application being tested isn't possible today. (Or maybe ever in the case of UWP games.)
What PresentMon offers us at this stage is an early look at a Fraps-like performance monitoring tool. In the same way that Fraps was looking for Present commands from Windows and recording them, PresentMon does the same thing, at a very similar point in the rendering pipeline as well. What is important and unique about PresentMon is that it is API independent and useful for all types of games and programs.
PresentMon at work
The first and obvious question for our readers is how this performance monitoring tool compares with Frame Rating, our FCAT-based capture benchmarking platform we have used on GPUs and CPUs for years now. To be honest, it's not the same and should not be considered an analog to it. Frame Rating and capture-based testing looks for smoothness, dropped frames and performance at the display, while Fraps and PresentMon look at performance closer to the OS level, before the graphics driver really gets the final say in things. I am still targeting for universal DX12 Frame Rating testing with exclusive full screen capable applications and expect that to be ready sooner rather than later. However, what PresentMon does give us is at least an early universal look at DX12 performance including games that are locked behind the Windows Store rules.
New Components, New Approach
After 20 or so enclosure reviews over the past year and a half and some pretty inconsistent test hardware along the way, I decided to adopt a standardized test bench for all reviews going forward. Makes sense, right? Turns out choosing the best components for a cases and cooling test system was a lot more difficult than I expected going in, as special consideration had to be made for everything from form-factor to noise and heat levels.
Along with the new components I will also be changing the approach to future reviews by expanding the scope of CPU cooler testing. After some debate as to the type of CPU cooler to employ I decided that a better test of an enclosure would be to use both closed-loop liquid and air cooling for every review, and provide thermal and noise results for each. For CPU cooler reviews themselves I'll be adding a "real-world" load result to the charts to offer a more realistic scenario, running a standard desktop application (in this case a video encoder) in addition to the torture-test result using Prime95.
But what about this new build? It isn't completely done but here's a quick look at the components I ended up with so far along with the rationale for each selection.
CPU – Intel Core i5-6600K ($249, Amazon.com)
The introduction of Intel’s 6th generation Skylake processors provided the
excuse opportunity for an upgrade after using an AMD FX-6300 system for the last couple of enclosure reviews, and after toying with the idea of the new i7-6700K, and immediately realizing this was likely overkill and (more importantly) completely unavailable for purchase at the time, I went with the more "reasonable" option with the i5. There has long been a debate as to the need for hyper-threading for gaming (though this may be changing with the introduction of DX12) but in any case this is still a very powerful processor and when stressed should produce a challenging enough thermal load to adequately test both CPU coolers and enclosures going forward.
GPU – XFX Double Dissipation Radeon R9 290X ($347, Amazon.com)
This was by far the most difficult selection. I don’t think of my own use when choosing a card for a test system like this, as it must meet a set of criteria to be a good fit for enclosure benchmarks. If I choose a card that runs very cool and with minimal noise, GPU benchmarks will be far less significant as the card won’t adequately challenge the design and thermal characteristics of the enclosure. There are certainly options that run at greater temperatures and higher noise (a reference R9 290X for example), but I didn’t want a blower-style cooler with the GPU. Why? More and more GPUs are released with some sort of large multi-fan design rather than a blower, and for enclosure testing I want to know how the case handles the extra warm air.
Noise was an important consideration, as levels from an enclosure of course vary based on the installed components. With noise measurements a GPU cooler that has very low output at idle (or zero, as some recent cooler designs permit) will allow system idle levels to fall more on case fans and airflow than a GPU that might drown them out. (This would also allow a better benchmark of CPU cooler noise - particularly with self-contained liquid coolers and audible pump noise.) And while I wanted very quiet performance at idle, at load there must be sufficient noise to measure the performance of the enclosure in this regard, though of course nothing will truly tax a design quite like a loud blower. I hope I've found a good balance here.
Subject: General Tech | April 14, 2015 - 05:42 PM | Jeremy Hellstrom
Tagged: ssd, benchmarking, synthetic
[H]ard|OCP will be resuming their benchmarking of SSDs in the near future and wanted to introduce both their new contributor and his thoughts on benchmarking SSDs. These drives offer several challenges when comparing performance that are not present when benchmarking spinning rust. For instance some controllers use compression to increase IOPS whenever possible but slow down when incompressible data is passed through the drive, providing a challenge to properly show performance comparisons to similar drives with difference or no compression whatsoever. Read through the article to see which synthetic benchmarks will remain as well as Chris' thoughts on new tests to accurately contrast the performance of SSDs.
"Many of our readers embrace our "real world" approach with hardware reviews. We have not published an SSD review for almost 2 years while we have been looking to revamp our SSD evaluation program. Today we wanted to give you some insight as to how we learned to stop worrying and love the real world SSD benchmark."
Here are some more Storage reviews from around the web:
- OCZ Vector 180 480GB SSD @ eTeknix
- Crucial MX200 250GB SSD @ Hardware Canucks
- Kingston HyperX Predator M.2 PCIe SSDs in RAID 0 @ The SSD Review
- Intel SSD 750 Series 1.2TB @ Legion Hardware
- 8 Facts You Never Knew About Western Digital's Hardware Encryption @ Tech ARP
- Western Digital My Passport Ultra Metal / Anniversary Edition (2 TB) @ Tech ARP
- QNAP TS-431+ @ Legion Hardware
- RaidSonic ICY BOX IB-RD3680SU3 External RAID Enclosure Review @ NikKTech
- QNAP TVS-863+ AMD Turbo vNAS Review @ Madshrimps
Subject: General Tech, Graphics Cards | May 28, 2014 - 05:49 PM | Jeremy Hellstrom
Tagged: benchmarking, 3dmark
HELSINKI, FINLAND – May 28, 2014 – Futuremark today announced 3DMark Sky Diver, a new DirectX 11 benchmark test for gaming laptops and mid-range PCs. 3DMark Sky Diver is the ideal test for benchmarking systems with mainstream DirectX 11 graphics cards, mobile GPUs, or integrated graphics. A preview trailer for the new benchmark shows a wingsuited woman skydiving into a mysterious, uncharted location. The scene is brought to life with tessellation, particles and advanced post-processing effects. Sky Diver will be shown in full at Computex from June 3-7, or find out more on the Futuremark website.
Jukka Mäkinen, Futuremark CEO said, "Some people think that 3DMark is only for high-end hardware and extreme overclocking. Yet millions of PC gamers rely on 3DMark to choose systems that best balance performance, efficiency and affordability. 3DMark Sky Diver complements our other tests by providing the ideal benchmark for gaming laptops and mainstream PCs."
3DMark - The Gamer's Benchmark for all your hardware
3DMark is the only benchmark that offers a range of tests for different classes of hardware:
- Fire Strike, for high performance gaming PCs (DirectX 11, feature level 11)
- Sky Diver, for gaming laptops and mid-range PCs (DirectX 11, feature level 11)
- Cloud Gate, for notebooks and typical home PCs (DirectX 11 feature level 10)
- Ice Storm, for tablets and entry level PCs (DirectX 11 feature level 9)
With 3DMark, you can benchmark the full performance range of modern DirectX 11 graphics hardware. Where Fire Strike is like a modern game on ultra high settings, Sky Diver is closer to a DirectX 11 game played on normal settings. This makes Sky Diver the best choice for benchmarking entry level to mid-range systems and Fire Strike the perfect benchmark for high performance gaming PCs.
See 3DMark Sky Diver in full at Computex
3DMark Sky Diver will be on display on the ASUS, MSI, GIGABYTE, Galaxy, Inno3D, and G-Skill booths at Computex, June 3-7.
S.Y. Shian, ASUS Vice President & General Manager of Notebook Business Unit said,
"We are proud to partner with Futuremark to show 3DMark Sky Diver at Computex. Sky Diver helps PC gamers choose systems that offer great performance and great value. We invite everyone to visit our stand to experience 3DMark Sky Diver on a range of new ASUS products."
Sky Diver will be released as an update for all editions of 3DMark, including the free 3DMark Basic Edition.
Subject: General Tech, Graphics Cards, Processors, Mobile | August 23, 2013 - 05:31 AM | Scott Michaud
Tagged: Futuremark, AnTuTu, benchmarking
VR-Zone tossed the bees nest in a paint shaker and received a fairly sedate outcome.
A little background information is required. AnTuTu, a mobile benchmark developed by AnTuTu Labs, has been accused of inaccurate scores and bias towards specific hardware. Leaked BayTrail-T benchmarks, surpassing our expectations of Intel's capabilities, were harshly refuted based on AnTuTu's credibility. More recently, certain Samsung GPUs have been allegedly recorded self-overclocking during that benchmark but not elsewhere.
Scene from Cloud Gate, latest Windows 3DMark.
Oliver Baltuch, president of Futuremark, accepted an interview with VR-Zone to discuss business and ethics in their marketplace. Futuremark is a direct competitor to AnTuTu and a household name in the benchmarking community. Being modest Fins, self-proclaimed, they did not wish to discuss whether AnTuTu was less honest than they are. Futuremark does disagree with AnTuTu's process, however, and has some suggestions for better results.
The design process for 3D Mark Android begun with 25 pages of specification proposal. Each vendor is given a chance to reply to that proposal and these responses are compared. Changes to the specification must be reviewed by a committee sitting between the financial department and the engineering department.
Baltuch made the point that all of their finances for the last five years, according to Finnish law, can be reviewed for about $7 USD. Despite being a private company, the law mandates no deals can be made in secret.
On the engineering side of things, drivers are approved only if they follow specific guidelines. Unapproved results will be removed from their website and leaderboards followed by a polite conversation with the manufacturer. Drivers are not allowed to identify their benchmarks intent on modifying settings due to that information.
Almost every benchmark they release gets negative responses from some upset vendor or vendors.
The relatively short interview is wrapped up with commentary on iOS benchmarks. Futuremark is nearing completion of their first benchmarking app. Apple disallows apps to exceed 60 frames per second, through vsync, which unnecessarily hinders benchmark scores. Working around this, Futuremark developed a method to render frames which are not displayed on screen to keep the processors from idling once at frame rate cap.
Ryan must love that idea...
This concept has, according to the interview, reached internal QA review and is expected to be released in a few weeks.
Futuremark develops benchmarks for x86 Windows, Windows RT, Android, and iOS. Scores are intended to scale linearly to their metrics and are designed to allow cross-platform performance comparisons.
Subject: General Tech | April 3, 2013 - 10:53 AM | Tim Verry
Tagged: ice storm extreme, ice storm, Futuremark, benchmarking, Android, 3dmark
Futuremark recently unveiled its latest 3DMark benchmarking suite for Android devices. Compatible with over 1,000 devices, the new 3DMark is a free benchmark that incorporates both the Ice Storm and Ice Storm Extreme tests. The benchmark was developed by Futuremark in cooperation with a number of industry companies including Broadcom, Imagination Technologies, Intel, NVIDIA, and Qualcomm. The Ice Storm Extreme test is also coming to the Windows version of 3DMark, and the tests can be used to compare benchmark scores across platforms.
Both the benchmarking tests are based on OpenGL ES 2.0. Ice Storm runs through two graphical tests to stress the GPU and one physics test to measure CPU performance. The ice Storm Extreme benchmark takes things further by bumping up the resolution to 1080 and swapping in higher quality textures and post processing effects.
The benchmark is compatible with a number of mobile smartphones and tablets running Android 3.1 or higher. It is a free download from the Google Play store.
The iOS and Windows RT versions of 3DMark are still in development. More information can be found in the press release.
Read more about Futuremark's 3DMark benchmarking suite at PC Perspective.
Subject: Graphics Cards | February 2, 2013 - 08:29 AM | Tim Verry
Tagged: gpus, gaming, Futuremark, benchmarking, 3dmark
Futuremark, developers of the popular 3DMark and PCMark computer hardware benchmarks has announced an official release date for the next version of 3DMark. The company has teased gamers and reviewers with screenshots as well as hinted that the name would no longer have the release year tacked onto the end of the name, but now the benchmark is finally official.
The new 3DMark will come in several different flavors aimed at Windows PCs, iOS, Android, and Windows RT devices. It will continue the trend of offline benchmarking and scoring paired with a web interface where users can see detailed benchmark run analysis.
The new 3DMark benchmark will include feature tests, a DX10 benchmark called Cloud Gate, and a DX11 benchmark called Fire Strike. Once the benchmark has completed, users will be able to dig into the web interface to access charts and graphs that cover the benchmarking runs from beginning to end. The graphs will track CPU clockspeed and utilization as well as temperatures for both the processor and graphics card(s).
On the mobile side of things, 3DMark will use a graphics test called Ice Storm that is more suited to ARM SoCs with integrated graphics processors. No DX11 goodness here, obviously.
The PC version of 3DMark will be available for download on February 4, 2013 at 18:00 UTC. Unfortunately, there is no official release dates for the mobile versions. Futuremark has indicated that they will be released over the next few weeks as they are finalized.
You can find more information on the next 3DMark benchmark on the Futuremark website.
Subject: General Tech | June 21, 2012 - 02:43 PM | Tim Verry
Tagged: windows 8, windows, Futuremark, directx 11, benchmarking, 3dmark
Popular benchmarking software developers Futuremark recently posted a video of latest 3DMark tech demo. Premiering in its Windows 8 benchmarking software, the tech demo uses complex volumetric lighting with real time scattering, tessellation, visible particles and clouds of smoke. It also uses fluid dynamics, audio by Pedro Macedo Camacho (who also created the 3DMark 11 soundtrack), ambient occlusion, and post processing. Whew, that’s a lot of shiny graphics!
We posted a few screenshots of the tech demo that showed up online a few weeks ago, and now it seems like the company is ready to show it off in video form. The embedded video below shows a mysterious figure walking through a small town nestled in a canyon with smoke, lava, and a flying robot to keep her company. The graphics are very detailed and the particle and fluid physics look really good. It should do a great job of stressing out your graphics cards when it comes out in the latest 3DMark.
Unfortunately, not much is known as far as specific release dates, or even if it will be called 3DMark 12 (or 3DMark for Windows 8). If you are into benchmarking software though, keep your eyes on Futuremark’s website as they release more details.
Subject: General Tech | June 5, 2012 - 05:37 PM | Tim Verry
Tagged: windows 8, computex, benchmarking, 3dmark
Popular benchmarking software company Futuremark has announced on their website a new version of their 3DMark application for Windows 8 benchmarking. While not available for download (yet), the application can be used to benchmark the performance of Windows 8 machines. Currently the company is calling the software "3DMark for Windows 8" which breaks the traditional numbered naming scheme.
Not much is known about the particulars yet, but we were able to snag some screenshots from their site which may or may not be publicly available any more. Take a look below the break (there are quite a few). More information should be coming shortly as Computex 2012 marches on.
I will miss the rocket powered airship and big guy with the minigun, but I suppose these benchmarks will be fun to watch as well.
Subject: General Tech | May 13, 2011 - 03:28 PM | Jeremy Hellstrom
Tagged: pcmark, benchmarking, win7
Well, for one thing the advanced tests have all been renamed to Entertainment, Creativity, Productivity, Computation and Storage replacing the older benchmark names. There will be three flavours, from the already widely available free edition, a $30 Advanced version and the $1000 Professional, with the $30 version being almost the same as the free version barring the lack of advertisements. Techgage is happy that the benchmark takes less time than the previous version as the extra time will add up after a few thousand run throughs.
"Futuremark has launched the latest version of its popular PC benchmarking tool, PCMark, and as its "7" name suggests, it's designed exclusively for use with Windows 7. A couple of notable changes were made to both the test organization of the program, and also its pricing schemes. Join us as we take a quick look to see what's been added or refined."
Here is some more Tech News from around the web:
- Call Interception Demonstrated On New Cisco Phones @ Slashdot
- Five Clever Ways To Make Dropbox More Useful @ TechSpot
- ASUS RT-N56U Dual-band Gigabit Wireless-N Router Review @ ThinkComputers
- Windows 7 malware is camouflaged using unicode filename trickery @ The Inquirer
- Nuovoyance and Samsung demo ultra-high resolution touchscreens @ The Inquirer