What you never knew you didn't know
While researching a few upcoming SD / microSD product reviews here at PC Perspective, I quickly found myself swimming in a sea of ratings and specifications. This write up was initially meant to explain and clarify these items, but it quickly grew into a reference too large to include in every SD card article, so I have spun it off here as a standalone reference. We hope it is as useful to you as it will be to our upcoming SD card reviews.
SD card speed ratings are a bit of a mess, so I'm going to do my best to clear things up here. I'll start with classes and grades. These are specs that define the *minimum* speed a given SD card should meet when reading or writing (both directions are used for the test). As with all flash devices, the write speed tends to be the more limiting factor. Without getting into gory detail, the tests used assume mostly sequential large writes and random reads occurring at no smaller than the minimum memory unit of the card (typically 512KB). The tests match the typical use case of an SD card, which is typically writing larger files (or sequential video streams), with minimal small writes (file table updates, etc).
In the above chart, we see speed 'Class' 2, 4, 6, and 10. The SD card spec calls out very specific requirements for these specs, but the gist of it is that an unfragmented SD card will be able to write at a minimum MB/s corresponding to its rated class (e.g. Class 6 = 6 MB/s minimum transfer speed). The workload specified is meant to represent a typical media device writing to an SD card, with buffering to account for slower FAT table updates (small writes). With higher bus speed modes (more on that later), we also get higher classes. Older cards that are not rated under this spec are referred to as 'Class 0'.
As we move higher than Class 10, we get to U1 and U3, which are referred to as UHS Speed Grades (contrary to the above table which states 'Class') in the SD card specification. The changeover from Class to Grade has something to do with speed modes, which also relates with the standard capacity of the card being used:
U1 and U3 correspond to 10 and 30 MB/s minimums, but the test conditions are slightly different for these specs (so Class 10 is not *exactly* the same as a U1 rating, even though they both equate to 10 MB/sec). Cards not performing to U1 are classified as 'Speed Grade 0'. One final note here is that a U rating also implies a UHS speed mode (see the next section).
New Components, New Approach
After 20 or so enclosure reviews over the past year and a half and some pretty inconsistent test hardware along the way, I decided to adopt a standardized test bench for all reviews going forward. Makes sense, right? Turns out choosing the best components for a cases and cooling test system was a lot more difficult than I expected going in, as special consideration had to be made for everything from form-factor to noise and heat levels.
Along with the new components I will also be changing the approach to future reviews by expanding the scope of CPU cooler testing. After some debate as to the type of CPU cooler to employ I decided that a better test of an enclosure would be to use both closed-loop liquid and air cooling for every review, and provide thermal and noise results for each. For CPU cooler reviews themselves I'll be adding a "real-world" load result to the charts to offer a more realistic scenario, running a standard desktop application (in this case a video encoder) in addition to the torture-test result using Prime95.
But what about this new build? It isn't completely done but here's a quick look at the components I ended up with so far along with the rationale for each selection.
CPU – Intel Core i5-6600K ($249, Amazon.com)
The introduction of Intel’s 6th generation Skylake processors provided the
excuse opportunity for an upgrade after using an AMD FX-6300 system for the last couple of enclosure reviews, and after toying with the idea of the new i7-6700K, and immediately realizing this was likely overkill and (more importantly) completely unavailable for purchase at the time, I went with the more "reasonable" option with the i5. There has long been a debate as to the need for hyper-threading for gaming (though this may be changing with the introduction of DX12) but in any case this is still a very powerful processor and when stressed should produce a challenging enough thermal load to adequately test both CPU coolers and enclosures going forward.
GPU – XFX Double Dissipation Radeon R9 290X ($347, Amazon.com)
This was by far the most difficult selection. I don’t think of my own use when choosing a card for a test system like this, as it must meet a set of criteria to be a good fit for enclosure benchmarks. If I choose a card that runs very cool and with minimal noise, GPU benchmarks will be far less significant as the card won’t adequately challenge the design and thermal characteristics of the enclosure. There are certainly options that run at greater temperatures and higher noise (a reference R9 290X for example), but I didn’t want a blower-style cooler with the GPU. Why? More and more GPUs are released with some sort of large multi-fan design rather than a blower, and for enclosure testing I want to know how the case handles the extra warm air.
Noise was an important consideration, as levels from an enclosure of course vary based on the installed components. With noise measurements a GPU cooler that has very low output at idle (or zero, as some recent cooler designs permit) will allow system idle levels to fall more on case fans and airflow than a GPU that might drown them out. (This would also allow a better benchmark of CPU cooler noise - particularly with self-contained liquid coolers and audible pump noise.) And while I wanted very quiet performance at idle, at load there must be sufficient noise to measure the performance of the enclosure in this regard, though of course nothing will truly tax a design quite like a loud blower. I hope I've found a good balance here.
Subject: Editorial | October 2, 2015 - 12:41 PM | Jeremy Hellstrom
Tagged: google, chromecast, AT&T, apple tv, amd, amazon
There is more discouraging news out of AMD as another 5% of their workforce, around 10,000 employees, will be let go by the end of 2016. That move will hurt their bottom line before the end of this year, $42 million in severance, benefit payouts and other costs associated with restructuring but should save around $60-70 million in costs by the end of next year. This is on top of the 8% cut to their workforce which occurred earlier this year and shows just how deep AMD needs to cut to stay alive, unfortunately reducing costs is not as effective as raising revenue. Before you laugh, point fingers or otherwise disparage AMD; consider for a moment a world in which Intel has absolutely no competition selling high powered desktop and laptop parts. Do you really think the already slow product refreshes will speed up or prices remain the same?
Consider the case of AT&T, who have claimed numerous times that they provide the best broadband service to their customers that they are capable of and at the lowest price they can sustain. It seems that if you live in a city which has been blessed with Google Fibre somehow AT&T is able to afford to charge $40/month less than in a city which only has the supposed competition of Comcast or Time Warner Cable. Interesting how the presence of Google in a market has an effect that the other two supposed competitors do not.
There is of course another way to deal with the competition and both Amazon and Apple have that one down pat. Apple removed the iFixit app that showed you the insides of your phone and had the temerity to actually show you possible ways to fix hardware issues. Today Amazon have started to kick both Apple TV and Chromecast devices off of their online store. As of today no new items can be added to the virtual inventory and as of the 29th of this month anything not sold will disappear. Apparently not enough people are choosing Amazon's Prime Video streaming and so instead of making the service compatible with Apple or Google's products, Amazon has opted to attempt to prevent, or at least hinder, the sale of those products.
The topics of competition, liquidity and other market forces are far too complex to be dealt with in a short post such as this but it is worth asking yourself; do you as a customer feel like competition is still working in your favour?
"AMD has unveiled a belt-tightening plan that the struggling chipmaker hopes will get its finances back on track to profitability."
Here is some more Tech News from around the web:
- Microsoft brings LinkedIn to Cortana, and Likes and mentions to Outlook @ The Inquirer
- Microsoft previews less buggy OneDrive for Business client @ The Register
- Toshiba CEO: Yeah, we MAY need to chop some heads @ The Register
- UK scientists create quantum cryptology world record with 'unhackable' data @ The Inquirer
Subject: Editorial, General Tech | September 29, 2015 - 03:30 PM | Jeremy Hellstrom
Tagged: trust, security, rant, microsoft, metadata, fud
Privacy of any nature when you utilize a device connected to the internet is quickly becoming a joke and not a very funny one. Just to name a few, Apple tracks your devices, Google scans every email you send, Lenovo actually has two programs to track your usage and of course there is Windows 10 and the data it collects and sends. Thankfully in some of these cases the programs which track and send your data can be disabled but the fact of the matter is that they are turned on by default.
The Inquirer hits the nail on the head "Money is simply a by-product of data." a fact which online sites such as Amazon and Facebook have known for a while and which software and hardware providers are now figuring out. In some cases an informed choice to share personal data is made, but this is not always true. When you share to Facebook or post your Fitbit results to the web you should be aware you are giving companies valuable data, the real question is about the data and metadata you are sharing of which you are unaware of.
Should you receive compensation for the data you provide to these companies? Should you always be able to opt out of sharing and still retain use of a particular service? Perhaps the cost of utilizing that service is sharing your data instead of money? There are a lot of questions and even a lot of different uses for this data but there is certainly no one single answer to those questions.
Microsoft have been collecting data from BSoD's for decades and Windows users have all benefited from it even though there is no opt out for sending that data. On the other hand is there a debt incurred towards Lenovo or other companies when you purchase a machine from them? Does the collection of patterns of usage benefit Lenovo users in a similar way to the data generated by a Windows BSoD or does the risk of this monitoring software being corrupted by others for nefarious purposes outweigh any possible benefits?
Of course this is only the tip of the iceberg, the Internet of Things is poised to become a nightmare for those who value their security, there are numerous exploits to track your cellphone that have nothing to do with your provider and that is only the tip of the iceberg. Just read through the Security tag here on PCPer for more examples if you have a strong stomach.
Please, take some time to think about how much you value your privacy and what data you are willing to share in exchange for products and services. Integrate that concern into your purchasing decisions, social media and internet usage. Hashtags are nice, but nothing speaks as loudly as your money; never forget that.
"MICROSOFT HAS SPOKEN out about its oft-criticised privacy policies, particularly those in the newly released Windows 10, which have provoked a spike in Bacofoil sales over its data collection policies."
Here is some more Tech News from around the web:
- Microsoft preps Azure data lake flood gates for readiness @ The Register
- BlackBerry's tactical capitulation to Google buys time – and possibly a future @ The Register
- Real-Time E-Book Editing With Calibre @ Linux.com
- 3D Printing Has Evolved Two Filament Standards @ Hack a Day
- Advertisers Already Using New iPhone Text Message Exploit @ Slashdot
- Confirmed: Android 6.0 Marshmallow rollout will begin next week @ The Inquirer
- World panics, children cry, workers sigh ... Facebook.com TITSUP @ The Register
Subject: Editorial, Mobile | September 28, 2015 - 09:57 AM | Ryan Shrout
Tagged: iphone 6s, iphone, ios, google, apple, Android
PC Perspective’s Android to iPhone series explores the opinions, views and experiences of the site’s Editor in Chief, Ryan Shrout, as he moves from the Android smartphone ecosystem to the world of the iPhone and iOS. Having been entrenched in the Android smartphone market for 7+ years, the editorial series is less of a review of the new iPhone 6s as it is an exploration on how the current smartphone market compares to what each sides’ expectations are.
Full Story Listing:
Opening and setting up a new iPhone is still an impressive experience. The unboxing process makes it feel like you are taking part in the reveal of product worth its cost and the accessories included are organized and presented well. Having never used an iPhone 6 or iPhone 6 Plus beyond the cursory “let me hold that”, it was immediately obvious to me that the iPhone build quality exceeded any of the recent Android-based smartphones I have used; including the new OnePlus 2, LG G4 and Droid Turbo. The rounded edges sparked some debate in terms of aesthetics but it definitely makes the phone FEEL slimmer than other smartphone options. The buttons were firm and responsive though I think there is more noise in the click of the home button than I expected.
The setup process for the phone was pretty painless but Ken, our production editor who has been an iPhone user every generation, did comment that the number of steps you have to go through to get to a working phone have increased quite a bit. Setup Siri, setup Touch ID, setup Wi-Fi, have you heard about iCloud? The list goes on. I did attempt to use the “Move to iOS” application from the Android Play Store on my Droid Turbo but I was never able to get it to work – the devices kept complaining about a disconnection of some sort in its peer-to-peer network and after about 8 tries, I gave up. I’m hoping to try it again with the incoming iPhone 6 Plus next week to see if it was a temporary issue.
After getting to the iPhone 6s home screen I spent the better part of the next hour doing something that I do every time I get a new phone: installing apps. The process is painful – go to the App Store, search for the program, download it, open it, login (and try to remember login information), repeat. With the Android Play Store I do appreciate the ability to “push” application downloads to a phone from the desktop website, making it much faster to search and acquire all the software you need. Apple would definitely benefit from some version of this that doesn’t require installing iTunes.
I am a LastPass user and one of the first changes I had to get used to was the change in how that software works on Android and iOS. With my Droid Turbo I was able to give LastPass access to system levels lower than you can with iOS and when using a third-party app like Twitter, LastPass can insert itself into the process and automatically input the username and/or password for the website or service. With the iPhone you don’t have that ability and there was a lot of password copying and pasting to get everything setup. This is an area where the openness of the Android platform can benefit users.
That being said, the benefits of Touch ID from Apple were immediately apparent. After going through the setup process using my fingerprint in place of my 15+ digit Apple ID password is a huge benefit and time saver. Every time I download a new app from the App Store and simply place my thumb on the home button, I grin; knowing this is how it should be for all passwords, everywhere. I was even able to setup my primary LastPass password to utilize Touch ID, removing one of the biggest annoyances of using the password keeping software on Android. Logging into the phone with your finger or thumb print rather than a pattern or PIN is great too. And though I know new phones like the OnePlus 2 uses a fingerprint reader for this purpose, the implementation just isn’t as smooth.
My final step before leaving the office and heading for home was to download my favorite podcasts and get that setup on the phone for the drive. Rather than use the Apple Podcasts app it was recommended that I try out Overcast, which has been solid so far. I setup the Giant Bombcast, My Brother, My Brother and I and a couple of others, let them download on Wi-Fi and set out for home. Pairing the iPhone 6s with my Chevy Volt was as easy as any other phone but I did notice that Bluetooth-based information being passed to the entertainment system (icons, current time stamps, etc.) was more accurate with the iPhone 6s than my Droid Turbo (starting times and time remaining worked when they previously did not). That could be a result of the podcast application itself (I used doubleTwist on Android).
On Saturday, with a bit more free time to setup the phone and get applications installed that I had previously forgotten, I did start to miss a couple of Android features. First, the lack of widgets on the iPhone home screens means the mass of icons on the iPhone 6s is much less useful than the customized screens I had on my Droid Turbo. With my Droid I had a page dedicated to social media widgets I could scroll through without opening up any specific applications. Another page included my current to-do list from Google Keep and my most current 15 items from Google Calendar, all at a glance.
I know that the top drag down menu on iOS with the Today and Notifications tabs is supposed to offer some of that functionality but the apps like Google Keep and Twitter don’t take advantage of it. And though cliché at this point, why in the hell doesn’t the Apple Weather application icon show the current temperature and weather status yet??
The second item I miss is the dedicated “back” button that Android devices have on them that are universal across the entire system. Always knowing that you can move to the previous screen or move from the current app to the home screen or other program that was just recently switched over is a great safety net that is missing in iOS. With only a single “always there” button on the phone, some software has the back button functionality on the top left hand corner and others have it in the form of an X or Close button somewhere else. I found myself constantly looking around each new app on the iPhone 6s to find out how to return to a previous screen and sometimes would hit the home button out of habit, which obviously isn’t going to have the intended function. Swiping from the left of the screen to the middle works with some applications, but not all.
Also, though my Droid Turbo phone was about the same size as the iPhone 6s, the size of the screen makes it hard to reach the top of the screen when only using one hand. With the Android back button along the bottom of the phone that meant it was always within reach. Those iOS apps that put the return functionality in the top left of the screen make it much more difficult to do, often risking dropping the phone by repositioning it in your hand. And double tapping (not clicking) the home button and THEN reaching for the back button on any particular app just seems to take too long.
On Saturday I went camping with my family at an early Halloween event that we have annually. This made for a great chance to test out the iPhone 6s camera, and without a doubt, it was the best phone camera I have used. The images were clear, the shutter speed was fast, and the ability to take high frame rate video or 4K video is a nice touch. I think that enough people have shown the advantages of the iPhone camera systems over almost anything else on the smartphone market and as a user of seemingly slow and laggard Android-based phone cameras, the move to the iPhone 6s is a noticeable change. As a parent of a 3 month old baby girl, these photos are becoming ever more important to me.
The new Live Photos feature, where essentially a few frames before and a few frames after the picture you actually took are captured (with audio included), is pretty much a gimmick but the effect is definitely eye-catching. When flipping through the camera roll you actually see a little bit of movement (someone’s face for example) which caused me to raise an eyebrow at first. It’s an interesting idea, but I’m not sure what use they will have off of the phone itself – will I be able to “play” these types of photos on my PC? Will I be able to share them to other phone users that don’t have the iPhone 6s?
Most of Sunday was spent watching football and using the iPhone 6s to monitor fantasy football and to watch football through our Wi-Fi network when I needed to leave the room for laundry. The phone was able to keep up, as you would expect, with these mostly lightweight tasks without issue. Switching between applications was quick and responsive, and despite the disadvantage that the iPhone 6s has over many Android flagship phones in terms of system memory, I never felt like the system was penalized for it.
Browsing the web through either Safari or Google Chrome did demonstrate a standard complaint about iOS – reloading of webpages when coming back into the browser application even if you didn’t navigate away from the page. With Android you are able to load up a webpage and then just…leave it there, for reference later. With the iPhone 6s, even with the added memory this model ships with, it will reload a page after some amount of time away from the browser app as the operating system decided it needed to utilize that memory for another purpose.
I haven’t had a battery life crisis with the iPhone yet, but I am worried about the lack of Quick Charging or Turbo Charging support on the iPhone 6s. This was a feature I definitely fell in love with on the Droid Turbo, especially when travelling for work or going on extended outings without access to power. I’ll have to monitor how this issue does or does not pop its head up.
Speaking of power and battery life – so far I have been impressed with how the iPhone 6s has performed. As I write this editorial up at 9:30pm on Sunday night, the battery level sits at 22%. Considering I have been using the phone for frequent speed tests (6 of them today) and just general purpose performance and usability testing, I consider this a good result. I only took one 5 minute phone call but texting and picture taking was plentiful. Again, this is another area where this long-term test is going to tell the real story, but for my first impressions the thinness of the iPhone 6s hasn’t created an instant penalty for battery life.
The journey is still beginning – tomorrow is my first full work day with the iPhone 6s and I have the final installment of my summer evening golf league. Will the iPhone 6s act as my golf GPS like my Droid Turbo did? Will it make it through the full day without having to resort to car charging or using an external battery? What other features and capabilities will I love or hate in this transition? More soon!
Subject: Editorial | September 18, 2015 - 01:00 PM | Josh Walrath
Tagged: Zen, raja koduri, lisa su, Jim Keller, bulldozer, amd
2012 was a significant year for AMD. Many of the top executives left and there were many new and exciting hires at the company. Lisa Su, who would eventually become President and CEO of AMD was hired in January of that year. Rory Read seemed to be on a roll with many measures to turn around the company. He also convinced some big name folks to come back to AMD from other lucrative positions. One of these rehires was Jim Keller.
Jim Keller, breakin it down for AMD. Or doing "The Robot". Or both.
Today it was announced that Jim would be leaving AMD effective Sept. 18th. He was back at AMD for three years and in that time headed up the CPU group. He implemented massive changes that would result in the design of the upcoming Zen architecture. There was a full scale ejection of the Bulldozer concept that powered AMD processors since 2011 with the FX-8150 introduction with the current Excavator core design to last through 2016 with the final product being "Bristol Ridge,"expected next summer. Zen will not ship until late 2016 with the first full quarter of revenue in 2017.
Jim helped to develop the K7 and K8 processors from AMD. He also was extremely influential in the creation of the X86-64 ISA that not only powers AMD’s parts, but also was adopted by Intel after their disastrous EPIC/IA64 ISA failed to go anywhere. His past also includes work at DEC on the Alpha processors and before AMD at Apple working on the A4 and A5 SOCs.
We do not know any of the details about his leaving, and perhaps never will. AMD has released an official statement that “Jim Keller is leaving AMD to pursue other opportunities, effective September 18”. Looking at Jim’s past employment, he seems to move around a bit. Perhaps he enjoys coming into a place, turning things around, implementing some new thinking, but then becomes bored with the daily routine of management, budget, and planning.
In the near future this change will not affect AMD’s roadmaps or product lineups. We still will see Bristol Ridge as the follow-up for Godavari in Summer 2016 and the late 2016 introduction of Zen. What can be said beyond that is hard to quantify. There are a lot of smart and talented people still working at AMD and perhaps this allows someone there to step up and introduce the next generation of architectures and thinking at AMD. Everybody likes the idea of a rockstar designer coming in to shake things up, but time moves on and new people become those rockstars.
We wish Jim well on his new journey and hope that this is not a harbinger of things to come for AMD. Consumers need the competition that AMD brings to the table and we certainly hope we see them continue to release new products and stay on a schedule that will benefit both them and consumers. Perhaps he will join fellow veteran Glenn Henry at VIA/Centaur and produce the next, great X86-64 chip. Perhaps not.
Subject: Editorial | September 9, 2015 - 03:53 PM | Ryan Shrout
Tagged: raja koduri, amd
In a move of outstanding wisdom and forward thinking, AMD has made a personnel move that I can get behind and support. After forming the Radeon Technologies Group to help refocus the company on graphics, it has promoted Raja Koduri to the role of Senior Vice President and Chief Architect of that new group. While this might be a little bit of an "inside baseball" announcement to discuss, Raja is one of the few people in the industry that I have known since day one and he is an outstanding and important person in the graphics world as we know it today.
Koduri recently returned to AMD after a stint with Apple as the mobile SoC vendors director of graphics architecture and his return was met with immediate enthusiasm and hope for a company that continues to struggle financially.
In this new role, Koduri will no longer just be responsible for the IP of AMD graphics, adding to his responsibility the entirety of the hardware, software and business direction for Radeon products. From personal experience I can assure readers that Raja is a fantastic leader, has great instincts for what the industry needs and has seen some of AMD's most successful products through development.
This new role and new division of structure at AMD will come with a lot of responsibility, as Koduri will be responsible for finding ways to grow the Radeon brand's shrinking market share, how to make a play in the mobile IP space, change the dynamic between developers and AMD, and how working with console vendors like MS and Sony makes sense going forward. In many ways this is a return to the structure that made ATI so successful as a player in the GPU space and AMD is definitely hoping this move can turn things around.
Good luck Raja!
To the Max?
Much of the PC enthusiast internet, including our comments section, has been abuzz with “Asynchronous Shader” discussion. Normally, I would explain what it is and then outline the issues that surround it, but I would like to swap that order this time. Basically, the Ashes of the Singularity benchmark utilizes Asynchronous Shaders in DirectX 12, but they disable it (by Vendor ID) for NVIDIA hardware. They say that this is because, while the driver reports compatibility, “attempting to use it was an unmitigated disaster in terms of performance and conformance”.
AMD's Robert Hallock claims that NVIDIA GPUs, including Maxwell, cannot support the feature in hardware at all, while all AMD GCN graphics cards do. NVIDIA has yet to respond to our requests for an official statement, although we haven't poked every one of our contacts yet. We will certainly update and/or follow up if we hear from them. For now though, we have no idea whether this is a hardware or software issue. Either way, it seems more than just politics.
So what is it?
Simply put, Asynchronous Shaders allows a graphics driver to cram workloads in portions of the GPU that are idle, but not otherwise available. For instance, if a graphics task is hammering the ROPs, the driver would be able to toss an independent physics or post-processing task into the shader units alongside it. Kollock from Oxide Games used the analogy of HyperThreading, which allows two CPU threads to be executed on the same core at the same time, as long as it has the capacity for it.
Kollock also notes that compute is becoming more important in the graphics pipeline, and it is possible to completely bypass graphics altogether. The fixed-function bits may never go away, but it's possible that at least some engines will completely bypass it -- maybe even their engine, several years down the road.
But, like always, you will not get an infinite amount of performance by reducing your waste. You are always bound by the theoretical limits of your components, and you cannot optimize past that (except for obviously changing the workload itself). The interesting part is: you can measure that. You can absolutely observe how long a GPU is idle, and represent it as a percentage of a time-span (typically a frame).
And, of course, game developers profile GPUs from time to time...
According to Kollock, he has heard of some console developers getting up to 30% increases in performance using Asynchronous Shaders. Again, this is on console hardware and so this amount may increase or decrease on the PC. In an informal chat with a developer at Epic Games, so massive grain of salt is required, his late night ballpark “totally speculative” guesstimate is that, on the Xbox One, the GPU could theoretically accept a maximum ~10-25% more work in Unreal Engine 4, depending on the scene. He also said that memory bandwidth gets in the way, which Asynchronous Shaders would be fighting against. It is something that they are interested in and investigating, though.
This is where I speculate on drivers. When Mantle was announced, I looked at its features and said “wow, this is everything that a high-end game developer wants, and a graphics developer absolutely does not”. From the OpenCL-like multiple GPU model taking much of the QA out of SLI and CrossFire, to the memory and resource binding management, this should make graphics drivers so much easier.
It might not be free, though. Graphics drivers might still have a bunch of games to play to make sure that work is stuffed through the GPU as tightly packed as possible. We might continue to see “Game Ready” drivers in the coming years, even though much of that burden has been shifted to the game developers. On the other hand, maybe these APIs will level the whole playing field and let all players focus on chip design and efficient injestion of shader code. As always, painfully always, time will tell.
Subject: Editorial | August 21, 2015 - 02:28 PM | Ryan Shrout
Tagged: video, Skylake, master system, Intel, 6700k
Sometimes you get weird boxes in the mail and you just know they are going to be up to no good. This time, Intel just launched the Intel Box Master System gaming system...with COLOR!
You really need to watch the video, but if you MUST sneak a peek at what we're talking about, check out the images below!
Visit Intel at http://inte.ly/unbox
It's Basically a Function Call for GPUs
Mantle, Vulkan, and DirectX 12 all claim to reduce overhead and provide a staggering increase in “draw calls”. As mentioned in the previous editorial, loading graphics card with tasks will take a drastic change in these new APIs. With DirectX 10 and earlier, applications would assign attributes to (what it is told is) the global state of the graphics card. After everything is configured and bound, one of a few “draw” functions is called, which queues the task in the graphics driver as a “draw call”.
While this suggests that just a single graphics device is to be defined, which we also mentioned in the previous article, it also implies that one thread needs to be the authority. This limitation was known about for a while, and it contributed to the meme that consoles can squeeze all the performance they have, but PCs are “too high level” for that. Microsoft tried to combat this with “Deferred Contexts” in DirectX 11. This feature allows virtual, shadow states to be loaded from secondary threads, which can be appended to the global state, whole. It was a compromise between each thread being able to create its own commands, and the legacy decision to have a single, global state for the GPU.
Some developers experienced gains, while others lost a bit. It didn't live up to expectations.
The paradigm used to load graphics cards is the problem. It doesn't make sense anymore. A developer might not want to draw a primitive with every poke of the GPU. At times, they might want to shove a workload of simple linear algebra through it, while other requests could simply be pushing memory around to set up a later task (or to read the result of a previous one). More importantly, any thread could want to do this to any graphics device.
The new graphics APIs allow developers to submit their tasks quicker and smarter, and it allows the drivers to schedule compatible tasks better, even simultaneously. In fact, the driver's job has been massively simplified altogether. When we tested 3DMark back in March, two interesting things were revealed:
- Both AMD and NVIDIA are only a two-digit percentage of draw call performance apart
- Both AMD and NVIDIA saw an order of magnitude increase in draw calls