Subject: Editorial, General Tech, Graphics Cards, Systems, Mobile | April 7, 2013 - 10:21 PM | Scott Michaud
Tagged: DirectX, DirectX 12
Microsoft DirectX is a series of interfaces for programmers to utilize typically when designing gaming or entertainment applications. Over time it became synonymous with Direct3D, the portion which mostly handles graphics processing by offloading those tasks to the video card. At one point, DirectX even handled networking through DirectPlay although that has been handled by Games for Windows Live or other APIs since Vista.
AMD Corporate Vice President Roy Taylor was recently interviewed by the German press, "c't magazin". When asked about the future of "Never Settle" bundles, Taylor claimed that games such as Crysis 3 and Bioshock: Infinite keep their consumers happy and also keep the industry innovating.
Keep in mind, the article was translated from German so I might not be entirely accurate with my understanding of his argument.
In a slight tangent, he discussed how new versions of DirectX tends to spur demand for new graphics processors with more processing power and more RAM. He has not heard anything about DirectX 12 and, in fact, he does not believe there will be one. As such, he is turning to bundled games to keep the industry moving forward.
Neowin, upon seeing this interview, reached out to Microsoft who committed to future "innovation with DirectX".
This exchange has obviously sparked a lot of... polarized... online discussion. One claimed that Microsoft is abandoning the PC to gain a foothold in the mobile market which it has practically zero share of. That is why they are dropping DirectX.
Unfortunately this does not make sense: DirectX would be one of the main advantages which Microsoft has in the mobile market. Mobile devices have access to fairly decent GPUs which can use DirectX to draw web pages and applications much smoother and much more power efficiently than their CPU counterparts. If anything, DirectX would be increased in relevance if Microsoft was blindly making a play for mobile.
The major threat to DirectX is still quite off in the horizon. At some point we might begin to see C++Amp or OpenCL nibble away at what DirectX does best: offload highly-parallel tasks to specialized processing units.
Still, releases such as DirectX 11.1 are quite focused on back-end tweaks and adjustments. What do you think a DirectX 12 API would even do, that would not already be possible with DirectX 11?
Subject: Editorial | April 6, 2013 - 04:34 AM | Scott Michaud
Tagged: Windows 8.1, windows blue, internet explorer, Internet Explorer 11
Windows Blue Windows 8.1 was leaked not too long ago. We reported on the release's illegitimate availability practically as soon as it happened. We knew that Internet Explorer took it to incremented its version to 11. The recent releases of Internet Explorer each made decent strides to catch the browser up to Google's Chrome and Mozilla's Firefox. Once thrown to the sharks thoroughly investigated, this release is pining to be just as relevant despite how near its expected release has been to Internet Explorer 10.
One of my first thoughts upon realizing that Internet Explorer 11 was an impending "thing": will it make it to Windows 7? Unfortunately, we still have no clue. Thankfully, unlike Windows RT which disallow rendering engines other than Internet Explorer's Trident, we are still capable of installing alternative browsers in Windows 7. If Internet Explorer 11 is unavailable, they can still install Firefox or Chrome.
For those who only use Internet Explorer and can upgrade to 11, you might be pleased to find WebGL support. Microsoft has been quite vocal against WebGL for quite some time, claiming it a security threat when facing the wild west of the internet. Then again, to some extent, the internet is a security nightmare in itself. The question is whether WebGL can be sufficiently secured for its usage:
- Animation effects (I created this specific demo... not the rest)
- Gorgeous, smooth, and battery-efficient 2d games
- Likewise beautiful 3D experiences
- And of course there's a semi-realtime raytracing demo.
This, to some extent, marks a moment where Microsoft promotes a Khronos standard. With some level of irony, Apple was one of the founding members of the WebGL group yet Microsoft might beat Safari to default WebGL support Of course it could not be that simple, however, as IE11 apparently accepts WebGL shaders (the math which computes the color and position of a pixel) in IESL rather than the standard GLSL. IESL, according to the name of its registry flag, seems to be heavily based on HLSL seen in DirectX.
I guess they just cannot let Khronos have a total victory?
SPDY also seems to be coming to IE11. SPDY, pronounced "speedy" and not an acronym, is a protocol designed to cut loading latency. Cool stuff.
Last and definitely least, IE11 is furthering its trend of pretending that it is a Mozilla Gecko-like rendering engine in its user agent string. Personally, I envision an IE logo buying a fiery-orange tail at a cosplay booth. They have been doing this for quite some time now.
Subject: Editorial, General Tech, Shows and Expos | March 27, 2013 - 03:25 AM | Scott Michaud
Tagged: battlefield, battlefield 4, GDC, GDC 13
Battlefield 4 is coming, that has been known with Medal of Honor: Warfighter's release and its promise of beta access, but the gameplay trailer is already here. Clocking in at just over 17 minutes, "Fishing in Baku" looks amazing from a technical standpoint.
The video has been embed below. A little not safe for work due to language and amputation.
Now that you finished gawking, we have gameplay to discuss. I cannot help but be disappointed with the campaign direction. Surely, the story was in planning prior to the release of Battlefield 3. Still, it seems to face the same generic-character problem which struck the last campaign.
In Battlefield 3, I really could not recognize many characters apart from the lead which made their deaths more confusing than upsetting. Normally when we claim a character is identifiable, we mean that we can relate to them. In this case, when I say the characters were not identifiable, I seriously mean that I probably could not pick them out in a police lineup.
Then again, the leaked promotional image for Battlefield 4 seems to show Blackburn at the helm. I guess there is some hope. Slim hope, which the trailer does not contribute to. I mean even the end narration capped how pointless the character interactions were. All this in spite of EA's proclaiming YouTube description of this being human, dramatic, and believable.
Oh well, it went boom good.
Subject: Editorial, General Tech, Processors, Shows and Expos | March 20, 2013 - 06:26 PM | Scott Michaud
Tagged: windows rt, nvidia, GTC 2013
NVIDIA develops processors, but without an x86 license they are only able to power ARM-based operating systems. When it comes to Windows, that means Windows Phone or Windows RT. The latter segment of the market has disappointing sales according to multiple OEMs, which Microsoft blames them for, but the jolly green GPU company is not crying doomsday.
NVIDIA just skimming the Surface RT, they hope.
As reported by The Verge, NVIDIA CEO Jen-Hsun Huang was optimistic that Microsoft would eventually let Windows RT blossom. He noted how Microsoft very often "gets it right" at some point when they push an initiative. And it is true, Microsoft has a history of turning around perceived disasters across a variety of devices.
They also have a history of, as they call it, "knifing the baby."
I think there is a very real fear for some that Microsoft could consider Intel's latest offerings as good enough to stop pursuing ARM. Of course, the more the pursue ARM, the more their business model will rely upon the-interface-formerly-known-as-Metro and likely all of its certification politics. As such, I think it is safe to say that I am watching the industry teeter on a fence with a bear on one side and a pack of rabid dogs on the other. On the one hand, Microsoft jumping back to Intel would allow them to perpetuate the desktop and all of the openness it provides. On the other hand, even if they stick with Intel they likely will just kill the desktop anyway, for the sake of user confusion and the security benefits of cert. We might just have less processor manufacturers when they do that.
So it could be that NVIDIA is confident that Microsoft will push Windows RT, or it could be that NVIDIA is pushing Microsoft to continue to develop Windows RT. Frankly, I do not know which would be better... or more accurately, worse.
Subject: Editorial | February 27, 2013 - 02:26 PM | Scott Michaud
Tagged: Podcast Bingo, Bingo
It's Bumpday! No, no I'm not reviving that, at least not yet.
But it is Wednesday and as such we will be gathering for another fine PC Perspective Podcast. As always, if you wish to join us: head down to pcper.com/live or click on the little radio tower in the upcoming events box on the right side of your screen.
Yes, I know, there are two P's. Deal with it.
Starting this week, we will have a new activity for our viewers to follow along with. Play along with the first official PC Perspective Podcast Bingo Card! Keep track of every episode we mention “Arc Welding” by putting us to task and marking off the square.
Be sure to call out the spaces you mark off and whatever Bingo patterns you manage to make out of it.
Our IRC room during the podcast (it's usually active 24/7) is: irc.mibbit.net #pcper
Of course this is just for fun – but fun is fun!
Subject: Editorial, General Tech, Systems | February 26, 2013 - 08:07 PM | Scott Michaud
Tagged: ps4, unreal engine 4
Unreal Engine 4 was present at Sony's Playstation 4 press conference, but that is no surprise. Epic Games has been present at several keynotes for new console launches. Last generation, Unreal Engine 3 kicked off both Xbox 360 and PS3 with demos of Gears of War and Unreal Tournament 2007, respectively. The PS4 received a continuation of the Elemental Demo first released at the end of E3 last June.
All I could think about when I watched the was, “This looks pretty bad. What happened?”
If you would like to follow along at home, both demos are available on Youtube:
As you can see from the animated GIF above, particle count appears to have been struck the worst. The eyes contain none of the particle effects in the PS4 version. There appears to be an order of magnitude or two more particles on the PC version than the PS4. There are no particle effects around the eyes of the statue. Whole segments of particles are not even rendered.
In this screenshot, downsampled to 660x355, the loss of physical detail is even more apparent. The big cluster of particles near the leg are not present in the PS4 version and the regular cluster is nowhere near as densely packed.
And the lighting, oh the lighting.
On the PS4 everything looks a lot higher contrast without a lot of the subtle lighting information. This loss of detail is most apparent with the volcano smoke and the glow of the hammer but are also obvious in the character model when viewed in the video.
Despite the 8GB of RAM, some of the textures also seem down-resolution. Everything appears to have much more of a plastic look to it.
Still, while computers still look better, at least high-end PC gaming will still be within the realm of scalability for quite some time. We have been hampered by being so far ahead of consoles that it was just not feasible to make full use of the extra power. At least that is looking to change.
Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | February 26, 2013 - 04:19 AM | Scott Michaud
Tagged: Firefox OS, mozilla, firefox, MWC, MWC 13
Mobile World Congress is going on at Barcelona and this year sees the official entry of a new contender: Firefox OS.
Mozilla held their keynote speech the day before the official start to the trade show. If there is anything to be learned from CES, it would be that there is an arms race to announce your product before everyone else steals media attention while still being considered a part of the trade show. By the time the trade show starts, most of the big players have already said all that they need to say.
If you have an hour to spare, you should check it out for yourself. The whole session was broadcast and recorded on Air Mozilla.
The whole concept of Firefox OS as I understand it is to open up web standards such that it is possible to create a completely functional mobile operating system from it. Specific platforms do not matter, the content will all conform to a platform of standards which anyone would be able to adopt.
I grin for a different reason: should some content exist in the future that is intrinsically valuable to society, its reliance on an open-based platform will allow future platforms to carry it.
Not a lot of people realize that iOS and Windows RT disallow alternative web browsers. Sure, Google Chrome the app exists for iOS, but it is really a re-skinned Safari. Any web browser in the Windows Store will use Trident as its rendering engine by mandate of their certification rules. This allows the platform developer to be choosey with whichever standards they wish to support. Microsoft has been very vocally against any web standard backed by Khronos. You cannot install another browser if you run across a web application requiring one of those packages.
When you have alternatives, such as Firefox OS, developers are promoted to try new things. The alternative platforms promote standards which generate these new applications and push the leaders to implement those standards too.
And so we creep ever-closer to total content separation from platform.
Taking an Accurate Look at SSD Write Endurance
Last year, I posted a rebuttal to a paper describing the future of flash memory as ‘bleak’. The paper went through great (and convoluted) lengths to paint a tragic picture of flash memory endurance moving forward. Yesterday a newer paper hit Slashdot – this one doing just the opposite, and going as far as to assume production flash memory handling up to 1 Million erase cycles. You’d think that since I’m constantly pushing flash memory as a viable, reliable, and super-fast successor to Hard Disks (aka 'Spinning Rust'), that I’d just sit back on this one and let it fly. After all, it helps make my argument! Well, I can’t, because if there are errors published on a topic so important to me, it’s in the interest of journalistic integrity that I must now post an equal and opposite rebuttal to this one – even if it works against my case.
First I’m going to invite you to read through the paper in question. After doing so, I’m now going to pick it apart. Unfortunately I’m crunched for time today, so I’m going to reduce my dissertation into the form of some simple bulleted points:
- Max data write speed did not take into account 8/10 encoding, meaning 6Gb/sec = 600MB/sec, not 750MB/sec.
- The flash *page* size (8KB) and block sizes (2MB) chosen more closely resemble that of MLC parts (not SLC – see below for why this is important).
- The paper makes no reference to Write Amplification.
Perhaps the most glaring and significant is that all of the formulas, while correct, fail to consider the most important factor when dealing with flash memory writes – Write Amplification.
Before geting into it, I'll reference the excellent graphic that Anand put in his SSD Relapse piece:
SSD controllers combine smaller writes into larger ones in an attempt to speed up the effective write speed. This falls flat once all flash blocks have been written to at least once. From that point forward, the SSD must play musical chairs with the data on each and every small write. In a bad case, a single 4KB write turns into a 2MB write. For that example, Write Amplification would be a factor of 500, meaning the flash memory is cycled at 500x the rate calculated in the paper. Sure that’s an extreme example, but the point is that without referencing amplification at all, it is assumed to be a factor of 1, which would only be the case if you were only writing 2MB blocks of data to the SSD. This is almost never the case, regardless of Operating System.
After posters on Slashdot called out the author on his assumptions of rated P/E cycles, he went back and added two links to justify his figures. The problem is that the first links to a 2005 data sheet for 90nm SLC flash. Samsung’s 90nm flash was 1Gb per die (128MB). The packages were available with up to 4 dies each, and scaling up to a typical 16-chip SSD, that only gives you an 8GB SSD. Not very practical. That’s not to say 100k is an inaccurate figure for SLC endurance. It’s just a really bad reference to use is all. Here's a better one from the Flash Memory Summit a couple of years back:
The second link was a 2008 PR blast from Micron, based on their proposed pushing of the 34nm process to its limits. “One Million Write Cycles” was nothing more than a tag line for an achievement accomplished in a lab under ideal conditions. That figure was never reached in anything you could actually buy in a SATA SSD. A better reference would be from that same presentation at the Summit:
This shows larger process nodes hitting even beyond 1 million cycles (given sufficient additional error bits used for error correction), but remember it has to be something that is available and in a usable capacity to be practical for real world use, and that’s just not the case for the flash in the above chart.
At the end of the day, manufacturers must balance cost, capacity, and longevity. This forces a push towards smaller processes (for more capacity per cost), with the limit being how much endurance they are willing to give up in the process. In the end they choose based on what the customer needs. Enterprise use leans towards SLC or eMLC, as they are willing to spend more for the gain in endurance. Typical PC users get standard MLC and now even TLC, which are *good enough* for that application. It's worth noting that most SSD failures are not due to burning out all of the available flash P/E cycles. The vast majority are due to infant mortality failures of the controller or even due to buggy firmware. I've never written enough to any single consumer SSD (in normal operation) to wear out all of the flash. The closest I've come to a flash-related failure was when I had an ioDrive fail during testing by excessive heat causing a solder pad to lift on one of the flash chips.
All of this said, I’d love to see a revisit to the author’s well-structured paper – only based on the corrected assumptions I’ve outlined above. *That* is the type of paper I would reference when attempting to make *accurate* arguments for SSD endurance.
Subject: Editorial, General Tech | February 16, 2013 - 02:08 AM | Scott Michaud
Tagged: consoles, consolitis, pc gaming
If you really enjoy an Xbox or Playstation game, better hope your console does not die: it is likely that nothing else will play it. This news comes from a statement made by Blake Jorgensen, CFO of Electronic Arts. Clearly EA is a trusted partner of all console developers and not just an anonymous tipster.
You mean, Devil May Stop Crying?
I tend to rant about this point quite often. For a market so devoted to the opinion that video games are art, the market certainly does not care about its preservation as art. There is always room for consumable and even disposable entertainment, but the difference with art is that it cannot be substituted with another piece of content.
There would be a difference if someone magically replaced every copy of Schindler’s List, including the vaulted masters, with The Boy in the Striped Pajamas. I could safely assume that the vast majority of the audience for either film was not just browsing the Holocaust movie genre. I would expect the viewer was seeking out the one or the other for a specific reason.
This is incompatible with the console ecosystem by its design. The point of the platform is to be disposable and its content is along for the ride while it lasts. They often deliver the console for less than their parts and labor fees: research, development, and marketing costs regardless. The business model is to eliminate as many big fees as possible and then jack up the price of everything else ten bucks here and there. Over time you will not be given a bargain, over time you will give them more than they made you think you saved. They then spend this extra money keeping content exclusively under their control, not yours. Also, profits... give or take.
Again, there is always room for consumable entertainment. The consoles are designed to be very convenient, but not cheap and not suitable for timeless art. Really, the only unfortunate element is how these impairments are viewed as assets and all the while examples such as this one dance around the background largely shrugged off without being pieced together.
As for your favorite game? Who knows, maybe you will get lucky and it will be remade on some other platform for you to purchase again. You might be lucky, it might even be available on the PC.
Subject: Editorial, General Tech | February 16, 2013 - 01:19 AM | Scott Michaud
There have been some groups opposed to the planned deal to cease publicly trading Dell and release their shares. It would seem that for many, a short-term payout of 25 percent over trading price is insufficient and, they believe, undervalues the company. I mean, the price is totally not derived from the value you gave it when you just finished trading stocks at 80 percent of what Dell is offering you or anything. Yes, I am making a joke: some investors were almost definitely going long on Dell. I still suspect that some are just playing hardball, hoping that a quarter on the dollar raise is just a starting bid.
Buckle in, I will separate stockholders opinions into two categories: investment firms and employees.
Ars Technica clearly had football on the mind when they wrote a very Superbowl-themed editorial. Early in the month, Southeastern Asset Management sent a letter to Dell management expressing their stance to vote against a deal to go private. The investment firm controls 8.5 percent of Dell which means their opinion has a fair amount of sway. A short few days later, T. Rowe Price stepped up to likewise oppose the deal. This firm owns 4.4 percent of Dell, which means combined they have roughly a 13 percent vote.
Factor in a bunch of smaller investors and you are looking at almost a fifth of the company wanting to keep it public. That combined voting power slightly overtakes the 16 percent control owned by Micheal Dell and could hamper the festivities.
Employees, meanwhile, are upset all the same. Again, according to Ars Technica and their vigilant coverage states that some employees were force to sell their stock acquired as a part of their 401k at $9 per share – substantially lower than the 13.65$ being offered to investors.
There are several other ways which employees get their stake in the company reduced or hampered, but I would direct you to the Ars Technica article so I do not butcher any details.
Unfortunately these sorts of practices are fairly commonplace when it comes to investment deals. It would appear as if this deal trots on common ground instead of taking the high road.
God, I hate mixed metaphors.
Get notified when we go live!