All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Editorial | April 18, 2013 - 01:55 PM | Scott Michaud
So, news which might excite our readers: we are going to try reviewing video games.
Of course, the first thing which needs to be addressed when reviewing games is our grading system. Games, in particular, are a very artful medium and as such it does not entirely make sense to quantize its qualities.
The simple answer is, we will not.
Step back and consider how we review hardware: we run some benchmarks, we discuss the features in often numbing detail, and we assign an award-badge to the product according to our opinion. A hardware could receive no merit; it could receive a bronze, silver, or gold medal; finally, the truly extraordinary products will receive an Editor's Choice Award. If you think about it, these can transfer quite easily to video game reviews.
Our expectation is to apply two ratings to every review: a badge and a number.
A badge is very good at qualifying our assessment of a product whereas numerical scores are very good at quantifying a derivable value. We, collectively as PC gamers, have certain expectations for games and they usually demand more than the impressions of a typical console gamer. Simultaneously, we tend to be an afterthought for a lot of titles; yes, I am being generous even with that statement. Many games are outright broken, crippled by DRM, or otherwise demonstrate in very obvious terms that our money is somehow inferior. On the other hand, there are games which go above and beyond reasonable expectations held by PC gamers, and even some unreasonable ones, and are rarely hailed for it.
We are not able to judge the artistic qualities of a game using a numerical score, but we can judge its technical merits using a numerical rubric.
And so exists our planned review metric. The main point is that there will not be any definite rank-order to each game, at least from an artistic standpoint. A game is allowed to really well on one category and really terribly on another. If you are concerned with the game itself, keep more of an eye toward which award we gave it. If you are concerned about how well the game exists as a PC title, take a look at the numerical score.
There are of course caveats to this method. A viewer who looks solely at the numerical score will not know much, if anything, about the game itself. The numerical score is just a gauge for the level of effort put into the PC version.
Then again, would you expect any less from a website called "PC Perspective" which reviews products with a blend of explanation of its qualitative features mixed in with strict quantitative benchmarking?
Lastly, this is not about whether a game is "better" on a PC or on a console. Developers are free to focus on whatever platform they desire. A game designed around a console and ported to the PC will still get a great score if the finished result exhibits a "great" level of care. Likewise, even if your game is PC-exclusive, do not expect us to give it a great score if it cannot alt-tab worth a damn and is wrapped in DRM which roots our system using kernel-mode drivers.
It is not particularly hard to make a great PC experience, all it takes is effort. Fortunately, that is a property that we can assign an honest grade to.
We would really like to hear your feedback on this. Drop a line in the comments below!
Subject: Editorial, General Tech, Graphics Cards | April 14, 2013 - 02:22 AM | Scott Michaud
Tagged: never settle, never settle reloaded, amd, far cry 3
So when AMD reloaded their Never Settle bundles, they left an extra round in the barrel.
Some of my favorite games were given to me in a bundle with some piece of computer hardware. You might remember from the PC Perspective game night that I am a major fan of the Unreal Tournament franchise. My first Unreal Tournament game was an unexpected surprise when I purchased my first standalone GPU. My 166MHz Pentium computer also came bundled with Mechwarrior 2 and Wipeout.
As we discussed, AMD considers bundle-offers as a way to keep the software industry rolling forward. The quantity and quality of games which participate in the recent Never Settle bundles certainly deserve credit as it is due. Bioshock: Infinite is a game that just about every PC gamer needs to experience, and there are about a half-dozen other great titles as a part of the promotion depending upon which card or cards you purchase.
As it turns out, AMD negotiated with Ubisoft and added Far Cry 3: Blood Dragon to their Never Settle bundle. The coolest part is that AMD will retroactively email codes for this new title to anyone who has redeemed a Never Settle: Reloaded code.
So if you have ever Reloaded your Never Settle in the past, check your email as apparently you can Never Settle your reloads again.
Subject: Editorial, General Tech, Graphics Cards, Systems, Mobile | April 7, 2013 - 10:21 PM | Scott Michaud
Tagged: DirectX, DirectX 12
Microsoft DirectX is a series of interfaces for programmers to utilize typically when designing gaming or entertainment applications. Over time it became synonymous with Direct3D, the portion which mostly handles graphics processing by offloading those tasks to the video card. At one point, DirectX even handled networking through DirectPlay although that has been handled by Games for Windows Live or other APIs since Vista.
AMD Corporate Vice President Roy Taylor was recently interviewed by the German press, "c't magazin". When asked about the future of "Never Settle" bundles, Taylor claimed that games such as Crysis 3 and Bioshock: Infinite keep their consumers happy and also keep the industry innovating.
Keep in mind, the article was translated from German so I might not be entirely accurate with my understanding of his argument.
In a slight tangent, he discussed how new versions of DirectX tends to spur demand for new graphics processors with more processing power and more RAM. He has not heard anything about DirectX 12 and, in fact, he does not believe there will be one. As such, he is turning to bundled games to keep the industry moving forward.
Neowin, upon seeing this interview, reached out to Microsoft who committed to future "innovation with DirectX".
This exchange has obviously sparked a lot of... polarized... online discussion. One claimed that Microsoft is abandoning the PC to gain a foothold in the mobile market which it has practically zero share of. That is why they are dropping DirectX.
Unfortunately this does not make sense: DirectX would be one of the main advantages which Microsoft has in the mobile market. Mobile devices have access to fairly decent GPUs which can use DirectX to draw web pages and applications much smoother and much more power efficiently than their CPU counterparts. If anything, DirectX would be increased in relevance if Microsoft was blindly making a play for mobile.
The major threat to DirectX is still quite off in the horizon. At some point we might begin to see C++Amp or OpenCL nibble away at what DirectX does best: offload highly-parallel tasks to specialized processing units.
Still, releases such as DirectX 11.1 are quite focused on back-end tweaks and adjustments. What do you think a DirectX 12 API would even do, that would not already be possible with DirectX 11?
Subject: Editorial | April 6, 2013 - 04:34 AM | Scott Michaud
Tagged: Windows 8.1, windows blue, internet explorer, Internet Explorer 11
Windows Blue Windows 8.1 was leaked not too long ago. We reported on the release's illegitimate availability practically as soon as it happened. We knew that Internet Explorer took it to incremented its version to 11. The recent releases of Internet Explorer each made decent strides to catch the browser up to Google's Chrome and Mozilla's Firefox. Once thrown to the sharks thoroughly investigated, this release is pining to be just as relevant despite how near its expected release has been to Internet Explorer 10.
One of my first thoughts upon realizing that Internet Explorer 11 was an impending "thing": will it make it to Windows 7? Unfortunately, we still have no clue. Thankfully, unlike Windows RT which disallow rendering engines other than Internet Explorer's Trident, we are still capable of installing alternative browsers in Windows 7. If Internet Explorer 11 is unavailable, they can still install Firefox or Chrome.
For those who only use Internet Explorer and can upgrade to 11, you might be pleased to find WebGL support. Microsoft has been quite vocal against WebGL for quite some time, claiming it a security threat when facing the wild west of the internet. Then again, to some extent, the internet is a security nightmare in itself. The question is whether WebGL can be sufficiently secured for its usage:
- Animation effects (I created this specific demo... not the rest)
- Gorgeous, smooth, and battery-efficient 2d games
- Likewise beautiful 3D experiences
- And of course there's a semi-realtime raytracing demo.
This, to some extent, marks a moment where Microsoft promotes a Khronos standard. With some level of irony, Apple was one of the founding members of the WebGL group yet Microsoft might beat Safari to default WebGL support Of course it could not be that simple, however, as IE11 apparently accepts WebGL shaders (the math which computes the color and position of a pixel) in IESL rather than the standard GLSL. IESL, according to the name of its registry flag, seems to be heavily based on HLSL seen in DirectX.
I guess they just cannot let Khronos have a total victory?
SPDY also seems to be coming to IE11. SPDY, pronounced "speedy" and not an acronym, is a protocol designed to cut loading latency. Cool stuff.
Last and definitely least, IE11 is furthering its trend of pretending that it is a Mozilla Gecko-like rendering engine in its user agent string. Personally, I envision an IE logo buying a fiery-orange tail at a cosplay booth. They have been doing this for quite some time now.
Subject: Editorial, General Tech, Shows and Expos | March 27, 2013 - 03:25 AM | Scott Michaud
Tagged: battlefield, battlefield 4, GDC, GDC 13
Battlefield 4 is coming, that has been known with Medal of Honor: Warfighter's release and its promise of beta access, but the gameplay trailer is already here. Clocking in at just over 17 minutes, "Fishing in Baku" looks amazing from a technical standpoint.
The video has been embed below. A little not safe for work due to language and amputation.
Now that you finished gawking, we have gameplay to discuss. I cannot help but be disappointed with the campaign direction. Surely, the story was in planning prior to the release of Battlefield 3. Still, it seems to face the same generic-character problem which struck the last campaign.
In Battlefield 3, I really could not recognize many characters apart from the lead which made their deaths more confusing than upsetting. Normally when we claim a character is identifiable, we mean that we can relate to them. In this case, when I say the characters were not identifiable, I seriously mean that I probably could not pick them out in a police lineup.
Then again, the leaked promotional image for Battlefield 4 seems to show Blackburn at the helm. I guess there is some hope. Slim hope, which the trailer does not contribute to. I mean even the end narration capped how pointless the character interactions were. All this in spite of EA's proclaiming YouTube description of this being human, dramatic, and believable.
Oh well, it went boom good.
Subject: Editorial, General Tech, Processors, Shows and Expos | March 20, 2013 - 06:26 PM | Scott Michaud
Tagged: windows rt, nvidia, GTC 2013
NVIDIA develops processors, but without an x86 license they are only able to power ARM-based operating systems. When it comes to Windows, that means Windows Phone or Windows RT. The latter segment of the market has disappointing sales according to multiple OEMs, which Microsoft blames them for, but the jolly green GPU company is not crying doomsday.
NVIDIA just skimming the Surface RT, they hope.
As reported by The Verge, NVIDIA CEO Jen-Hsun Huang was optimistic that Microsoft would eventually let Windows RT blossom. He noted how Microsoft very often "gets it right" at some point when they push an initiative. And it is true, Microsoft has a history of turning around perceived disasters across a variety of devices.
They also have a history of, as they call it, "knifing the baby."
I think there is a very real fear for some that Microsoft could consider Intel's latest offerings as good enough to stop pursuing ARM. Of course, the more the pursue ARM, the more their business model will rely upon the-interface-formerly-known-as-Metro and likely all of its certification politics. As such, I think it is safe to say that I am watching the industry teeter on a fence with a bear on one side and a pack of rabid dogs on the other. On the one hand, Microsoft jumping back to Intel would allow them to perpetuate the desktop and all of the openness it provides. On the other hand, even if they stick with Intel they likely will just kill the desktop anyway, for the sake of user confusion and the security benefits of cert. We might just have less processor manufacturers when they do that.
So it could be that NVIDIA is confident that Microsoft will push Windows RT, or it could be that NVIDIA is pushing Microsoft to continue to develop Windows RT. Frankly, I do not know which would be better... or more accurately, worse.
Subject: Editorial | February 27, 2013 - 02:26 PM | Scott Michaud
Tagged: Podcast Bingo, Bingo
It's Bumpday! No, no I'm not reviving that, at least not yet.
But it is Wednesday and as such we will be gathering for another fine PC Perspective Podcast. As always, if you wish to join us: head down to pcper.com/live or click on the little radio tower in the upcoming events box on the right side of your screen.
Yes, I know, there are two P's. Deal with it.
Starting this week, we will have a new activity for our viewers to follow along with. Play along with the first official PC Perspective Podcast Bingo Card! Keep track of every episode we mention “Arc Welding” by putting us to task and marking off the square.
Be sure to call out the spaces you mark off and whatever Bingo patterns you manage to make out of it.
Our IRC room during the podcast (it's usually active 24/7) is: irc.mibbit.net #pcper
Of course this is just for fun – but fun is fun!
Subject: Editorial, General Tech, Systems | February 26, 2013 - 08:07 PM | Scott Michaud
Tagged: ps4, unreal engine 4
Unreal Engine 4 was present at Sony's Playstation 4 press conference, but that is no surprise. Epic Games has been present at several keynotes for new console launches. Last generation, Unreal Engine 3 kicked off both Xbox 360 and PS3 with demos of Gears of War and Unreal Tournament 2007, respectively. The PS4 received a continuation of the Elemental Demo first released at the end of E3 last June.
All I could think about when I watched the was, “This looks pretty bad. What happened?”
If you would like to follow along at home, both demos are available on Youtube:
As you can see from the animated GIF above, particle count appears to have been struck the worst. The eyes contain none of the particle effects in the PS4 version. There appears to be an order of magnitude or two more particles on the PC version than the PS4. There are no particle effects around the eyes of the statue. Whole segments of particles are not even rendered.
In this screenshot, downsampled to 660x355, the loss of physical detail is even more apparent. The big cluster of particles near the leg are not present in the PS4 version and the regular cluster is nowhere near as densely packed.
And the lighting, oh the lighting.
On the PS4 everything looks a lot higher contrast without a lot of the subtle lighting information. This loss of detail is most apparent with the volcano smoke and the glow of the hammer but are also obvious in the character model when viewed in the video.
Despite the 8GB of RAM, some of the textures also seem down-resolution. Everything appears to have much more of a plastic look to it.
Still, while computers still look better, at least high-end PC gaming will still be within the realm of scalability for quite some time. We have been hampered by being so far ahead of consoles that it was just not feasible to make full use of the extra power. At least that is looking to change.
Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | February 26, 2013 - 04:19 AM | Scott Michaud
Tagged: Firefox OS, mozilla, firefox, MWC, MWC 13
Mobile World Congress is going on at Barcelona and this year sees the official entry of a new contender: Firefox OS.
Mozilla held their keynote speech the day before the official start to the trade show. If there is anything to be learned from CES, it would be that there is an arms race to announce your product before everyone else steals media attention while still being considered a part of the trade show. By the time the trade show starts, most of the big players have already said all that they need to say.
If you have an hour to spare, you should check it out for yourself. The whole session was broadcast and recorded on Air Mozilla.
The whole concept of Firefox OS as I understand it is to open up web standards such that it is possible to create a completely functional mobile operating system from it. Specific platforms do not matter, the content will all conform to a platform of standards which anyone would be able to adopt.
I grin for a different reason: should some content exist in the future that is intrinsically valuable to society, its reliance on an open-based platform will allow future platforms to carry it.
Not a lot of people realize that iOS and Windows RT disallow alternative web browsers. Sure, Google Chrome the app exists for iOS, but it is really a re-skinned Safari. Any web browser in the Windows Store will use Trident as its rendering engine by mandate of their certification rules. This allows the platform developer to be choosey with whichever standards they wish to support. Microsoft has been very vocally against any web standard backed by Khronos. You cannot install another browser if you run across a web application requiring one of those packages.
When you have alternatives, such as Firefox OS, developers are promoted to try new things. The alternative platforms promote standards which generate these new applications and push the leaders to implement those standards too.
And so we creep ever-closer to total content separation from platform.
Subject: Editorial, General Tech | February 16, 2013 - 02:08 AM | Scott Michaud
Tagged: consoles, consolitis, pc gaming
If you really enjoy an Xbox or Playstation game, better hope your console does not die: it is likely that nothing else will play it. This news comes from a statement made by Blake Jorgensen, CFO of Electronic Arts. Clearly EA is a trusted partner of all console developers and not just an anonymous tipster.
You mean, Devil May Stop Crying?
I tend to rant about this point quite often. For a market so devoted to the opinion that video games are art, the market certainly does not care about its preservation as art. There is always room for consumable and even disposable entertainment, but the difference with art is that it cannot be substituted with another piece of content.
There would be a difference if someone magically replaced every copy of Schindler’s List, including the vaulted masters, with The Boy in the Striped Pajamas. I could safely assume that the vast majority of the audience for either film was not just browsing the Holocaust movie genre. I would expect the viewer was seeking out the one or the other for a specific reason.
This is incompatible with the console ecosystem by its design. The point of the platform is to be disposable and its content is along for the ride while it lasts. They often deliver the console for less than their parts and labor fees: research, development, and marketing costs regardless. The business model is to eliminate as many big fees as possible and then jack up the price of everything else ten bucks here and there. Over time you will not be given a bargain, over time you will give them more than they made you think you saved. They then spend this extra money keeping content exclusively under their control, not yours. Also, profits... give or take.
Again, there is always room for consumable entertainment. The consoles are designed to be very convenient, but not cheap and not suitable for timeless art. Really, the only unfortunate element is how these impairments are viewed as assets and all the while examples such as this one dance around the background largely shrugged off without being pieced together.
As for your favorite game? Who knows, maybe you will get lucky and it will be remade on some other platform for you to purchase again. You might be lucky, it might even be available on the PC.
Subject: Editorial, General Tech | February 16, 2013 - 01:19 AM | Scott Michaud
There have been some groups opposed to the planned deal to cease publicly trading Dell and release their shares. It would seem that for many, a short-term payout of 25 percent over trading price is insufficient and, they believe, undervalues the company. I mean, the price is totally not derived from the value you gave it when you just finished trading stocks at 80 percent of what Dell is offering you or anything. Yes, I am making a joke: some investors were almost definitely going long on Dell. I still suspect that some are just playing hardball, hoping that a quarter on the dollar raise is just a starting bid.
Buckle in, I will separate stockholders opinions into two categories: investment firms and employees.
Ars Technica clearly had football on the mind when they wrote a very Superbowl-themed editorial. Early in the month, Southeastern Asset Management sent a letter to Dell management expressing their stance to vote against a deal to go private. The investment firm controls 8.5 percent of Dell which means their opinion has a fair amount of sway. A short few days later, T. Rowe Price stepped up to likewise oppose the deal. This firm owns 4.4 percent of Dell, which means combined they have roughly a 13 percent vote.
Factor in a bunch of smaller investors and you are looking at almost a fifth of the company wanting to keep it public. That combined voting power slightly overtakes the 16 percent control owned by Micheal Dell and could hamper the festivities.
Employees, meanwhile, are upset all the same. Again, according to Ars Technica and their vigilant coverage states that some employees were force to sell their stock acquired as a part of their 401k at $9 per share – substantially lower than the 13.65$ being offered to investors.
There are several other ways which employees get their stake in the company reduced or hampered, but I would direct you to the Ars Technica article so I do not butcher any details.
Unfortunately these sorts of practices are fairly commonplace when it comes to investment deals. It would appear as if this deal trots on common ground instead of taking the high road.
God, I hate mixed metaphors.
Subject: Editorial, General Tech | February 9, 2013 - 02:49 AM | Scott Michaud
Tagged: windows blue
Could the sadness Microsoft feels with their OEM partners make the whole company feel just a little Blue?
I have been thinking about this while reading the latest news from Mary Jo Foley over at ZDNet. This has not been the first time that we have mentioned the color. Blue was, and still is, a codename for the first major feature-update of Windows 8. What we learned is that now it seems that “Blue” covers much more.
As many know, Microsoft has shifted their branding into four color-coded divisions: blue is for Windows; red is for Office; green is for Xbox, and yellow has yet to be disclosed. As far as we know, the Windows division encompasses Windows Phone, Internet Explorer, official apps, and so forth. Apparently “Blue”, the codenamed update, will start Microsoft on an annual update schedule for the Windows division. This means that Internet Explorer as well as the Mail, Calendar, Bing app, and other “Windows Services” such as SkyDrive and Hotmail will shift towards the yearly timer.
As I read Mary Jo's article, I focused on a point buried late in the second act of the column:
Instead of RTMing a new version of Windows once every three or so years, and then hoping/praying OEMs can get the final bits tested and preloaded on new hardware a few months later, Microsoft is going to try to push Blue out to users far more quickly, possibly via the Windows Store, my contact said.
While I have speculated about Microsoft and their desires to shift business models to a subscription service for quite some time, I have not considered OEM partners as a prominent reason. Microsoft has been wrestling with their manufacturers, that has recently been made obvious. The release of a new operating system drives users to go out and purchase new hardware. The PC industry bounces forward with software and hardware enhancements chained in lockstep to the three year Windows cycle, even the enthusiast market to some extent.
Perhaps Microsoft is trying to let the hardware itself drive the market. Instead of pushing the industry forward in big leaps, would it be possible that Microsoft wants the hardware to evolve and a new version of Windows to be there waiting for it?
Subject: Editorial, General Tech | February 5, 2013 - 05:44 PM | Scott Michaud
Tagged: Psychonauts, Notch, Tim Schafer
You cannot knock Tim Schafer: he abides by “Shut up and take my money”.
Last year we reported on the public negotiations between the heads of Mojang and Double Fine for a potential sequel to Psychonauts. The game was supposed to take “a couple” of million to make, which was later clarified to at least $13 million USD. This prompted the famous response from Notch, “Yeah, I can do that.”
Some time later, the deal fell by the wayside.
A storm never came that day, barely a ripple brushed against his wooden canoe.
Recently Notch was on Reddit and commented about the status of the sequel. The final budget ended up being around $18 million USD which ended up being beyond what Notch felt comfortable investing in. It was not for a lack of funds, however. Markus stated that he just did not have the time to be involved in an $18 million dollar deal.
The biggest point I would like to make is how little damage was caused by discussing this out in the open: the game fell through, at least for the moment, and no effigies were burnt. We might be approaching a time and an industry where these sorts of discussions will not need to be performed in strict secrecy.
Congratulations to Markus Persson and Tim Schafer for being brave or eccentric enough to trust the internet. We are sorry it didn't work out, but wish you luck in the future.
Subject: Editorial, General Tech, Systems, Mobile | February 5, 2013 - 05:10 PM | Scott Michaud
Dell, dude, you're getting a Dell!
So it is official that Dell is going private. Michael Dell, CEO, as well as: Silver Lake, MSD Capital, several banks, and Dell itself will buy back stocks from investors 25% above the January 11th trading price. The whole deal would be worth $24.4 billion USD.
Going private allows the company to make big shifts in their business without answering to investors on a quarterly basis. We can see how being a publicly traded company seems to hinder businesses after they grow beyond what a cash infusion can assist. Even Apple finds it necessary to keep an absolutely gigantic pile of cash to play with, only recently paying dividends to investors.
Also contributing to the buyback, as heavily reported, is a $2 billion USD loan from Microsoft. While it sounds like a lot in isolation, it is only just over 8% of the whole deal. All you really can pull is that it seems like Microsoft supports Dell in their decision and is putting their money where their intentions are.
Subject: Editorial, Cases and Cooling | February 5, 2013 - 12:55 PM | Ryan Shrout
Tagged: seasonic, PSU, power supply, m12ii, giveaway, contest
We know that our readers love to win free stuff, and who can blame you? Building PCs can sometimes be a burden on your wallet and we do our best to help that by showing you the best deals and occasionally having contests like this!
Our good friends at Seasonic wanted to offer up a couple of power supplies for our community and we were obviously excited to facilitate! Here are the goods you can win:
Seasonic M12II SS-850AM 850 watt Power Supply ($130 value - Newegg Link)
Seasonic M12II SS-750AM 750 watt Power Supply ($120 value - Newegg Link)
Seasonic M12II Bronze Series has been the ever-popular power supply series of the semi modular category. Now Seasonic extends the semi-modular series by introducing M12II Bronze 650/750/850 to provide consumers a larger selection in the entry level 80PLUS Bronze certified category.
The M12II Bronze new models have built in a full protection feature including OCP, OPP,OTP, OVP, SCP & UVP and meet worldwide safety and environmental standards. The all-new M12II-650/750/850 units are the new leaders in the 80 PLUS Bronze category and another great addition to the Seasonic Retail power supply family.
If you are looking to build a new PC or upgrade your current system, either of these two power supplies will make a great backbone for all the other components.
How do you win?
- Visit your favorite PC Perspective pages like our YouTube channel, Facebook page and Twitter account. You should subscribe, like and follow us, you know...if you want to. We'd appreciate it!
- Also, stop by the Seasonic Facebook page and give it a look - they are always posting contests and giveaways there!!
- Leave a comment here on this post telling us what you would be able to do better if your system was powered by one of these power supplies!
We'll pick a winner on Wednesday the 13th of February, so get your entries in NOW! A big thank you goes out to Seasonic for supporting PC Perspective and for supporting our loyal readers!
Subject: Editorial, General Tech | February 2, 2013 - 06:23 PM | Scott Michaud
Tagged: webkit, w3c, microsoft, internet explorer, html5
Microsoft has been doing their penance for the sins against web developers of the two decades past. The company does not want developers to target specific browsers and opt to include W3C implementations of features if they are available.
Microsoft traditionally fought web standards, forcing developers to implement ActiveX and filters to access advanced features such as opacity. Web developers would program their websites multiple times to account for the... intricacies... of Internet Explorer when compared to virtually every other browser.
Now Google and Apple, rightfully or otherwise (respectively, trollolol), are heavily gaining in popularity. This increase in popularity leads to websites implementing features exclusively for Webkit-based browsers. Internet Explorer is not the browser which gets targeted for advanced effect. If there is Internet Explorer-specific code in sites it is usually workarounds for earlier versions of the browser and only muck up Microsoft's recent standards-compliance by feeding it non-standard junk.
It has been an uphill battle for Microsoft to push users to upgrade their browsers and web developers to upgrade their sites. “modern.IE” is a service which checks for typical incompatibilities and allows for developers to test their site across multiple versions of IE.
Even still, several web technologies are absent in Internet Explorer as they have not been adopted by the W3C. WebGL and WebCL seek to make the web browser into high-performance platform for applications. Microsoft has been vocal about not supporting these Khronos-backed technologies on the grounds of security. Instead of building out web browsers as a cross-platform application platform Microsoft is pushing hard to not get their app marketplace ignored.
I am not sure what Microsoft should fear most: that their app marketplace will be smothered by their competitors, or whether they only manage to win the battle after the war changes theaters. You know what they say, history repeats itself.
Subject: Editorial | January 16, 2013 - 09:41 PM | Josh Walrath
Tagged: ST Ericsson, planar, PD-SOI, L8580, FinFET, FD-SOI, Cortex A9, cortex a15, arm
SOI has been around for some time now, but in partially depleted form (PD-SOI). Quite a few manufacturers have utilized PD-SOI for their products, such as AMD and IBM (probably the two largest producers of SOI based parts). Oddly enough, Intel has shunned SOI wafers altogether. One would expect Intel to spare no expense to have the fastest semiconductor based chips on the market, but SOI did not provide enough advantages for the chip behemoth to outweigh the nearly 10% increase in wafer and production costs. There were certainly quite a few interesting properties to PD-SOI, but Intel was able to find ways around bulk silicon’s limitations. These non-SOI improvements include stress and strain, low-K dialectrics, high-K metal gates, and now 3D FinFET Technology. Intel simply did not need SOI to achieve the performance they were looking for while still using bulk silicon wafers.
Things started looking a bit grim for SOI as a technology a few years back. AMD was starting to back out of utilizing SOI for sub-32 nm products, and IBM was slowly shifting away from producing chips based on their Power technology. PD-SOI’s days seemed numbered. And they are. That is ok though, as the technology will see a massive uptake with the introduction of Fully Depleted SOI wafers. I will not go into the technology in full right now, but expect another article further into the future. I mentioned in a tweet some days ago that in manufacturing, materials are still king. This looks to hold true with FD-SOI.
Intel had to utilize 3D FinFETs on 22 nm because they simply could not get the performance out of bulk silicon and planar structures. There are advantages and disadvantages to these structures. The advantage is that better power characteristics can be attained without using exotic materials all the while keeping bins high, but the disadvantage is the increased complexity of wafer production with such structures. It is arguable that the increase in complexity completely offsets the price premium of a SOI based solution. We have also seen with the Intel process that while power consumption is decreased as compared to the previous 32 nm process, the switching performance vs. power consumption is certainly not optimal. Hence the reason why we have not seen Intel release Ivy Bridge parts that are clocked significantly faster than last generation Sandy Bridge chips.
FD-SOI and planar structures at 22 nm and 20 nm promise the improve power characteristics as compared to bulk/FinFET. It also looks to improve overall power vs. clockspeed as compared to bulk/FinFET. In a nutshell this means better power consumption as well as a jump in clockspeed as compared to previous generations. Gate first designs using FD-SOI could be very good, but industry analysts say that gate last designs could be “spectacular”.
So what does this have to do with ST Ericsson? They are one of the first companies to show a products based on 28 nm FD-SOI technology. The ARM based NovaThore L8580 is a dual Cortex A9 design with the graphics portion being the IMG SGX544. At first glance we would think that ST is behind the ball, as other manufacturers are releasing Cortex A15 parts which improve IPC by a significant amount. Then we start digging into the details.
The fastest Cortex A9 designs that we have seen so far have been clocked around 1.5 GHz. The L8580 can be clocked up to 2.5 GHz. Whatever IPC improvements we see with A15 are soon washed away by the sheer clockspeed advantage that the L8580 has. While it has been rumored that the Tegra 4 will be clocked up to 2 GHz in tablet form, ST is able to get the L8580 to 2.5 GHz in a smartphone. NVIDIA utilizes a 5th core to improve low power performance, but ST was able to get their chip to run at 0.6v in low power mode. This decrease in complexity combined with what appears to be outstanding electrical and thermal characteristics makes this a very interesting device.
The Cortex A9 cores are not the only ones to see an improvement in clockspeed and power consumption. The well known and extensively used SGX544 graphics portion runs at 600 MHz in a handheld device, and is around 20% faster clocked than other comparable parts.
When we add all these things together we have a product that appears to be head and shoulders above current parts from Qualcomm and Samsung. It also appears that these parts are comparable, if not slightly ahead, of the announced next generation of parts from the Cortex A15 crowd. It stands to reason that ST Ericsson will run away with the market and be included in every new handheld sold from now until the first 22/20 nm parts are released? Unfortunately for ST Ericsson, this is not the case. If there was an Achilles Heel to the L8580 it is that of production capabilities. ST Ericsson started production on FD-SOI wafers this past spring, but it was processing hundreds of wafers a month vs. the thousands that are required for full scale production. We can assume that ST Ericsson has improved this situation, but they are not exactly a powerhouse when it comes to manufacturing prowess. They simply do not seem to have the FD-SOI production capabilities to handle orders from more than a handful of cellphone and table manufacturers.
ST Ericsson has a very interesting part, and it certainly looks to prove the capabilities of FD-SOI when compared to competing products being produced on bulk silicon. The Nova Thor L8580 will gain some new customers with its combination of performance and power characteristics, even though it is using the “older” Cortex A9 design. FD-SOI has certainly caught the industrys’ attention. There are more FD-SOI factoids floating around that I want to cover soon, but these will have to wait. For the time being ST Ericsson is on the cutting edge when it comes to SOI and their proof of concept L8580 seems to have exceeded expectations.
Subject: Editorial, General Tech, Systems, Shows and Expos | January 9, 2013 - 02:44 PM | Scott Michaud
Tagged: CES, ces 2013, valve, Steam Box
CES opened with excitement from Xi3 Corporation and their announcement of the Piston. When Gabe Newell spoke with Kotaku at the VGAs he said that we would see Big Picture PCs this year. With word that Xi3 received funding from Valve some of us, including myself, wondered if this Piston was Valve’s “Nexus” Steam Box.
Ben Krasnow, hardware engineer at Valve, comments in the video below about whether we have seen Valve’s canonical Steam Box or if there are any planned announcements for 2013.
Thank you Ben.
There seems to be some discrepancies between statements from Krasnow and Valve Managing Director, Gabe Newell. The major deviation concerns whether the official Steam Box will be based on Linux or another operating system. When interviewed by The Verge, Gabe Newell claimed the official box will be based on Linux with no unclear terms:
We’ll come out with our own and we’ll sell it to consumers by ourselves. That’ll be a Linux box, [and] if you want to install Windows you can. We’re not going to make it hard. This is not some locked box by any stretch of the imagination.
Krasnow in an email discussion with Engadget was somewhat more timid in future plans. The Engadget article was published on the same day as the The Verge interview which makes neither position particularly out of date. His statement:
"The box might be linux-based, but it might not," he continued. "It's true that we are beta-testing Left for Dead 2 on Linux, and have also been public about Steam Big Picture Mode. We are also working on virtual and augmented reality hardware, and also have other hardware projects that have not been disclosed yet, but probably will be in 2013."
At the same point this might all become irrelevant very quickly. As reported yesterday, Gabe Newell in the very same interview seemed to strongly suggest that post-Kepler GPUs will bring virtualization to the consumer market. If that is the case, then the only barrier between Linux and Windows would be for a company to provide a user-friendly virtual machine. Having your host operating system as one or the other would not particularly matter if the user could run gaming applications from the other platform.
PC Perspective's CES 2013 coverage is sponsored by AMD.
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Editorial, General Tech | January 9, 2013 - 04:08 AM | Scott Michaud
Tagged: MSN, MSN Messenger
Windows Live Messenger, formerly MSN Messenger, almost lived to see its fourteenth birthday.
The messaging service will continue to live on in our hearts, minds, and China. A butterfly flapped its wings and stirred up a storm of trouble for AOL with its superior webcam capabilities. The purchase of a competing service by Microsoft should have come as a well Timed Warner of an impending fate similar to which AIM suffered.
We should remember Messenger as a fighter who swam against the Windows Live Waves. While it now looks down on us from the Skype I imagine it is at peace. It knows that it did all that it could. It knows that Skype fairly beat it at its own game.
Getting progressively worse with each and every version.
So long MSN! My Trillian contact list will feel emptier without you! But please do not be offended by my eulogy, as I say this in the utmost respect: Yahoo!
PC Perspective's CES 2013 coverage is sponsored by AMD.
Follow all of our coverage of the show at http://pcper.com/ces!
CES 2013: The Verge Interviews Gave Newell for Steam Box. Valve's Director Hints Post-Kepler GPUs Can Be Virtualized!
Subject: Editorial, General Tech, Graphics Cards, Networking, Systems, Shows and Expos | January 8, 2013 - 11:11 PM | Scott Michaud
Tagged: valve, gaben, Gabe Newell, ces 2013, CES
So the internet has been in a roar about The Steam Box and it probably will eclipse Project Shield as topic of CES 2013. The Verge scored an interview to converse about the hardware future of the company and got more than he asked for.
Now if only he would have discussed potential launch titles.
Wow! That *is* a beautiful knife collection.
The point which stuck with me most throughout the entire interview was directed at Valve’s opinion of gaming on connected screens. Gabe Newell responded,
The Steam Box will also be a server. Any PC can serve multiple monitors, so over time, the next-generation (post-Kepler) you can have one GPU that’s serving up eight simulateneous [sic] game calls. So you could have one PC and eight televisions and eight controllers and everybody getting great performance out of it. We’re used to having one monitor, or two monitors -- now we’re saying lets expand that a little bit.
This is pretty much confirmation, assuming no transcription errors on the part of The Verge, that Maxwell will support the virtualization features of GK110 and bring it mainstream. This also makes NVIDIA Grid make much more sense in the long term. Perhaps NVIDIA will provide some flavor of a Grid server for households directly?
The concept gets me particularly excited. One of the biggest wastes of money the tech industry has is purchasing redundant hardware. Consoles are a perfect example: not only is the system redundant to your other computational device which is usually at worst a $200 GPU away from a completely better experience, you pay for software to be reliant on that redundant platform which will eventually disappear along with said software. In fact, many have multiple redundant consoles because the list of software they desire is not localized to just one system so they need redundant redundancies. Oy!
A gaming server should help make the redundancy argument more obvious. If you need extra interfaces then you should only need to purchase the extra interfaces. Share the number crunching and only keep it up to date.
Also check out the rest of the interview over at The Verge. I decided just to cover a small point with potentially big ramifications.
PC Perspective's CES 2013 coverage is sponsored by AMD.
Follow all of our coverage of the show at http://pcper.com/ces!
Get notified when we go live!