All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Editorial, Graphics Cards | May 8, 2013 - 11:37 PM | Ryan Shrout
Tagged: video, nvidia, live, frame rating, fcat
Update: Did you miss the live stream? Watch the on-demand replay below and learn all about the Frame Rating system, FCAT, input latency and more!!
I know, based solely on the amount of traffic and forum discussion, that our readers have really adopted and accepted our Frame Rating graphics testing methodology. Based on direct capture of GPU output via an external system and a high end capture card, our new systems have helped users see GPU performance a in more "real-world" light that previous benchmarks would not allow.
I also know that there are lots of questions about the process, the technology and the results we have shown. In order to try and address these questions and to facilitate new ideas from the community, we are hosting a PC Perspective Live Stream on Thursday afternoon.
Joining me will be NVIDIA's Tom Petersen, a favorite of the community, to talk about NVIDIA's stance on FCAT and Frame Rating, as well as just talk about the science of animation and input.
The primary part of this live stream will be about education - not about bashing one particular product line or talking up another. And part of that education is your ability to interact with us live, ask questions and give feedback. During the stream we'll be monitoring the chat room embedded on http://pcper.com/live and I'll be watching my Twitter feed for questions from the audience. The easiest way to get your question addressed though will be to leave a comment or inquiry here in this post below. It doesn't require registration and this will allow us to think about the questions before hand, giving it a better chance of being answered during the stream.
Frame Rating and FCAT Live Stream
11am PT / 2pm ET - May 9th
So, stop by at 2pm ET on Thursday, May 9th to discuss the future of graphics performance and benchmarking!
Subject: Editorial, General Tech, Graphics Cards, Processors | May 8, 2013 - 09:32 PM | Scott Michaud
Tagged: Volcanic Islands, radeon, ps4, amd
So the Southern Islands might not be entirely stable throughout 2013 as we originally reported; seismic activity being analyzed suggests the eruption of a new GPU micro-architecture as early as Q4. These Volcanic Islands, as they have been codenamed, should explode onto the scene opposing NVIDIA's GeForce GTX 700-series products.
It is times like these where GPGPU-based seismic computation becomes useful.
The rumor is based upon a source which leaked a fragment of a slide outlining the processor in block diagram form and specifications of its alleged flagship chip, "Hawaii". Of primary note, Volcanic Islands is rumored to be organized with both Serial Processing Modules (SPMs) and a Parallel Compute Module (PCM).
So apparently a discrete GPU can have serial processing units embedded on it now.
Heterogeneous Systems Architecture (HSA) is a set of initiatives to bridge the gap between massively parallel workloads and branching logic tasks. We usually make reference to this in terms of APUs and bringing parallel-optimized hardware to the CPU. In this case, we are discussing it in terms of bringing serial processing to the discrete GPU. According to the diagram, the chip within would contain 8 processor modules each with two processing cores and an FPU for a total of 16 cores. There does not seem to be any definite identification whether these cores would be based upon their license to produce x86 processors or their other license to produce ARM processors. Unlike an APU, this is heavily skewed towards parallel computation rather than a relatively even balance between CPU, GPU, and chipset features.
Now of course, why would they do that? Graphics processors can do branching logic but it tends to sharply cut performance. With an architecture such as this, a programmer might be able to more efficiently switch between parallel and branching logic tasks without doing an expensive switch across the motherboard and PCIe bus between devices. Josh Walrath suggested a server containing these as essentially add-in card computers. For gamers, this might help out with workloads such as AI which is awkwardly split between branching logic and massively parallel visibility and path-finding tasks. Josh seems skeptical about this until HSA becomes further adopted, however.
Still, there is a reason why they are implementing this now. I wonder, if the SPMs are based upon simple x86 cores, how the PS4 will influence PC gaming. Technically, a Volcanic Island GPU would be an oversized PS4 within an add-in card. This could give AMD an edge, particularly in games ported to the PC from the Playstation.
This chip, Hawaii, is rumored to have the following specifications:
- 4096 stream processors
- 16 serial processor cores on 8 modules
- 4 geometry engines
- 256 TMUs
- 64 ROPs
- 512-bit GDDR5 memory interface, much like the PS4.
20 nm Gate-Last silicon fab process
- Unclear if TSMC or "Common Platform" (IBM/Samsung/GLOBALFOUNDRIES)
Softpedia is also reporting on this leak. Their addition claims that the GPU will be designed on a 20nm Gate-Last fabrication process. While gate-last is considered to be not worth the extra effort in production, Fully Depleted Silicon On Insulator (FD-SOI) is apparently "amazing" on gate-last at 28nm and smaller fabrication. This could mean that AMD is eying that technology and making this design with intent of switching to an FD-SOI process, without a large redesign which an initially easier gate-first production would require.
Well that is a lot to process... so I will leave you with an open question for our viewers: what do you think AMD has planned with this architecture, and what do you like and/or dislike about what your speculation would mean?
Subject: Editorial, Mobile | May 7, 2013 - 12:07 AM | Scott Michaud
Tagged: unreal engine, firefox, asm.js
If, on the other hand, you wish to see an example of a large application compiled for the browser: would Unreal Engine 3 suffice?
Clearly a computer hardware website would take the effort required to run a few benchmarks, and we do not disappoint. Epic Citadel was run in its benchmark mode in Firefox 20.0.1, Firefox 22.0a2, and Google Chrome; true, it was not run for long on Chrome before the tab crashed, but you cannot blame me for trying.
Each benchmark was run at full-screen 1080p "High Performance" settings on a PC with a Core i7 3770, a GeForce GTX 670, and more available RAM than the browser could possibly even allocate. The usual Firefox framerate limit was removed; they were the only tab open on the same fresh profile; the setting layout.frame_rate.precise was tested in both positions because I cannot keep up what the state of requestAnimationFrame callback delay is; and each scenario was performed twice and averaged.
- layout.frame_rate.precise true: 54.7 FPS
- layout.frame_rate.precise false: 53.2 FPS
Firefox 22.0a2 (asm.js)
- layout.frame_rate.precise true: 147.05 FPS
- layout.frame_rate.precise false: 144.8 FPS
Google Chrome 26.0.1410.64
It is also very enticing for Epic as well. A little over a month ago, Mark Rein and Tim Sweeney of Epic were interviewed by Gamasutra about HTML5 support for Unreal Engine. Due in part to the removal of UnrealScript in favor of game code being scripted in C++, Unreal Engine 4 will support HTML5. They are working with Mozilla to make the browser a reasonable competitor to consoles; write once, run on Mac, Windows, Linux, or anywhere compatible browsers can be found. Those familiar with my past editorials know this excites me greatly.
So what do our readers think? Comment away!
Subject: Editorial | April 26, 2013 - 11:40 AM | Ryan Shrout
Tagged: video, pcper, Indiegogo
UPDATE 4/26/13: We are extremely excited to see that we have met our first goal for our Indiegogo project! We are eternally grateful for our fans and readers that are supporting us in this endeavor. We are going to start putting together orders for the set materials and I am very excited about the direction this is pointing us in. There is still room to improve the project though and we have lots of great perks available for those of you that are still looking to contribute to the cause! Oh, and we should have our T-shirt design ready early next week as well. Thank you EVERYONE for reading PC Perspective!!
And for those of you looking for a bit more insight into our total goals, here is another mock up of the set!
Yesterday evening, the team at PC Perspective launched a project that will help us grow and expand our coverage of technology and computer hardware. Using the crowd funding service called Indiegogo.com, we are doing a fund drive to help improve the quality of our video content and enable us to do more, unique styles of content.
If you are anything like us, you love technology. Motherboards, graphics cards, processors, SSDs, monitors, laptops, tablets, cell phones and more. And you also love reading about them, hearing about them and seeing them, dissecting them and finding out what makes them tick.
I am confident that high quality video content is the future of our medium and while we have been able to do quite a lot with the basic technology and setup we have here today, my goal is to be able to bring the readers regular, high quality video content on all aspects of technology. We want to not only have video reviews for products but we want to be able to do near-daily content updates on the news of the day while balancing that with long-form interviews of personalities that make the industry function. We have dabbled in some of these content types and the responses have been great, but we need a higher quality setup to really do it right.
Our goal with this project is to build the funds necessary to turn our office in Florence, KY into a high tech video production outlet that starts with a quality set design and better quality equipment.
Other than supporting one of your favorite online outlets, we have also lined up some sweet perks for contributors to our project. We have ad-free versions of the site, Tshirts and access to the PC Perspective Gold Club that has some pretty ridiculous giveaways! If you support us even further you can get some individual time with our team to tell us why you supported us, answer your questions or even join us for an episode of the PC Perspective Podcast!
Come visit the offices, join the process of creating a new show or make fun of Josh's laugh in person - it's all possible!
So if you have the means and you want to support our cause, if you have enjoyed any of our articles, podcasts or video reviews, consider helping to fund our project!
Support PC Perspective's Indiegogo Project
Subject: Editorial, General Tech, Displays | April 22, 2013 - 05:34 PM | Scott Michaud
Tagged: LG, ips, hack
Operators are standing by...
Of course Apple is not a primary manufacturer of LCD panels; like everyone else, they buy their panels from someone like LG. Due to how much Apple loves IPS technology, which I cannot blame them for, they in fact do purchase their displays from LG.
If you have an itchy soldering iron, so can you.
According to EmertHacks, the LG part number for retina iPad screens is LP097QX1-SPA1. The blog post states that he could find the panel for as cheap as $55, but my own digging game up with costs between $60 and $200 plus shipping. These panels are mostly destined to iPad repair shops, but you can give it a better home.
With under $20 of other parts, this panel could be attached to a DisplayPort connection. All said and done, you could have a 2048x1536 9.7" display with an 800:1 static contrast ratio for about $70.
Subject: Editorial, General Tech, Systems | April 20, 2013 - 07:36 PM | Scott Michaud
Tagged: windows, start button, Metro
The latest rumors, based on registry digging and off-the-record testimony, claims that Windows 8.1 will including the option of booting directly into the desktop. A bold claim such as this requires some due diligence. Comically, the attempts to confirm this rumor has unearthed another: the start button, but not necessarily the start menu, could return. On the record, Microsoft also wants to be more open to customer feedback. Despite these recent insights into the future of Windows, all's quiet with the worst aspect of modernization.
Mary Jo Foley, contributor to ZDNet and very reliable bullcrap filter for Microsoft rumors, learned from a reliable source that the Start Button might have a place in the modern Windows. Quite the catch while fishing to validate a different rumor; she was originally investigating whether Microsoft would consider allowing users to boot direct to desktop via recently unearthed registry keys. Allegedly both are being planned for at least some SKUs of Windows 8.1, namely the Professional and Enterprise editions.
But, as usual for Microsoft, the source emphasized, "Until it ships, anything can change." No-one was clear about the Start Button from a functional standpoint: would it be bound to display the Start Screen? Would it be something more?
Personally, I liked the modern Windows interface. Sure, it is messed up on the modern-side when it comes to multiple monitor support, but that can easily be fixed. As you will note, I am still actively boycotting everything beyond Windows 7 and this news will not change my mind. We are bickering over interface elements when the real concern is the deprecation of user control. Outside of the desktop: the only applications you can use are from the Windows Store or Windows Update; the only websites you can browse are ones which Internet Explorer can render; and the only administrator is Microsoft.
Imagine if Microsoft is told by a government that its citizens are not allowed encryption applications.
The Windows Store is clearly modeled by, and about as messed up as, the Xbox Marketplace. Even if your application gets certified, would Microsoft eventually determine that certification fees should be the burden of the developer? That is how it is on the Xbox with each patch demanding a price tag of about $40,000 after the first-one-free promotion. That would be pretty hard to swallow for an open-source application or a cute game that a teenage woman makes for her significant other as a Valentine's gift.
Microsoft's current Chief Financial Officer, Peter Klein, stated in his third quarter earnings release that Windows Blue, "Further advances the vision of Windows 8 as well as responds to customer feedback." Despite how abrupt this change would seem, the recent twitchy nature should not come as a surprise; Microsoft has had a tendency to completely change course on products for quite some time now. Mary Jo mentioned how Microsoft changed course on UAC but even that is a bad example; a better one is how Microsoft changed from its initial assertions that Windows 8 Developer Preview would not be shaped by customer feedback.
A lot has changed between Developer Preview and RTM.
Then again, we can hope that Microsoft associates this pain with love for the desktop. I would be comfortable with the modern Windows if we were given a guarantee that desktop x86 applications would forever be supported. I might even reconsider using and developing applications if they allow loading uncertified metro-style applications and commit to never removing that functionality.
I can get used to a new method of accessing my applications. I can never get used to a middle-man who only says "no". If Microsoft is all ears, I hope we make this point loud and clear.
Subject: Editorial | April 18, 2013 - 01:55 PM | Scott Michaud
So, news which might excite our readers: we are going to try reviewing video games.
Of course, the first thing which needs to be addressed when reviewing games is our grading system. Games, in particular, are a very artful medium and as such it does not entirely make sense to quantize its qualities.
The simple answer is, we will not.
Step back and consider how we review hardware: we run some benchmarks, we discuss the features in often numbing detail, and we assign an award-badge to the product according to our opinion. A hardware could receive no merit; it could receive a bronze, silver, or gold medal; finally, the truly extraordinary products will receive an Editor's Choice Award. If you think about it, these can transfer quite easily to video game reviews.
Our expectation is to apply two ratings to every review: a badge and a number.
A badge is very good at qualifying our assessment of a product whereas numerical scores are very good at quantifying a derivable value. We, collectively as PC gamers, have certain expectations for games and they usually demand more than the impressions of a typical console gamer. Simultaneously, we tend to be an afterthought for a lot of titles; yes, I am being generous even with that statement. Many games are outright broken, crippled by DRM, or otherwise demonstrate in very obvious terms that our money is somehow inferior. On the other hand, there are games which go above and beyond reasonable expectations held by PC gamers, and even some unreasonable ones, and are rarely hailed for it.
We are not able to judge the artistic qualities of a game using a numerical score, but we can judge its technical merits using a numerical rubric.
And so exists our planned review metric. The main point is that there will not be any definite rank-order to each game, at least from an artistic standpoint. A game is allowed to really well on one category and really terribly on another. If you are concerned with the game itself, keep more of an eye toward which award we gave it. If you are concerned about how well the game exists as a PC title, take a look at the numerical score.
There are of course caveats to this method. A viewer who looks solely at the numerical score will not know much, if anything, about the game itself. The numerical score is just a gauge for the level of effort put into the PC version.
Then again, would you expect any less from a website called "PC Perspective" which reviews products with a blend of explanation of its qualitative features mixed in with strict quantitative benchmarking?
Lastly, this is not about whether a game is "better" on a PC or on a console. Developers are free to focus on whatever platform they desire. A game designed around a console and ported to the PC will still get a great score if the finished result exhibits a "great" level of care. Likewise, even if your game is PC-exclusive, do not expect us to give it a great score if it cannot alt-tab worth a damn and is wrapped in DRM which roots our system using kernel-mode drivers.
It is not particularly hard to make a great PC experience, all it takes is effort. Fortunately, that is a property that we can assign an honest grade to.
We would really like to hear your feedback on this. Drop a line in the comments below!
Subject: Editorial, General Tech, Graphics Cards | April 14, 2013 - 02:22 AM | Scott Michaud
Tagged: never settle, never settle reloaded, amd, far cry 3
So when AMD reloaded their Never Settle bundles, they left an extra round in the barrel.
Some of my favorite games were given to me in a bundle with some piece of computer hardware. You might remember from the PC Perspective game night that I am a major fan of the Unreal Tournament franchise. My first Unreal Tournament game was an unexpected surprise when I purchased my first standalone GPU. My 166MHz Pentium computer also came bundled with Mechwarrior 2 and Wipeout.
As we discussed, AMD considers bundle-offers as a way to keep the software industry rolling forward. The quantity and quality of games which participate in the recent Never Settle bundles certainly deserve credit as it is due. Bioshock: Infinite is a game that just about every PC gamer needs to experience, and there are about a half-dozen other great titles as a part of the promotion depending upon which card or cards you purchase.
As it turns out, AMD negotiated with Ubisoft and added Far Cry 3: Blood Dragon to their Never Settle bundle. The coolest part is that AMD will retroactively email codes for this new title to anyone who has redeemed a Never Settle: Reloaded code.
So if you have ever Reloaded your Never Settle in the past, check your email as apparently you can Never Settle your reloads again.
Subject: Editorial, General Tech, Graphics Cards, Systems, Mobile | April 7, 2013 - 10:21 PM | Scott Michaud
Tagged: DirectX, DirectX 12
Microsoft DirectX is a series of interfaces for programmers to utilize typically when designing gaming or entertainment applications. Over time it became synonymous with Direct3D, the portion which mostly handles graphics processing by offloading those tasks to the video card. At one point, DirectX even handled networking through DirectPlay although that has been handled by Games for Windows Live or other APIs since Vista.
AMD Corporate Vice President Roy Taylor was recently interviewed by the German press, "c't magazin". When asked about the future of "Never Settle" bundles, Taylor claimed that games such as Crysis 3 and Bioshock: Infinite keep their consumers happy and also keep the industry innovating.
Keep in mind, the article was translated from German so I might not be entirely accurate with my understanding of his argument.
In a slight tangent, he discussed how new versions of DirectX tends to spur demand for new graphics processors with more processing power and more RAM. He has not heard anything about DirectX 12 and, in fact, he does not believe there will be one. As such, he is turning to bundled games to keep the industry moving forward.
Neowin, upon seeing this interview, reached out to Microsoft who committed to future "innovation with DirectX".
This exchange has obviously sparked a lot of... polarized... online discussion. One claimed that Microsoft is abandoning the PC to gain a foothold in the mobile market which it has practically zero share of. That is why they are dropping DirectX.
Unfortunately this does not make sense: DirectX would be one of the main advantages which Microsoft has in the mobile market. Mobile devices have access to fairly decent GPUs which can use DirectX to draw web pages and applications much smoother and much more power efficiently than their CPU counterparts. If anything, DirectX would be increased in relevance if Microsoft was blindly making a play for mobile.
The major threat to DirectX is still quite off in the horizon. At some point we might begin to see C++Amp or OpenCL nibble away at what DirectX does best: offload highly-parallel tasks to specialized processing units.
Still, releases such as DirectX 11.1 are quite focused on back-end tweaks and adjustments. What do you think a DirectX 12 API would even do, that would not already be possible with DirectX 11?
Subject: Editorial | April 6, 2013 - 04:34 AM | Scott Michaud
Tagged: Windows 8.1, windows blue, internet explorer, Internet Explorer 11
Windows Blue Windows 8.1 was leaked not too long ago. We reported on the release's illegitimate availability practically as soon as it happened. We knew that Internet Explorer took it to incremented its version to 11. The recent releases of Internet Explorer each made decent strides to catch the browser up to Google's Chrome and Mozilla's Firefox. Once thrown to the sharks thoroughly investigated, this release is pining to be just as relevant despite how near its expected release has been to Internet Explorer 10.
One of my first thoughts upon realizing that Internet Explorer 11 was an impending "thing": will it make it to Windows 7? Unfortunately, we still have no clue. Thankfully, unlike Windows RT which disallow rendering engines other than Internet Explorer's Trident, we are still capable of installing alternative browsers in Windows 7. If Internet Explorer 11 is unavailable, they can still install Firefox or Chrome.
For those who only use Internet Explorer and can upgrade to 11, you might be pleased to find WebGL support. Microsoft has been quite vocal against WebGL for quite some time, claiming it a security threat when facing the wild west of the internet. Then again, to some extent, the internet is a security nightmare in itself. The question is whether WebGL can be sufficiently secured for its usage:
- Animation effects (I created this specific demo... not the rest)
- Gorgeous, smooth, and battery-efficient 2d games
- Likewise beautiful 3D experiences
- And of course there's a semi-realtime raytracing demo.
This, to some extent, marks a moment where Microsoft promotes a Khronos standard. With some level of irony, Apple was one of the founding members of the WebGL group yet Microsoft might beat Safari to default WebGL support Of course it could not be that simple, however, as IE11 apparently accepts WebGL shaders (the math which computes the color and position of a pixel) in IESL rather than the standard GLSL. IESL, according to the name of its registry flag, seems to be heavily based on HLSL seen in DirectX.
I guess they just cannot let Khronos have a total victory?
SPDY also seems to be coming to IE11. SPDY, pronounced "speedy" and not an acronym, is a protocol designed to cut loading latency. Cool stuff.
Last and definitely least, IE11 is furthering its trend of pretending that it is a Mozilla Gecko-like rendering engine in its user agent string. Personally, I envision an IE logo buying a fiery-orange tail at a cosplay booth. They have been doing this for quite some time now.
Subject: Editorial, General Tech, Shows and Expos | March 27, 2013 - 03:25 AM | Scott Michaud
Tagged: battlefield, battlefield 4, GDC, GDC 13
Battlefield 4 is coming, that has been known with Medal of Honor: Warfighter's release and its promise of beta access, but the gameplay trailer is already here. Clocking in at just over 17 minutes, "Fishing in Baku" looks amazing from a technical standpoint.
The video has been embed below. A little not safe for work due to language and amputation.
Now that you finished gawking, we have gameplay to discuss. I cannot help but be disappointed with the campaign direction. Surely, the story was in planning prior to the release of Battlefield 3. Still, it seems to face the same generic-character problem which struck the last campaign.
In Battlefield 3, I really could not recognize many characters apart from the lead which made their deaths more confusing than upsetting. Normally when we claim a character is identifiable, we mean that we can relate to them. In this case, when I say the characters were not identifiable, I seriously mean that I probably could not pick them out in a police lineup.
Then again, the leaked promotional image for Battlefield 4 seems to show Blackburn at the helm. I guess there is some hope. Slim hope, which the trailer does not contribute to. I mean even the end narration capped how pointless the character interactions were. All this in spite of EA's proclaiming YouTube description of this being human, dramatic, and believable.
Oh well, it went boom good.
Subject: Editorial, General Tech, Processors, Shows and Expos | March 20, 2013 - 06:26 PM | Scott Michaud
Tagged: windows rt, nvidia, GTC 2013
NVIDIA develops processors, but without an x86 license they are only able to power ARM-based operating systems. When it comes to Windows, that means Windows Phone or Windows RT. The latter segment of the market has disappointing sales according to multiple OEMs, which Microsoft blames them for, but the jolly green GPU company is not crying doomsday.
NVIDIA just skimming the Surface RT, they hope.
As reported by The Verge, NVIDIA CEO Jen-Hsun Huang was optimistic that Microsoft would eventually let Windows RT blossom. He noted how Microsoft very often "gets it right" at some point when they push an initiative. And it is true, Microsoft has a history of turning around perceived disasters across a variety of devices.
They also have a history of, as they call it, "knifing the baby."
I think there is a very real fear for some that Microsoft could consider Intel's latest offerings as good enough to stop pursuing ARM. Of course, the more the pursue ARM, the more their business model will rely upon the-interface-formerly-known-as-Metro and likely all of its certification politics. As such, I think it is safe to say that I am watching the industry teeter on a fence with a bear on one side and a pack of rabid dogs on the other. On the one hand, Microsoft jumping back to Intel would allow them to perpetuate the desktop and all of the openness it provides. On the other hand, even if they stick with Intel they likely will just kill the desktop anyway, for the sake of user confusion and the security benefits of cert. We might just have less processor manufacturers when they do that.
So it could be that NVIDIA is confident that Microsoft will push Windows RT, or it could be that NVIDIA is pushing Microsoft to continue to develop Windows RT. Frankly, I do not know which would be better... or more accurately, worse.
Subject: Editorial | February 27, 2013 - 02:26 PM | Scott Michaud
Tagged: Podcast Bingo, Bingo
It's Bumpday! No, no I'm not reviving that, at least not yet.
But it is Wednesday and as such we will be gathering for another fine PC Perspective Podcast. As always, if you wish to join us: head down to pcper.com/live or click on the little radio tower in the upcoming events box on the right side of your screen.
Yes, I know, there are two P's. Deal with it.
Starting this week, we will have a new activity for our viewers to follow along with. Play along with the first official PC Perspective Podcast Bingo Card! Keep track of every episode we mention “Arc Welding” by putting us to task and marking off the square.
Be sure to call out the spaces you mark off and whatever Bingo patterns you manage to make out of it.
Our IRC room during the podcast (it's usually active 24/7) is: irc.mibbit.net #pcper
Of course this is just for fun – but fun is fun!
Subject: Editorial, General Tech, Systems | February 26, 2013 - 08:07 PM | Scott Michaud
Tagged: ps4, unreal engine 4
Unreal Engine 4 was present at Sony's Playstation 4 press conference, but that is no surprise. Epic Games has been present at several keynotes for new console launches. Last generation, Unreal Engine 3 kicked off both Xbox 360 and PS3 with demos of Gears of War and Unreal Tournament 2007, respectively. The PS4 received a continuation of the Elemental Demo first released at the end of E3 last June.
All I could think about when I watched the was, “This looks pretty bad. What happened?”
If you would like to follow along at home, both demos are available on Youtube:
As you can see from the animated GIF above, particle count appears to have been struck the worst. The eyes contain none of the particle effects in the PS4 version. There appears to be an order of magnitude or two more particles on the PC version than the PS4. There are no particle effects around the eyes of the statue. Whole segments of particles are not even rendered.
In this screenshot, downsampled to 660x355, the loss of physical detail is even more apparent. The big cluster of particles near the leg are not present in the PS4 version and the regular cluster is nowhere near as densely packed.
And the lighting, oh the lighting.
On the PS4 everything looks a lot higher contrast without a lot of the subtle lighting information. This loss of detail is most apparent with the volcano smoke and the glow of the hammer but are also obvious in the character model when viewed in the video.
Despite the 8GB of RAM, some of the textures also seem down-resolution. Everything appears to have much more of a plastic look to it.
Still, while computers still look better, at least high-end PC gaming will still be within the realm of scalability for quite some time. We have been hampered by being so far ahead of consoles that it was just not feasible to make full use of the extra power. At least that is looking to change.
Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | February 26, 2013 - 04:19 AM | Scott Michaud
Tagged: Firefox OS, mozilla, firefox, MWC, MWC 13
Mobile World Congress is going on at Barcelona and this year sees the official entry of a new contender: Firefox OS.
Mozilla held their keynote speech the day before the official start to the trade show. If there is anything to be learned from CES, it would be that there is an arms race to announce your product before everyone else steals media attention while still being considered a part of the trade show. By the time the trade show starts, most of the big players have already said all that they need to say.
If you have an hour to spare, you should check it out for yourself. The whole session was broadcast and recorded on Air Mozilla.
The whole concept of Firefox OS as I understand it is to open up web standards such that it is possible to create a completely functional mobile operating system from it. Specific platforms do not matter, the content will all conform to a platform of standards which anyone would be able to adopt.
I grin for a different reason: should some content exist in the future that is intrinsically valuable to society, its reliance on an open-based platform will allow future platforms to carry it.
Not a lot of people realize that iOS and Windows RT disallow alternative web browsers. Sure, Google Chrome the app exists for iOS, but it is really a re-skinned Safari. Any web browser in the Windows Store will use Trident as its rendering engine by mandate of their certification rules. This allows the platform developer to be choosey with whichever standards they wish to support. Microsoft has been very vocally against any web standard backed by Khronos. You cannot install another browser if you run across a web application requiring one of those packages.
When you have alternatives, such as Firefox OS, developers are promoted to try new things. The alternative platforms promote standards which generate these new applications and push the leaders to implement those standards too.
And so we creep ever-closer to total content separation from platform.
Subject: Editorial, General Tech | February 16, 2013 - 02:08 AM | Scott Michaud
Tagged: consoles, consolitis, pc gaming
If you really enjoy an Xbox or Playstation game, better hope your console does not die: it is likely that nothing else will play it. This news comes from a statement made by Blake Jorgensen, CFO of Electronic Arts. Clearly EA is a trusted partner of all console developers and not just an anonymous tipster.
You mean, Devil May Stop Crying?
I tend to rant about this point quite often. For a market so devoted to the opinion that video games are art, the market certainly does not care about its preservation as art. There is always room for consumable and even disposable entertainment, but the difference with art is that it cannot be substituted with another piece of content.
There would be a difference if someone magically replaced every copy of Schindler’s List, including the vaulted masters, with The Boy in the Striped Pajamas. I could safely assume that the vast majority of the audience for either film was not just browsing the Holocaust movie genre. I would expect the viewer was seeking out the one or the other for a specific reason.
This is incompatible with the console ecosystem by its design. The point of the platform is to be disposable and its content is along for the ride while it lasts. They often deliver the console for less than their parts and labor fees: research, development, and marketing costs regardless. The business model is to eliminate as many big fees as possible and then jack up the price of everything else ten bucks here and there. Over time you will not be given a bargain, over time you will give them more than they made you think you saved. They then spend this extra money keeping content exclusively under their control, not yours. Also, profits... give or take.
Again, there is always room for consumable entertainment. The consoles are designed to be very convenient, but not cheap and not suitable for timeless art. Really, the only unfortunate element is how these impairments are viewed as assets and all the while examples such as this one dance around the background largely shrugged off without being pieced together.
As for your favorite game? Who knows, maybe you will get lucky and it will be remade on some other platform for you to purchase again. You might be lucky, it might even be available on the PC.
Subject: Editorial, General Tech | February 16, 2013 - 01:19 AM | Scott Michaud
There have been some groups opposed to the planned deal to cease publicly trading Dell and release their shares. It would seem that for many, a short-term payout of 25 percent over trading price is insufficient and, they believe, undervalues the company. I mean, the price is totally not derived from the value you gave it when you just finished trading stocks at 80 percent of what Dell is offering you or anything. Yes, I am making a joke: some investors were almost definitely going long on Dell. I still suspect that some are just playing hardball, hoping that a quarter on the dollar raise is just a starting bid.
Buckle in, I will separate stockholders opinions into two categories: investment firms and employees.
Ars Technica clearly had football on the mind when they wrote a very Superbowl-themed editorial. Early in the month, Southeastern Asset Management sent a letter to Dell management expressing their stance to vote against a deal to go private. The investment firm controls 8.5 percent of Dell which means their opinion has a fair amount of sway. A short few days later, T. Rowe Price stepped up to likewise oppose the deal. This firm owns 4.4 percent of Dell, which means combined they have roughly a 13 percent vote.
Factor in a bunch of smaller investors and you are looking at almost a fifth of the company wanting to keep it public. That combined voting power slightly overtakes the 16 percent control owned by Micheal Dell and could hamper the festivities.
Employees, meanwhile, are upset all the same. Again, according to Ars Technica and their vigilant coverage states that some employees were force to sell their stock acquired as a part of their 401k at $9 per share – substantially lower than the 13.65$ being offered to investors.
There are several other ways which employees get their stake in the company reduced or hampered, but I would direct you to the Ars Technica article so I do not butcher any details.
Unfortunately these sorts of practices are fairly commonplace when it comes to investment deals. It would appear as if this deal trots on common ground instead of taking the high road.
God, I hate mixed metaphors.
Subject: Editorial, General Tech | February 9, 2013 - 02:49 AM | Scott Michaud
Tagged: windows blue
Could the sadness Microsoft feels with their OEM partners make the whole company feel just a little Blue?
I have been thinking about this while reading the latest news from Mary Jo Foley over at ZDNet. This has not been the first time that we have mentioned the color. Blue was, and still is, a codename for the first major feature-update of Windows 8. What we learned is that now it seems that “Blue” covers much more.
As many know, Microsoft has shifted their branding into four color-coded divisions: blue is for Windows; red is for Office; green is for Xbox, and yellow has yet to be disclosed. As far as we know, the Windows division encompasses Windows Phone, Internet Explorer, official apps, and so forth. Apparently “Blue”, the codenamed update, will start Microsoft on an annual update schedule for the Windows division. This means that Internet Explorer as well as the Mail, Calendar, Bing app, and other “Windows Services” such as SkyDrive and Hotmail will shift towards the yearly timer.
As I read Mary Jo's article, I focused on a point buried late in the second act of the column:
Instead of RTMing a new version of Windows once every three or so years, and then hoping/praying OEMs can get the final bits tested and preloaded on new hardware a few months later, Microsoft is going to try to push Blue out to users far more quickly, possibly via the Windows Store, my contact said.
While I have speculated about Microsoft and their desires to shift business models to a subscription service for quite some time, I have not considered OEM partners as a prominent reason. Microsoft has been wrestling with their manufacturers, that has recently been made obvious. The release of a new operating system drives users to go out and purchase new hardware. The PC industry bounces forward with software and hardware enhancements chained in lockstep to the three year Windows cycle, even the enthusiast market to some extent.
Perhaps Microsoft is trying to let the hardware itself drive the market. Instead of pushing the industry forward in big leaps, would it be possible that Microsoft wants the hardware to evolve and a new version of Windows to be there waiting for it?
Subject: Editorial, General Tech | February 5, 2013 - 05:44 PM | Scott Michaud
Tagged: Psychonauts, Notch, Tim Schafer
You cannot knock Tim Schafer: he abides by “Shut up and take my money”.
Last year we reported on the public negotiations between the heads of Mojang and Double Fine for a potential sequel to Psychonauts. The game was supposed to take “a couple” of million to make, which was later clarified to at least $13 million USD. This prompted the famous response from Notch, “Yeah, I can do that.”
Some time later, the deal fell by the wayside.
A storm never came that day, barely a ripple brushed against his wooden canoe.
Recently Notch was on Reddit and commented about the status of the sequel. The final budget ended up being around $18 million USD which ended up being beyond what Notch felt comfortable investing in. It was not for a lack of funds, however. Markus stated that he just did not have the time to be involved in an $18 million dollar deal.
The biggest point I would like to make is how little damage was caused by discussing this out in the open: the game fell through, at least for the moment, and no effigies were burnt. We might be approaching a time and an industry where these sorts of discussions will not need to be performed in strict secrecy.
Congratulations to Markus Persson and Tim Schafer for being brave or eccentric enough to trust the internet. We are sorry it didn't work out, but wish you luck in the future.
Subject: Editorial, General Tech, Systems, Mobile | February 5, 2013 - 05:10 PM | Scott Michaud
Dell, dude, you're getting a Dell!
So it is official that Dell is going private. Michael Dell, CEO, as well as: Silver Lake, MSD Capital, several banks, and Dell itself will buy back stocks from investors 25% above the January 11th trading price. The whole deal would be worth $24.4 billion USD.
Going private allows the company to make big shifts in their business without answering to investors on a quarterly basis. We can see how being a publicly traded company seems to hinder businesses after they grow beyond what a cash infusion can assist. Even Apple finds it necessary to keep an absolutely gigantic pile of cash to play with, only recently paying dividends to investors.
Also contributing to the buyback, as heavily reported, is a $2 billion USD loan from Microsoft. While it sounds like a lot in isolation, it is only just over 8% of the whole deal. All you really can pull is that it seems like Microsoft supports Dell in their decision and is putting their money where their intentions are.