All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Editorial, General Tech, Graphics Cards, Systems | May 23, 2013 - 06:40 PM | Scott Michaud
Tagged: xbox one, xbox, unreal engine, ps4, playstation 4, epic games
Unreal Engine 4 was presented at the PlayStation 4 announcement conference through a new Elemental Demo. We noted how the quality seemed to have dropped in the eight months following E3 while the demo was being ported to the console hardware. The most noticeable differences were in the severely reduced particle counts and the non-existent fine lighting details; of course, Epic pumped the contrast in the PS4 version which masked the lack of complexity as if it were a stylistic choice.
Still, the demo was clearly weakened. The immediate reaction was to assume that Epic Games simply did not have enough time to optimize the demo for the hardware. That is true to some extent, but there are theoretical limits on how much performance you can push out of hardware at 100% perfect utilization.
Now that we know both the PS4 and, recently, the Xbox One: it is time to dissect more carefully.
A recent LinkedIn post from EA Executive VP and CTO, Rajat Taneja, claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market. While there are many ways to interpret that statement, in terms of raw performance that statement is not valid.
As of our current knowledge, the PlayStation 4 contains an eight core AMD "Jaguar" CPU with an AMD GPU containing 18 GCN compute units, consisting of a total of 1152 shader units. Without knowing driving frequencies, this chip should be slightly faster than the Xbox One's 768 shader units within 12 GCN compute units. The PS4 claims their system has a total theoretical 2 teraFLOPs of performance and the Xbox One would almost definitely be slightly behind that.
Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.
This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.
Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.
But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.
Update, 5/24/2013: Mark Rein of Epic Games responds to the statement made by Rajat Taneja of EA. While we do not know his opinion on consoles... we know his opinion on EA's opinion:
— Mark Rein (@MarkRein) May 23, 2013
Subject: Editorial, Graphics Cards | May 23, 2013 - 12:08 PM | Ryan Shrout
Tagged: video, live, gtx 780, gk110
Missed the LIVE stream? You can catch the video reply of it below!
Hopefully by now you have read over our review of the new NVIDIA GeForce GTX 780 graphics card that launched this morning. Taking the GK110 GPU, cutting off some more cores, and setting a price point of $650 will definitely create some interesting discussion.
Join me today at 2pm ET / 11am PT as we discuss the GTX 780, our review and take your questions. You can leave them in the comments below, no registration required.
Subject: Editorial, General Tech | May 22, 2013 - 01:53 AM | Scott Michaud
Tagged: antivirus, antimalware
They might be a good means of guarding you from momentary lapses of judgment, but security is not equivalent to antivirus packages. You always need to consider how much your system is exposed to untrusted and even unsolicited data. Any software which accepts untrusted data has some surface with potential vulnerability to attack.
This, inherently, includes software which accepts data to scan it for malware.
Last week was host to Patch Tuesday, and one of its many updates fixed a vulnerability in Microsoft's Malware Protection Engine (MPE). The affected code is only present in applications which run the 64-bit version of the engine. For home users, these applications are: Microsoft Security Essentials (x86-64), Microsoft Malicious Software Removal Tool (x86-64), and all varieties of Windows Defender (x86-64). For enterprise users, MPE is also a part of Forefront and Endpoint applications and suites.
Despite the irony, I will not beat up on Microsoft. As far as I know, these vulnerabilities are semi-frequently patched in basically any antimalware application. At the very least, Microsoft declares and remedies problems with reasonable and appropriate policies; they could have just as easily buried this fix and pushed it out silently or worse, wait until it becomes actively exploited in the wild and even beyond.
But, and I realize I am repeating myself at this point, the biggest takeaway from this news: you cannot let the mere presence of antivirus suites permit you to be complacent. No scanner will detect everything, and some might even be the way in.
Subject: Editorial, General Tech, Graphics Cards, Processors, Systems | May 21, 2013 - 05:26 PM | Scott Michaud
Tagged: xbox one, xbox
Almost exactly three months have passed since Sony announced the Playstation 4 and just three weeks remain until E3. Ahead of the event, Microsoft unveiled their new Xbox console: The Xbox One. Being so close to E3, they are saving the majority of games until that time. For now, it is the box itself as well as its non-gaming functionality.
First and foremost, the raw specifications:
- AMD APU (5 billion transistors, 8 core, on-die eSRAM)
- 8GB RAM
- 500GB Storage, Bluray reader
- USB 3.0, 802.11n, HDMI out, HDMI in
The hardware is a definite win for AMD. The Xbox One is based upon an APU which is quite comparable to what the PS4 will offer. Unlike previous generations, there will not be too much differentiation based on available performance; I would not expect to see much of a fork in terms of splitscreen and other performance-sensitive features.
A new version of the Kinect sensor will also be present with all units which developers can depend upon. Technically speaking, the camera is higher resolution and more wide-angle; up to six skeletons can be tracked with joints able to rotate rather than just hinge. Microsoft is finally also permitting developers to use the Kinect along with a standard controller to, as they imagine, allow a user to raise their controller to block with a shield. That is the hope, but near the launch of the original Kinect, Microsoft filed a patent to allow sign language recognition: has not happened yet. Who knows whether the device will be successfully integrated into gaming applications.
Of course Microsoft is known most for system software, and the Xbox runs three lightweight operating environments. In Windows 8, you have the Modern interface which runs WinRT applications and you have the desktop app which is x86 compatible.
The Xbox One borrows more than a little from this model.
The home screen, which I am tempted to call the Start Screen, for the console has a very familiar tiled interface. They are not identical to Windows but they are definitely consistent. This interface allows for access to Internet Explorer and an assortment of apps. These apps can be pinned to the side of the screen, identical to Windows 8 modern app. I am expecting there to be "a lot of crossover" (to say the least) between this and the Windows Store; I would not be surprised if it is basically the same API. This works both when viewing entertainment content as well as within a game.
These three operating systems run at the same time. The main operating system is basically a Hyper-V environment which runs the two other operating systems simultaneously in sort-of virtual machines. These operating systems can be layered with low latency, since all you are doing is compositing them in a different order.
Lastly, they made reference to Xbox Live, go figure. Microsoft is seriously increasing their server capacity and expects developers to utilize Azure infrastructure to offload "latency-insensitive" computation for games. While Microsoft promises that you can play games offline, this obviously does not apply to features (or whole games) which rely upon the back-end infrastructure.
And yes, I know you will all beat up on me if I do not mention the SimCity debacle. Maxis claimed that much of the game requires an online connection due to the complicated server requirements; after a crack allowed offline functionality, it was clear that the game mostly operates fine on a local client. How much will the Xbox Live cloud service offload? Who knows, but that is at least their official word.
Now to tie up some loose ends. The Xbox One will not be backwards compatible with Xbox 360 games although that is no surprise. Also, Microsoft says they are allowing users to resell and lend games. That said, games will be installed and not require the disc, from what I have heard. Apart from the concerns about how much you can run on a single 500GB drive, once the game is installed rumor has it that if you load it elsewhere (the rumor is even more unclear about whether "elsewhere" counts accounts or machines) you will need to pay a fee to Microsoft. In other words? Basically not a used game.
Well, that has it. You can be sure we will add more as information comes forth. Comment away!
Subject: Editorial | May 17, 2013 - 10:36 AM | PCPer Staff
We often get asked about our favorite gaming notebooks and despite the somewhat aged styling, we still think Alienware makes some of the best. Today's deal offers up the small M14x R2 with a 1600x900 screen, Core i7-3630QM quad-core processors, GeForce GT 650M discrete graphics and 16GB of DDR3 memroy. The $1308 price is a nice $300+ discount over the normal price!
Subject: Editorial, General Tech | May 16, 2013 - 03:45 PM | Scott Michaud
Tagged: web browser, Malware, IE10
If you consider your browser security based solely on whether it will allow you to manually download a malicious executable: IE10 is the best browser ever!
Rod Trent over at Windows IT Pro seems to believe this when NSS labs released their report, "Socially Engineered Malware Blocking". In this report, Internet Explorer blocked the user from downloading nearly all known malware (clarification: all known malware within the test). Google Chrome came in second place with a little less than 17% fail rate and the other browsers were quite far behind with approximately a 90% failure rate.
Based on that one metric alone, Rod Trent used a cutesy chess image to proclaim IE the... king... of the hill. Not only that, he suggests Safari, Opera, and Firefox consider "shuttering their doors." After about a decade of Internet Explorer suffering from countless different and unique vectors of exploitation, now is the time to proclaim a victor for attacks which require explicit user action?
Buckle in, readers, it's a rant.
Firstly, this reminds me a little bit of Microsoft Security Essentials. Personally, I use it, because it provides enough protection for me. Unlike its competitors, MSE has next to no false positives because almost ignores zero-day exploits. The AV package drew criticism from lab tests which test zero-day exploits. Microsoft Security Essentials was ranked second-worst by this metric.
Well, time to shutter your doors Micr... oh wait Rod Trent lauded it as award-winning. Huh...
But while we are on the topic of false positives, how do you weigh those in your grading of a browser? According to the report, and common sense, achieving pure success in this metric is dead simple if you permit your browser to simply block every download, good or bad.
If a 100% false positive acceptance rate is acceptable, it is trivial to protect users from all malicious download. With just a few lines of code, Firefox, Safari, and Opera could displace Internet Explorer and Chrome as the leaders of protection against socially engineered malware. However, describing every download as "malicious" would break the internet. Finding a balance between accuracy and safety is the challenge for browsers at the front of protection technology.
A browser that is capable of blocking malware without blocking legitimate content would certainly be applause-worthy. I guess time will tell whether Internet Explorer 10 is able to walk the balance, or whether it will just be a nuisance like the first implementations of UAC.
OK, Google did actually release exactly one native Windows application at Google I/O: It's called Android Studio, an application that helps developers create apps that run on Android, Google’s answer to Windows. But don’t worry, Microsoft fans: Internet Explorer (IE) flags the Android Studio download as potential malware.
Ah crap... that was quick.
Now to be fair, Internet Explorer 10 and later have been doing things right. I am glad to see Microsoft support standards and push for an open web after so many years. This feature helps protect users from their own complacency.
Still, be careful when you call checkmate: some places may forfeit your credibility.
Subject: Editorial, General Tech, Cases and Cooling, Processors | May 10, 2013 - 04:23 PM | Scott Michaud
Tagged: c6, c7, haswell, PSU, corsair
I cannot do it captain! I don't have the not enough power!
We have been discussing the ultra-low power state of Haswell processors for a little over a week and how it could be detrimental to certain power supplies. Power supply manufacturers never quite expected that you could have as little as a 0.05 Amp (0.6W) draw on the 12V rail without being off. Since then, companies such as Enermax started to list power supplies which have been tested and are compliant with the new power requirements.
|AXi||AX1200i||Yes||100% Compatible with Haswell CPUs|
|AX860i||Yes||100% Compatible with Haswell CPUs|
|AX760i||Yes||100% Compatible with Haswell CPUs|
|AX||AX1200||Yes||100% Compatible with Haswell CPUs|
|AX860||Yes||100% Compatible with Haswell CPUs|
|AX850||Yes||100% Compatible with Haswell CPUs|
|AX760||Yes||100% Compatible with Haswell CPUs|
|AX750||Yes||100% Compatible with Haswell CPUs|
|AX650||Yes||100% Compatible with Haswell CPUs|
|HX||HX1050||Yes||100% Compatible with Haswell CPUs|
|HX850||Yes||100% Compatible with Haswell CPUs|
|HX750||Yes||100% Compatible with Haswell CPUs|
|HX650||Yes||100% Compatible with Haswell CPUs|
|TX-M||TX850M||Yes||100% Compatible with Haswell CPUs|
|TX750M||Yes||100% Compatible with Haswell CPUs|
|TX650M||Yes||100% Compatible with Haswell CPUs|
|TX||TX850||Yes||100% Compatible with Haswell CPUs|
|TX750||Yes||100% Compatible with Haswell CPUs|
|TX650||Yes||100% Compatible with Haswell CPUs|
|GS||GS800||Yes||100% Compatible with Haswell CPUs|
|GS700||Yes||100% Compatible with Haswell CPUs|
|GS600||Yes||100% Compatible with Haswell CPUs|
|CX-M||CX750M||Yes||100% Compatible with Haswell CPUs|
|CX600M||TBD||Likely compatible — currently validating|
|CX500M||TBD||Likely compatible — currently validating|
|CX430M||TBD||Likely compatible — currently validating|
|CX||CX750||Yes||100% Compatible with Haswell CPUs|
|CX600||TBD||Likely compatible — currently validating|
|CX500||TBD||Likely compatible — currently validating|
|CX430||TBD||Likely compatible — currently validating|
|VS||VS650||TBD||Likely compatible — currently validating|
|VS550||TBD||Likely compatible — currently validating|
|VS450||TBD||Likely compatible — currently validating|
|VS350||TBD||Likely compatible — currently validating|
Above is Corsair's slightly incomplete chart as of the time it was copied from their website, 3:30pm on May 10th, 2013; so far it is coming up all good. Their blog should be updated as new products get validated for the new C6 and C7 CPU sleep states.
The best part of this story is just how odd it is given the race to arc-welding (it's not a podcast so you can't Bingo! hahaha!) supplies we have been experiencing over the last several years. Simply put, some companies never thought that component manufacturers such as Intel would race to the bottom of power draws.
Subject: Editorial, Graphics Cards | May 8, 2013 - 11:37 PM | Ryan Shrout
Tagged: video, nvidia, live, frame rating, fcat
Update: Did you miss the live stream? Watch the on-demand replay below and learn all about the Frame Rating system, FCAT, input latency and more!!
I know, based solely on the amount of traffic and forum discussion, that our readers have really adopted and accepted our Frame Rating graphics testing methodology. Based on direct capture of GPU output via an external system and a high end capture card, our new systems have helped users see GPU performance a in more "real-world" light that previous benchmarks would not allow.
I also know that there are lots of questions about the process, the technology and the results we have shown. In order to try and address these questions and to facilitate new ideas from the community, we are hosting a PC Perspective Live Stream on Thursday afternoon.
Joining me will be NVIDIA's Tom Petersen, a favorite of the community, to talk about NVIDIA's stance on FCAT and Frame Rating, as well as just talk about the science of animation and input.
The primary part of this live stream will be about education - not about bashing one particular product line or talking up another. And part of that education is your ability to interact with us live, ask questions and give feedback. During the stream we'll be monitoring the chat room embedded on http://pcper.com/live and I'll be watching my Twitter feed for questions from the audience. The easiest way to get your question addressed though will be to leave a comment or inquiry here in this post below. It doesn't require registration and this will allow us to think about the questions before hand, giving it a better chance of being answered during the stream.
Frame Rating and FCAT Live Stream
11am PT / 2pm ET - May 9th
So, stop by at 2pm ET on Thursday, May 9th to discuss the future of graphics performance and benchmarking!
Subject: Editorial, General Tech, Graphics Cards, Processors | May 8, 2013 - 09:32 PM | Scott Michaud
Tagged: Volcanic Islands, radeon, ps4, amd
So the Southern Islands might not be entirely stable throughout 2013 as we originally reported; seismic activity being analyzed suggests the eruption of a new GPU micro-architecture as early as Q4. These Volcanic Islands, as they have been codenamed, should explode onto the scene opposing NVIDIA's GeForce GTX 700-series products.
It is times like these where GPGPU-based seismic computation becomes useful.
The rumor is based upon a source which leaked a fragment of a slide outlining the processor in block diagram form and specifications of its alleged flagship chip, "Hawaii". Of primary note, Volcanic Islands is rumored to be organized with both Serial Processing Modules (SPMs) and a Parallel Compute Module (PCM).
So apparently a discrete GPU can have serial processing units embedded on it now.
Heterogeneous Systems Architecture (HSA) is a set of initiatives to bridge the gap between massively parallel workloads and branching logic tasks. We usually make reference to this in terms of APUs and bringing parallel-optimized hardware to the CPU. In this case, we are discussing it in terms of bringing serial processing to the discrete GPU. According to the diagram, the chip within would contain 8 processor modules each with two processing cores and an FPU for a total of 16 cores. There does not seem to be any definite identification whether these cores would be based upon their license to produce x86 processors or their other license to produce ARM processors. Unlike an APU, this is heavily skewed towards parallel computation rather than a relatively even balance between CPU, GPU, and chipset features.
Now of course, why would they do that? Graphics processors can do branching logic but it tends to sharply cut performance. With an architecture such as this, a programmer might be able to more efficiently switch between parallel and branching logic tasks without doing an expensive switch across the motherboard and PCIe bus between devices. Josh Walrath suggested a server containing these as essentially add-in card computers. For gamers, this might help out with workloads such as AI which is awkwardly split between branching logic and massively parallel visibility and path-finding tasks. Josh seems skeptical about this until HSA becomes further adopted, however.
Still, there is a reason why they are implementing this now. I wonder, if the SPMs are based upon simple x86 cores, how the PS4 will influence PC gaming. Technically, a Volcanic Island GPU would be an oversized PS4 within an add-in card. This could give AMD an edge, particularly in games ported to the PC from the Playstation.
This chip, Hawaii, is rumored to have the following specifications:
- 4096 stream processors
- 16 serial processor cores on 8 modules
- 4 geometry engines
- 256 TMUs
- 64 ROPs
- 512-bit GDDR5 memory interface, much like the PS4.
20 nm Gate-Last silicon fab process
- Unclear if TSMC or "Common Platform" (IBM/Samsung/GLOBALFOUNDRIES)
Softpedia is also reporting on this leak. Their addition claims that the GPU will be designed on a 20nm Gate-Last fabrication process. While gate-last is considered to be not worth the extra effort in production, Fully Depleted Silicon On Insulator (FD-SOI) is apparently "amazing" on gate-last at 28nm and smaller fabrication. This could mean that AMD is eying that technology and making this design with intent of switching to an FD-SOI process, without a large redesign which an initially easier gate-first production would require.
Well that is a lot to process... so I will leave you with an open question for our viewers: what do you think AMD has planned with this architecture, and what do you like and/or dislike about what your speculation would mean?
Subject: Editorial, Mobile | May 7, 2013 - 12:07 AM | Scott Michaud
Tagged: unreal engine, firefox, asm.js
If, on the other hand, you wish to see an example of a large application compiled for the browser: would Unreal Engine 3 suffice?
Clearly a computer hardware website would take the effort required to run a few benchmarks, and we do not disappoint. Epic Citadel was run in its benchmark mode in Firefox 20.0.1, Firefox 22.0a2, and Google Chrome; true, it was not run for long on Chrome before the tab crashed, but you cannot blame me for trying.
Each benchmark was run at full-screen 1080p "High Performance" settings on a PC with a Core i7 3770, a GeForce GTX 670, and more available RAM than the browser could possibly even allocate. The usual Firefox framerate limit was removed; they were the only tab open on the same fresh profile; the setting layout.frame_rate.precise was tested in both positions because I cannot keep up what the state of requestAnimationFrame callback delay is; and each scenario was performed twice and averaged.
- layout.frame_rate.precise true: 54.7 FPS
- layout.frame_rate.precise false: 53.2 FPS
Firefox 22.0a2 (asm.js)
- layout.frame_rate.precise true: 147.05 FPS
- layout.frame_rate.precise false: 144.8 FPS
Google Chrome 26.0.1410.64
It is also very enticing for Epic as well. A little over a month ago, Mark Rein and Tim Sweeney of Epic were interviewed by Gamasutra about HTML5 support for Unreal Engine. Due in part to the removal of UnrealScript in favor of game code being scripted in C++, Unreal Engine 4 will support HTML5. They are working with Mozilla to make the browser a reasonable competitor to consoles; write once, run on Mac, Windows, Linux, or anywhere compatible browsers can be found. Those familiar with my past editorials know this excites me greatly.
So what do our readers think? Comment away!
Subject: Editorial | April 26, 2013 - 11:40 AM | Ryan Shrout
Tagged: video, pcper, Indiegogo
UPDATE 4/26/13: We are extremely excited to see that we have met our first goal for our Indiegogo project! We are eternally grateful for our fans and readers that are supporting us in this endeavor. We are going to start putting together orders for the set materials and I am very excited about the direction this is pointing us in. There is still room to improve the project though and we have lots of great perks available for those of you that are still looking to contribute to the cause! Oh, and we should have our T-shirt design ready early next week as well. Thank you EVERYONE for reading PC Perspective!!
And for those of you looking for a bit more insight into our total goals, here is another mock up of the set!
Yesterday evening, the team at PC Perspective launched a project that will help us grow and expand our coverage of technology and computer hardware. Using the crowd funding service called Indiegogo.com, we are doing a fund drive to help improve the quality of our video content and enable us to do more, unique styles of content.
If you are anything like us, you love technology. Motherboards, graphics cards, processors, SSDs, monitors, laptops, tablets, cell phones and more. And you also love reading about them, hearing about them and seeing them, dissecting them and finding out what makes them tick.
I am confident that high quality video content is the future of our medium and while we have been able to do quite a lot with the basic technology and setup we have here today, my goal is to be able to bring the readers regular, high quality video content on all aspects of technology. We want to not only have video reviews for products but we want to be able to do near-daily content updates on the news of the day while balancing that with long-form interviews of personalities that make the industry function. We have dabbled in some of these content types and the responses have been great, but we need a higher quality setup to really do it right.
Our goal with this project is to build the funds necessary to turn our office in Florence, KY into a high tech video production outlet that starts with a quality set design and better quality equipment.
Other than supporting one of your favorite online outlets, we have also lined up some sweet perks for contributors to our project. We have ad-free versions of the site, Tshirts and access to the PC Perspective Gold Club that has some pretty ridiculous giveaways! If you support us even further you can get some individual time with our team to tell us why you supported us, answer your questions or even join us for an episode of the PC Perspective Podcast!
Come visit the offices, join the process of creating a new show or make fun of Josh's laugh in person - it's all possible!
So if you have the means and you want to support our cause, if you have enjoyed any of our articles, podcasts or video reviews, consider helping to fund our project!
Support PC Perspective's Indiegogo Project
Subject: Editorial, General Tech, Displays | April 22, 2013 - 05:34 PM | Scott Michaud
Tagged: LG, ips, hack
Operators are standing by...
Of course Apple is not a primary manufacturer of LCD panels; like everyone else, they buy their panels from someone like LG. Due to how much Apple loves IPS technology, which I cannot blame them for, they in fact do purchase their displays from LG.
If you have an itchy soldering iron, so can you.
According to EmertHacks, the LG part number for retina iPad screens is LP097QX1-SPA1. The blog post states that he could find the panel for as cheap as $55, but my own digging game up with costs between $60 and $200 plus shipping. These panels are mostly destined to iPad repair shops, but you can give it a better home.
With under $20 of other parts, this panel could be attached to a DisplayPort connection. All said and done, you could have a 2048x1536 9.7" display with an 800:1 static contrast ratio for about $70.
Subject: Editorial, General Tech, Systems | April 20, 2013 - 07:36 PM | Scott Michaud
Tagged: windows, start button, Metro
The latest rumors, based on registry digging and off-the-record testimony, claims that Windows 8.1 will including the option of booting directly into the desktop. A bold claim such as this requires some due diligence. Comically, the attempts to confirm this rumor has unearthed another: the start button, but not necessarily the start menu, could return. On the record, Microsoft also wants to be more open to customer feedback. Despite these recent insights into the future of Windows, all's quiet with the worst aspect of modernization.
Mary Jo Foley, contributor to ZDNet and very reliable bullcrap filter for Microsoft rumors, learned from a reliable source that the Start Button might have a place in the modern Windows. Quite the catch while fishing to validate a different rumor; she was originally investigating whether Microsoft would consider allowing users to boot direct to desktop via recently unearthed registry keys. Allegedly both are being planned for at least some SKUs of Windows 8.1, namely the Professional and Enterprise editions.
But, as usual for Microsoft, the source emphasized, "Until it ships, anything can change." No-one was clear about the Start Button from a functional standpoint: would it be bound to display the Start Screen? Would it be something more?
Personally, I liked the modern Windows interface. Sure, it is messed up on the modern-side when it comes to multiple monitor support, but that can easily be fixed. As you will note, I am still actively boycotting everything beyond Windows 7 and this news will not change my mind. We are bickering over interface elements when the real concern is the deprecation of user control. Outside of the desktop: the only applications you can use are from the Windows Store or Windows Update; the only websites you can browse are ones which Internet Explorer can render; and the only administrator is Microsoft.
Imagine if Microsoft is told by a government that its citizens are not allowed encryption applications.
The Windows Store is clearly modeled by, and about as messed up as, the Xbox Marketplace. Even if your application gets certified, would Microsoft eventually determine that certification fees should be the burden of the developer? That is how it is on the Xbox with each patch demanding a price tag of about $40,000 after the first-one-free promotion. That would be pretty hard to swallow for an open-source application or a cute game that a teenage woman makes for her significant other as a Valentine's gift.
Microsoft's current Chief Financial Officer, Peter Klein, stated in his third quarter earnings release that Windows Blue, "Further advances the vision of Windows 8 as well as responds to customer feedback." Despite how abrupt this change would seem, the recent twitchy nature should not come as a surprise; Microsoft has had a tendency to completely change course on products for quite some time now. Mary Jo mentioned how Microsoft changed course on UAC but even that is a bad example; a better one is how Microsoft changed from its initial assertions that Windows 8 Developer Preview would not be shaped by customer feedback.
A lot has changed between Developer Preview and RTM.
Then again, we can hope that Microsoft associates this pain with love for the desktop. I would be comfortable with the modern Windows if we were given a guarantee that desktop x86 applications would forever be supported. I might even reconsider using and developing applications if they allow loading uncertified metro-style applications and commit to never removing that functionality.
I can get used to a new method of accessing my applications. I can never get used to a middle-man who only says "no". If Microsoft is all ears, I hope we make this point loud and clear.
Subject: Editorial | April 18, 2013 - 01:55 PM | Scott Michaud
So, news which might excite our readers: we are going to try reviewing video games.
Of course, the first thing which needs to be addressed when reviewing games is our grading system. Games, in particular, are a very artful medium and as such it does not entirely make sense to quantize its qualities.
The simple answer is, we will not.
Step back and consider how we review hardware: we run some benchmarks, we discuss the features in often numbing detail, and we assign an award-badge to the product according to our opinion. A hardware could receive no merit; it could receive a bronze, silver, or gold medal; finally, the truly extraordinary products will receive an Editor's Choice Award. If you think about it, these can transfer quite easily to video game reviews.
Our expectation is to apply two ratings to every review: a badge and a number.
A badge is very good at qualifying our assessment of a product whereas numerical scores are very good at quantifying a derivable value. We, collectively as PC gamers, have certain expectations for games and they usually demand more than the impressions of a typical console gamer. Simultaneously, we tend to be an afterthought for a lot of titles; yes, I am being generous even with that statement. Many games are outright broken, crippled by DRM, or otherwise demonstrate in very obvious terms that our money is somehow inferior. On the other hand, there are games which go above and beyond reasonable expectations held by PC gamers, and even some unreasonable ones, and are rarely hailed for it.
We are not able to judge the artistic qualities of a game using a numerical score, but we can judge its technical merits using a numerical rubric.
And so exists our planned review metric. The main point is that there will not be any definite rank-order to each game, at least from an artistic standpoint. A game is allowed to really well on one category and really terribly on another. If you are concerned with the game itself, keep more of an eye toward which award we gave it. If you are concerned about how well the game exists as a PC title, take a look at the numerical score.
There are of course caveats to this method. A viewer who looks solely at the numerical score will not know much, if anything, about the game itself. The numerical score is just a gauge for the level of effort put into the PC version.
Then again, would you expect any less from a website called "PC Perspective" which reviews products with a blend of explanation of its qualitative features mixed in with strict quantitative benchmarking?
Lastly, this is not about whether a game is "better" on a PC or on a console. Developers are free to focus on whatever platform they desire. A game designed around a console and ported to the PC will still get a great score if the finished result exhibits a "great" level of care. Likewise, even if your game is PC-exclusive, do not expect us to give it a great score if it cannot alt-tab worth a damn and is wrapped in DRM which roots our system using kernel-mode drivers.
It is not particularly hard to make a great PC experience, all it takes is effort. Fortunately, that is a property that we can assign an honest grade to.
We would really like to hear your feedback on this. Drop a line in the comments below!
Subject: Editorial, General Tech, Graphics Cards | April 14, 2013 - 02:22 AM | Scott Michaud
Tagged: never settle, never settle reloaded, amd, far cry 3
So when AMD reloaded their Never Settle bundles, they left an extra round in the barrel.
Some of my favorite games were given to me in a bundle with some piece of computer hardware. You might remember from the PC Perspective game night that I am a major fan of the Unreal Tournament franchise. My first Unreal Tournament game was an unexpected surprise when I purchased my first standalone GPU. My 166MHz Pentium computer also came bundled with Mechwarrior 2 and Wipeout.
As we discussed, AMD considers bundle-offers as a way to keep the software industry rolling forward. The quantity and quality of games which participate in the recent Never Settle bundles certainly deserve credit as it is due. Bioshock: Infinite is a game that just about every PC gamer needs to experience, and there are about a half-dozen other great titles as a part of the promotion depending upon which card or cards you purchase.
As it turns out, AMD negotiated with Ubisoft and added Far Cry 3: Blood Dragon to their Never Settle bundle. The coolest part is that AMD will retroactively email codes for this new title to anyone who has redeemed a Never Settle: Reloaded code.
So if you have ever Reloaded your Never Settle in the past, check your email as apparently you can Never Settle your reloads again.
Subject: Editorial, General Tech, Graphics Cards, Systems, Mobile | April 7, 2013 - 10:21 PM | Scott Michaud
Tagged: DirectX, DirectX 12
Microsoft DirectX is a series of interfaces for programmers to utilize typically when designing gaming or entertainment applications. Over time it became synonymous with Direct3D, the portion which mostly handles graphics processing by offloading those tasks to the video card. At one point, DirectX even handled networking through DirectPlay although that has been handled by Games for Windows Live or other APIs since Vista.
AMD Corporate Vice President Roy Taylor was recently interviewed by the German press, "c't magazin". When asked about the future of "Never Settle" bundles, Taylor claimed that games such as Crysis 3 and Bioshock: Infinite keep their consumers happy and also keep the industry innovating.
Keep in mind, the article was translated from German so I might not be entirely accurate with my understanding of his argument.
In a slight tangent, he discussed how new versions of DirectX tends to spur demand for new graphics processors with more processing power and more RAM. He has not heard anything about DirectX 12 and, in fact, he does not believe there will be one. As such, he is turning to bundled games to keep the industry moving forward.
Neowin, upon seeing this interview, reached out to Microsoft who committed to future "innovation with DirectX".
This exchange has obviously sparked a lot of... polarized... online discussion. One claimed that Microsoft is abandoning the PC to gain a foothold in the mobile market which it has practically zero share of. That is why they are dropping DirectX.
Unfortunately this does not make sense: DirectX would be one of the main advantages which Microsoft has in the mobile market. Mobile devices have access to fairly decent GPUs which can use DirectX to draw web pages and applications much smoother and much more power efficiently than their CPU counterparts. If anything, DirectX would be increased in relevance if Microsoft was blindly making a play for mobile.
The major threat to DirectX is still quite off in the horizon. At some point we might begin to see C++Amp or OpenCL nibble away at what DirectX does best: offload highly-parallel tasks to specialized processing units.
Still, releases such as DirectX 11.1 are quite focused on back-end tweaks and adjustments. What do you think a DirectX 12 API would even do, that would not already be possible with DirectX 11?
Subject: Editorial | April 6, 2013 - 04:34 AM | Scott Michaud
Tagged: Windows 8.1, windows blue, internet explorer, Internet Explorer 11
Windows Blue Windows 8.1 was leaked not too long ago. We reported on the release's illegitimate availability practically as soon as it happened. We knew that Internet Explorer took it to incremented its version to 11. The recent releases of Internet Explorer each made decent strides to catch the browser up to Google's Chrome and Mozilla's Firefox. Once thrown to the sharks thoroughly investigated, this release is pining to be just as relevant despite how near its expected release has been to Internet Explorer 10.
One of my first thoughts upon realizing that Internet Explorer 11 was an impending "thing": will it make it to Windows 7? Unfortunately, we still have no clue. Thankfully, unlike Windows RT which disallow rendering engines other than Internet Explorer's Trident, we are still capable of installing alternative browsers in Windows 7. If Internet Explorer 11 is unavailable, they can still install Firefox or Chrome.
For those who only use Internet Explorer and can upgrade to 11, you might be pleased to find WebGL support. Microsoft has been quite vocal against WebGL for quite some time, claiming it a security threat when facing the wild west of the internet. Then again, to some extent, the internet is a security nightmare in itself. The question is whether WebGL can be sufficiently secured for its usage:
- Animation effects (I created this specific demo... not the rest)
- Gorgeous, smooth, and battery-efficient 2d games
- Likewise beautiful 3D experiences
- And of course there's a semi-realtime raytracing demo.
This, to some extent, marks a moment where Microsoft promotes a Khronos standard. With some level of irony, Apple was one of the founding members of the WebGL group yet Microsoft might beat Safari to default WebGL support Of course it could not be that simple, however, as IE11 apparently accepts WebGL shaders (the math which computes the color and position of a pixel) in IESL rather than the standard GLSL. IESL, according to the name of its registry flag, seems to be heavily based on HLSL seen in DirectX.
I guess they just cannot let Khronos have a total victory?
SPDY also seems to be coming to IE11. SPDY, pronounced "speedy" and not an acronym, is a protocol designed to cut loading latency. Cool stuff.
Last and definitely least, IE11 is furthering its trend of pretending that it is a Mozilla Gecko-like rendering engine in its user agent string. Personally, I envision an IE logo buying a fiery-orange tail at a cosplay booth. They have been doing this for quite some time now.
Subject: Editorial, General Tech, Shows and Expos | March 27, 2013 - 03:25 AM | Scott Michaud
Tagged: battlefield, battlefield 4, GDC, GDC 13
Battlefield 4 is coming, that has been known with Medal of Honor: Warfighter's release and its promise of beta access, but the gameplay trailer is already here. Clocking in at just over 17 minutes, "Fishing in Baku" looks amazing from a technical standpoint.
The video has been embed below. A little not safe for work due to language and amputation.
Now that you finished gawking, we have gameplay to discuss. I cannot help but be disappointed with the campaign direction. Surely, the story was in planning prior to the release of Battlefield 3. Still, it seems to face the same generic-character problem which struck the last campaign.
In Battlefield 3, I really could not recognize many characters apart from the lead which made their deaths more confusing than upsetting. Normally when we claim a character is identifiable, we mean that we can relate to them. In this case, when I say the characters were not identifiable, I seriously mean that I probably could not pick them out in a police lineup.
Then again, the leaked promotional image for Battlefield 4 seems to show Blackburn at the helm. I guess there is some hope. Slim hope, which the trailer does not contribute to. I mean even the end narration capped how pointless the character interactions were. All this in spite of EA's proclaiming YouTube description of this being human, dramatic, and believable.
Oh well, it went boom good.
Subject: Editorial, General Tech, Processors, Shows and Expos | March 20, 2013 - 06:26 PM | Scott Michaud
Tagged: windows rt, nvidia, GTC 2013
NVIDIA develops processors, but without an x86 license they are only able to power ARM-based operating systems. When it comes to Windows, that means Windows Phone or Windows RT. The latter segment of the market has disappointing sales according to multiple OEMs, which Microsoft blames them for, but the jolly green GPU company is not crying doomsday.
NVIDIA just skimming the Surface RT, they hope.
As reported by The Verge, NVIDIA CEO Jen-Hsun Huang was optimistic that Microsoft would eventually let Windows RT blossom. He noted how Microsoft very often "gets it right" at some point when they push an initiative. And it is true, Microsoft has a history of turning around perceived disasters across a variety of devices.
They also have a history of, as they call it, "knifing the baby."
I think there is a very real fear for some that Microsoft could consider Intel's latest offerings as good enough to stop pursuing ARM. Of course, the more the pursue ARM, the more their business model will rely upon the-interface-formerly-known-as-Metro and likely all of its certification politics. As such, I think it is safe to say that I am watching the industry teeter on a fence with a bear on one side and a pack of rabid dogs on the other. On the one hand, Microsoft jumping back to Intel would allow them to perpetuate the desktop and all of the openness it provides. On the other hand, even if they stick with Intel they likely will just kill the desktop anyway, for the sake of user confusion and the security benefits of cert. We might just have less processor manufacturers when they do that.
So it could be that NVIDIA is confident that Microsoft will push Windows RT, or it could be that NVIDIA is pushing Microsoft to continue to develop Windows RT. Frankly, I do not know which would be better... or more accurately, worse.
Subject: Editorial | February 27, 2013 - 02:26 PM | Scott Michaud
Tagged: Podcast Bingo, Bingo
It's Bumpday! No, no I'm not reviving that, at least not yet.
But it is Wednesday and as such we will be gathering for another fine PC Perspective Podcast. As always, if you wish to join us: head down to pcper.com/live or click on the little radio tower in the upcoming events box on the right side of your screen.
Yes, I know, there are two P's. Deal with it.
Starting this week, we will have a new activity for our viewers to follow along with. Play along with the first official PC Perspective Podcast Bingo Card! Keep track of every episode we mention “Arc Welding” by putting us to task and marking off the square.
Be sure to call out the spaces you mark off and whatever Bingo patterns you manage to make out of it.
Our IRC room during the podcast (it's usually active 24/7) is: irc.mibbit.net #pcper
Of course this is just for fun – but fun is fun!