Join the PC Perspective Team and Austin Evans LIVE on Tonight's Podcast!

Subject: Editorial | June 4, 2014 - 07:42 PM |
Tagged: video, pcper, live, austin evans

Tonight's live edition of the PC Perspective Podcast is going to have a special guest, the Internet's Austin Evans. You likely know of Austin through his wildly popular YouTube channel or maybe his dance moves.

But seriously, Austin Evans is a great guy with a lot of interesting input on technology. Stop by our live page at http://www.pcper.com/live at 10pm EST / 7pm EST for all the fun!

Make sure you don't miss it by signing up for our PC Perspective Live Mailing List!

pcperlive.png

Source: PCPer Live!

TrueCrypt Taken Offline Doesn't Pass My Smell Test

Subject: Editorial, General Tech | May 29, 2014 - 02:17 AM |
Tagged: TrueCrypt

It should not pass anyone's smell test but it apparently does, according to tweets and other articles. Officially, the TrueCrypt website (which redirects to their SourceForge page) claims that, with the end of Windows XP support (??), the TrueCrypt development team wants users to stop using their software. Instead, they suggest a switch to BitLocker, Mac OSX built-in encryption, or whatever random encryption suite comes up when you search your Linux distro's package manager (!?). Not only that, but several versions of Windows (such as 7 Home Premium) do not have access to BitLocker. Lastly, none of these are a good solution for users who want a single encrypted container across multiple OSes.

A new version (don't use it!!!) called TrueCrypt 7.2 was released and signed with their private encryption key.

TrueCrypt_Logo.png

The developers have not denied the end of support, and its full-of-crap reason. (Seriously, because Microsoft deprecated Windows XP almost two months ago, they pull support for a two year old version now?)

They have also not confirmed it. They have been missing since at least "the announcement" (or earlier if they were not the ones who made it). Going missing and unreachable, the day of your supposedly gigantic resignation announcement, does not support the validity of that announcement. 

To me, that is about as unconfirmed as you can get.

Still, people are believing the claims that TrueCrypt 7.1a is not secure. The version has been around since February 2012 and, beyond people looking at its source code, has passed a significant portion of a third-party audit. Even if you believe the website, it only says that TrueCrypt will not be updated for security. It does not say that TrueCrypt 7.1a is vulnerable to any known attack.

In other words, the version that has been good enough for over two years, and several known cases of government agencies being unable to penetrate it, is probably as secure today as it was last week.

"The final version", TrueCrypt 7.2, is a decrypt-only solution. It allows users to unencrypt existing vaults, although who knows what else it does, to move it to another solution. The source code changes have been published, and they do not seem shady so far, but since we cannot even verify that their private key has not leaked, I wouldn't trust it. A very deep compromise could make finding vulnerabilities very difficult.

So what is going on? Who knows. One possibility is that they were targeted for a very coordinated hack, one which completely owned them and their private key, performed by someone(s) who spent a significant amount of time modifying a fake 7.2 version. Another possibility is that they were legally gagged and forced to shut down operations, but they managed to negotiate a method for users to decrypt existing data with a neutered build.

One thing is for sure, if this is a GoG-style publicity stunt, I will flip a couple of tables.

We'll see. ┻━┻ \_()_/ ┻━┻

Source: TrueCrypt

Mozilla Firefox to Implement Adobe DRM for Video

Subject: Editorial, General Tech | May 14, 2014 - 09:56 PM |
Tagged: ultraviolet, mozilla, DRM, Adobe Access, Adobe

Needless to say, DRM is a controversial topic and I am clearly against it. I do not blame Mozilla. The non-profit organization responsible for Firefox knew that they could not oppose Chrome, IE, and Safari while being a consumer software provider. I do not even blame Apple, Google, and Microsoft for their decisions, either. This problem is much bigger and it comes down to a total misunderstanding of basic mathematics (albeit at a ridiculously abstract and applied level).

22-mozilla-2.jpg

Simply put, piracy figures are meaningless. They are a measure of how many people use content without paying (assuming they are even accurate). You know what is more useful? Sales figures. Piracy figures are measurements, dependent variables, and so is revenue. Measurements cannot influence other measurements. Specifically, measurements cannot influence anything because they are, themselves, the result of influences. That is what "a measure" is.

Implementing DRM is not a measurement, however. It is a controllable action whose influence can be recorded. If you implement DRM and your sales go down, it hurt you. You may notice piracy figures decline. However, you should be too busy to care because you should be spending your time trying to undo the damage you did to your sales! Why are you looking at piracy figures when you're bleeding money?

I have yet to see a DRM implementation that correlated with an increase in sales. I have, however, seen some which correlate to a massive decrease.

The thing is, Netflix might know that and I am pretty sure that some of the web browser companies know that. They do not necessarily want to implement DRM. What they want is content and, surprise, the people who are in charge of the content are definitely not enlightened to that logic. I am not even sure if they realize that the reason why content is pirated before their release dates is because they are not leaked by end users.

But whatever. Technical companies, who want that content available on their products, are stuck finding a way to appease those content companies in a way that damages their users and shrinks their potential market the least. For Mozilla, this means keeping as much open as possible.

do-not-hurt-2.jpg

Since they do not have existing relationships with Hollywood, Adobe Access will be the actual method of displaying the video. They are clear to note that this only applies to video. They believe their existing relationships in text, images, and games will prevent the disease from spreading. This is basically a plug-in architecture with a sandbox that is open source and as strict as possible.

This sandbox is intended to prevent a security vulnerability from having access to the host system, give a method of controlling the DRM's performance if it hitches, and not allow the DRM to query the machine for authentication. The last part is something they wanted to highlight, because it shows their effort to protect the privacy of their users. They also imply a method for users to opt-out but did not go into specifics.

As an aside, Adobe will support their Access DRM software on Windows, Mac, and Linux. Mozilla is pushing hard for Android and Firefox OS, too. According to Adobe, Access DRM is certified for use with Ultraviolet content.

I accept Mozilla's decision to join everyone else but I am sad that it came to this. I can think of only two reasons for including DRM: for legal (felony) "protection" under the DMCA or to make content companies feel better while they slowly sink their own ships chasing after numbers which have nothing to do with profits or revenue.

Ultimately, though, they made a compromise. That is always how we stumble and fall down slippery slopes. I am disappointed but I cannot suggest a better option.

Source: Mozilla

Mozilla Makes Suggestions to the FCC about Net Neutrality

Subject: Editorial, General Tech | May 5, 2014 - 08:08 PM |
Tagged: mozilla, net neutrality

Recently, the FCC has been moving to give up Net Neutrality. Mozilla, being dedicated to the free (as in speech) and open internet, has offered a simple compromise. Their proposal is that the FCC classifies internet service providers (ISPs) as common carriers on the server side, forcing restrictions on them to prevent discrimination of traffic to customers, while allowing them to be "information services" to consumers.

mozilla-fcc.png

In other words, force ISPs to allow services to have unrestricted access to consumers, without flipping unnecessary tables with content distribution (TV, etc.) services. Like all possibilities so far, it could have some consequences, however.

"Net Neutrality" is a hot issue lately. Simply put, the internet gives society an affordable method of sharing information. How much is "just information" is catching numerous industries off guard, including ones which Internet Service Providers (ISPs) participate in (such as TV and Movie distribution), and that leads to serious tensions.

On the one hand, these companies want to protect their existing business models. They want consumers to continue to select their cable and satellite TV packages, on-demand videos, and other services at controlled profit margins and without the stress and uncertainty of competing.

On the other hand, if the world changes, they want to be the winner in that new reality. Yikes.

mozilla-UP.jpg

A... bad... photograph of Mozilla's "UP" anti-datamining proposal.

Mozilla's proposal is very typical of them. They tend to propose compromises which divides an issue such that both sides get the majority of their needs. Another good example is "UP", or User Personalization, which tries to cut down on data mining by giving a method for the browser to tell websites what they actually want to know (and let the user tell the browser how much to tell them). The user would compromise, giving the amount of information they find acceptable, so the website would compromise and take only what they need (rather than developing methods to grab anything and everything they can). It feels like a similar thing is happening here. This proposal gives users what they want, freedom to choose services without restriction, without tossing ISPs into "Title II" common carrier altogether.

Of course, this probably comes with a few caveats...

The first issue that pops in my mind is, "What is a service?". I see this causing problems for peer-to-peer applications (including BitTorrent Sync and Crashplan, excluding Crashplan Central). Neither endpoint would necessarily be classified as "a server", or at least convince a non-technical lawmaker that is the case, and thus ISPs would not need to apply common carrier restrictions to them. This could be a serious issue for WebRTC. Even worse, companies like Google and Netflix would have no incentive to help fight those battles -- they're legally protected. It would have to be defined, very clearly, what makes "a server".

Every method will get messy for someone. Still, the discussion is being made.

Source: Mozilla

Post Tax Day Celebration! Win an EVGA Hadron Air and GeForce GTX 750!

Subject: Editorial, General Tech, Graphics Cards | April 30, 2014 - 10:05 AM |
Tagged: hadron air, hadron, gtx 750, giveaway, evga, contest

Congrats to our winner: Pierce H.! Check back soon for more contests and giveaways at PC Perspective!!

In these good old United States of America, April 15th is a trying day. Circled on most of our calendars is the final deadline for paying up your bounty to Uncle Sam so we can continue to have things like freeway systems and universal Internet access. 

But EVGA is here for us! Courtesy of our long time sponsor you can win a post-Tax Day prize pack that includes both an EVGA Hadron Air mini-ITX chassis (reviewed by us here) as well as an EVGA GeForce GTX 750 graphics card. 

evgacontestapril.jpg

Nothing makes paying taxes better than free stuff that falls under the gift limit...

With these components under your belt you are well down the road to PC gaming bliss, upgrading your existing PC or starting a new one in a form factor you might not have otherwise imagined. 

Competing for these prizes is simple and open to anyone in the world, even if you don't suffer the same April 15th fear that we do. (I'm sure you have your own worries...)

  1. Fill out the form at the bottom of this post to give us your name and email address, in addition to the reasons you love April 15th! (Seriously, we need some good ideas for next year to keep our heads up!) Also, this does not mean you should leave a standard comment on the post to enter, though you are welcome to do that too.
     
  2. Stop by our Facebook page and give us a LIKE (I hate saying that), head over to our Twitter page and follow @pcper and heck, why not check our our many videos and subscribe to our YouTube channel?
     
  3. Why not do the same for EVGA's Facebook and Twitter accounts?
     
  4. Wait patiently for April 30th when we will draw and update this news post with the winners name and tax documentation! (Okay, probably not that last part.)

A huge thanks goes out to friends and supporters at EVGA for providing us with the hardware to hand out to you all. If it weren't for sponsors like this PC Perspective just couldn't happen, so be sure to give them some thanks when you see them around the In-tar-webs!!

Good luck!

Source: EVGA

AMD AM1 Retested on 60 Watt Power Supply

Subject: Editorial | April 23, 2014 - 09:51 PM |
Tagged: TDP, Athlon 5350, Asus AM1I-A, amd, AM1

If I had one regret about my AM1 review that posted a few weeks ago, it was that I used a pretty hefty (relatively speaking) 500 watt power supply for a part that is listed at a 25 watt TDP.  Power supplies really do not hit their efficiency numbers until they are at least under 50% load.  Even the most efficient 500 watt power supply is going to inflate the consumption numbers of these diminutive parts that we are currently testing.

am1p_01.jpg

Keep it simple... keep it efficient.

Ryan had sent along a 60 watt notebook power supply with an ATX cable adapter at around the same time as I started testing on the AMD Athlon 5350 and Asus AM1I-A.  I was somewhat roped into running that previously mentioned 500 watt power supply due to comparative reasons.  I was using a 100 watt TDP A10-6790 APU with a pretty loaded Gigabyte A88X based ITX motherboard.  That combination would have likely fried the 60 watt (12v x 5A) notebook power supply under load.

Now that I had a little extra time on my hands, I was able to finally get around to seeing exactly how efficient this little number could get.  I swapped the old WD Green 1 TB drive for a new Samsung 840 EVO 500 GB SSD.  I removed the BD-ROM drive completely from the equation as well.  Neither of those parts uses a lot of wattage, but I am pushing this combination to go as low as I possibly can.

power-idle.png

power-load.png

The results are pretty interesting.  At idle we see the 60 watt supply (sans spinning drive and BD-ROM) hitting 12 watts as measured from the wall.  The 500 watt power supply and those extra pieces added another 11 watts of draw.  At load we see a somewhat similar numbers, but not nearly as dramatic as at idle.  The 60 watt system is drawing 29 watts while the 500 watt system is at 37 watts.

am1p_02.jpg

So how do you get from a 60 watt notebook power adapter to ATX standard? This is the brains behind the operation.

The numbers for both power supplies are both good, but we do see that we get a nice jump in efficiency from using the smaller unit and a SSD instead of a spinning drive.  Either way, the Athlon 5350 and AMD AM1 infrastructure sip power as compared to most desktop processors.

Source: AMD

Ars Technica Estimates Steam Sales and Hours Played

Subject: Editorial, General Tech | April 16, 2014 - 01:56 AM |
Tagged: valve, steam

Valve does not release sales or hours played figures for any game on Steam and it is rare to find a publisher who will volunteer that information. That said, Steam user profiles list that information on a per-account basis. If someone, say Ars Technica, had access to sufficient server capacity, say an Amazon Web Services instance, and a reasonable understanding of statistics, then they could estimate.

Oh look, Ars Technica estimated by extrapolating from over 250,000 random accounts.

SteamHW.png

If interested, I would definitely look through the original editorial for all of its many findings. Here, if you let me (and you can't stop me even if you don't), I would like to add my own analysis on a specific topic. The Elder Scrolls V: Skyrim on the PC, according to VGChartz, sold 3.42 million copies on at retail, worldwide. The thing is, Steamworks was required for every copy sold at retail or online. According to Ars Technica's estimates, 5.94 million copies were registered with Steam.

5.94 minus 3.42 is 2.52 million copies sold digitally. Almost a third of PC sales were made through Steam and other digital distribution platforms. Also, this means that the PC was the game's second-best selling platform, ahead of the PS3 (5.43m) and behind the Xbox 360 (7.92m), minus any digital sales on those platforms if they exist, of course. Despite its engine being programmed in DirectX 9, it is still a fairly high-end game. That is a fairly healthy install base for decent gaming PCs.

Did you discover anything else on your own? Be sure to discuss it in our comments!

Source: Ars Technica

GDC 2014: Shader-limited Optimization for AMD's GCN

Subject: Editorial, General Tech, Graphics Cards, Processors, Shows and Expos | March 30, 2014 - 01:45 AM |
Tagged: gdc 14, GDC, GCN, amd

While Mantle and DirectX 12 are designed to reduce overhead and keep GPUs loaded, the conversation shifts when you are limited by shader throughput. Modern graphics processors are dominated by sometimes thousands of compute cores. Video drivers are complex packages of software. One of their many tasks is converting your scripts, known as shaders, into machine code for its hardware. If this machine code is efficient, it could mean drastically higher frame rates, especially at extreme resolutions and intense quality settings.

amd-gcn-unit.jpg

Emil Persson of Avalanche Studios, probably known best for the Just Cause franchise, published his slides and speech on optimizing shaders. His talk focuses on AMD's GCN architecture, due to its existence in both console and PC, while bringing up older GPUs for examples. Yes, he has many snippets of GPU assembly code.

AMD's GCN architecture is actually quite interesting, especially dissected as it was in the presentation. It is simpler than its ancestors and much more CPU-like, with resources mapped to memory (and caches of said memory) rather than "slots" (although drivers and APIs often pretend those relics still exist) and with how vectors are mostly treated as collections of scalars, and so forth. Tricks which attempt to combine instructions together into vectors, such as using dot products, can just put irrelevant restrictions on the compiler and optimizer... as it breaks down those vector operations into those very same component-by-component ops that you thought you were avoiding.

Basically, and it makes sense coming from GDC, this talk rarely glosses over points. It goes over execution speed of one individual op compared to another, at various precisions, and which to avoid (protip: integer divide). Also, fused multiply-add is awesome.

I know I learned.

As a final note, this returns to the discussions we had prior to the launch of the next generation consoles. Developers are learning how to make their shader code much more efficient on GCN and that could easily translate to leading PC titles. Especially with DirectX 12 and Mantle, which lightens the CPU-based bottlenecks, learning how to do more work per FLOP addresses the other side. Everyone was looking at Mantle as AMD's play for success through harnessing console mindshare (and in terms of Intel vs AMD, it might help). But honestly, I believe that it will be trends like this presentation which prove more significant... even if behind-the-scenes. Of course developers were always having these discussions, but now console developers will probably be talking about only one architecture - that is a lot of people talking about very few things.

This is not really reducing overhead; this is teaching people how to do more work with less, especially in situations (high resolutions with complex shaders) where the GPU is most relevant.

Mozilla Dumps "Metro" Version of Firefox

Subject: Editorial, General Tech | March 16, 2014 - 03:27 AM |
Tagged: windows, mozilla, microsoft, Metro

If you use the Firefox browser on a PC, you are probably using its "Desktop" application. They also had a version for "Modern" Windows 8.x that could be used from the Start Screen. You probably did not use it because fewer than 1000 people per day did. This is more than four orders of magnitude smaller than the number of users for Desktop's pre-release builds.

Yup, less than one-thousandth.

22-mozilla-2.jpg

Jonathan Nightingale, VP of Firefox, stated that Mozilla would not be willing to release the product without committing to its future development and support. There was not enough interest to take on that burden and it was not forecast to have a big uptake in adoption, either.

From what we can see, it's pretty flat.

The code will continue to exist in the organization's Mercurial repository. If "Modern" Windows gets a massive influx of interest, they could return to what they had. It should also be noted that there never was a version of Firefox for Windows RT. Microsoft will not allow third-party rendering engines as a part of their Windows Store certification requirements (everything must be based on Trident, the core of Internet Explorer). That said, this is also true of iOS and Firefox Junior exists with these limitations. It's not truly Firefox, little more than a re-skinned Safari (as permitted by Apple), but it exists. I have heard talks about Firefox Junior for Windows RT, Internet Explorer reskinned by Mozilla, but not to any detail. The organization is very attached to its own technology because, if whoever made the engine does not support new features or lags in JavaScript performance, the re-skins have nothing to leverage it.

Paul Thurrott of WinSupersite does not blame Mozilla for killing "Metro" Firefox. He acknowledges that they gave it a shot and did not see enough pre-release interest to warrant a product. He places some of the blame on Microsoft for the limitations it places on browsers (especially on Windows RT). In my opinion, this is just a symptom of the larger problem of Windows post-7. Hopefully, Microsoft can correct these problems and do so in a way that benefits their users (and society as a whole).

Source: Mozilla

Valve's Direct3D to OpenGL Translator (Or Part of It)

Subject: Editorial, General Tech | March 11, 2014 - 10:15 PM |
Tagged: valve, opengl, DirectX

Late yesterday night, Valve released source code from their "ToGL" transition layer. This bundle of code sits between "[a] limited subset of Direct3D 9.0c" and OpenGL to translate engines which are designed in the former, into the latter. It was pulled out of the DOTA 2 source tree and published standalone... mostly. Basically, it is completely unsupported and probably will not even build without some other chunks of the Source engine.

valve-dx-opengl.jpg

Still, Valve did not need to release this code, but they did. How a lot of open-source projects work is that someone dumps a starting blob, and if sufficient, the community pokes and prods it to mold it into a self-sustaining entity. The real question is whether the code that Valve provided is sufficient. As often is the case, time will tell. Either way, this is a good thing that other companies really should embrace: giving out your old code to further the collective. We are just not sure how good.

ToGL is available now at Valve's GitHub page under the permissive, non-copyleft MIT license.

Source: Valve GitHub

The Bigger They Are: The Titan They Fall? 48GB Install

Subject: Editorial, General Tech | February 25, 2014 - 03:43 PM |
Tagged: titanfall, ssd

UPDATE (Feb 26th): Our readers pointed out in the comments, although I have yet to test it, that you can change Origin's install-to directory before installing a game to have them on a separate hard drive as the rest. Not as easy as Steam's method, but apparently works for games like this that you want somewhere else. I figured it would forget games in the old directory, but apparently not.

Well, okay. Titanfall will require a significant amount of hard drive space when it is released in two weeks. Receiving the game digitally will push 21GB of content through your modem and unpack to 48GB. Apparently, the next generation has arrived.

titanfall.jpg

Honestly, I am not upset over this. Yes, this basically ignores customers who install their games to their SSDs. Origin, at the moment, forces all games to be installed in a single directory (albeit that can be anywhere) unlike Steam, which allows games to be individually sent to multiple folders. It would be a good idea to keep those customers in mind... but not at the expense of the game itself. Like always, both "high-end" and "unoptimized" titles have high minimum specifications; we decide which one applies by considering how effectively the performance is used.

That is something that we will need to find out when it launches on March 11th.

Source: PC Gamer

Kaveri loves fast memory

Subject: Editorial, General Tech | February 25, 2014 - 03:34 PM |
Tagged: ddr3, Kaveri, A10 7850K, amd, linux

You don't often see performance scaling as clean as what Phoronix saw when testing the effect of memory speed on AMD's A10-7850K.  Pick any result and you can clearly see a smooth increase in performance from DDR3-800 to DDR3-2400.  The only time that increase seems to decline slightly is between DDR3-2133 and 2400MHz, with some tests showing little to no increase between those two speeds.  Some tests do still show an improvement, for certain workloads on Linux the extra money is worth it but in other cases you can save a few dollars and limit yourself to the slightly cheaper DDR3-2133.  Check out the full review here.

image.php_.jpg

"Earlier in the week I published benchmarks showing AMD Kaveri's DDR3-800MHz through DDR3-2133MHz system memory performance. Those results showed this latest-generation AMD APU craving -- and being able to take advantage of -- high memory frequencies. Many were curious how DDR3-2400MHz would fair with Kaveri so here's some benchmarks as we test out Kingston's HyperX Beast 8GB DDR3-2400MHz memory kit."

Here are some more Memory articles from around the web:

Memory

Source: Phoronix

Irrational Games Implodes with Controlled Demolition

Subject: Editorial, General Tech | February 19, 2014 - 06:15 PM |
Tagged: bioshock infinite

The team behind the original BioShock and Bioshock: Infinite decided to call it quits. After seventeen years, depending on where you start counting, the company dissolved to form another, much smaller studio. Only about fifteen employees will transition to the new team. The rest are provided financial support, given a bit of time to develop their portfolios, and can attend a recruitment day to be interviewed by other studios and publishers. They may also be offered employment elsewhere in Take-Two Interactive.

bioshock_infinite_sp.jpg

The studio formed by the handful of remaining employees will look to develop games based on narrative, which is definitely their strength. Each game will be distributed digitally and Take-Two will continue to be their parent company.

While any job loss is terrible, I am interested in the future project. BioShock: Infinite sold millions of copies but I wonder if its size ultimately caused it harm. It was pretty and full of detail, at the expense of requiring a large team. The game had a story which respected your intelligence, you may not understand it and that was okay, but I have little confidence that it was anywhere close to the team's original vision. From budget constraints to the religious beliefs of development staff, we already know about several aspects of the game that changed significantly. Even Elizabeth, according to earlier statements from Ken Levine, was on the bubble because of her AI's complexity. I can imagine how difficult it is to resist those changes when seeing man-hour budgets. I cannot, however, imagine BioShock: Infinite without Elizabeth. A smaller team might help them concentrate their effort where it matters and keep artistic vision from becoming too dilute.

As for BioShock? The second part of the Burial at Sea DLC is said to wrap up the entire franchise. 2K will retain the license if they want to release sequels or spin-offs. I doubt Ken Levine will have anything more to do with it, however.

Oh PCMag, Console vs PC

Subject: Editorial, General Tech, Systems | February 12, 2014 - 10:45 PM |
Tagged: xbox, xbone, ps4, Playstation, pc gaming

PCMag, your source for Apple and gaming console coverage (I joke), wrote up an editorial about purchasing a gaming console. Honestly, they should have titled it, "How to Buy a Game Device" since they also cover the NVIDIA SHIELD and other options.

The entire Console vs PC debate bothers me, though. Neither side handles it well.

PS4-01.png

I will start by highlighting problems with the PC side, before you stop reading. Everyone says you can assemble your own gaming PC to save a little money. Yes, that is true and it is unique to the platform. The problem is that the public vision then becomes, "You must assemble and maintain your own gaming PC".

No.

No. No. No.

Some people prefer the support system provided by the gaming consoles. If it bricks, which some of them do a lot, you can call up the manufacturer for a replacement in a few weeks. The same could be absolutely true for a gaming PC. There is nothing wrong with purchasing a computer from a system builder, ranging from Dell to Puget Systems.

The point of gaming PC is that you do not need to. You can also deal with a small business. For Canadians, if you purchase all of your hardware through NCIX, you can add $50 to your order for them to ship your parts as a fully assembled PC, with Windows installed (if purchased). You also get a one-year warranty. The downside is that you lose your ability to pick-and-choose components from other retailers and you cannot reuse your old stuff. Unfortunately, I do not believe NCIX USA offers this. Some local stores may offer similar benefits, though. One around my area assembled for free.

The benefits of the PC is always choice. You can assemble it yourself (or with a friend). You can have a console-like experience with a system builder. You can also have something in-between with small businesses. It is your choice.

Most importantly, your choice of manufacturer does not restrict your choice in content.

5-depressing.png

As for the consoles, I cannot find a rock-solid argument that will always be better on them. If you are thinking about purchasing one, the available content should sway your decision. Microsoft will be the place to get "Halo". Sony will be the place to get "The Last of Us". Nintendo will be the place to get "Mario". Your money should go where the content you want is. That, and wherever your friends play.

But, of course, then you are what made the content exclusive.

Note: Obviously the PC has issues with proprietary platforms, too. Unlike the consoles, it could also be a temporary issue. The PC business model does not depend upon Windows. If it remains a sufficient platform? Great. If not, we have multiple options which range from Linux/SteamOS to Web Standards for someone to develop a timeless classic on.

Source: PCMag

(Phoronix) Intel Haswell iGPU Linux Performance in a Slump?

Subject: Editorial, General Tech, Graphics Cards | January 22, 2014 - 02:12 AM |
Tagged: linux, intel hd graphics, haswell

Looking through this post by Phoronix, it would seem that Intel had a significant regression in performance on Ubuntu 14.04 with the Linux 3.13 kernel. In some tests, HD 4600 only achieves about half of the performance recorded on the HD 4000. I have not been following Linux iGPU drivers and it is probably a bit late to do any form of in-depth analysis... but yolo. I think the article actually made a pretty big mistake and came to the exact wrong conclusion.

Let's do this!

7-TuxGpu.png

According to the article, in Xonotic v0.7, Ivy Bridge's Intel HD 4000 scores 176.23 FPS at 1080p on low quality settings. When you compare this to Haswell's HD 4600 and its 124.45 FPS result, this seems bad. However, even though they claim this as a performance regression, they never actually post earlier (and supposedly faster) benchmarks.

So I dug one up.

Back in October, the same test was performed with the same hardware. The Intel HD 4600 was not significantly faster back then, rather it was actually a bit slower with a score of 123.84 FPS. The Intel HD 4000 managed 102.68 FPS. Haswell did not regress between that time and Ubuntu 14.04 on Linux 3.13, Ivy Bridge received a 71.63% increase between then and Ubuntu 14.04 on Linux 3.13.

Of course, there could have been a performance increase between October and now and that recently regressed for Haswell... but I could not find those benchmarks. All I can see is that Haswell has been quite steady since October. Either way, that is a significant performance increase on Ivy Bridge since that snapshot in time, even if Haswell had a rise-and-fall that I was unaware of.

Source: Phoronix

Valve Virtual Reality Project at Steam Dev Days

Subject: Editorial, General Tech | January 20, 2014 - 11:35 PM |
Tagged: valve, virtual reality

Steam Dev Days was last week. At it, Valve announced a redesign of their Steam Controller and the removal of Steam Greenlight, among other notables. This was a press-free event, officially. Of course, due to Twitter and other social media platforms, everyone can decide to be a journalist on a whim. Things are going to leak out.

Other things are going to be officially released, too.

Valve-VR-heroish.jpg

Michael Abrash held a speech at the event discussing his virtual reality initiative within Valve. Both it and the Steam Machine project was in question when the company released Jeri Ellsworth and several other employees. After SteamOS was announced and castAR, Jeri's project at Valve, had its Kickstarter, it was assumed that Valve gave up on augmented reality. Despite this, they still kept Michael Abrash on their staff.

I would speculate, completely from an outside position, that two virtual reality groups existed at one point (at least to some extent). The project seems to have been sliced into two parts, one leaving with Jeri and one continuing with Michael. I seriously doubt this had anything to do with the "High School Cliques" that Jeri was referring to, however. She said it was "longtime staff" (Michael was newly hired around the end of Portal 2's development) and not within her hardware team.

valve-vr-specs.jpg

These are the specs that Valve has developed prototypes to.

1K x 1K per eye is about 100x less than they would like, however.

Ooo... 100 megapixels per eye.

I just believe it all shook out to an unfortunate fork in the project.

Politics aside, Michael Abrash sees virtual reality affecting "the entire entertainment industry" and will be well supported by Steam. I hope this would mean that Valve will finally drop the hammer on music and movie distribution. I have been expecting this ever since the Steam infrastructure was upgraded back in July 2011. Of course, neither server or software will solve content availability but I am still expecting them to take a shot at it. Remember that Valve is creating movies, could they have plans for virtual reality content?

valve-vr-room.jpg

The latest prototype of the Oculus Rift uses camera tracking for low-latency visibility.

This looks like Valve's solution.

The PDF slide deck is publicly available and each page includes the script he heavily followed. Basically, reading this is like being there, just less fun.

Source: Valve

Corsair Quantifies the Benefits of Overclocking

Subject: Editorial, General Tech, Graphics Cards, Processors, Memory, Systems | January 20, 2014 - 02:40 AM |
Tagged: corsair, overclocking

I rarely overclock anything and this is for three main reasons. The first is that I have had an unreasonably bad time with computer parts failing on their own. I did not want to tempt fate. The second was that I focused on optimizing the operating system and its running services. This was mostly important during the Windows 98, Windows XP, and Windows Vista eras. The third is that I did not find overclocking valuable enough for the performance you regained.

A game that is too hefty to run is probably not an overclock away from working.

intelupgrade.jpg

Thankfully this never took off...

Today, overclocking is easier and safer than ever with parts that basically do it automatically and back off, on their own, if thermals are too aggressive. Several components are also much less locked down than they have been. (Has anyone, to this day, hacked the locked Barton cores?) It should not be too hard to find a SKU which encourages the enthusiast to tweak some knobs.

But how much of an increase will you see? Corsair has been blogging about using their components (along with an Intel processor, Gigabyte motherboard, and eVGA graphics card because they obviously do not make those) to overclock. The cool part is they break down performance gains in terms of raising the frequencies for just the CPU, just the GPU, just the RAM, or all of the above together. This breakdown shows how each of the three categories contribute to the whole. While none of the overclocks are dramatic, Corsair is probably proud of the 5% jump in Cinebench OpenGL performance just by overclocking the RAM from 1600 MHz to 1866 MHz without touching the CPU or GPU.

It is definitely worth a look.

Source: Corsair

Live and Let Dvorak

Subject: Editorial, General Tech | January 19, 2014 - 11:46 PM |
Tagged: Keyboards, keyboard

Peter Bright down at Ars Technica wrote an editorial about the Lenovo ThinkPad X1 Carbon. His opinion is that keyboard developers should innovate in ways that "doesn't undermine expectations". Replacing a row of physical keys for a software-controlled touch strip is destructive because, even if the change proved invaluable, it would ultimately be inferior because it clashes with every other keyboard the user encounters. He then concludes with a statement that really should have directed his thesis.

Lenovo's engineers may be well-meaning in their attempts to improve the keyboard. But they've lost a sale as a result. The quest for the perfect laptop continues.

23-keyboard.jpg

That is the entire point of innovation! You may dislike how a feature interacts with your personal ecosystem and that will drive you away from the product. Users who purchased the laptop without considering the keyboard have the option of returning it and writing reviews for others (or simply put up with it). Users who purchased the laptop because of the keyboard are happy.

I mainly disagree with the article because it claims that it is impossible to innovate the keyboard in any way that affects the core layout. I actually disagree with it for two reasons.

My first issue is about how vague he is. His primary example of good keyboard innovation is the IBM ThinkPad 701c and its "butterfly keyboard". The attempt is to increase the keyboard size to exceed the laptop itself to make it more conventional. Conventional for who? How many people use primarily small laptops with shrunken keyboards compared to people who touch-type function keys?

The second critique leads from the first. The PC industry became so effective because every manufacturer tries to be a little different with certain SKUs to gain tiny advantages. There could have easily been a rule against touchscreen computers. Eventually someone hit it out of the park and found an implementation that was wildly successful to a gigantic market. The QWERTY design has weathered the storm for more than a century but there is no rule that it cannot shift in the future.

In fact, at some point, someone decided to add an extra row of function keys. This certainly could undermine the expectations of users who have to go between computers and electronic typewriters.

It will be tough, though. Keyboards have settled down and learning their layouts is a significant mental investment. There are several factors to consider when it comes to how successful a keyboard modification will become. Mostly, however, it will come down to someone trying and observing what happens. Do not worry about letting random ideas in because the bad ideas will show themselves out.

Basically the point is: never say never (especially not that vaguely).

Source: Ars Technica

SimCity Modding is Slightly Hypocritical

Subject: Editorial, General Tech | January 11, 2014 - 12:13 AM |
Tagged: SimCity, ea

Maxis and Electronic Arts recognize the hefty portion of SimCity's popularity as a franchise is due to its mod community. The current version could use all of the help it can get after its unfortunate first year. They have finally let the community take over... to some extent. EA is imposing certain rules upon the content creators. Most of them are reasonable. One of them can have unforeseen consequences for the LGBQT community. The first rule should apply to their expansion packs.

Starting at the end, the last three rules (#3 through #5) are mostly reasonable. They protect EA against potential malware and breaches of their EULA and Terms of Service. The fifth rule does begin to dip its toe into potential censorship but it does not really concern me.

harvest-moon-small.jpg

No-one can be "Best Friends" in North America.

The second rule, while mostly condemning illegal activity, does include the requirement that content remains within ESRB 10+ and PEGI 7. The problem with any content certification is that it limits the dialog between artists and society. In this case, references to same-sex topics (ex: Harvest Moon) in games may force a minimum T or M rating. A mod which introduces some story element where two Sims of the same gender go on a date or live together (again, like Harvest Moon) might result in interest groups rattling the ESRB's cage until EA draws a firm line on that specific topic.

EA is very good with the LGBQT community but this could get unnecessarily messy.

The first rule is a different story. It says that mods which affect the simulation for multiplayer games or features are not allowed (despite being the only official mode). They do not want a modification to give players an unfair advantage over the rest of the game's community.

You know, like maybe an airship which boosts "your struggling industry or commercial [districts]" and also brings in tourists and commuters without causing traffic on your connecting highway?

Maxis is still, apparently, exploring options for offline SimCity experiences. Even if they allow a server preference to not affect the global economy, mods would be able to be quarantined to those areas. Great, problem solved. Instead, it is somewhat left up to interpretation what is allowed. To make matters worse, the current examples of mods that we have are purely cosmetic.

SimCity is nowhere near as bad as Halo 2 Vista for its mod functionality (those mod tools were so hobbled that its own tutorial was impossible). It could actually be good. These are just areas for EA to consider and, hopefully, reconsider.

Source: Maxis

CES 2014: Valve And 13 Launch Partners Unveil Slew of Steam Machines

Subject: Editorial, General Tech | January 7, 2014 - 02:25 AM |
Tagged: valve, SteamOS, steambox, opinion, Gabe Newell, CES 2014, CES

Valve Co-Founder Gabe Newell took the stage at a press conference in Las Vegas last night to introduce SteamOS powered Steam Machines and the company's hardware partners for the initial 2014 launch. And it has been quite the launch thus far, with as many as 13 companies launching at least one Steambox PC.

The majority of Steam Machines are living room friendly Mini-ITX (or smaller) form factors, but that has not stopped other vendors from going all out with full-tower builds. The 13 hardware partners have all put their own spin on a SteamOS-powered PC, and by the second half of 2014, users will be able to choose from $500 SFF cubes to ~$1000 Mini-ITX builds with dedicated graphics, to powerhouse desktop PCs that have MSRPs up to $6,000 and multiple GPUs. In fact, aside from SteamOS and support for the Steam Controller, the systems do not share much else, offering up unique options–which is a great thing. 

For the curious, the 13 Steam Machine hardware vendors are listed below.

  1. Alienware
  2. Alternate
  3. CyberPowerPC
  4. Digital Storm
  5. Falcon Northwest
  6. Gigabyte
  7. iBuyPower
  8. Materiel.net
  9. Next
  10. Origin PC
  11. Scan Computers
  12. Webhallen
  13. Zotac

As luck would have it for those eager to compare all of the available options, the crew over at Ars Technica have put together a handy table of the currently-known specifications and pricing of each company's Steam Machines! Some interesting takeaways from the chart include the almost even division between AMD and NVIDIA dedicated graphics while Intel has a single hardware win with it's Iris Pro 5200 (Gigabyte BRIX Pro). On the other hand, on the CPU side of things, Intel has the most design wins with AMD having as many as 3 design wins versus Intel's 10 (in the best case scenario). The pricing is also interesting. While there are outliers that offer up very expensive and affordable models, the majority of Steam Machines tend to be closer to the $1000 mark than either the $500 or $2000+ price points. In other words, about the same amount of money for a mid-range DIY PC. This is not necessarily a bad thing, as users are getting decent hardware for their money, a free OS, and OEM warranty/support (and there is nothing stopping the DIYers from making their own Steamboxes).

A SFF Steambox (left) from Zotac and a full-tower SteamOS gaming desktop from Falcon Nothwest (right).

So far, I have to say that I'm more impressed than not with the Steam Machine launch which has gone off better than I had expected. Here's hoping the hardware vendors are able to come through at the announced price points and Valve is able to continue wrangling developer support (and to improve the planned game streaming functionality from a Windows box). If so, I think Valve and it's partners will have a hit on their hands that will help bring PC gaming into the living room and (hopefully) on par (at least) in the mainstream perspective with the ever-popular game consoles (which are now using x86 PC architectures).

What do you think about the upcoming crop of Steam Machines? Does SteamOS have a future? Let us know your thoughts and predictions in the comments below!

Coverage of CES 2014 is brought to you by AMD!

PC Perspective's CES 2014 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: Ars Technica