Gabe Newell gets Steamed up over piracy discussions

Subject: Editorial, General Tech | February 20, 2012 - 08:08 PM |
Tagged: valve, piracy, Gabe Newell

Ben Kuchera of Penny Arcade caught an interview with Valve Software’s managing director and co-founder, Gabe Newell. The topics were quite typical for a Gabe Newell interview and involve working at Valve, the future of gaming, and DRM. Gabe also joined the beard club; welcome Gabe!


Photo Credit: Giant Bomb

A little over halfway through the interview, Penny Arcade asked Gabe whether they believe that they sidestepped the problems of used games and piracy with Steam. Gabe instead responded to the premise of the question, rather than the question itself:

You know, I get fairly frustrated when I hear how the issue is framed in a lot of cases. To us it seems pretty obvious that people always want to treat it as a pricing issue, that people are doing this because they can get it for free and so we just need to create these draconian DRM systems or ani-piracy(sic) systems, and that just really doesn’t match up with the data.

This quote echoes a problem I have had with the piracy discussion for quite some time. The main problem with the concept of piracy is that people wish to frame it in a context that seems intuitive to them rather than experiment to discover what actually occurs. Piracy is seen as a problem which must be controlled. This logic is fundamentally flawed because piracy is not itself a problem but rather a measurement of potential problems.

Gabe continues with an anecdote of a discussion between a company who used third-party DRM for their title and himself:

Recently I was in a meeting and there’s a company that had a third party DRM solution and we showed them look, this is what happens, at this point in your life cycle your DRM got hacked, right? Now let’s look at the data, did your sales change at all? No, your sales didn’t change one bit. Right? So here’s before and after, here’s where you have DRM that annoys your customers and causing huge numbers of support calls and in theory you would think that you would see a huge drop off in sales after that got hacked, and instead there was absolutely no difference in sales before or after. You know, and then we tell them you actually probably lost a whole bunch of sales as near as we can tell, here’s how much money you lost by bundling that with your product.

Gabe highlights what a business should actually be concerned with: increasing your measurement of revenue and profits, rather than decreasing your measurement of piracy. You as a company could simply not develop products and completely kill piracy, but that would also entirely kill your revenue as you would have nothing to gain revenue from.

Before we begin to discuss piracy, the very first step is that we need to frame it as what it really is: a measurement. While violating terms of a license agreement is in fact wrong, if you focus your business on what is right or wrong you will go broke.

If you believe that there is value in preventing non-paying users from using your product then you will only hurt yourself (and if SOPA/PIPA taught us anything, innocent adjacent companies). It is possible that the factors which contribute to piracy also contribute to your revenue positively as well as potentially negatively. It is also entirely possible that increased piracy could be a measurement of a much bigger problem: your business practices.

You know, it’s a really bad idea to start off on the assumption that your customers are on the other side of some sort of battle with you. I really don’t think that is either accurate or a really good business strategy ((…)) we’ve run all of these experiments, you know, this has been going on for many years now and we all can look at what the outcomes are and there really isn’t – there are lots of compelling instances where making customers – you know, giving customers a great experience and thinking of ways to create value for them is way more important than making it incredibly hard for the customers to move their products from one machine to another.

Source: Penny Arcade

Upgrade your PS3 with an SSD... or just play on the PC.

Subject: General Tech, Storage | February 20, 2012 - 05:53 PM |
Tagged: ssd, PS3

There is an interesting article down at Eurogamer which covers the possible benefits of upgrading a PS3 with a solid state drive. Those who know me can guess that I am snickering while crossing another perceived advantage off of my console versus PC list. Still, if for some reason you want to play exclusives to a disposable platform that are only exclusive because you let them be and you desire to upgrade your experience, check out the interesting article.


Isn’t “not needing to do this” the whole reason for having a console?

Consoles titles are naturally becoming as hard drive-intensive as they are allowed to be due to their abysmally small quantity of RAM. Developers have been using tricks to increase the usefulness of their available RAM such as disallowing split screen, streaming content as needed, and rendering at low resolutions.

The first Halo, for instance, was famous for their quick load times. The load speed is due in part to having their game assets copied multiple times on the disk which allows choice in loading whichever copy requires the least seek time to access. Also, having a hard drive helped Halo too.

The article itself focuses mostly on RAGE and Skyrim due to their harsh issues with lag and pop-in. Skyrim has had known issues with getting progressively worse as time progressed. This issue was mostly corrected in version 2.03 as was also demonstrated in Eurogamer’s article making an SSD almost unnecessary, but prior to 2.03 an SSD surprisingly helped substantially with the problem. It should also be no surprise that throwing faster storage at RAGE helped immensely just as it does on the PC.

If you were considering upgrading to a faster drive for your Sony console be sure to check out Eurogamer -- or the new Hardware Leaderboard and just play on the PC.

Source: Eurogamer

Don't assume the price dictates the audio quality; try real studio quality headsets from Audio-Technica

Subject: General Tech | February 20, 2012 - 03:02 PM |
Tagged: audiophile, headset, audio, audio-technica, ATH-A900

At an MSRP of $250, the Audio-Techinca ATH-A900 headphones are not intended for the casual gamer and as you can tell by the 1/4" connector they are designed for someone who owns a high end microphone amp.  On the other hand if you need studio quality audio and will be wearing the headsets for hours at a time then the high end features built into these headphones are worth the investment.  The 53mm drivers are in an enclosed earcup which helps bring the bass up close and personal and are designed with much sturdier materials than other popular headsets.  To contrast the difference [H]ard|OCP tried Beats by Dre Studio headphones which cost more than the ATH-A900s and in every case they felt the ATH-A900s were vastly superior.  As far as [H] is concerned the two headsets aren't even in the same class.


"Audio-Technica's open headphones are known to gamers for the wearing comfort and huge soundstage that these provide, but the open back models simply lack bass and isolation. Today, we will see if a pricey pair of the company's closed back audiophile headphones can offer the compromise many of you are looking for in PC audio."

Here is some more Tech News from around the web:

Audio Corner


Source: [H]ard|OCP

You can't always write a chord ugly enough to say what you want to say ...

Subject: General Tech | February 20, 2012 - 02:15 PM |
Tagged: DIY, model m, input, frank zappa, blue alps sliders

Sometimes hacks and mods are done to save you time and money or possibly both but other times you find yourself stuck in the position of Frak Zappa and cannot find a giraffe filled with whipped cream and have to make it yourself.  Such is the case with this completely made custom keyboard described at Hack a Day, in which every part was either custom ordered or made by the designer themselves.  None of the keys seem to be in their accustomed places and your thumbs will get a workout from all of those keys mounted in the centre of the board but for a programmer this could be the perfect design.  It has taken over a year to build and likely cost more than a mass produced designed keyboard but if you want something done right ...


"[dmw] posted a pseudo-build log over at the geekhack keyboard forums. Every single part of this keyboard is custom-made. The key caps were made by Signature Plastics, the case was made by Shapeways, and the custom PCB for the key switches came directly from Express PCB. The key switches are blue Alps sliders (one of the best key switches available) with a few white Alps switches taken from an old Apple keyboard."

Here is some more Tech News from around the web:

Tech Talk


Source: Hack a Day

Intel Inside. Thecus' Next-Gen NAS introduced: N4800, N2800

Subject: General Tech, Systems, Storage | February 20, 2012 - 01:53 PM |
Tagged: Thecus, NAS

Home users are starting to look at Network Attached Storage (NAS) devices to serve their home media needs. Also popular are products which allow you to browse the internet and play media on your TV. Just announced by Thecus are two NAS devices which fit both roles and many others. The N2800 contains a built-in media card reader while the N4800 has a built in mini Uninterruptable Power Supply (UPS), OLED status screen, and a second USB3.0 port.


I hear they're a NASty bunch...

The obvious selling features of the two devices are the inclusion of HDMI output to enable the above roles as well as an updated 3rd Generation Intel Atom CPU D2700. The D2700 is a 2.13GHz Dual Core and hyper threaded Intel Atom processor manufactured at 32nm.

Check out the highlights of their press release below.

02/20/2012- As part of the Intel Embedded Alliance, Thecus has precedence and access to a multitude of Intel prototypes and the latest technologies. Working on those products for months now, Thecus is delighted to finally release its Vision Series.

The new N2800 and N4800 are going to be some of the first Intel(r) Atom(tm) D2700 based NAS! They will set the standard for what's best in the market to help you build a true multimedia center: USB 3.0, Dual Gigabit Ports, SD Card reader (N2800), Mini-UPS (N4800), etc.

And the most important feature is the HDMI output. With Thecus Local Display module, it's now possible to connect the NAS directly to a monitor and control it through USB mouse/keyboard. Playing HD movies, browsing the web, controlling the NAS... everything is now possible directly from your TV! Thanks to this feature, Thecus is now creating a new standard among the NAS industry.

About Thecus(r)

Thecus(r) Technology Corp. specializes in IP Storage Server and Network Video Recorder solutions. The company was established in 2004 with the mission to make technology that is as transparent as it is easy-to-use and products that are not only the best on the market, but are accessible to experts and novices alike. Combining a world-class R&D team highly experienced in storage hardware and software development with a keen customer focus, Thecus(r) stays close to the market to develop high-quality products to fulfill the storage and surveillance needs of today's world.

Source: Thecus

Of Near Threshold Voltage and Atomic Transistors

Subject: General Tech | February 20, 2012 - 01:53 PM |
Tagged: NTV, near threshold voltage, transistor, pentium, qubit

The eyes of the world are on the 22nm Ivy Bridge chip that Intel has slowly been giving us details of but there is also something interesting happening at 32nm with the world's most repurposed Intel CPU.  Once again the old Pentium core has been dusted off and modified to showcase new Intel technology, in this case Near Threshold Voltage operations.  In this case the Threshold refers to the amount of power needed to flip a bit on a processor, what you would be used to seeing as VCC and is the reason those dang chips get so toasty.   Much in the way that SpeedStep and other energy savings technologies reduce the operating frequency of an underloaded processor, Intel has tied the amount of voltage to the frequency and lowers the power requirements along with the chips speed.  The demonstration model that they showed The Register varied from a high end of 1.2 volts at 915MHz to a mere 280 millivolts at 3MHz and down to 2 millivolts in sleep.  By scaling the power consumption Intel may have found a nice middle group between performance and TDP to keep ARM from making major inroads into the server room, if they can pull it off with more modern processors.   They also showed off a solar powered CPU which might be handy on a cell phone but seems of limited commercial value in the short term as well as a

Keeping with the theme of small, The Register also has news of research which has created a working transistor out of a single phosphorus atom, an atomic radius of 0.110nm for those who like some scale with their transistors.  The trick was the temperature; seeing as it is a measure of energy expressed as movement (to put it ridiculously simply) you need low temperatures to keep the atoms from moving more than 10nm.  At -196°C the atom was stable enough for its position to be accurately predicted which is absolutely necessary if you plan to use the atom as a qubit.  Overclocking is going to be difficult.


"The threshold voltage is the point at which transistors turn on and conduct electricity. If you can flip bits near this threshold, instead of using much larger swing that is typically many times this threshold voltage to turn zeros into ones and vice versa, then you can save a lot of power."

Here is some more Tech News from around the web:

Tech Talk


Source: The Register

Microsoft Allegedly Overhauling SkyDrive With Increased Paid Storage, Applications, and Other Goodies

Subject: General Tech | February 20, 2012 - 11:12 AM |
Tagged: storage, skydrive, paid storage, free, cloud backup, bitlocker, app integration

Update: Some of the rumors have been confirmed by Microsoft in a blog post, though the individual file size increase was a bit off.  Microsoft will be allowing files up to 2 GB in size as compared to the rumored 300 MB file sizes.

Every so often, I run across a rumor that sounds almost too good to be true. On the other hand, it sounds so good that I just can't stop myself from being excited about it. Over the weekend, I saw an article that talked about Windows Live Skydrive offering paid storage tiers and I now really want this to come to fruition.

For those curious, SkyDrive is Microsoft's "cloud storage" service that gives users 25 GB of free storage space to hold files. There are some restrictions with the individual file size (that can be worked around if you really want to backup a home movie for example), but otherwise it is a boatload of space for free and saved my butt when the, um, "formatting catastrophe" of 2010 happened by having most of my digital photos backed up!


SkyDrive as it is now, funny old photos and all!

The service is connected to your Microsoft Live or hotmail account and can be accessed by navigating to There are some usability issues with the service; however, including the fact that it's a pain in the rear to upload more than one or two files. The website doesn't make it easy to batch upload, say, a folder or folders only a file at a time. Further, it is not nearly as easy to manage those files once they are in the SkyDrive as it should be. Now, if you use IE, the SkyDrive website will allow you to upload multiple files easier; however, the other browsers are left without a way to do it. There is also the aforementioned individual file size limit of 100 MB per file.

The exciting bit about the rumors and (allegedly) leaked screen shots is that if they stay true the service is about to get a whole lot better by offering cheap storage and fixing many of the issues people have had with the service.


The leaked image

On the storage front, Microsoft is allegedly adding new paid storage tiers and increasing the individual file size limit to 300 MB (from 100 MB). Among the new plans are 20 GB, 50 GB, and 100 GB offerings (which is in addition to the free 25 GB of space) for $10, $25, and $50 a year respectively. Not a bad price at all in my opinion! Assuming the pricing is accurate, they are vastly undercutting the competition. Dropbox, for example, is currently offering 50 GB for $99 a year and 100 GB for $199 per year. Granted, Dropbox has syncing functionality, no individual file size limit, and is a much easier to use service with an established user base, but at these prices the Microsoft offering is likely to win over many people who just want some cheap off site backup space!


Paid Storage Space SkyDrive (Price Per Year) Dropbox (Price Per Year)
20 GB $10 n/a
50 GB $25 $99
100 GB $50 $199

Dropbox pricing just for comparision.

While there are currently mobile applications for Windows Phone and Apple iOS smart phones, users must turn to third party explorer extensions (like SDExplorer) for Windows OS integration on the desktop. More leaked images seem to suggest that Microsoft will be launching applications for Windows and Mac operating systems to better integrate SkyDrive into the OS (and hopefully enable easier cloud file management). SDExplorer is a third party extension that I used to upload all my photos to SkyDrive and it allows mounting the SkyDrive account as a "hard drive" under Windows Explorer. Unfortunately, it costs money to get the full feature set, so hopefully Microsoft can provide similar (or more) features for free with their OS.


In addition, Microsoft will allegedly be adding URL shortening for public and shared SkyDrive file links as well as the ability to share files to Twitter and Facebook from within the SkyDrive website. For the latter, there are already APIs and Microsoft is likely just leveraging them to make sharing files a bit more convenient. On the other hand, Microsoft will be using their own URL shortening service via the domain instead of integrating with an existing service.

As a user of Libre Office (the fork off of what was once Open Office), I deal a lot with .odt files, which is the open document standard. For users of Microsoft's web application of Office, they have been forced to save files to the Microsoft standards; however, rumors suggest that the service will soon support creating and saving to the .odt, .odp, and .ods document formats. If you are using Office Web Apps, then you are already likely fairly integrated into the Office universe, and this feature won't mean much. On the other hand, this will help out others who may need to edit one of the Libre Office created documents backed up to their SkyDrive on the go. Better compatibility is always a step in the right direction for MS after all.

Last up on the rumor pile for SkyDrive is the ability to store BitLocker recovery keys directly to SkyDrive so that you have a backup should you ever forget your encryption password. The flip side of that convenience feature is that it provides another attack vector should someone attempt to get their hands on your encryption keys, and it is a location that you must depend on someone else to keep secure. As weird as it may sound, you might want to encrypt your encryption key before uploading it to any "cloud" service (heh), just in case. Still, it's good to have options.

More photos and information on the alleged leaks can be found here and here.

Needless to say, there was quite the leak this weekend over Microsoft SkyDrive features!  It is a lot to take in, but in my opinion it sounds like they are really giving the service special attention it needs to get it into fighting form.  And if the rumors hold true it will be much more comptetitive with other cloud storage backup options as a result of the overhaul.  I'm excited about this, obviously, but what about you?  Do you use SkyDrive?


Wi-Fi on Rosepoint SoC die. Intel flexes before ARM wrestle.

Subject: General Tech, Processors, Systems, Mobile, Shows and Expos | February 20, 2012 - 01:50 AM |
Tagged: Rosepoint, ISSCC 2012, ISSCC, Intel

If there is one thing that Intel is good at, it is writing a really big check to go in a new direction right when absolutely needed. Intel has released press information on what should be expected from their presence at the International Solid-State Circuits Conference which is currently in progress until the 23rd. The headliner for Intel at this event is their Rosepoint System on a Chip (SoC) which looks to lower power consumption by rethinking the RF transceiver and including it on the die itself. While the research has been underway for over a decade at this point, pressure from ARM has pushed Intel to, once again, throw money at R&D until their problems go away.

Intel could have easily trolled us all and have named this SoC "Centrino".

Almost ten years ago, AMD had Intel in a very difficult position. Intel fought to keep clock-rates high until AMD changed their numbering scheme to give proper credit to their higher performance-per-clock components. Intel dominated, legally or otherwise, the lower end market with their Celeron line of processors.

AMD responded with series of well-timed attacks against Intel. AMD jabbed Intel in the face and punched them in the gut with the release of the Sempron processor line nearby filing for anti-trust against Intel to allow them to more easily sell their processors in mainstream PCs.

At around this time, Intel decided to entirely pivot their product direction and made plans to take their Netburst architecture behind the shed. AMD has yet to recover from the tidal wave which the Core architectures crashed upon them.


Intel wishes to stop assaulting your battery indicator.

With the surge of ARM processors that have been fundamentally designed for lower power consumption than Intel’s x86-based competition, things look bleak for the expanding mobile market. Leave it to Intel to, once again, simply cut a gigantic check.

Intel is in the process of cutting power wherever possible in their mobile offerings. To remain competitive with ARM, Intel is not above outside-the-box solutions including the integration of more power-hungry components directly into the main processor. Similar to NVIDIA’s recent integration of touchscreen hardware into their Tegra 3 SoC, Intel will push the traditionally very power-hungry Wi-Fi transceivers into the SoC and supposedly eliminate all analog portions of the component in the process.

I am not too knowledgeable about Wi-Fi transceivers so I am not entirely sure how big of a jump Intel has made in their development, but it appears to be very significant. Intel is said to discuss this technology more closely during their talk on Tuesday morning titled, “A 20dBm 2.4GHz Digital Outphasing Transmitter for WLAN Application in 32nm CMOS.”

This paper is about a WiFi-compliant (802.11g/n) transmitter using Intel’s 32nm process and techniques leveraging Intel transistors to achieve record performance (power consumption per transmitted data better than state-of-the art). These techniques are expected to yield even better results when moved to Intel’s 22nm process and beyond.

What we do know is that the Rosepoint SoC will be manufactured at 32nm and is allegedly quite easy to scale down to smaller processes when necessary. Intel has also stated that while only Wi-Fi is currently supported, other frequencies including cellular bands could be developed in the future.

We will need to wait until later to see how this will affect the real world products, but either way -- this certainly is a testament to how much change a dollar can be broken into.

Source: Intel

Blender 2.62 released -- getting better all the time.

Subject: General Tech | February 18, 2012 - 11:11 PM |
Tagged: Blender

So what do you do if you blew all of your money on a professional workstation and have nothing left for software?

Well, you get a better business strategy.

Occasionally there are open source products which rival or exceed the usability of the packaged products. I first started learning 3D modeling on a perpetual educational license of Maya and 2D art on a combination of Photoshop and GIMP. While I could not manage to eliminate Photoshop from my workflow I found the switch from Maya to pre-release Blender + Bmesh builds felt like an upgrade -- not just a manageable change. Blender is rapidly getting even better with each new bi-monthly version such as their just-released 2.62 update.

Blender 2-62.png

Flower power?

(Photo Credit: Blender Project / Alexey Lugovoy)

Blender decided to introduce many new features throughout the 2.6 series of releases by developing them in parallel and merging branches into the release trunk as they became worthy. This release yielded a new renderer known as “Cycles”, new UV unwrapping tools, reprogrammed Boolean tools, and motion tracking features.

Personally, I look most forward to the official 2.63 release scheduled for April. It appears as if the secrecy surrounding the development status of Bmesh was lifted and its introduction to the canon application is pinned to that release. Prior to pre-release Bmesh builds, Blender just felt too distant to the style of modeling which I developed in my years of using Maya. Since the addition of Bmesh, Blender was able to fit all of the essential roles which Maya satisfied and avoided some of my long-standing gripes with Autodesk’s $3000 package in the process. I was not even referring to its cost.

By the end of the 2.6 line, I expect that Blender will be an upgrade for users of many 3D applications. Check it out, for free, at their website.

Source: Blender

Intel urges you to program better now, not the same -- later.

Subject: General Tech, Processors, Mobile | February 18, 2012 - 09:06 PM |
Tagged: Intel, mobile, developer

Clay Breshears over at Intel posted about lazy software optimization over on the Intel Software Blog. His post is a spiritual resurrection of the over seven year’s old article by Herb Sutter, “The Free Lunch is Over: A Fundamental Turn Toward Concurrency in Software.” The content is very similar, but the problem is quite different.

The original 2004 article urged developers to heed the calls for the multi-core choo choo express and not hang around on the single core platform (train or computing) waiting for performance to get better. The current article takes that same mentality and applies it to power efficiency. Rather than waiting for hardware that has appropriate power efficiency for your application, learn techniques to bring your application into your desired power requirements.


"I believe your program is a little... processor heavy."

The meat of the article focuses on the development of mobile applications and the concerns that developers should have with battery conservation. Of course there is something to be said about Intel promoting mobile power efficiency. While developers could definitely increase the efficiency of their code, there is still a whole buffet of potential on the hardware side.

If you are a developer, particularly of mobile or laptop applications, Intel has an education portal for best power efficiency practices on their website. Be sure to check it out and pick up the tab once in a while, okay?

Source: Intel Blog