Steam Family Sharing Is Different From Xbox One

Subject: Editorial, General Tech | September 11, 2013 - 08:31 PM |
Tagged: xbox one, valve, steam

I know there will be some comparison between the recent Steam Family Sharing announcement and what Microsoft proposed, to a flock of airborne tomatoes I might add, for the Xbox One. Steam integrates some level of copy discouragement by accounts which identify a catalog of content with an individual. This account, user name and password, tends to be more precious to the licensee than a physical disk or a nondescript blob of bits.

The point is not to prevent unauthorized copying, however; the point is to increase sales.

SteamRequestingAccess.jpg

Account information is used, not just for authentication, but to add value to the service. If you log in to your account from a friend's computer, you have access to your content and it can be installed to their machine. This is slightly more convenient, given a fast internet connection, than carrying a DRM-free game on physical storage (unless, of course, paid licenses are revoked or something). Soon, authorized friends will also be able to borrow your library when you are not using it if their devices are authorized by your account.

Microsoft has a similar authentication system through Xbox Live. The Xbox One also proposed a sharing feature with the caveat that all devices would need a small, few kilobyte, internet connection once every 24 hours.

The general public went mental.

The debate (debacle?) between online sharing and online restrictions saw fans of the idea point to the PC platform and how Steam has similar restrictions. Sure, Steam has an offline mode, but it is otherwise just as restrictive; Valve gets away with it, Microsoft should too!

It is true, Microsoft has a more difficult time with public relations than Valve does with Steam. However, like EA and their troubles with Origin, they have shown themselves to be less reliable than Valve over time. When a purchase is made on Steam, it has been kept available to the best of their abilities. Microsoft, on the other hand, bricked the multiplayer and online features of each and every original Xbox title. Microsoft did a terrible job explaining how the policy benefits customers, and that is declared reason for the backlash, but had they acquired trust from their customers over the years then this might have just blown over. Even still, I find Steam Family Sharing to be a completely different situation from what we just experienced in the console space.

So then, apart from banked good faith, what is the actual difference?

Steam is not the only place to get PC games!

Games could be purchased at retail or competing online services such as GoG.com. Customers who disagree with the Xbox One license have nowhere else to go. In the event that a game is available only with restrictive DRM, which many are, the publisher and/or developer holds responsibility. There is little stopping a game from being released, like The Witcher 3, DRM-free at launch and trusting the user to be ethical with their bits.

Unfortunately for Xbox Division, controlling the point of sale is how they expect to recover the subsidized hardware. Their certification and retail policies cannot be circumvented because that is their business model: lose some money acquiring customers who then have no choice but to give you money in return.

This is not the case on Windows, Mac, and Linux. It is easy to confuse Steam with "PC Gaming", however, due to how common it is. They were early, they were compelling, but most of all they were consistent. Their trust was earned and, moreover, is not even required to enjoy the PC platform.

Source: Steam
Author:
Subject: Editorial
Manufacturer: AMD

Retiring the Workhorses

There is an inevitable shift coming.  Honestly, this has been quite obvious for some time, but it has just taken AMD a bit longer to get here than many have expected.  Some years back we saw AMD release their new motto, “The Future is Fusion”.  While many thought it somewhat interesting and trite, it actually foreshadowed the massive shift from monolithic CPU cores to their APUs.  Right now AMD’s APUs are doing “ok” in desktops and are gaining traction in mobile applications.  What most people do not realize is that AMD will be going all APU all the time in the very near future.

AMD_Fusion.jpg

We can look over the past few years and see that AMD has been headed in this direction for some time, but they simply have not had all the materials in place to make this dramatic shift.  To get a better understanding of where AMD is heading, how they plan to address multiple markets, and what kind of pressures they are under, we have to look at the two major non-APU markets that AMD is currently hanging onto by a thread.  In some ways, timing has been against AMD, not to mention available process technologies.

Click here to read the entire editorial!

 

PC Perspective Hardware Workshop 2013 @ Quakecon 2013 in Dallas, TX

Subject: Editorial, General Tech | August 6, 2013 - 02:30 PM |
Tagged: workshop, video, streaming, quakecon, prizes, live, giveaways

UPDATE: Did you miss the live event?  Check out the replay below!  Thanks to everyone that helped make it possible and see you next year!

It is that time of year again: another installment of the PC Perspective Hardware Workshop!  Once again we will be presenting on the main stage at Quakecon 2013 being held in Dallas, TX August 1-4th.  

workshop_logo.png

Main Stage - Quakecon 2013

Saturday, August 3rd, 12:30pm CT

Our thanks go out to the organizers of Quakecon for allowing us and our partners to put together a show that we are proud of every year.  We love giving back to the community of enthusiasts and gamers that drive us to do what we do!  Get ready for 2 hours of prizes, games and raffles and the chances are pretty good that you'll take something out with you - really, they are pretty good!

Our thanks for this year's workshop logo goes to John Pastor!!

Our primary partners at the event are those that threw in for our ability to host the workshop at Quakecon and for the hundreds of shirts we have ready to toss out!  Our thanks to NVIDIA, Western Digital and Corsair!!

nvidia_logo_small.png

wdc_logo_small.png

corsair_logo_small.png

Live Streaming

If you can't make it to the workshop - don't worry!  You can still watch the workshop live on our page right here as we stream it over one of several online services.  Just remember this URL: http://pcper.com/workshop and you will find your way!

 

PC Perspective LIVE Podcast and Meetup

We are planning on hosting any fans that want to watch us record our weekly PC Perspective Podcast (http://pcper.com/podcast) on Wednesday evening in our meeting room at the Hilton Anatole.  I don't yet know exactly WHEN or WHERE the location will be, but I will update this page accordingly on Wednesday July 31st when we get the data.  You might also consider following me on Twitter for updates on that status as well.

After the recording, we'll hop over the hotel bar for a couple drinks and hang out.  We have room for at leaast 50-60 people to join us in the room but we'll still be recording if just ONE of you shows up.  :)

 

Prize List (will continue to grow!)

Continue reading to see the list of prizes for the workshop!!!

Qualcomm works out their ARMs and not their core?

Subject: Editorial, General Tech, Processors, Mobile | August 3, 2013 - 07:21 PM |
Tagged: qualcomm, Intel, mediatek, arm

MediaTek, do you even lift?

According to a Taiwan Media Roundtable transcript, discovered by IT World, Qualcomm has no interest, at least at the moment, in developing an octo-core processor. MediaTek, their competitor, recently unveiled an eight core ARM System on a Chip (SoC) which can be fully utilized. Most other mobile SoCs with eight cores function as a fast quad-core and a slower, but more efficient, quad-core processor with the most appropriate chosen for the task.

qualcomm.png

Anand Chandrasekher of Qualcomm believes it is desperation.

So, I go back to what I said: it's not about cores. When you can't engineer a product that meets the consumers' expectations, maybe that’s when you resort to simply throwing cores together. That is the equivalent of throwing spaghetti against the wall and seeing what sticks. That's a dumb way to do it and I think our engineers aren't dumb.

The moderator, clearly amused by the reaction, requested a firm clarification that Qualcomm will not launch an octo-core product. A firm, but not clear, response was given, "We don't do dumb things". Of course they would not commit to swearing off eight cores for all eternity, at some point they may find core count to be their bottleneck, but that is not the case for the moment. They will also not discuss whether bumping the clock rate is the best option or whether they should focus on graphics performance. He is just assured that they are focused on the best experience for whatever scenario each product is designed to solve.

And he is assured that Intel, his former employer, still cannot catch them. As we have discussed in the past: Intel is a company that will spend tens of billions of dollars, year over year, to out-research you if they genuinely want to play in your market. Even with his experience at Intel, he continues to take them lightly.

We don't see any impact from any of Intel's claims on current or future products. I think the results from empirical testers on our products that are currently shipping in the marketplace is very clear, and across a range of reviewers from Anandtech to Engadget, Qualcomm Snapdragon devices are winning both on experience as well as battery life. What our competitors are claiming are empty promises and is not having an impact on us.

Qualcomm has a definite lead, at the moment, and may very well keep ahead through Bay Trail. AMD, too, kept a lead throughout the entire Athlon 64 generation and believed they could beat anything Intel could develop. They were complacent, much as Qualcomm sounds currently, and when Intel caught up AMD could not float above the sheer volume of money trying to drown them.

Then again, even if you are complacent, you may still be the best. Maybe Intel will never get a Conroe moment against ARM.

Your thoughts?

Source: IT World

NVIDIA CloudLight: White Paper for Lighting in the Cloud

Subject: Editorial, General Tech | August 3, 2013 - 04:03 PM |
Tagged: nvidia, CloudLight, cloud gaming

Trust the cloud... be the cloud.

The executives on stage might as well have waved their hands while reciting that incantation during the announcement of the Xbox One. Why not? The audience would have just assumed Don Mattrick was trying to get some weird Kinect achievement on stage. You know, kill four people with one laser beam while trying to sink your next-generation platform in a ranked keynote. 50 Gamerscore!

cloudlight.png

Microsoft stated, during and after the keynote, that each Xbox One would have access to cloud servers for certain processing tasks. Xbox Live would be receiving enough servers such that each console could access three times its performance, at launch, to do... stuff. You know, things that are hard to calculate but are not too dependent upon latency. You know what we mean, right?

Apparently Microsoft did not realize that was a detail they were supposed to sell us on.

In the mean time, NVIDIA has been selling us on offloaded computation to cloud architectures. We knew Global Illumination (GI) was a very complicated problem. Most of the last couple decades has been progressively removing approximations to what light truly does.

CloudLight is their research project, presented at SIGRAPH Asia and via Williams College, to demonstrate server-processed indirect lighting. In their video, each of the three effects are demonstrated at multiple latencies. The results look pretty good until about 500ms which is where the brightest points are noticeably in the wrong locations.

cloudlight-2.jpg

Again, the video is available here.

The three methods used to generate indirect lighting are: irradiance maps, where lightmaps are continuously calculated on a server and streamed by H.264; photons, which raytraces lighting for the scene as previous rays expire and streams only the most current ones to clients who need it; and voxels, which stream fully computed frames to the clients. The most interesting part is that as you add more users, in most cases, server-processing remains fairly constant.

It should be noted, however, that each of these demonstrations only moved the most intense lights slowly. I would expect an effect such as switching a light on in an otherwise dark room would create a "pop-in" effect if it lags too far behind user interaction or the instantaneous dynamic lights.

That said, for a finite number of instant switches, it would be possible for a server to render both results and have the client choose the appropriate lightmap (or the appropriate set of pixels from the same, large, lightmap). For an Unreal Tournament 3 mod, I was experimenting with using a Global Illumination solver to calculate lighting. My intention was to allow users to turn on and off a handful of lights in each team's base. As lights were shot out or activated by a switch, the shader would switch to the appropriate pre-rendered solution. I would expect a similar method to work here.

What other effects do you believe can withstand a few hundred milliseconds of latency?

Source: NVIDIA
Author:
Subject: Editorial
Manufacturer: Quakecon

The Densest 2.5 Hours Imaginable

John Carmack again kicked off this year's Quakecon with an extended technical discussion about nearly every topic bouncing around his head.  These speeches are somewhat legendary for the depth of discussion on what are often esoteric topics, but they typically expose some very important sea changes in the industry, both in terms of hardware and software.  John was a bit more organized and succinct this year by keeping things in check with some 300 lines of discussion that he thought would be interesting for us.
 
Next Generation Consoles
 
John cut to the chase and started off the discussion about the upcoming generation of consoles.  John was both happy and sad that we are moving to a new generation of products.  He feels that they really have a good handle on the optimizations of the previous generation of consoles to really extract every ounce of performance and create some interesting content.  The advantages of a new generation of consoles are very obvious, and that is particularly exciting for John.
 
31978_06_pre_orders_for_next_gen_xbox_one_controllers_and_headsets_now_open_full.jpg
 
The two major consoles are very, very similar.  There are of course differences between the two, but the basis for the two are very much the same.  As we well know, the two consoles feature APUs designed by AMD and share a lot of similarities.  The Sony hardware is a bit more robust and has more memory bandwidth, but when all is said and done, the similarities outweigh the differences by a large margin.  John mentioned that this was very good for AMD, as they are still in second place in terms of performance from current architectures as compared to Intel and their world class process technology.
 
Some years back there was a thought that Intel would in fact take over the next generation of consoles.  Larrabee was an interesting architecture in that it melded x86 CPUs with robust vector units in a high speed fabric on a chip.  With their prowess in process technology, this seemed a logical move for the console makers.  Time has passed, and Intel did not execute on Larrabee as many had expected.  While the technology has been implemented in the current Xeon Phi product, it has never hit the consumer world.
 

Mozilla Labs: Give Advertisers Only... Exactly What They Want

Subject: Editorial, General Tech | July 31, 2013 - 08:03 PM |
Tagged: Privacy, mozilla, DNT

Mozilla Labs is researching a new approach to the problem of privacy and targeted advertising: allow the user to provide the data that honest advertisers intend to acquire via tracking behavior. The hope is that users who manage their own privacy will not have companies try to do it for them.

Internet users are growing concerned about how they are tracked and monitored online. Crowds rally behind initiatives, such as Do Not Track (DNT) and neutering the NSA, because of an assumed promise of privacy even if it is just superficial.

do not hurt.jpg

DNT, for instance, is a web developer tool permitting honest sites to be less shy when considering features which make privacy advocates poop themselves and go to competing pages. Users, who were not the intended audience of this feature, threw a fit because it failed to satisfy their privacy concerns. Internet Explorer, which is otherwise becoming a great browser, decided to break the standard by not providing the default, "user has not specified", value.

Of course, all this does is hands honest web developers a broken tool; immoral and arrogantly amoral sites will track anyway.

Mozilla Labs is currently investigating another solution. We could, potentially, at some point, see an addition to Firefox which distills all of the information honest websites would like to know into a summary which users could selectively share. This, much like DNT, will not prevent companies or other organizations from tracking you but rather give most legitimate situations a fittingly legitimate alternative.

All of this data, such as history and geolocation, is already stored by browsers as a result of how they operate. This concept allows users to release some of this information to the sites they visit and, ideally, satisfy both parties. Maybe then, those who actually are malicious, cannot shrug off their actions as a common industry requirement.

Source: Mozilla

Time Warner Cable Increasing Modem Rental Fee

Subject: Editorial | July 30, 2013 - 07:08 AM |
Tagged: time warner, Internet, cable modem, cable isp

According to the Consumerist, cable ISP Time Warner Cable is increasing its modem rental fee to $5.99 per month. The company initially instituted a $3.95 rental fee in October of last year, and it is already ratcheting up the fees further for the same hardware people are already using.

The new rental fee will be $5.99 per month for most accounts (it seems the SignatureHome with TV is exempted from a rental fee), which works out to $71.88 per year. For comparison, the previous $3.95 fee would be $47.4 a year. As such, Time Warner Cable is looking at a healthy increase in revenue from the higher fee, which ISI analyst Vijay Jayant estimates to be around (an extra) $150 million according to Reuters. That’s a lot of money, but even zooming in to the per-customer numbers, it is a large increase. In fact, with the $71.88 total per year in rental fees for the modem, it is now cheaper to buy a new DOCSIS 3 modem outright.

For example, the Motorola SB6121 is on TWC’s approved modem list and is $69.45 on Amazon which would save you about 20 cents a month in the first year, and whatever the rental fee is in later years. I have been using this modem (albeit on Comcast) since February 2012 and I can easily recommend it. Things can look even better in favor of buying your own modem if you are on a slower tier and can get by with a DOCSIS 2 modem, some of which can be purchased for around $40 used.

Time Warner Cable Increasing Modem Rental Fees.jpg

Time Warner Cable further claims that it has increased speeds on its most popular internet service tier and added additional Wi-Fi hotspots for customers over the past year alongside the fee increase announcement, but unless you are on its “most popular” tier, its hardly enough good news to outweigh the fee increase and is likely to only further the outrage from customers who have already attempted class action lawsuits over the $3.95 fee.

Fortunately, there are options to bring your own modem and it is a simple process if you cannot justify the fee increase out of principle or just want to save a bit of money.

Source: Consumerist

Google Launches $35 Chromecast Media Streaming Stick

Subject: Editorial, General Tech | July 27, 2013 - 03:39 AM |
Tagged: streaming, media, google. chrome, chromecast, chrome os

Earlier this week, web search giant Google launched a new portable media streaming device called the Chromecast. The Chromecast is a small device about the size of a large USB flash drive that has a full size HDMI video output, micro USB power jack, and Wi-Fi connectivity. The device run’s Google’s Chrome OS and is able to display or playback any web page or media file that the Chrome web browser can.

Chromecast.jpg

The Chromecast is designed to plug into televisions and stream media from the internet. Eventually, users will be able to “cast” embedded media files or web pages from a smartphone, tablet, or PC running Android, iOS, Windows, or Mac OS X with a Chrome web browser over to the Chromecast. The sending device will point the Chromecast as the requisite URL where the streaming media or web page resides along with any necessary authorization tokens needed to access content behind a pay-wall or username/password login. From there, the Chromecast itself will reach out to the Internet over the Wi-Fi radio, retrieve the web page or media stream, and output it to the TV over HDMI. Playback controls will be accessible on the sending device, such as an Android smartphone, but it is the Chromecast itself that is streaming the media unlike solutions like wireless HDMI, AirPlay, DLNA, or Miracast. As such, the sending device is able to perform other tasks while the Chromecast handles the media streaming.

At launch, users will be able to use the Chromecast to stream Netflix, YouTube, and Google Play videos. At some point in the future, Google will be adding support for additional apps, including Pandora Internet radio streaming. Beyond that, (and this feature is still in development) users will be able to share entire Chrome tabs with the Chromecast (some reports are indicating that this tab sharing is done using the WebRTC standard). Users will need to download and install a Google Cast extension, which will put a button to the right of the URL button that, when pressed, will “cast” the tab to the Chromecast which will pull it up over its own internet connection and output it to the TV. When on a website that implements the SDK, users will have additional options for sharing just the video and using the PC as a remote along with handy playback and volume controls.

Alternatively, Google is releasing a Chromecast SDK that will allow developers to integrate their streaming media with the Chromecast. Instead of needing to share the entire tab, web developers or mobile app developers will be able to integrate casting functionality that will allow users to share solely the streaming media with the Chromecast similar to the upcoming ability to stream just the YouTube or Netflix video itself rather than the entire web page with the video embedded into it. Unfortunately, there is currently a caveat that states that developers must have all there apps (using the Chromecast SDK) approved by Google.

Chromecast Chrome Browser Tab Sharing.jpg

Sharing ("Casting") a Chrome web browser tab to a TV from a PC using the Chromecast.

It should be noted that Wired has reported success in using the tab sharing functionality to play back local media by electing Chrome to playback locally-stored video files, but this is not a perfect solution as Chrome has a limited number of formats it can playback in a window and audio sync proved tricky at times. With that said, the Chromecast is intended to be an Internet streaming device, and Google is marketing it as such, so it is difficult to fault the Chromecast for local streaming issues. There are better solutions for getting the most out of your LAN-accessible media, after all.

The Chromecast is $35 and will ship as soon as August 7, 2013 from the Google Play Store. Amazon and Best Buy had stock listed on their websites until yesterday when both e-tailers sold out (though you might be lucky enough to find a Chromecast at a brick and mortar Best Buy store). For $35, you get the Chromecast itself, a rigid HDMI extender that extends the Chromecast closer to the edge of the TV to make installation/removal easier, and a USB power cord. Google was initially also offering 3 free months of Netflix Instant streaming but has since backed away from the promo due to overwhelming demand (and if Google can continue to sell out of Chromecasts without spending money on Netflix for each unit, it is going to do that despite the PR hit (or at least disappointed buyers) to bolster the profit margin on the inexpensive gadget).

The Chromecast does have its flaws, and the launch was not perfect (many OS support and device features are still being worked on), but at $35 it is a simple impulse buy on a device that should only get better from here as the company further fleshes out the software. Even on the off-chance that Google abandons the Chromecast, it can still stream Netflix, YouTube, and Google Play for a pittance. 

Keep an eye on the Google blog for more information about the Chromecast. The device is currently listed on the Google Play store for pre-order.

Source: Google

Intel Announces Q2 Results: Below Expectations, but Still Good

Subject: Editorial | July 17, 2013 - 09:34 PM |
Tagged: silvermont, quarterly results, money, Lenovo, k900, Intel, atom, 22 nm tri-gate, 14 nm

Intel announced their Q2 results for this year, and it did not quite meet expectations.  When I say expectations, I usually mean “make absolutely obscene amounts of money”.  It seems that Intel was just shy of estimates and margins were only slightly lower than expected.  That being said, Intel reported revenue of $12.8 billion US and a net income of $2 billion US.  Not… too… shabby.

Analysts were of course expecting higher, but it seems as though the PC slowdown is in fact having a material effect on the market.  Intel earlier this quarter cut estimates, so this was not exactly a surprise.  Margins came in around 58.3%, but these are expected to recover going into Q3.  Intel is certainly still in a strong position as millions of PCs are being shipped every quarter and they are the dominant CPU maker in its market.

Intel-Swimming-in-Money.jpg

Intel has been trying to get into the mobile market as it still exhibits strong growth not only now, but over the next several years as things become more and more connected.  Intel had ignored this market for some time, much to their dismay.  Their Atom based chips were slow to improve and typically used a last generation process node for cost savings.  In the face of a strong ARM based portfolio of products from companies like Qualcomm, Samsung, and Rockchip, the Intel Atom was simply not an effective solution until the latest batch of chips were available from Intel.  Products like the Atom Z2580, which powers the Lenovo K900 phone, were late to market as compared to other 28 nm products such as the Snapdragon series from Qualcomm.

Intel expects the next generation of Atom being built on its 22 nm Tri-Gate process, Silvermont, to be much more competitive with the latest generation offerings from its ARM based competitors.  Unfortunately for Intel, we do not expect to see Silvermont based products until later in Q3 with availability in late Q4 or Q1 2014.  Intel needs to move chips, but this will be a very different market than what they are used to.  These SOCs have decent margins, but they are nowhere near what Intel can do with their traditional notebook, desktop, and server CPUs.

To help cut costs going forward, it seems as though Intel will be pulling back on its plans for 14 nm production.  Expenditures and floor space/equipment for 14 nm will be cut back as compared to what previous plans had held.  Intel still is hoping to start 14 nm production at the end of this year with the first commercial products to hit at the end of 2014.  There are questions as to how viable 14 nm is as a fully ramped process in 2014.  Eventually 14 nm will work as advertised, but it appears as though the kinks were much more complex than anticipated given how quickly Intel ramped 22 nm.

Intel has plenty of money, a dominant position in the x86 world, and a world class process technology on which to base future products on.  I would say that they are still in very, very good shape.  The market is ever changing and Intel is still fairly nimble given their size.  They also recognize (albeit sometimes a bit later than expected) shifts in the marketplace and they invariably craft a plan of attack which addresses their shortcomings.  While Intel revenue seems to have peaked last year, they are addressing new markets aggressively as well as holding onto their dominant position in notebooks, desktops, and server markets.  Intel is expecting Q3 to be up, but overall sales throughout 2013 to be flat as compared to 2012.  Have I mentioned they still cleared $2 billion in a down quarter?

Source: Intel