PC Perspective Hardware Workshop 2013 @ Quakecon 2013 in Dallas, TX

Subject: Editorial, General Tech | August 6, 2013 - 11:30 AM |
Tagged: workshop, video, streaming, quakecon, prizes, live, giveaways

UPDATE: Did you miss the live event?  Check out the replay below!  Thanks to everyone that helped make it possible and see you next year!

It is that time of year again: another installment of the PC Perspective Hardware Workshop!  Once again we will be presenting on the main stage at Quakecon 2013 being held in Dallas, TX August 1-4th.  

workshop_logo.png

Main Stage - Quakecon 2013

Saturday, August 3rd, 12:30pm CT

Our thanks go out to the organizers of Quakecon for allowing us and our partners to put together a show that we are proud of every year.  We love giving back to the community of enthusiasts and gamers that drive us to do what we do!  Get ready for 2 hours of prizes, games and raffles and the chances are pretty good that you'll take something out with you - really, they are pretty good!

Our thanks for this year's workshop logo goes to John Pastor!!

Our primary partners at the event are those that threw in for our ability to host the workshop at Quakecon and for the hundreds of shirts we have ready to toss out!  Our thanks to NVIDIA, Western Digital and Corsair!!

nvidia_logo_small.png

wdc_logo_small.png

corsair_logo_small.png

Live Streaming

If you can't make it to the workshop - don't worry!  You can still watch the workshop live on our page right here as we stream it over one of several online services.  Just remember this URL: http://pcper.com/workshop and you will find your way!

 

PC Perspective LIVE Podcast and Meetup

We are planning on hosting any fans that want to watch us record our weekly PC Perspective Podcast (http://pcper.com/podcast) on Wednesday evening in our meeting room at the Hilton Anatole.  I don't yet know exactly WHEN or WHERE the location will be, but I will update this page accordingly on Wednesday July 31st when we get the data.  You might also consider following me on Twitter for updates on that status as well.

After the recording, we'll hop over the hotel bar for a couple drinks and hang out.  We have room for at leaast 50-60 people to join us in the room but we'll still be recording if just ONE of you shows up.  :)

 

Prize List (will continue to grow!)

Continue reading to see the list of prizes for the workshop!!!

Qualcomm works out their ARMs and not their core?

Subject: Editorial, General Tech, Processors, Mobile | August 3, 2013 - 04:21 PM |
Tagged: qualcomm, Intel, mediatek, arm

MediaTek, do you even lift?

According to a Taiwan Media Roundtable transcript, discovered by IT World, Qualcomm has no interest, at least at the moment, in developing an octo-core processor. MediaTek, their competitor, recently unveiled an eight core ARM System on a Chip (SoC) which can be fully utilized. Most other mobile SoCs with eight cores function as a fast quad-core and a slower, but more efficient, quad-core processor with the most appropriate chosen for the task.

qualcomm.png

Anand Chandrasekher of Qualcomm believes it is desperation.

So, I go back to what I said: it's not about cores. When you can't engineer a product that meets the consumers' expectations, maybe that’s when you resort to simply throwing cores together. That is the equivalent of throwing spaghetti against the wall and seeing what sticks. That's a dumb way to do it and I think our engineers aren't dumb.

The moderator, clearly amused by the reaction, requested a firm clarification that Qualcomm will not launch an octo-core product. A firm, but not clear, response was given, "We don't do dumb things". Of course they would not commit to swearing off eight cores for all eternity, at some point they may find core count to be their bottleneck, but that is not the case for the moment. They will also not discuss whether bumping the clock rate is the best option or whether they should focus on graphics performance. He is just assured that they are focused on the best experience for whatever scenario each product is designed to solve.

And he is assured that Intel, his former employer, still cannot catch them. As we have discussed in the past: Intel is a company that will spend tens of billions of dollars, year over year, to out-research you if they genuinely want to play in your market. Even with his experience at Intel, he continues to take them lightly.

We don't see any impact from any of Intel's claims on current or future products. I think the results from empirical testers on our products that are currently shipping in the marketplace is very clear, and across a range of reviewers from Anandtech to Engadget, Qualcomm Snapdragon devices are winning both on experience as well as battery life. What our competitors are claiming are empty promises and is not having an impact on us.

Qualcomm has a definite lead, at the moment, and may very well keep ahead through Bay Trail. AMD, too, kept a lead throughout the entire Athlon 64 generation and believed they could beat anything Intel could develop. They were complacent, much as Qualcomm sounds currently, and when Intel caught up AMD could not float above the sheer volume of money trying to drown them.

Then again, even if you are complacent, you may still be the best. Maybe Intel will never get a Conroe moment against ARM.

Your thoughts?

Source: IT World

NVIDIA CloudLight: White Paper for Lighting in the Cloud

Subject: Editorial, General Tech | August 3, 2013 - 01:03 PM |
Tagged: nvidia, CloudLight, cloud gaming

Trust the cloud... be the cloud.

The executives on stage might as well have waved their hands while reciting that incantation during the announcement of the Xbox One. Why not? The audience would have just assumed Don Mattrick was trying to get some weird Kinect achievement on stage. You know, kill four people with one laser beam while trying to sink your next-generation platform in a ranked keynote. 50 Gamerscore!

cloudlight.png

Microsoft stated, during and after the keynote, that each Xbox One would have access to cloud servers for certain processing tasks. Xbox Live would be receiving enough servers such that each console could access three times its performance, at launch, to do... stuff. You know, things that are hard to calculate but are not too dependent upon latency. You know what we mean, right?

Apparently Microsoft did not realize that was a detail they were supposed to sell us on.

In the mean time, NVIDIA has been selling us on offloaded computation to cloud architectures. We knew Global Illumination (GI) was a very complicated problem. Most of the last couple decades has been progressively removing approximations to what light truly does.

CloudLight is their research project, presented at SIGRAPH Asia and via Williams College, to demonstrate server-processed indirect lighting. In their video, each of the three effects are demonstrated at multiple latencies. The results look pretty good until about 500ms which is where the brightest points are noticeably in the wrong locations.

cloudlight-2.jpg

Again, the video is available here.

The three methods used to generate indirect lighting are: irradiance maps, where lightmaps are continuously calculated on a server and streamed by H.264; photons, which raytraces lighting for the scene as previous rays expire and streams only the most current ones to clients who need it; and voxels, which stream fully computed frames to the clients. The most interesting part is that as you add more users, in most cases, server-processing remains fairly constant.

It should be noted, however, that each of these demonstrations only moved the most intense lights slowly. I would expect an effect such as switching a light on in an otherwise dark room would create a "pop-in" effect if it lags too far behind user interaction or the instantaneous dynamic lights.

That said, for a finite number of instant switches, it would be possible for a server to render both results and have the client choose the appropriate lightmap (or the appropriate set of pixels from the same, large, lightmap). For an Unreal Tournament 3 mod, I was experimenting with using a Global Illumination solver to calculate lighting. My intention was to allow users to turn on and off a handful of lights in each team's base. As lights were shot out or activated by a switch, the shader would switch to the appropriate pre-rendered solution. I would expect a similar method to work here.

What other effects do you believe can withstand a few hundred milliseconds of latency?

Source: NVIDIA

Mozilla Labs: Give Advertisers Only... Exactly What They Want

Subject: Editorial, General Tech | July 31, 2013 - 05:03 PM |
Tagged: Privacy, mozilla, DNT

Mozilla Labs is researching a new approach to the problem of privacy and targeted advertising: allow the user to provide the data that honest advertisers intend to acquire via tracking behavior. The hope is that users who manage their own privacy will not have companies try to do it for them.

Internet users are growing concerned about how they are tracked and monitored online. Crowds rally behind initiatives, such as Do Not Track (DNT) and neutering the NSA, because of an assumed promise of privacy even if it is just superficial.

do not hurt.jpg

DNT, for instance, is a web developer tool permitting honest sites to be less shy when considering features which make privacy advocates poop themselves and go to competing pages. Users, who were not the intended audience of this feature, threw a fit because it failed to satisfy their privacy concerns. Internet Explorer, which is otherwise becoming a great browser, decided to break the standard by not providing the default, "user has not specified", value.

Of course, all this does is hands honest web developers a broken tool; immoral and arrogantly amoral sites will track anyway.

Mozilla Labs is currently investigating another solution. We could, potentially, at some point, see an addition to Firefox which distills all of the information honest websites would like to know into a summary which users could selectively share. This, much like DNT, will not prevent companies or other organizations from tracking you but rather give most legitimate situations a fittingly legitimate alternative.

All of this data, such as history and geolocation, is already stored by browsers as a result of how they operate. This concept allows users to release some of this information to the sites they visit and, ideally, satisfy both parties. Maybe then, those who actually are malicious, cannot shrug off their actions as a common industry requirement.

Source: Mozilla

Time Warner Cable Increasing Modem Rental Fee

Subject: Editorial | July 30, 2013 - 04:08 AM |
Tagged: time warner, Internet, cable modem, cable isp

According to the Consumerist, cable ISP Time Warner Cable is increasing its modem rental fee to $5.99 per month. The company initially instituted a $3.95 rental fee in October of last year, and it is already ratcheting up the fees further for the same hardware people are already using.

The new rental fee will be $5.99 per month for most accounts (it seems the SignatureHome with TV is exempted from a rental fee), which works out to $71.88 per year. For comparison, the previous $3.95 fee would be $47.4 a year. As such, Time Warner Cable is looking at a healthy increase in revenue from the higher fee, which ISI analyst Vijay Jayant estimates to be around (an extra) $150 million according to Reuters. That’s a lot of money, but even zooming in to the per-customer numbers, it is a large increase. In fact, with the $71.88 total per year in rental fees for the modem, it is now cheaper to buy a new DOCSIS 3 modem outright.

For example, the Motorola SB6121 is on TWC’s approved modem list and is $69.45 on Amazon which would save you about 20 cents a month in the first year, and whatever the rental fee is in later years. I have been using this modem (albeit on Comcast) since February 2012 and I can easily recommend it. Things can look even better in favor of buying your own modem if you are on a slower tier and can get by with a DOCSIS 2 modem, some of which can be purchased for around $40 used.

Time Warner Cable Increasing Modem Rental Fees.jpg

Time Warner Cable further claims that it has increased speeds on its most popular internet service tier and added additional Wi-Fi hotspots for customers over the past year alongside the fee increase announcement, but unless you are on its “most popular” tier, its hardly enough good news to outweigh the fee increase and is likely to only further the outrage from customers who have already attempted class action lawsuits over the $3.95 fee.

Fortunately, there are options to bring your own modem and it is a simple process if you cannot justify the fee increase out of principle or just want to save a bit of money.

Source: Consumerist

Google Launches $35 Chromecast Media Streaming Stick

Subject: Editorial, General Tech | July 27, 2013 - 12:39 AM |
Tagged: streaming, media, google. chrome, chromecast, chrome os

Earlier this week, web search giant Google launched a new portable media streaming device called the Chromecast. The Chromecast is a small device about the size of a large USB flash drive that has a full size HDMI video output, micro USB power jack, and Wi-Fi connectivity. The device run’s Google’s Chrome OS and is able to display or playback any web page or media file that the Chrome web browser can.

Chromecast.jpg

The Chromecast is designed to plug into televisions and stream media from the internet. Eventually, users will be able to “cast” embedded media files or web pages from a smartphone, tablet, or PC running Android, iOS, Windows, or Mac OS X with a Chrome web browser over to the Chromecast. The sending device will point the Chromecast as the requisite URL where the streaming media or web page resides along with any necessary authorization tokens needed to access content behind a pay-wall or username/password login. From there, the Chromecast itself will reach out to the Internet over the Wi-Fi radio, retrieve the web page or media stream, and output it to the TV over HDMI. Playback controls will be accessible on the sending device, such as an Android smartphone, but it is the Chromecast itself that is streaming the media unlike solutions like wireless HDMI, AirPlay, DLNA, or Miracast. As such, the sending device is able to perform other tasks while the Chromecast handles the media streaming.

At launch, users will be able to use the Chromecast to stream Netflix, YouTube, and Google Play videos. At some point in the future, Google will be adding support for additional apps, including Pandora Internet radio streaming. Beyond that, (and this feature is still in development) users will be able to share entire Chrome tabs with the Chromecast (some reports are indicating that this tab sharing is done using the WebRTC standard). Users will need to download and install a Google Cast extension, which will put a button to the right of the URL button that, when pressed, will “cast” the tab to the Chromecast which will pull it up over its own internet connection and output it to the TV. When on a website that implements the SDK, users will have additional options for sharing just the video and using the PC as a remote along with handy playback and volume controls.

Alternatively, Google is releasing a Chromecast SDK that will allow developers to integrate their streaming media with the Chromecast. Instead of needing to share the entire tab, web developers or mobile app developers will be able to integrate casting functionality that will allow users to share solely the streaming media with the Chromecast similar to the upcoming ability to stream just the YouTube or Netflix video itself rather than the entire web page with the video embedded into it. Unfortunately, there is currently a caveat that states that developers must have all there apps (using the Chromecast SDK) approved by Google.

Chromecast Chrome Browser Tab Sharing.jpg

Sharing ("Casting") a Chrome web browser tab to a TV from a PC using the Chromecast.

It should be noted that Wired has reported success in using the tab sharing functionality to play back local media by electing Chrome to playback locally-stored video files, but this is not a perfect solution as Chrome has a limited number of formats it can playback in a window and audio sync proved tricky at times. With that said, the Chromecast is intended to be an Internet streaming device, and Google is marketing it as such, so it is difficult to fault the Chromecast for local streaming issues. There are better solutions for getting the most out of your LAN-accessible media, after all.

The Chromecast is $35 and will ship as soon as August 7, 2013 from the Google Play Store. Amazon and Best Buy had stock listed on their websites until yesterday when both e-tailers sold out (though you might be lucky enough to find a Chromecast at a brick and mortar Best Buy store). For $35, you get the Chromecast itself, a rigid HDMI extender that extends the Chromecast closer to the edge of the TV to make installation/removal easier, and a USB power cord. Google was initially also offering 3 free months of Netflix Instant streaming but has since backed away from the promo due to overwhelming demand (and if Google can continue to sell out of Chromecasts without spending money on Netflix for each unit, it is going to do that despite the PR hit (or at least disappointed buyers) to bolster the profit margin on the inexpensive gadget).

The Chromecast does have its flaws, and the launch was not perfect (many OS support and device features are still being worked on), but at $35 it is a simple impulse buy on a device that should only get better from here as the company further fleshes out the software. Even on the off-chance that Google abandons the Chromecast, it can still stream Netflix, YouTube, and Google Play for a pittance. 

Keep an eye on the Google blog for more information about the Chromecast. The device is currently listed on the Google Play store for pre-order.

Source: Google

Intel Announces Q2 Results: Below Expectations, but Still Good

Subject: Editorial | July 17, 2013 - 06:34 PM |
Tagged: silvermont, quarterly results, money, Lenovo, k900, Intel, atom, 22 nm tri-gate, 14 nm

Intel announced their Q2 results for this year, and it did not quite meet expectations.  When I say expectations, I usually mean “make absolutely obscene amounts of money”.  It seems that Intel was just shy of estimates and margins were only slightly lower than expected.  That being said, Intel reported revenue of $12.8 billion US and a net income of $2 billion US.  Not… too… shabby.

Analysts were of course expecting higher, but it seems as though the PC slowdown is in fact having a material effect on the market.  Intel earlier this quarter cut estimates, so this was not exactly a surprise.  Margins came in around 58.3%, but these are expected to recover going into Q3.  Intel is certainly still in a strong position as millions of PCs are being shipped every quarter and they are the dominant CPU maker in its market.

Intel-Swimming-in-Money.jpg

Intel has been trying to get into the mobile market as it still exhibits strong growth not only now, but over the next several years as things become more and more connected.  Intel had ignored this market for some time, much to their dismay.  Their Atom based chips were slow to improve and typically used a last generation process node for cost savings.  In the face of a strong ARM based portfolio of products from companies like Qualcomm, Samsung, and Rockchip, the Intel Atom was simply not an effective solution until the latest batch of chips were available from Intel.  Products like the Atom Z2580, which powers the Lenovo K900 phone, were late to market as compared to other 28 nm products such as the Snapdragon series from Qualcomm.

Intel expects the next generation of Atom being built on its 22 nm Tri-Gate process, Silvermont, to be much more competitive with the latest generation offerings from its ARM based competitors.  Unfortunately for Intel, we do not expect to see Silvermont based products until later in Q3 with availability in late Q4 or Q1 2014.  Intel needs to move chips, but this will be a very different market than what they are used to.  These SOCs have decent margins, but they are nowhere near what Intel can do with their traditional notebook, desktop, and server CPUs.

To help cut costs going forward, it seems as though Intel will be pulling back on its plans for 14 nm production.  Expenditures and floor space/equipment for 14 nm will be cut back as compared to what previous plans had held.  Intel still is hoping to start 14 nm production at the end of this year with the first commercial products to hit at the end of 2014.  There are questions as to how viable 14 nm is as a fully ramped process in 2014.  Eventually 14 nm will work as advertised, but it appears as though the kinks were much more complex than anticipated given how quickly Intel ramped 22 nm.

Intel has plenty of money, a dominant position in the x86 world, and a world class process technology on which to base future products on.  I would say that they are still in very, very good shape.  The market is ever changing and Intel is still fairly nimble given their size.  They also recognize (albeit sometimes a bit later than expected) shifts in the marketplace and they invariably craft a plan of attack which addresses their shortcomings.  While Intel revenue seems to have peaked last year, they are addressing new markets aggressively as well as holding onto their dominant position in notebooks, desktops, and server markets.  Intel is expecting Q3 to be up, but overall sales throughout 2013 to be flat as compared to 2012.  Have I mentioned they still cleared $2 billion in a down quarter?

Source: Intel

Xbox Division has a Leader: Julie Larson-Green

Subject: Editorial, General Tech, Systems | July 14, 2013 - 11:09 PM |
Tagged: xbox, xbox one

Two weeks have passed since Steve Ballmer informed all Microsoft employees that Don Mattrick would disembark and pursue a career at Zynga for one reason or another. Initially, Ballmer himself was set to scab the void for an uncertain amount of time, further unsettling the upcoming Xbox One launch without a proper manager to oversee. His reign was cut short, best measured in days, when he appointed Julie Larson-Green as the head of Microsoft Devices and Studios.

... because a Christmas gift without ribbon would just be a box... one X box.

Larson_web.jpg

Of course the internet, then, erupted with anxiety: some reasonable concerns, even more (predictably) inane. Larson-Green has a long list of successfully shipped products to her name but, apart from the somewhat cop-out of Windows 7, nothing which resonates with gamers. Terrible sexism and similarly embarrassments boiled over the gaming community, but crazies will always be crazy, especially those adjacent to Xbox Live subscribers.

Operating Systems will be filled by Terry Myerson, who rose to power from the Windows Phone division. This could be a sign of things to come for Windows, particularly as Microsoft continues to push for convergence between x86, RT, and Phone. I would not be surprised to see continued pressure from Microsoft to ingrain Windows Store, and all of its certification pros and woes, into each of their operating systems.

As for Xbox, while Julie is very user experience (UX)-focused, division oversight passed to her long after its flagship product's lifetime high-level plans have been defined. If Windows 7 is any indication, she might not stray too far away from that which has been laid out prior her arrival; likewise, if Windows 8 is any indication, a drastically new direction could just spring without notice.

Source: Microsoft

AT&T Plans To Acquire Leap Wireless (Cricket) For $1.2 Billion

Subject: Editorial, General Tech | July 13, 2013 - 04:24 PM |
Tagged: wireless, spectrum, leap wireless, cricket, AT&T, acquisition, 4g lte

AT&T Plans To Acquire Leap Wireless (Cricket)

In a counter move to the SoftBank-Sprint-Clearwire merger, AT&T has announced its intentions to buy out Leap Wireless and its Cricket pre-paid cell service brand. AT&T will pay as much as $15 per share, which amounts to a bit under $1.19 billion (79.05 million outstanding shares at $15 per share). Before the announcement, Leap Wireless was trading at less than $8, so the bid is fairly generous. So far, approximately 30% of shareholders have voted to accept the buyout offer.

In the buyout deal, AT&T will acquire Leap Wireless, its Cricket brand in the US, licenses, spectrum, Cricket brand, 3,400 employees, and its retail locations. Cricket currently has a 3G CDMA network and is rolling out a 4G network. The company has about 5 million subscribers. AT&T will get to add a bit more spectrum to its portfolio in the PCS and AWS bands. This spectrum held by Leap Wireless is reportedly complementary to AT&T’s existing licenses.

Cricket Wireless.png

Interestingly Leap Wireless is not doing very well, and has about $2.8 billion in net debt, and its Cricket service is loosing subscribers. AT&T would also have to assume that debt. Cricket offers up unlimited plans that include unlimited voice calls, texting, and data. AT&T has stated that it would assume control of and maintain the Cricket brand. It will continue to offer service to existing Cricket customers and would also offer up its own 4G LTE network for use by Cricket pre-paid plans (phone hardware permitting). AT&T stated in a press release that it intends to use the Leap Wireless acquisition to “jump start AT&T’s expansion into the highly competitive prepaid segment.”

The buyout deal will need to be approved by Leap Wireless as well as by the US Department of Justice and FCC. If it successfully passes through the various regulatory bodies, AT&T expects the deal to close within the next six to nine months.

Personally, I have my doubts that AT&T will continue to maintain the Cricket service as is, especially when it comes to unlimited data. As far as its pre-paid expansion, it at least tried to go down this path before with its line of Go phones. I believe that this deal is mostly about padding out AT&T’s spectrum portfolio in a bid to head off Sprint, and maintain its position against T-Mobile and Verizon. The MVNO and pre-paid market is certainly growing and AT&T is going to want a piece of that market, but I also think that the last thing AT&T wants to do is cannibalize its own contract offerings by offering up a similar pre-paid service with unlimited everything for half the price. Sure, AT&T will take it versus getting nothing, but the company is going to have a hard time balancing both offerings in a way that does not negatively effect one or both of its pre-paid and post paid services.

What do you think about the deal, is this a good thing for Cricket customers? Is AT&T serious about wanting to jump into the pre-paid market?

Source: AT&T

Windows 8 Market Share Outpaces Vista, but Is Still Far Below Windows 7 and Windows XP

Subject: Editorial, General Tech | July 6, 2013 - 08:33 PM |
Tagged: windows 8, Windows 7, microsoft, desktop market share

A recent report by NetMarketShare indicates that Windows 8 is having a difficult time displacing Microsoft's older operating systems. Of the total market, Windows occupies 91.50% across all existing versions. Windows 7 and Windows XP dominate the Windows market share at 44.37% and 37.17% respectively. Microsoft's latest operating system, Windows 8, is sitting at 5.1%, which barely scratches past Windows Vista at 4.62%. Having more market share than Windows Vista and Windows 98 is good, but it is hardly proving to be as popular as Microsoft hoped for. 

Desktop Operating System Market Share 2013.jpg

June 2013 Desktop Operating System Market Share, as measured by NetMarketShare.

Granted, Windows 8 is still a new operating system, whereas XP and Windows 7 have had several years to gain users, be included on multiple generations of OEM machines, and be accepted by the enterprise customers. The free Windows 8.1 update should alleviate some users' concerns and may help bolster its market share as well. However, Windows XP simply will not die and Windows 7 (if talk on the Internet is to be believed, hehe) seems to be good enough for the majority of users, so it is difficult to say when (or if) Microsoft's latest OS will outpace the two existing, and entrenched, Windows operating systems.

YoY, Windows 7 lost 0.33% market share while Windows XP lost 6.44% market share. Meanwhile, Windows 8 has been slowly increasing in market share each quarter since its release. Netmarketshare reported 1.72% market share in December of 2012, and in six months the operating system has grown by 3.38%. There is no direct cause and effect here, but it does suggest that few people are choosing a Windows 8 upgrade path, and that despite the growth, the lost market share for Windows 7 and XP is not solely from people switching to Windows 8, but also some small number of people jumping to alternative operating systems such as Mac OS X and Linux. The historical data is neat, but it is difficult to predict how things will look moving forward. If adoption continues at this pace, it is going to take a long time for Windows 8 to dethrone Microsoft's older Windows XP and Windows 7 operating systems.

How you made the switch to Windows 8 or gotten it on a new machine? Will the Back-to-School shopping season give Windows 8 the adoption rate boost it needs?

Xbox Division Lead, Don Mattrick, Leaves to Join... Zynga? Steve Ballmer, Himself, Scabs the Void.

Subject: Editorial, General Tech | July 2, 2013 - 12:33 AM |
Tagged: xbox one, xbox, microsoft, consolitis

Well that was unexpected...

Don Mattrick, a few months ahead of the Xbox One launch and less than two months after its unveiling, decided to leave his position at Microsoft as president of Interactive Entertainment Business. This news was first made official by a Zynga press release, which announced acquiring him as CEO. Steve Ballmer later published an open letter addressed all employees of Microsoft, open to the public via their news feed, wishing him luck and outlining the immediate steps to follow.

Mattrick.jpg

While subtle in the email, no replacement has been planned for after his departure on July 8th. Those who report to Don Mattrick will report directly to Steve Ballmer, himself, seemingly through the launch of Xbox One. As scary and unsettling as Xbox One PR has been lately, launching your flagship ship without a captain is a depressingly fitting apex. This would likely mean that either: Don gave minimal notice of his departure, he was being abruptly ousted from Microsoft and Zynga just happened to make convenient PR for all parties involved, or there is literally no sense to be made of the situation.

However the situation came about, Xbox One will likely launch from a team directly lead by Steve Ballmer and Zynga will have a new CEO. Will his goal be to turn the former social gaming giant back on course? Or will he be there to milk blood from the company before it turns to stone?

I wonder whether his new contract favors cash or stock...

Source: Zynga

Google's Web is QUIC and SPDY

Subject: Editorial, General Tech | July 1, 2013 - 11:12 PM |
Tagged: google, spdy, QUIC

It missed being a recursive acronym by a single letter...

TCP is known for being the go-to protocol for stable connections over the internet. There are some things you can guarantee: you will not lose bits of data, packets will arrive in order, incorrect packets will be checked and redelivered, and both endpoints will be roughly metered to the least capacity. It is easy to develop applications around the TCP protocol, it does the hard problems for you.

Being meticulous also makes it slow, relatively speaking.

9-ethernet2.png

UDP, on the other hand, frees its packets in a fountain to hopefully land where it is intended. This protocol is fast, but a pain for applications that need some level of reliability. Quick UDP Internet Connections (QUIC), from Google, leverages UDP to create multiple independent, even encrypted connections. While TCP could be made faster, it is beyond the jurisdiction of web browsers; support is embedded into the operating system itself. This leaves building upon UDP, suffering with TCP, or not being compatible with about every network hardware installed just about anywhere.

This comes on the heels of SPDY, Google's other open protocol. SPDY is built around HTTP and both presume a reliable protocol underneath, where TCP is the usual candidate. A large advantage of SPDY allows assets to simultaneously stream over a single connection. TCP will, unfortunately, freeze the entire connection (and thus each stream) when a single stream drops a packet. QUIC, based upon UDP, can then be used to accelerate SPDY further by allowing truly independent multiplexing.

QUIC will be used for "a small percentage of Chrome dev and canary channel traffic to some Google server", for experimentation purposes. The code itself is licensed under BSD and, as such, could migrate to other browsers in due time.

Source: CNet

Microsoft Gives Xbox One Gamers What They Want... Sort Of

Subject: Editorial, General Tech | June 19, 2013 - 06:08 PM |
Tagged: xbox one, gaming, DRM, disc

Microsoft faced a major backlash from users following the unveiling of its latest Xbox One console. Users were rather unnerved at Microsoft’s reveal that the new console would be required to “phone home” at least once every 24 hours in order to authenticate games and allow sharing. Considering Sony carried forward the disc traditions of the PS3 combined with the user uproar, Microsoft has reconsidered and issued an update to users via a blog post titled (in part) “Your Feedback Matters.”

Amidst the uncertainty caused by various MS sources issuing statements about functionality and DRM that conflict with one another and an air of as-yet-un-announced secrecy pre-E3 where MS released just enough info about the DRM to get users scared (can you tell the way MS handled this irked me?), the company talked about the Xbox One moving forward and taking advantage of the ‘digital age.’ The new console would require online authentication (and daily check-ins), but would also allow sharing of your game library with up to 10 other people, re-downloadable games that can be installed on other consoles (and played) so long as you log into your Xbox Live account (the latter bit is similar in nature to Steam on the PC). Further, disc games could be resold or gifted if the publishers allow it.

That has changed now, however. Microsoft has reconsidered its position and is going back to the way things work(ed) on the existing Xbox 360. Instead of taking the logical approach of keeping with the plan but removing the daily authentication requirement for games if you keep the game disc in the tray, Microsoft has taken their ball Xbox One controller and completely backtracked.

Xbox One Logo.jpg

DRM on the Xbox One is now as follows, and these changes go in place of (not in addition to) the previously announced sharing and reselling functionalities.

For physical disc games:

According to Xbox Wire, after their initial setup and installation, disc-based games will not require an internet connection for offline functionality (though multiplayer components will, obviously, need an active connection). Even better, trading and reselling of disc-based games is no longer limited by publishers. Trading, selling, gifting, renting, et al of physical disc-based games "will work just as it does today on the Xbox 360." Microsoft is also not region locking physical games, which means that you will not have to worry about games purchased abroad working on your console at home.

In order to play disc-based games, you will need to keep the game disc in the tray, even if it is installed on the hard drive, however.

Changes to Downloaded games:

As far as downloadable games, Microsoft is restricting these titles such that they cannot be shared or resold. In the previous model, you would have been able to share the titles with your family, but not anymore. You will still be able to re-download the games.

There is no word on whether or not gamers will still lose access to all of the titles in their game library if their Xbox Live accounts are ever banned. It is likely that gamers will lose any downloadable games though as those are effectively tied to a single Xbox Live account.

While at first glance it may seem as though gamers won this round, in the end no one really won. Instead of Microsoft working around gamers concerns for physical media and moving forward together, it is as though Microsoft has thrown up its hands in frustration, and tossed out all of the innovative aspects for digital/downloadable titles along with the undesirable daily authentication and other invasive DRM measures that gamers clearly indicated they did not want.

I believe that Microsoft should have kept to the original game plan, but added an exception to the daily check-in rules so long as the console was able to authenticate the game offline by identifying a physical game disc in the tray. That way, gamers that are not comfortable with (or able to) keeping the Xbox One connected to the internet could continue to play games using discs while also allowing those with always-on Xbox One consoles the privileges of sharing their libraries. Doing so would have also helped ease the console gaming populance as a whole into Microsoft's ideal digital age once the next Xbox comes out. However, instead of simply toning down the changes, Microsoft has completely backtracked, and now no one wins. Sigh.

What are your thoughts on Microsoft's latest changes to the Xbox One? Was it the right move, or were you looking forward to increased freedom with your digitally-downloaded games?

Also read:

Source: Xbox Wire

Steam Might Allow Shared Games?

Subject: Editorial, General Tech | June 19, 2013 - 03:33 PM |
Tagged: steam, DRM

You can learn a lot by scanning configuration, registry files, and so forth; many have made off with a successful bounty. Most recently, some Steam Beta users dug around in their user interface (UI) files to notice a few interesting lines, instructing the user that the title they are attempting to launch will kick off a friend it is currently being shared with.

Wait, what?!

Steam-UI.png

"SteamUI_JoinDialog_SharedLicense_Title" "Shared game library"

"SteamUI_JoinDialog_SharedLicenseLocked_OwnerText" "Just so you know, your games are currently in use by %borrower%. Playing now will send %borrower% a notice that it's time to quit."

"SteamUI_JoinDialog_SharedLicenseLocked_BorrowerText" "This shared game is currently unavailable. Please try against later or buy this game for your own library."

Sure, this whole game DRM issue has been flipping some tables around the industry. Microsoft tried permitting users share games with their family, utilizing about the worst possible PR, and eventually needed to undo that decision. Users would like flexible licensing schemes, but the content industry (including the platform owners like Microsoft, Nintendo, and Sony, who receive license fees from game sales) are unwilling to cooperate unless they are assured that users are honest.

Of course, what usually happens is honest users get crapped on and pirates enjoy a better experience, after initial setup.

While there is not much difference, from a high level view, between Steam and the proposed Xbox One, there are a number of differences. The obvious difference is Steam's offline mode, but probably the larger reason is trust. Valve has demonstrated a lot of good faith to their customers; where Microsoft shuts down access to content people paid for, Valve has shown they have intentions for both long-term support and consideration for the user's experience.

Ultimately, I feel as if DRM is not a necessary evil, but while it exists at least there are companies such as Valve who earn trust and use DRM both for and against users. I expect that some day, the industry will turn against DRM either willingly, by legal intervention, or because companies like cdp.pl will use DRM-free as a promotional tool and nibble their way to dominance.

And yes, despite the fact that this will be confused with bias: if you prove that you are untrustworthy before, you will get away with less later regardless of your intentions.

The Witcher 3's DRM Strategy: Still None on PC

Subject: Editorial, General Tech | June 19, 2013 - 03:16 PM |
Tagged: DRM, The Witcher 3, GOG

cdp.pl, formerly CD Projekt, has been one of the last holdouts against DRM. Founders of GoG.com and developer/publisher for The Witcher franchise, they offer a DRM-free platform for users to purchase games. Sure, they are usually good and old ones, aptly enough, but they are confident enough to include their most ambitious titles, The Witcher and The Witcher 2.

With The Witcher 3, we will see the title launch without DRM on GoG, trusting their users will purchase the title and be honest.

witcher-drm2.jpg

Apparently, the game will have a world slightly larger than Skyrim.

Hopefully, with very little empty space.

I have long been a proponent of DRM-free media, as you could probably tell. I believe that DRM-free titles end up netting more sales than the same title would have with encryption; even if that were not true, society is harmed more than enough to justify its non-existence. Sure, we all know unapologetic jerks and they are, indeed, jerks. Just because these jerks exist does not mean your company should, or successfully will, be the alpha a-hole on the a-hole food-chain. Chances are you will just upset your actual customers, now former customers. There are reasons why I never purchased (never pirated either, I just flat-out ignored the entire franchise's existence) another Crysis title after the first one's SecuROM debacle wrecked my camcorder's DVD-authoring software.

So, when The Witcher 3 comes out, back it up on your external hard drive and maybe even keep a copy on your home theater PC. Most importantly, buy it... sometime in 2014.

Source: PC Gamer

E3 2013: Microsoft can ban your Xbox One library

Subject: Editorial, General Tech, Systems, Shows and Expos | June 17, 2013 - 12:16 AM |
Tagged: xbox one, microsoft, ea, E3 13, E3

Update: Microsoft denies the statements from their support account... but this is still one of the major problems with DRM and closed platforms in general. It is stuff like this that you let them do.

xbox-one-head.jpg

Electronic Arts knows that they need to shake their terrible public image.

Welcome to Microsoft's PR strategy for the Xbox One.

Consumers, whether they acknowledge it or not, fear for the control that platform holders have over their content. It was hard for many to believe that having your EA account banned for whatever reason, even a dispute with a forum moderator, forfeited your license to games you play through that EA account. Sounds like another great idea for Microsoft to steal.

Not stopping there, later on in the thread they were asked what would happen in the event of a security breach. You know, recourse before destroying access to possibly thousands of dollars of content.

While not a "verified account", @xboxsupport is.

They acknowledge ownership of this account in the background image there.

Honestly, there shouldn't have been any doubt that these actually are Microsoft employees.

... Yikes.

At this point, we have definitely surpassed absurdity. Sure, you typically need to do something fairly bad to have Microsoft stop charging your for Xbox Live. Removing access to your entire library of games, to me, is an attempt to limit cheating and the hardware community.

Great, encourage spite from the soldering irons, that works out well.

Don't worry, enthusiasts, you know the PC loves you.

Gaming as a form of entertainment is fundamentally different than gaming as a form of art. When content is entertainment, its message touches you without any intrinsic value and can be replaced with similar content. Sometimes a certain piece of content, itself, has specific value to society. It is these times where we should encourage efforts by organizations such as GoG, Mozilla and W3C, Khronos, and many others. Without help, it could be extremely difficult or impossible for content to be preserved for future generations and future civilizations.

It does not even need to get in the way of the industry and its attempt to profit from the gaming medium; a careless industry, on the other hand, can certainly get in the way of our ability to have genuine art. After all, this is the main reason why I am a PC gamer: the platform allows entertainment to co-exist with communities who support themselves when the official channels do not.

Of course, unless Windows learns a little something from the Xbox. I guess do not get your Windows Store account banned in the future?

Intel is not slowing down, exclamation exclamation. Haswell-E for Holiday 2014 question mark.

Subject: Editorial, General Tech, Processors | June 15, 2013 - 04:02 PM |
Tagged: Intel, Ivy Bridge-E, Haswell-E

In my analysis of the recent Intel Computex keynote, I noted that the displayed confidence came across more as repressing self-doubt. It did not seem, to me, like Intel wants to abandon the high-end enthusiast but rather catch up with their low performance and high efficiency competitors; they just know they are secure in that market. Of course, we could see mid-range choices dwindle and prices stagnate, but I cast doubt that Intel wants to exit the enthusiast market despite their silence about Ivy Bridge-E.

Haswell-E1.jpg

All Images, Credit: VR-Zone

And Intel, now, wants to return some confidence to their high-end consumers comma they are not slowing down exclamation point exclamation point.

VR-Zone, the site which published Ivy Bridge-E's lazy release roadmap, are also the ones to suggest Haswell-E will come before mainstream Broadwell offerings. Once again, all is right with the world. Slated for release around holiday 2014, just a year after Ivy Bridge-E, Haswell-E will come alongside the X99 chipset. Instead of Broadwell, the back to school window of 2014 will by filled by a refresh of 22nm Haswell products with a new 9-series chipset.

Seriously, it's like watching the face of Intel's Tick-Tock while a repairman is tweaking the gears.

Haswell-E2.jpg

In terms of specifications, Haswell-E will come in 8 and 6-core offerings with up to 20MB of cache. Apart from the inclusion of DDR4 support, the main advantage of Haswell-E over the upcoming Ivy Bridge-E is supposed to be raw performance; VR-Zone estimates up to 33-50% better computational strength. A depressingly novel area of improvement as of recent...

Lastly, with recent discussion of the awkwardly hobbled K-series parts, our readers might be happy to know that all Haswell-E parts will be unlocked to overclocking. This, again, leads me to believe that Intel is not hoping to suffocate the enthusiast market but rather sort their users: mid-range consumers will take what they are given and, if they object, send them on the bus to Funk-E town.

Haswell-E3.jpg

Note, while the headlining slide definitively says "All Processors Unlocked"...

... this slide says "For K and Extreme series products." I will assume the latter is out of date?

Which begs the question: what does our readers think about that potential strategy? It could lead to mainstream performance products being pushed down into BGA-territory, but cements the existence of an enthusiast platform.

Source: VR-Zone

WWDC 13: Dissecting Apple's New Hardware Changes. MacBook Air and the new Mac Pro.

Subject: Editorial, General Tech, Systems, Shows and Expos | June 11, 2013 - 01:06 AM |
Tagged: wwdc 13, MacBook Air, Mac Pro, apple

Sometimes our "Perspective" is needed on Apple announcements because some big points just do not get covered by the usual sources. Other times, portions of the story can be relevant to our readers. This is one of those days where both are true. Either side should review our thoughts and analysis of Apple's recent ultrabook and, especially, their upcoming desktop offerings.

The MacBook Air has been, predictably, upgraded Intel's Haswell processors. Battery life is the first obvious benefit of the CPU, and that has been well reported. The 11-inch MacBook Air gains an extra four hours of battery life, usable for up to 9 hours between charges. The extra space on the 13-inch MacBook Air allows it to last 12 hours between charges.

apple-macbook-air.jpg

Less discussed, both MacBook Airs will contain Intel's Iris iGPU more commonly known as Intel HD 5000. You cannot get Intel HD 5000 graphics without selecting a BGA socket component which you would install by soldering it in place. While there are several better solutions from competing GPU vendors, Apple will have one of the first shipping implementations of Haswell's canonical graphics processor. Iris is said to have double the performance of previous generation Ivy Bridge graphics for a fraction of its power consumption.

Also included in the MacBook Air is an 802.11a/b/g/n/ac WiFi network adapter and Bluetooth 4.0. Apple is not typically known to introduce new standards and often lags severely behind what is available on the PC unless they had a hand in trademarking it, USB 3.0 being the obvious and recent example.

The specifications will be somewhat customizable, the user is able to select between: an i5 and an i7 processor, 4GB or 8GB of RAM, and 128, 256, or 512GB SSD. It has shipped the day it was announced with base prices ranging between $999 for an entry-level 11-inch and $1099 for an entry-level 13-inch.

But now we move on to the dying industry, desktop PCs, where all innovation has died unless it is to graft a touch interface to anything and everything.

"Can't innovate any more, my ass", grunts Phil Schiller, on the keynote stage.

Whether you like it, or think "innovation" is the best word, it's a legitimate new design some will want.

While the new Mac Pro is not a system that I would be interested in purchasing, for issues I will outline soon, these devices are what some users really want. I have been a very strong proponent of OEM devices as they highlight the benefit of the PC industry: choice. You can purchase a device, like the new Mac Pro, from a vendor; alternatively, you can purchase the components individually to assemble yourself and save a lot of money; otherwise, you can hire a small business computer store or technician.

We need more companies, like Apple, to try new devices and paradigms for workstations and other high-performance devices. While it is less ideal for Apple to be the ones coming up with these redesigns, Apple's platform encourages applications to be vendor-specific (only run on a Mac), it can still benefit the PC industry by demonstrating that life and demand still exists; trying something new could reap large benefits. Not everyone wants to have a full ATX case with discrete components but still want workstation performance, and that is okay.

Now when it comes to actual specifications, the typical coverage glossed over what could be easily approximated by a trip to Wikipedia and Google. Sure, some may have been in a rush within the auditorium, but still.

The specifications are:

  • Intel Xeon E5-2600 V2-class CPU, Ivy Bridge-E, 12 cores max (suggests single-socket)
  • 4-channel DDR3 ECC RAM, apparently 4 DIMMS which suggests 4x16GB (Max).
  • Dual FirePro GPUs, 4096 total shaders with 2x6GB GDDR5.
  • PCIe SSD
  • Thunderbolt 2, USB3.0, and WiFi ac (+ a/b/g/n??), Bluetooth 4.0

Now the downside is that basically anything you wish to add to the Mac Pro needs to be done through Thunderbolt, Bluetooth 4.0, or USB 3.0. When you purchase an all-in-one custom design, you forfeit your ability to reach in and modify the components. There is also no mention of pricing, and for a computer with this shoplist you should expect to pay a substantial invoice even without "The Apple Tax", but that is not the point of purchasing a high-end workstation. Apple certainly put in as close to the best-of-the-best as they could.

Now could people stop claiming the PC is dead and work towards sustaining it? I know people love stories of jarring industry shifts, but this is ridiculous.

Source: Apple

Win an AMD A10-6800K Richland APU + SimCity!!

Subject: Editorial, Processors | June 10, 2013 - 07:53 AM |
Tagged: SimCity, Richland, giveaway, contest, APU, amd, a10-6800k

Odd, turns out I found two brand new AMD A10-6800K Richland APUs sitting on my desk this morning.  I called AMD to ask what this was all about and they said that if I didn't need them, I might as well give them away to our readers. 

"Oh, and throw in a free copy of the new SimCity while you're at it," they told me.

apugiveaway.jpg

Who am I to argue?

So let's have a giveaway! 

We are handing out two AMD A10-6800K "Richland" APUs for you to build a brand new PC around and including a key for SimCity with each of them.  If you haven't read Josh's review of the A10-6800K APU you should definitely do so; it will help educate you on exactly what you are getting - for FREE. 

apubmarks.png

To enter, I need you to leave a comment on this very news post below telling us what you would build with a brand new A10 APU - you don't have to be registered to do so but we'd sure like it if you were.  (Make sure you leave your correct email address so I can get in touch with you if you win.)  Also, feel free to stop by the PC Perspective YouTube channel and either give our videos a gander or subscribe.  I think we put out some great content there and we'd like more of you to see it. 

I will pick one winner on June 17th and another on June 24th so you have two separate weeks to potentially win! 

A big thanks goes out to AMD for supplying the APUs and copies of SimCity for this giveaway.  Good luck!!

Source: AMD

Computex 2013: The Comedic Return of the Ultra GPUs

Subject: Editorial, General Tech, Graphics Cards, Shows and Expos | June 9, 2013 - 11:49 PM |
Tagged: Ultra, geforce titan, computex

So long to Computex 2013, we barely knew thee. You poured stories all over our news feed for more than a whole week. What say you, another story for the... metaphorical road... between here... and... Taipei? Okay, so the metaphorical road is bumpy and unpaved, work with me.

It was substantially more difficult to decipher the name of a video card a number of years ago. Back then, products would be classified by their model numbers and often assigned a suffix like: "Ultra", "Pro", or "LE". These suffixes actually meant a lot, performing noticeably better (or maybe worse) than the suffix-less number and possibly even overlapping with other number-classes.

colorful-gtx-titan-ultra-edition,B-V-387931-13.png

Image Credit: zol.com.cn via Tom's Hardware

Just when they were gone long enough for us to miss them, the suffixes might make some measure of a return. On the show floor, Colorful exhibited the NVIDIA GeForce GTX Titan Ultra Edition. This card uses a standard slightly-disabled GK110-based GeForce GTX Titan GPU, with the usual 2688 CUDA cores, and 6GB of GDDR5. While the GK110 chip has potential for 2880 CUDA cores, NVIDIA has not released any product (not even Tesla or Quadro) with more than 2688 CUDA cores enabled. Colorful's Titan Ultra and the reference Titan are electrically identical; this "Ultra" version just adds a water block for a cooling system and defaults to some amount of a factory overclock.

But, this is not the first time we have heard of a Titan Ultra...

Back in April, ExtremeTech found a leak for two official products: the GTX Titan LE and the GTX Titan Ultra. While the LE would be slightly stripped down compared to the full GTX Titan, the GTX Titan Ultra would be NVIDIA's first release of a GK110 part without any CUDA cores disabled.

So if that rumor ends up being true, you could choose between Colorful's GTX Titan Ultra with its partially disabled GK110 based on the full GTX Titan design; or, you could choose the reference GTX Titan Ultra based on a full GK110 GPU unlike the partially disabled GK110 on the full GTX Titan.

If you are feeling nostalgic... that might actually be confusion... as this is why suffixes went away.