Time Warner Cable Increasing Modem Rental Fee

Subject: Editorial | July 30, 2013 - 04:08 AM |
Tagged: time warner, Internet, cable modem, cable isp

According to the Consumerist, cable ISP Time Warner Cable is increasing its modem rental fee to $5.99 per month. The company initially instituted a $3.95 rental fee in October of last year, and it is already ratcheting up the fees further for the same hardware people are already using.

The new rental fee will be $5.99 per month for most accounts (it seems the SignatureHome with TV is exempted from a rental fee), which works out to $71.88 per year. For comparison, the previous $3.95 fee would be $47.4 a year. As such, Time Warner Cable is looking at a healthy increase in revenue from the higher fee, which ISI analyst Vijay Jayant estimates to be around (an extra) $150 million according to Reuters. That’s a lot of money, but even zooming in to the per-customer numbers, it is a large increase. In fact, with the $71.88 total per year in rental fees for the modem, it is now cheaper to buy a new DOCSIS 3 modem outright.

For example, the Motorola SB6121 is on TWC’s approved modem list and is $69.45 on Amazon which would save you about 20 cents a month in the first year, and whatever the rental fee is in later years. I have been using this modem (albeit on Comcast) since February 2012 and I can easily recommend it. Things can look even better in favor of buying your own modem if you are on a slower tier and can get by with a DOCSIS 2 modem, some of which can be purchased for around $40 used.

Time Warner Cable Increasing Modem Rental Fees.jpg

Time Warner Cable further claims that it has increased speeds on its most popular internet service tier and added additional Wi-Fi hotspots for customers over the past year alongside the fee increase announcement, but unless you are on its “most popular” tier, its hardly enough good news to outweigh the fee increase and is likely to only further the outrage from customers who have already attempted class action lawsuits over the $3.95 fee.

Fortunately, there are options to bring your own modem and it is a simple process if you cannot justify the fee increase out of principle or just want to save a bit of money.

Source: Consumerist

Google Launches $35 Chromecast Media Streaming Stick

Subject: Editorial, General Tech | July 27, 2013 - 12:39 AM |
Tagged: streaming, media, google. chrome, chromecast, chrome os

Earlier this week, web search giant Google launched a new portable media streaming device called the Chromecast. The Chromecast is a small device about the size of a large USB flash drive that has a full size HDMI video output, micro USB power jack, and Wi-Fi connectivity. The device run’s Google’s Chrome OS and is able to display or playback any web page or media file that the Chrome web browser can.

Chromecast.jpg

The Chromecast is designed to plug into televisions and stream media from the internet. Eventually, users will be able to “cast” embedded media files or web pages from a smartphone, tablet, or PC running Android, iOS, Windows, or Mac OS X with a Chrome web browser over to the Chromecast. The sending device will point the Chromecast as the requisite URL where the streaming media or web page resides along with any necessary authorization tokens needed to access content behind a pay-wall or username/password login. From there, the Chromecast itself will reach out to the Internet over the Wi-Fi radio, retrieve the web page or media stream, and output it to the TV over HDMI. Playback controls will be accessible on the sending device, such as an Android smartphone, but it is the Chromecast itself that is streaming the media unlike solutions like wireless HDMI, AirPlay, DLNA, or Miracast. As such, the sending device is able to perform other tasks while the Chromecast handles the media streaming.

At launch, users will be able to use the Chromecast to stream Netflix, YouTube, and Google Play videos. At some point in the future, Google will be adding support for additional apps, including Pandora Internet radio streaming. Beyond that, (and this feature is still in development) users will be able to share entire Chrome tabs with the Chromecast (some reports are indicating that this tab sharing is done using the WebRTC standard). Users will need to download and install a Google Cast extension, which will put a button to the right of the URL button that, when pressed, will “cast” the tab to the Chromecast which will pull it up over its own internet connection and output it to the TV. When on a website that implements the SDK, users will have additional options for sharing just the video and using the PC as a remote along with handy playback and volume controls.

Alternatively, Google is releasing a Chromecast SDK that will allow developers to integrate their streaming media with the Chromecast. Instead of needing to share the entire tab, web developers or mobile app developers will be able to integrate casting functionality that will allow users to share solely the streaming media with the Chromecast similar to the upcoming ability to stream just the YouTube or Netflix video itself rather than the entire web page with the video embedded into it. Unfortunately, there is currently a caveat that states that developers must have all there apps (using the Chromecast SDK) approved by Google.

Chromecast Chrome Browser Tab Sharing.jpg

Sharing ("Casting") a Chrome web browser tab to a TV from a PC using the Chromecast.

It should be noted that Wired has reported success in using the tab sharing functionality to play back local media by electing Chrome to playback locally-stored video files, but this is not a perfect solution as Chrome has a limited number of formats it can playback in a window and audio sync proved tricky at times. With that said, the Chromecast is intended to be an Internet streaming device, and Google is marketing it as such, so it is difficult to fault the Chromecast for local streaming issues. There are better solutions for getting the most out of your LAN-accessible media, after all.

The Chromecast is $35 and will ship as soon as August 7, 2013 from the Google Play Store. Amazon and Best Buy had stock listed on their websites until yesterday when both e-tailers sold out (though you might be lucky enough to find a Chromecast at a brick and mortar Best Buy store). For $35, you get the Chromecast itself, a rigid HDMI extender that extends the Chromecast closer to the edge of the TV to make installation/removal easier, and a USB power cord. Google was initially also offering 3 free months of Netflix Instant streaming but has since backed away from the promo due to overwhelming demand (and if Google can continue to sell out of Chromecasts without spending money on Netflix for each unit, it is going to do that despite the PR hit (or at least disappointed buyers) to bolster the profit margin on the inexpensive gadget).

The Chromecast does have its flaws, and the launch was not perfect (many OS support and device features are still being worked on), but at $35 it is a simple impulse buy on a device that should only get better from here as the company further fleshes out the software. Even on the off-chance that Google abandons the Chromecast, it can still stream Netflix, YouTube, and Google Play for a pittance. 

Keep an eye on the Google blog for more information about the Chromecast. The device is currently listed on the Google Play store for pre-order.

Source: Google

Intel Announces Q2 Results: Below Expectations, but Still Good

Subject: Editorial | July 17, 2013 - 06:34 PM |
Tagged: silvermont, quarterly results, money, Lenovo, k900, Intel, atom, 22 nm tri-gate, 14 nm

Intel announced their Q2 results for this year, and it did not quite meet expectations.  When I say expectations, I usually mean “make absolutely obscene amounts of money”.  It seems that Intel was just shy of estimates and margins were only slightly lower than expected.  That being said, Intel reported revenue of $12.8 billion US and a net income of $2 billion US.  Not… too… shabby.

Analysts were of course expecting higher, but it seems as though the PC slowdown is in fact having a material effect on the market.  Intel earlier this quarter cut estimates, so this was not exactly a surprise.  Margins came in around 58.3%, but these are expected to recover going into Q3.  Intel is certainly still in a strong position as millions of PCs are being shipped every quarter and they are the dominant CPU maker in its market.

Intel-Swimming-in-Money.jpg

Intel has been trying to get into the mobile market as it still exhibits strong growth not only now, but over the next several years as things become more and more connected.  Intel had ignored this market for some time, much to their dismay.  Their Atom based chips were slow to improve and typically used a last generation process node for cost savings.  In the face of a strong ARM based portfolio of products from companies like Qualcomm, Samsung, and Rockchip, the Intel Atom was simply not an effective solution until the latest batch of chips were available from Intel.  Products like the Atom Z2580, which powers the Lenovo K900 phone, were late to market as compared to other 28 nm products such as the Snapdragon series from Qualcomm.

Intel expects the next generation of Atom being built on its 22 nm Tri-Gate process, Silvermont, to be much more competitive with the latest generation offerings from its ARM based competitors.  Unfortunately for Intel, we do not expect to see Silvermont based products until later in Q3 with availability in late Q4 or Q1 2014.  Intel needs to move chips, but this will be a very different market than what they are used to.  These SOCs have decent margins, but they are nowhere near what Intel can do with their traditional notebook, desktop, and server CPUs.

To help cut costs going forward, it seems as though Intel will be pulling back on its plans for 14 nm production.  Expenditures and floor space/equipment for 14 nm will be cut back as compared to what previous plans had held.  Intel still is hoping to start 14 nm production at the end of this year with the first commercial products to hit at the end of 2014.  There are questions as to how viable 14 nm is as a fully ramped process in 2014.  Eventually 14 nm will work as advertised, but it appears as though the kinks were much more complex than anticipated given how quickly Intel ramped 22 nm.

Intel has plenty of money, a dominant position in the x86 world, and a world class process technology on which to base future products on.  I would say that they are still in very, very good shape.  The market is ever changing and Intel is still fairly nimble given their size.  They also recognize (albeit sometimes a bit later than expected) shifts in the marketplace and they invariably craft a plan of attack which addresses their shortcomings.  While Intel revenue seems to have peaked last year, they are addressing new markets aggressively as well as holding onto their dominant position in notebooks, desktops, and server markets.  Intel is expecting Q3 to be up, but overall sales throughout 2013 to be flat as compared to 2012.  Have I mentioned they still cleared $2 billion in a down quarter?

Source: Intel

Xbox Division has a Leader: Julie Larson-Green

Subject: Editorial, General Tech, Systems | July 14, 2013 - 11:09 PM |
Tagged: xbox, xbox one

Two weeks have passed since Steve Ballmer informed all Microsoft employees that Don Mattrick would disembark and pursue a career at Zynga for one reason or another. Initially, Ballmer himself was set to scab the void for an uncertain amount of time, further unsettling the upcoming Xbox One launch without a proper manager to oversee. His reign was cut short, best measured in days, when he appointed Julie Larson-Green as the head of Microsoft Devices and Studios.

... because a Christmas gift without ribbon would just be a box... one X box.

Larson_web.jpg

Of course the internet, then, erupted with anxiety: some reasonable concerns, even more (predictably) inane. Larson-Green has a long list of successfully shipped products to her name but, apart from the somewhat cop-out of Windows 7, nothing which resonates with gamers. Terrible sexism and similarly embarrassments boiled over the gaming community, but crazies will always be crazy, especially those adjacent to Xbox Live subscribers.

Operating Systems will be filled by Terry Myerson, who rose to power from the Windows Phone division. This could be a sign of things to come for Windows, particularly as Microsoft continues to push for convergence between x86, RT, and Phone. I would not be surprised to see continued pressure from Microsoft to ingrain Windows Store, and all of its certification pros and woes, into each of their operating systems.

As for Xbox, while Julie is very user experience (UX)-focused, division oversight passed to her long after its flagship product's lifetime high-level plans have been defined. If Windows 7 is any indication, she might not stray too far away from that which has been laid out prior her arrival; likewise, if Windows 8 is any indication, a drastically new direction could just spring without notice.

Source: Microsoft

AT&T Plans To Acquire Leap Wireless (Cricket) For $1.2 Billion

Subject: Editorial, General Tech | July 13, 2013 - 04:24 PM |
Tagged: wireless, spectrum, leap wireless, cricket, AT&T, acquisition, 4g lte

AT&T Plans To Acquire Leap Wireless (Cricket)

In a counter move to the SoftBank-Sprint-Clearwire merger, AT&T has announced its intentions to buy out Leap Wireless and its Cricket pre-paid cell service brand. AT&T will pay as much as $15 per share, which amounts to a bit under $1.19 billion (79.05 million outstanding shares at $15 per share). Before the announcement, Leap Wireless was trading at less than $8, so the bid is fairly generous. So far, approximately 30% of shareholders have voted to accept the buyout offer.

In the buyout deal, AT&T will acquire Leap Wireless, its Cricket brand in the US, licenses, spectrum, Cricket brand, 3,400 employees, and its retail locations. Cricket currently has a 3G CDMA network and is rolling out a 4G network. The company has about 5 million subscribers. AT&T will get to add a bit more spectrum to its portfolio in the PCS and AWS bands. This spectrum held by Leap Wireless is reportedly complementary to AT&T’s existing licenses.

Cricket Wireless.png

Interestingly Leap Wireless is not doing very well, and has about $2.8 billion in net debt, and its Cricket service is loosing subscribers. AT&T would also have to assume that debt. Cricket offers up unlimited plans that include unlimited voice calls, texting, and data. AT&T has stated that it would assume control of and maintain the Cricket brand. It will continue to offer service to existing Cricket customers and would also offer up its own 4G LTE network for use by Cricket pre-paid plans (phone hardware permitting). AT&T stated in a press release that it intends to use the Leap Wireless acquisition to “jump start AT&T’s expansion into the highly competitive prepaid segment.”

The buyout deal will need to be approved by Leap Wireless as well as by the US Department of Justice and FCC. If it successfully passes through the various regulatory bodies, AT&T expects the deal to close within the next six to nine months.

Personally, I have my doubts that AT&T will continue to maintain the Cricket service as is, especially when it comes to unlimited data. As far as its pre-paid expansion, it at least tried to go down this path before with its line of Go phones. I believe that this deal is mostly about padding out AT&T’s spectrum portfolio in a bid to head off Sprint, and maintain its position against T-Mobile and Verizon. The MVNO and pre-paid market is certainly growing and AT&T is going to want a piece of that market, but I also think that the last thing AT&T wants to do is cannibalize its own contract offerings by offering up a similar pre-paid service with unlimited everything for half the price. Sure, AT&T will take it versus getting nothing, but the company is going to have a hard time balancing both offerings in a way that does not negatively effect one or both of its pre-paid and post paid services.

What do you think about the deal, is this a good thing for Cricket customers? Is AT&T serious about wanting to jump into the pre-paid market?

Source: AT&T

Windows 8 Market Share Outpaces Vista, but Is Still Far Below Windows 7 and Windows XP

Subject: Editorial, General Tech | July 6, 2013 - 08:33 PM |
Tagged: windows 8, Windows 7, microsoft, desktop market share

A recent report by NetMarketShare indicates that Windows 8 is having a difficult time displacing Microsoft's older operating systems. Of the total market, Windows occupies 91.50% across all existing versions. Windows 7 and Windows XP dominate the Windows market share at 44.37% and 37.17% respectively. Microsoft's latest operating system, Windows 8, is sitting at 5.1%, which barely scratches past Windows Vista at 4.62%. Having more market share than Windows Vista and Windows 98 is good, but it is hardly proving to be as popular as Microsoft hoped for. 

Desktop Operating System Market Share 2013.jpg

June 2013 Desktop Operating System Market Share, as measured by NetMarketShare.

Granted, Windows 8 is still a new operating system, whereas XP and Windows 7 have had several years to gain users, be included on multiple generations of OEM machines, and be accepted by the enterprise customers. The free Windows 8.1 update should alleviate some users' concerns and may help bolster its market share as well. However, Windows XP simply will not die and Windows 7 (if talk on the Internet is to be believed, hehe) seems to be good enough for the majority of users, so it is difficult to say when (or if) Microsoft's latest OS will outpace the two existing, and entrenched, Windows operating systems.

YoY, Windows 7 lost 0.33% market share while Windows XP lost 6.44% market share. Meanwhile, Windows 8 has been slowly increasing in market share each quarter since its release. Netmarketshare reported 1.72% market share in December of 2012, and in six months the operating system has grown by 3.38%. There is no direct cause and effect here, but it does suggest that few people are choosing a Windows 8 upgrade path, and that despite the growth, the lost market share for Windows 7 and XP is not solely from people switching to Windows 8, but also some small number of people jumping to alternative operating systems such as Mac OS X and Linux. The historical data is neat, but it is difficult to predict how things will look moving forward. If adoption continues at this pace, it is going to take a long time for Windows 8 to dethrone Microsoft's older Windows XP and Windows 7 operating systems.

How you made the switch to Windows 8 or gotten it on a new machine? Will the Back-to-School shopping season give Windows 8 the adoption rate boost it needs?

Xbox Division Lead, Don Mattrick, Leaves to Join... Zynga? Steve Ballmer, Himself, Scabs the Void.

Subject: Editorial, General Tech | July 2, 2013 - 12:33 AM |
Tagged: xbox one, xbox, microsoft, consolitis

Well that was unexpected...

Don Mattrick, a few months ahead of the Xbox One launch and less than two months after its unveiling, decided to leave his position at Microsoft as president of Interactive Entertainment Business. This news was first made official by a Zynga press release, which announced acquiring him as CEO. Steve Ballmer later published an open letter addressed all employees of Microsoft, open to the public via their news feed, wishing him luck and outlining the immediate steps to follow.

Mattrick.jpg

While subtle in the email, no replacement has been planned for after his departure on July 8th. Those who report to Don Mattrick will report directly to Steve Ballmer, himself, seemingly through the launch of Xbox One. As scary and unsettling as Xbox One PR has been lately, launching your flagship ship without a captain is a depressingly fitting apex. This would likely mean that either: Don gave minimal notice of his departure, he was being abruptly ousted from Microsoft and Zynga just happened to make convenient PR for all parties involved, or there is literally no sense to be made of the situation.

However the situation came about, Xbox One will likely launch from a team directly lead by Steve Ballmer and Zynga will have a new CEO. Will his goal be to turn the former social gaming giant back on course? Or will he be there to milk blood from the company before it turns to stone?

I wonder whether his new contract favors cash or stock...

Source: Zynga

Google's Web is QUIC and SPDY

Subject: Editorial, General Tech | July 1, 2013 - 11:12 PM |
Tagged: google, spdy, QUIC

It missed being a recursive acronym by a single letter...

TCP is known for being the go-to protocol for stable connections over the internet. There are some things you can guarantee: you will not lose bits of data, packets will arrive in order, incorrect packets will be checked and redelivered, and both endpoints will be roughly metered to the least capacity. It is easy to develop applications around the TCP protocol, it does the hard problems for you.

Being meticulous also makes it slow, relatively speaking.

9-ethernet2.png

UDP, on the other hand, frees its packets in a fountain to hopefully land where it is intended. This protocol is fast, but a pain for applications that need some level of reliability. Quick UDP Internet Connections (QUIC), from Google, leverages UDP to create multiple independent, even encrypted connections. While TCP could be made faster, it is beyond the jurisdiction of web browsers; support is embedded into the operating system itself. This leaves building upon UDP, suffering with TCP, or not being compatible with about every network hardware installed just about anywhere.

This comes on the heels of SPDY, Google's other open protocol. SPDY is built around HTTP and both presume a reliable protocol underneath, where TCP is the usual candidate. A large advantage of SPDY allows assets to simultaneously stream over a single connection. TCP will, unfortunately, freeze the entire connection (and thus each stream) when a single stream drops a packet. QUIC, based upon UDP, can then be used to accelerate SPDY further by allowing truly independent multiplexing.

QUIC will be used for "a small percentage of Chrome dev and canary channel traffic to some Google server", for experimentation purposes. The code itself is licensed under BSD and, as such, could migrate to other browsers in due time.

Source: CNet

Microsoft Gives Xbox One Gamers What They Want... Sort Of

Subject: Editorial, General Tech | June 19, 2013 - 06:08 PM |
Tagged: xbox one, gaming, DRM, disc

Microsoft faced a major backlash from users following the unveiling of its latest Xbox One console. Users were rather unnerved at Microsoft’s reveal that the new console would be required to “phone home” at least once every 24 hours in order to authenticate games and allow sharing. Considering Sony carried forward the disc traditions of the PS3 combined with the user uproar, Microsoft has reconsidered and issued an update to users via a blog post titled (in part) “Your Feedback Matters.”

Amidst the uncertainty caused by various MS sources issuing statements about functionality and DRM that conflict with one another and an air of as-yet-un-announced secrecy pre-E3 where MS released just enough info about the DRM to get users scared (can you tell the way MS handled this irked me?), the company talked about the Xbox One moving forward and taking advantage of the ‘digital age.’ The new console would require online authentication (and daily check-ins), but would also allow sharing of your game library with up to 10 other people, re-downloadable games that can be installed on other consoles (and played) so long as you log into your Xbox Live account (the latter bit is similar in nature to Steam on the PC). Further, disc games could be resold or gifted if the publishers allow it.

That has changed now, however. Microsoft has reconsidered its position and is going back to the way things work(ed) on the existing Xbox 360. Instead of taking the logical approach of keeping with the plan but removing the daily authentication requirement for games if you keep the game disc in the tray, Microsoft has taken their ball Xbox One controller and completely backtracked.

Xbox One Logo.jpg

DRM on the Xbox One is now as follows, and these changes go in place of (not in addition to) the previously announced sharing and reselling functionalities.

For physical disc games:

According to Xbox Wire, after their initial setup and installation, disc-based games will not require an internet connection for offline functionality (though multiplayer components will, obviously, need an active connection). Even better, trading and reselling of disc-based games is no longer limited by publishers. Trading, selling, gifting, renting, et al of physical disc-based games "will work just as it does today on the Xbox 360." Microsoft is also not region locking physical games, which means that you will not have to worry about games purchased abroad working on your console at home.

In order to play disc-based games, you will need to keep the game disc in the tray, even if it is installed on the hard drive, however.

Changes to Downloaded games:

As far as downloadable games, Microsoft is restricting these titles such that they cannot be shared or resold. In the previous model, you would have been able to share the titles with your family, but not anymore. You will still be able to re-download the games.

There is no word on whether or not gamers will still lose access to all of the titles in their game library if their Xbox Live accounts are ever banned. It is likely that gamers will lose any downloadable games though as those are effectively tied to a single Xbox Live account.

While at first glance it may seem as though gamers won this round, in the end no one really won. Instead of Microsoft working around gamers concerns for physical media and moving forward together, it is as though Microsoft has thrown up its hands in frustration, and tossed out all of the innovative aspects for digital/downloadable titles along with the undesirable daily authentication and other invasive DRM measures that gamers clearly indicated they did not want.

I believe that Microsoft should have kept to the original game plan, but added an exception to the daily check-in rules so long as the console was able to authenticate the game offline by identifying a physical game disc in the tray. That way, gamers that are not comfortable with (or able to) keeping the Xbox One connected to the internet could continue to play games using discs while also allowing those with always-on Xbox One consoles the privileges of sharing their libraries. Doing so would have also helped ease the console gaming populance as a whole into Microsoft's ideal digital age once the next Xbox comes out. However, instead of simply toning down the changes, Microsoft has completely backtracked, and now no one wins. Sigh.

What are your thoughts on Microsoft's latest changes to the Xbox One? Was it the right move, or were you looking forward to increased freedom with your digitally-downloaded games?

Also read:

Source: Xbox Wire

Steam Might Allow Shared Games?

Subject: Editorial, General Tech | June 19, 2013 - 03:33 PM |
Tagged: steam, DRM

You can learn a lot by scanning configuration, registry files, and so forth; many have made off with a successful bounty. Most recently, some Steam Beta users dug around in their user interface (UI) files to notice a few interesting lines, instructing the user that the title they are attempting to launch will kick off a friend it is currently being shared with.

Wait, what?!

Steam-UI.png

"SteamUI_JoinDialog_SharedLicense_Title" "Shared game library"

"SteamUI_JoinDialog_SharedLicenseLocked_OwnerText" "Just so you know, your games are currently in use by %borrower%. Playing now will send %borrower% a notice that it's time to quit."

"SteamUI_JoinDialog_SharedLicenseLocked_BorrowerText" "This shared game is currently unavailable. Please try against later or buy this game for your own library."

Sure, this whole game DRM issue has been flipping some tables around the industry. Microsoft tried permitting users share games with their family, utilizing about the worst possible PR, and eventually needed to undo that decision. Users would like flexible licensing schemes, but the content industry (including the platform owners like Microsoft, Nintendo, and Sony, who receive license fees from game sales) are unwilling to cooperate unless they are assured that users are honest.

Of course, what usually happens is honest users get crapped on and pirates enjoy a better experience, after initial setup.

While there is not much difference, from a high level view, between Steam and the proposed Xbox One, there are a number of differences. The obvious difference is Steam's offline mode, but probably the larger reason is trust. Valve has demonstrated a lot of good faith to their customers; where Microsoft shuts down access to content people paid for, Valve has shown they have intentions for both long-term support and consideration for the user's experience.

Ultimately, I feel as if DRM is not a necessary evil, but while it exists at least there are companies such as Valve who earn trust and use DRM both for and against users. I expect that some day, the industry will turn against DRM either willingly, by legal intervention, or because companies like cdp.pl will use DRM-free as a promotional tool and nibble their way to dominance.

And yes, despite the fact that this will be confused with bias: if you prove that you are untrustworthy before, you will get away with less later regardless of your intentions.

The Witcher 3's DRM Strategy: Still None on PC

Subject: Editorial, General Tech | June 19, 2013 - 03:16 PM |
Tagged: DRM, The Witcher 3, GOG

cdp.pl, formerly CD Projekt, has been one of the last holdouts against DRM. Founders of GoG.com and developer/publisher for The Witcher franchise, they offer a DRM-free platform for users to purchase games. Sure, they are usually good and old ones, aptly enough, but they are confident enough to include their most ambitious titles, The Witcher and The Witcher 2.

With The Witcher 3, we will see the title launch without DRM on GoG, trusting their users will purchase the title and be honest.

witcher-drm2.jpg

Apparently, the game will have a world slightly larger than Skyrim.

Hopefully, with very little empty space.

I have long been a proponent of DRM-free media, as you could probably tell. I believe that DRM-free titles end up netting more sales than the same title would have with encryption; even if that were not true, society is harmed more than enough to justify its non-existence. Sure, we all know unapologetic jerks and they are, indeed, jerks. Just because these jerks exist does not mean your company should, or successfully will, be the alpha a-hole on the a-hole food-chain. Chances are you will just upset your actual customers, now former customers. There are reasons why I never purchased (never pirated either, I just flat-out ignored the entire franchise's existence) another Crysis title after the first one's SecuROM debacle wrecked my camcorder's DVD-authoring software.

So, when The Witcher 3 comes out, back it up on your external hard drive and maybe even keep a copy on your home theater PC. Most importantly, buy it... sometime in 2014.

Source: PC Gamer

E3 2013: Microsoft can ban your Xbox One library

Subject: Editorial, General Tech, Systems, Shows and Expos | June 17, 2013 - 12:16 AM |
Tagged: xbox one, microsoft, ea, E3 13, E3

Update: Microsoft denies the statements from their support account... but this is still one of the major problems with DRM and closed platforms in general. It is stuff like this that you let them do.

xbox-one-head.jpg

Electronic Arts knows that they need to shake their terrible public image.

Welcome to Microsoft's PR strategy for the Xbox One.

Consumers, whether they acknowledge it or not, fear for the control that platform holders have over their content. It was hard for many to believe that having your EA account banned for whatever reason, even a dispute with a forum moderator, forfeited your license to games you play through that EA account. Sounds like another great idea for Microsoft to steal.

Not stopping there, later on in the thread they were asked what would happen in the event of a security breach. You know, recourse before destroying access to possibly thousands of dollars of content.

While not a "verified account", @xboxsupport is.

They acknowledge ownership of this account in the background image there.

Honestly, there shouldn't have been any doubt that these actually are Microsoft employees.

... Yikes.

At this point, we have definitely surpassed absurdity. Sure, you typically need to do something fairly bad to have Microsoft stop charging your for Xbox Live. Removing access to your entire library of games, to me, is an attempt to limit cheating and the hardware community.

Great, encourage spite from the soldering irons, that works out well.

Don't worry, enthusiasts, you know the PC loves you.

Gaming as a form of entertainment is fundamentally different than gaming as a form of art. When content is entertainment, its message touches you without any intrinsic value and can be replaced with similar content. Sometimes a certain piece of content, itself, has specific value to society. It is these times where we should encourage efforts by organizations such as GoG, Mozilla and W3C, Khronos, and many others. Without help, it could be extremely difficult or impossible for content to be preserved for future generations and future civilizations.

It does not even need to get in the way of the industry and its attempt to profit from the gaming medium; a careless industry, on the other hand, can certainly get in the way of our ability to have genuine art. After all, this is the main reason why I am a PC gamer: the platform allows entertainment to co-exist with communities who support themselves when the official channels do not.

Of course, unless Windows learns a little something from the Xbox. I guess do not get your Windows Store account banned in the future?

Intel is not slowing down, exclamation exclamation. Haswell-E for Holiday 2014 question mark.

Subject: Editorial, General Tech, Processors | June 15, 2013 - 04:02 PM |
Tagged: Intel, Ivy Bridge-E, Haswell-E

In my analysis of the recent Intel Computex keynote, I noted that the displayed confidence came across more as repressing self-doubt. It did not seem, to me, like Intel wants to abandon the high-end enthusiast but rather catch up with their low performance and high efficiency competitors; they just know they are secure in that market. Of course, we could see mid-range choices dwindle and prices stagnate, but I cast doubt that Intel wants to exit the enthusiast market despite their silence about Ivy Bridge-E.

Haswell-E1.jpg

All Images, Credit: VR-Zone

And Intel, now, wants to return some confidence to their high-end consumers comma they are not slowing down exclamation point exclamation point.

VR-Zone, the site which published Ivy Bridge-E's lazy release roadmap, are also the ones to suggest Haswell-E will come before mainstream Broadwell offerings. Once again, all is right with the world. Slated for release around holiday 2014, just a year after Ivy Bridge-E, Haswell-E will come alongside the X99 chipset. Instead of Broadwell, the back to school window of 2014 will by filled by a refresh of 22nm Haswell products with a new 9-series chipset.

Seriously, it's like watching the face of Intel's Tick-Tock while a repairman is tweaking the gears.

Haswell-E2.jpg

In terms of specifications, Haswell-E will come in 8 and 6-core offerings with up to 20MB of cache. Apart from the inclusion of DDR4 support, the main advantage of Haswell-E over the upcoming Ivy Bridge-E is supposed to be raw performance; VR-Zone estimates up to 33-50% better computational strength. A depressingly novel area of improvement as of recent...

Lastly, with recent discussion of the awkwardly hobbled K-series parts, our readers might be happy to know that all Haswell-E parts will be unlocked to overclocking. This, again, leads me to believe that Intel is not hoping to suffocate the enthusiast market but rather sort their users: mid-range consumers will take what they are given and, if they object, send them on the bus to Funk-E town.

Haswell-E3.jpg

Note, while the headlining slide definitively says "All Processors Unlocked"...

... this slide says "For K and Extreme series products." I will assume the latter is out of date?

Which begs the question: what does our readers think about that potential strategy? It could lead to mainstream performance products being pushed down into BGA-territory, but cements the existence of an enthusiast platform.

Source: VR-Zone

WWDC 13: Dissecting Apple's New Hardware Changes. MacBook Air and the new Mac Pro.

Subject: Editorial, General Tech, Systems, Shows and Expos | June 11, 2013 - 01:06 AM |
Tagged: wwdc 13, MacBook Air, Mac Pro, apple

Sometimes our "Perspective" is needed on Apple announcements because some big points just do not get covered by the usual sources. Other times, portions of the story can be relevant to our readers. This is one of those days where both are true. Either side should review our thoughts and analysis of Apple's recent ultrabook and, especially, their upcoming desktop offerings.

The MacBook Air has been, predictably, upgraded Intel's Haswell processors. Battery life is the first obvious benefit of the CPU, and that has been well reported. The 11-inch MacBook Air gains an extra four hours of battery life, usable for up to 9 hours between charges. The extra space on the 13-inch MacBook Air allows it to last 12 hours between charges.

apple-macbook-air.jpg

Less discussed, both MacBook Airs will contain Intel's Iris iGPU more commonly known as Intel HD 5000. You cannot get Intel HD 5000 graphics without selecting a BGA socket component which you would install by soldering it in place. While there are several better solutions from competing GPU vendors, Apple will have one of the first shipping implementations of Haswell's canonical graphics processor. Iris is said to have double the performance of previous generation Ivy Bridge graphics for a fraction of its power consumption.

Also included in the MacBook Air is an 802.11a/b/g/n/ac WiFi network adapter and Bluetooth 4.0. Apple is not typically known to introduce new standards and often lags severely behind what is available on the PC unless they had a hand in trademarking it, USB 3.0 being the obvious and recent example.

The specifications will be somewhat customizable, the user is able to select between: an i5 and an i7 processor, 4GB or 8GB of RAM, and 128, 256, or 512GB SSD. It has shipped the day it was announced with base prices ranging between $999 for an entry-level 11-inch and $1099 for an entry-level 13-inch.

But now we move on to the dying industry, desktop PCs, where all innovation has died unless it is to graft a touch interface to anything and everything.

"Can't innovate any more, my ass", grunts Phil Schiller, on the keynote stage.

Whether you like it, or think "innovation" is the best word, it's a legitimate new design some will want.

While the new Mac Pro is not a system that I would be interested in purchasing, for issues I will outline soon, these devices are what some users really want. I have been a very strong proponent of OEM devices as they highlight the benefit of the PC industry: choice. You can purchase a device, like the new Mac Pro, from a vendor; alternatively, you can purchase the components individually to assemble yourself and save a lot of money; otherwise, you can hire a small business computer store or technician.

We need more companies, like Apple, to try new devices and paradigms for workstations and other high-performance devices. While it is less ideal for Apple to be the ones coming up with these redesigns, Apple's platform encourages applications to be vendor-specific (only run on a Mac), it can still benefit the PC industry by demonstrating that life and demand still exists; trying something new could reap large benefits. Not everyone wants to have a full ATX case with discrete components but still want workstation performance, and that is okay.

Now when it comes to actual specifications, the typical coverage glossed over what could be easily approximated by a trip to Wikipedia and Google. Sure, some may have been in a rush within the auditorium, but still.

The specifications are:

  • Intel Xeon E5-2600 V2-class CPU, Ivy Bridge-E, 12 cores max (suggests single-socket)
  • 4-channel DDR3 ECC RAM, apparently 4 DIMMS which suggests 4x16GB (Max).
  • Dual FirePro GPUs, 4096 total shaders with 2x6GB GDDR5.
  • PCIe SSD
  • Thunderbolt 2, USB3.0, and WiFi ac (+ a/b/g/n??), Bluetooth 4.0

Now the downside is that basically anything you wish to add to the Mac Pro needs to be done through Thunderbolt, Bluetooth 4.0, or USB 3.0. When you purchase an all-in-one custom design, you forfeit your ability to reach in and modify the components. There is also no mention of pricing, and for a computer with this shoplist you should expect to pay a substantial invoice even without "The Apple Tax", but that is not the point of purchasing a high-end workstation. Apple certainly put in as close to the best-of-the-best as they could.

Now could people stop claiming the PC is dead and work towards sustaining it? I know people love stories of jarring industry shifts, but this is ridiculous.

Source: Apple

Win an AMD A10-6800K Richland APU + SimCity!!

Subject: Editorial, Processors | June 10, 2013 - 07:53 AM |
Tagged: SimCity, Richland, giveaway, contest, APU, amd, a10-6800k

Odd, turns out I found two brand new AMD A10-6800K Richland APUs sitting on my desk this morning.  I called AMD to ask what this was all about and they said that if I didn't need them, I might as well give them away to our readers. 

"Oh, and throw in a free copy of the new SimCity while you're at it," they told me.

apugiveaway.jpg

Who am I to argue?

So let's have a giveaway! 

We are handing out two AMD A10-6800K "Richland" APUs for you to build a brand new PC around and including a key for SimCity with each of them.  If you haven't read Josh's review of the A10-6800K APU you should definitely do so; it will help educate you on exactly what you are getting - for FREE. 

apubmarks.png

To enter, I need you to leave a comment on this very news post below telling us what you would build with a brand new A10 APU - you don't have to be registered to do so but we'd sure like it if you were.  (Make sure you leave your correct email address so I can get in touch with you if you win.)  Also, feel free to stop by the PC Perspective YouTube channel and either give our videos a gander or subscribe.  I think we put out some great content there and we'd like more of you to see it. 

I will pick one winner on June 17th and another on June 24th so you have two separate weeks to potentially win! 

A big thanks goes out to AMD for supplying the APUs and copies of SimCity for this giveaway.  Good luck!!

Source: AMD

Computex 2013: The Comedic Return of the Ultra GPUs

Subject: Editorial, General Tech, Graphics Cards, Shows and Expos | June 9, 2013 - 11:49 PM |
Tagged: Ultra, geforce titan, computex

So long to Computex 2013, we barely knew thee. You poured stories all over our news feed for more than a whole week. What say you, another story for the... metaphorical road... between here... and... Taipei? Okay, so the metaphorical road is bumpy and unpaved, work with me.

It was substantially more difficult to decipher the name of a video card a number of years ago. Back then, products would be classified by their model numbers and often assigned a suffix like: "Ultra", "Pro", or "LE". These suffixes actually meant a lot, performing noticeably better (or maybe worse) than the suffix-less number and possibly even overlapping with other number-classes.

colorful-gtx-titan-ultra-edition,B-V-387931-13.png

Image Credit: zol.com.cn via Tom's Hardware

Just when they were gone long enough for us to miss them, the suffixes might make some measure of a return. On the show floor, Colorful exhibited the NVIDIA GeForce GTX Titan Ultra Edition. This card uses a standard slightly-disabled GK110-based GeForce GTX Titan GPU, with the usual 2688 CUDA cores, and 6GB of GDDR5. While the GK110 chip has potential for 2880 CUDA cores, NVIDIA has not released any product (not even Tesla or Quadro) with more than 2688 CUDA cores enabled. Colorful's Titan Ultra and the reference Titan are electrically identical; this "Ultra" version just adds a water block for a cooling system and defaults to some amount of a factory overclock.

But, this is not the first time we have heard of a Titan Ultra...

Back in April, ExtremeTech found a leak for two official products: the GTX Titan LE and the GTX Titan Ultra. While the LE would be slightly stripped down compared to the full GTX Titan, the GTX Titan Ultra would be NVIDIA's first release of a GK110 part without any CUDA cores disabled.

So if that rumor ends up being true, you could choose between Colorful's GTX Titan Ultra with its partially disabled GK110 based on the full GTX Titan design; or, you could choose the reference GTX Titan Ultra based on a full GK110 GPU unlike the partially disabled GK110 on the full GTX Titan.

If you are feeling nostalgic... that might actually be confusion... as this is why suffixes went away.

E3 2013: Bludgeon that horse again! Xbox One DRM

Subject: Editorial, General Tech, Systems, Shows and Expos | June 6, 2013 - 05:46 PM |
Tagged: xbox one, E3 13, E3

So heading up to E3, Microsoft decided to drop their DRM bombshell so it would get buried over the next couple of days. In terms of permissiveness, the Xbox One is not nearly as bad as feared; of course, it is still terrible in certain ways.

Microsoft will allow games to be played offline on the Xbox One... for 24 hours. If your internet connection has been offline for longer than that period (unclear whether the timer starts when internet goes out or from last update) then your system will be locked to live TV and disc-based movies. Games and apps, even ones which should have no online functionality, will cease to function until you reconnect with Xbox servers.

This also means that if the Xbox servers have an outage lasting between 24 hours and "taken offline forever", all gaming and apparently apps will cease to function on the Xbox One.

And people wonder why I freak out about Windows Store.

xbox-one-front.png

It's like if Wall-E grew a Freddie Mercury

But at least they will allow some level of used-game transfer... if the publisher agrees. Check out this statement from Microsoft Studios:

In our role as a game publisher, Microsoft Studios will enable you to give your games to friends or trade in your Xbox One games at participating retailers. Third party publishers may opt in or out of supporting game resale and may set up business terms or transfer fees with retailers. Microsoft does not receive any compensation as part of this. In addition, third party publishers can enable you to give games to friends. Loaning or renting games won’t be available at launch, but we are exploring the possibilities with our partners.

So this will be an interesting experiment: how will revenue and profitability be affected for game publishers who deny used game sales? I honestly expect that used game sales actually promote the purchasing of more games and that initiatives to limit used game transfers will reduce user engagement. Of course Microsoft is now taking all of the flak from Sony, who may or may not be considering the same practice, but I am sure at least Microsoft is hoping that everyone will forget this when shiny new trailers erase the collective gamer memory.

In return, however, Microsoft is being fairly permissive when it comes to how many users can be licensed on a single disk. Up to ten family members are allowed access to your collective library.

And, after all, it should not be a surprise that a console game disappears when Microsoft shuts down their servers: consoles were always designed to be disposable. I have been proclaiming that for quite some time. The difference is now, people cannot really deny it.

Source: Microsoft

Computex 2013 / E3 2013: Unreal Engine 4 Partners Program

Subject: Editorial, General Tech, Shows and Expos | June 6, 2013 - 02:42 PM |
Tagged: unreal engine 4, ue4, E3 13, E3, computex

We are bleeding through the overlap between Computex and E3 media windows; this news has a somewhat relevant fit for both. Unreal Engine 4 is coming and I expect we will see one or more demos and UE4-powered titles over the next week. In fact, I would be fairly shocked if we do not see the end of the Elemental Demo with the Xbox One E3 keynote. We may also potentially see Unreal Engine 4 running on mobile devices and maybe even HTML5 at some point throughout the tradeshow, either canonically through Epic or via a licensee product.

This morning, Epic opened the Unreal Engine 4 Integrated Partners Program (IPP). Of course they already have a couple of members, most of which were partners with Unreal Engine 3.

The founding IPP partners are:

  • Wwise from Audiokinetic
    • Manages large databases of sound effects and voice-overs
    • Manages subtitles and multiple dubbings of voice clips
  • Autodesk Gameware from Autodesk
    • Contains multiple packages including Beast, Navigation, and Scaleform
    • Scaleform is a Flash rendering engine for HUDs, menus, etc. developed using Flash Professional in 2D or 3D. It is what StarCraft II, Mass Effect, and Borderlands uses.
    • Beast is a lighting toolkit for global illumination, radiosity, etc.
    • Navigation is an AI solver, predominantly for pathfinding.
  • Simplygon from Donya Labs
    • Reduces polygon count of models so they take up less processing resources especially as they get further away from the camera.
  • Enlighten from Geomerics
    • Another Global Illumination solver, most popular usage being Battlefield 3.
  • SpeedTree for Games from IDV
    • Makes a bunch of efficient trees so studios do not need to hire as many minimum wage peons.
  • Intel Threading Building Blocks (TBB) from Intel
    • Helps developers manage C++ threading for multicore systems.
    • Deals with memory management and scheduling tasks
  • morpheme from NaturalMotion
    • Animation and physics software for designers to create animations
    • Works with NVIDIA PhysX
  • euphoria from NaturalMotion
    • Simulates animations based on driving conditions via the CPU, most popular usage being GTA IV.
  • PhysX and APEX from NVIDIA
    • You probably know this one.
    • GPU-based rigid body, soft body, fluid, and cloth solvers.
    • Allows for destructible environments and other complex simulations.
  • Oculus Rift from Oculus VR
  • Bink Video from Rad Game Tools
    • ... is not included! Just kidding, that stuff'll survive a nuclear apocalypse.
    • Seriously, check in just about any DirectX or OpenGL game's credits if it includes pre-rendered video cutscenes or video-textures.
    • I'll wait here.
    • In all seriousness, Rad Game Tools has been licensed in over 15,500 titles. It's been a meme to some extent for game programmers. This should be no surprise.
  • Telemetry Performance Visualizer from Rad Game Tools
    • Allows developers to see graphs of what their hardware is working on over time.
    • Helps developers know what benefits the most from optimization.
  • RealD Developer Kit (RDK) from RealD
    • Helps game developers create stereoscopic 3D games.
  • Umbra 3 from Umbra Software
    • Determines what geometry can be seen by the player and what should be unloaded to increase performance.
    • Sits between artists and programmers to the former does not need to think about optimization, and the latter does not need to claw their eyes out.
  • IncrediBuild-XGE from Xoreax
    • Apparently farms out tasks to idle PCs on your network.
    • I am not sure, but I think it is mostly useful for creating a pre-render farm at a game studio for light-baking and such.

We still have a little while until E3 and so we do not know how E3 will be, but I highly expect to see Unreal Engine 4 be a recurring theme over the next week. Keep coming back to PC Perspective, because you know we have a deep interest in where Epic is headed.

Source: Epic Games

PCPer Live! ASUS Z87 Motherboard and Intel Haswell Live Event!

Subject: Editorial, General Tech, Motherboards, Processors | June 4, 2013 - 07:40 AM |
Tagged: z87, video, overclocking, live, i7-4770k, haswell, ASUS ROG, asus

While we run around with our hair on fire trying to get ready for the Intel Haswell and Z87 product launch this weekend, I wanted to let everyone know about a live stream event we will be holding on Tuesday, June 4th.  JJ from ASUS, a crowd favorite for sure, will be joining us LIVE in studio to talk all about the new lineup of ASUS Z87 motherboards.  We'll also discuss performance and overclocking capabilities of the new processor and platform.

rog1.jpg

ASUS Z87 and Haswell Live Stream

10am PT / 1pm ET - June 4th

PC Perspective Live! Page

Be sure you stop by and join in the show!  Questions will be answered, prizes will be given out and fun will be had!  Who knows, maybe we can break some stuff live as well??  On hand to give away to those of you joining the live stream, we'll have these prizes:

  • 2 x ASUS Z87 Motherboards
  • 1 x ASUS Graphics card

Methods for winning will be decided closer to the event, but if you are watching live, you'll be included.  And we'll ship anywhere in the world!

pcperlive2.png

ASUS and I also want the event to be interactive, so we want your questions.  We'll of course being paying attention to the chat room on our live page but you'll have better luck if you submit your questions about the ASUS Z87 products and Haswell processors before hand, in the comments section below.  You don't have to register to ask and we'll have the ability to read them beforehand! 

I'll update this post with more information after the reviews and stories start to hit, so keep an eye here for more details!!

2013 StarCraft II World Championship Series Season 1 -- Finals (... of Season 1... ) this Weekend!

Subject: Editorial, General Tech | June 4, 2013 - 12:44 AM |
Tagged: WCS, starcraft 2, HoTS

A little eye-rest before another barrage of Computex news...

Blizzard took over the canon StarCraft II tournament scene as of last year. The goal was to create a unified ranking system between every tournament and help participants deal with scheduling, a problem in recent years. Throughout the entire year, Blizzard is hosting the 2013 StarCraft II World Championship Series. They seem to like breaking rankings into seasons and the 2013 series, alone, will incorporate three of them leading to the year's grand finals in November.

One year per series; three seasons and a grand finals per year; three regional tournaments and a finale per season. This season's finals will take place this weekend, June 8th and 9th, in South Korea.

BattleChamp.jpg

Tournaments in Europe, Korea, and North America chose the 16 competitors for the 2013 Season 1 Finals this weekend in Korea. The top five competitors in each tournament (top six for Korea) earned their invite. In all: 3 Protoss, 5 Terrans, and 8 Zerg will be participating. I guess their hearts are only half of the swarm.

If the regional matches were any indication, the seasonal finals should be a very entertaining bridge between Computex coverage and E3 2013. Players are getting much better at the game mechanics while still being able to surprise their opponents and even the audience with unusual strategies. Players exploit windows of weakness in their opponents with a moment of strength; the entertainment mostly comes from seeing each player attempt to delay or lengthen those windows all while hiding their own weak periods into times where the opponent is unable to reasonably exploit it.

What are your opinions of "eSports"? Good concept, bad name?

Source: Liquipedia