All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Editorial, General Tech, Graphics Cards, Processors, Memory, Systems | January 20, 2014 - 07:40 AM | Scott Michaud
Tagged: corsair, overclocking
I rarely overclock anything and this is for three main reasons. The first is that I have had an unreasonably bad time with computer parts failing on their own. I did not want to tempt fate. The second was that I focused on optimizing the operating system and its running services. This was mostly important during the Windows 98, Windows XP, and Windows Vista eras. The third is that I did not find overclocking valuable enough for the performance you regained.
A game that is too hefty to run is probably not an overclock away from working.
Thankfully this never took off...
Today, overclocking is easier and safer than ever with parts that basically do it automatically and back off, on their own, if thermals are too aggressive. Several components are also much less locked down than they have been. (Has anyone, to this day, hacked the locked Barton cores?) It should not be too hard to find a SKU which encourages the enthusiast to tweak some knobs.
But how much of an increase will you see? Corsair has been blogging about using their components (along with an Intel processor, Gigabyte motherboard, and eVGA graphics card because they obviously do not make those) to overclock. The cool part is they break down performance gains in terms of raising the frequencies for just the CPU, just the GPU, just the RAM, or all of the above together. This breakdown shows how each of the three categories contribute to the whole. While none of the overclocks are dramatic, Corsair is probably proud of the 5% jump in Cinebench OpenGL performance just by overclocking the RAM from 1600 MHz to 1866 MHz without touching the CPU or GPU.
It is definitely worth a look.
Subject: Editorial, General Tech | January 20, 2014 - 04:46 AM | Scott Michaud
Tagged: Keyboards, keyboard
Peter Bright down at Ars Technica wrote an editorial about the Lenovo ThinkPad X1 Carbon. His opinion is that keyboard developers should innovate in ways that "doesn't undermine expectations". Replacing a row of physical keys for a software-controlled touch strip is destructive because, even if the change proved invaluable, it would ultimately be inferior because it clashes with every other keyboard the user encounters. He then concludes with a statement that really should have directed his thesis.
Lenovo's engineers may be well-meaning in their attempts to improve the keyboard. But they've lost a sale as a result. The quest for the perfect laptop continues.
That is the entire point of innovation! You may dislike how a feature interacts with your personal ecosystem and that will drive you away from the product. Users who purchased the laptop without considering the keyboard have the option of returning it and writing reviews for others (or simply put up with it). Users who purchased the laptop because of the keyboard are happy.
I mainly disagree with the article because it claims that it is impossible to innovate the keyboard in any way that affects the core layout. I actually disagree with it for two reasons.
My first issue is about how vague he is. His primary example of good keyboard innovation is the IBM ThinkPad 701c and its "butterfly keyboard". The attempt is to increase the keyboard size to exceed the laptop itself to make it more conventional. Conventional for who? How many people use primarily small laptops with shrunken keyboards compared to people who touch-type function keys?
The second critique leads from the first. The PC industry became so effective because every manufacturer tries to be a little different with certain SKUs to gain tiny advantages. There could have easily been a rule against touchscreen computers. Eventually someone hit it out of the park and found an implementation that was wildly successful to a gigantic market. The QWERTY design has weathered the storm for more than a century but there is no rule that it cannot shift in the future.
In fact, at some point, someone decided to add an extra row of function keys. This certainly could undermine the expectations of users who have to go between computers and electronic typewriters.
It will be tough, though. Keyboards have settled down and learning their layouts is a significant mental investment. There are several factors to consider when it comes to how successful a keyboard modification will become. Mostly, however, it will come down to someone trying and observing what happens. Do not worry about letting random ideas in because the bad ideas will show themselves out.
Basically the point is: never say never (especially not that vaguely).
Subject: Editorial, General Tech | January 11, 2014 - 05:13 AM | Scott Michaud
Tagged: SimCity, ea
Maxis and Electronic Arts recognize the hefty portion of SimCity's popularity as a franchise is due to its mod community. The current version could use all of the help it can get after its unfortunate first year. They have finally let the community take over... to some extent. EA is imposing certain rules upon the content creators. Most of them are reasonable. One of them can have unforeseen consequences for the LGBQT community. The first rule should apply to their expansion packs.
Starting at the end, the last three rules (#3 through #5) are mostly reasonable. They protect EA against potential malware and breaches of their EULA and Terms of Service. The fifth rule does begin to dip its toe into potential censorship but it does not really concern me.
No-one can be "Best Friends" in North America.
The second rule, while mostly condemning illegal activity, does include the requirement that content remains within ESRB 10+ and PEGI 7. The problem with any content certification is that it limits the dialog between artists and society. In this case, references to same-sex topics (ex: Harvest Moon) in games may force a minimum T or M rating. A mod which introduces some story element where two Sims of the same gender go on a date or live together (again, like Harvest Moon) might result in interest groups rattling the ESRB's cage until EA draws a firm line on that specific topic.
EA is very good with the LGBQT community but this could get unnecessarily messy.
The first rule is a different story. It says that mods which affect the simulation for multiplayer games or features are not allowed (despite being the only official mode). They do not want a modification to give players an unfair advantage over the rest of the game's community.
You know, like maybe an airship which boosts "your struggling industry or commercial [districts]" and also brings in tourists and commuters without causing traffic on your connecting highway?
Maxis is still, apparently, exploring options for offline SimCity experiences. Even if they allow a server preference to not affect the global economy, mods would be able to be quarantined to those areas. Great, problem solved. Instead, it is somewhat left up to interpretation what is allowed. To make matters worse, the current examples of mods that we have are purely cosmetic.
SimCity is nowhere near as bad as Halo 2 Vista for its mod functionality (those mod tools were so hobbled that its own tutorial was impossible). It could actually be good. These are just areas for EA to consider and, hopefully, reconsider.
Subject: Editorial, General Tech | January 7, 2014 - 07:25 AM | Tim Verry
Tagged: valve, SteamOS, steambox, opinion, Gabe Newell, CES 2014, CES
Valve Co-Founder Gabe Newell took the stage at a press conference in Las Vegas last night to introduce SteamOS powered Steam Machines and the company's hardware partners for the initial 2014 launch. And it has been quite the launch thus far, with as many as 13 companies launching at least one Steambox PC.
The majority of Steam Machines are living room friendly Mini-ITX (or smaller) form factors, but that has not stopped other vendors from going all out with full-tower builds. The 13 hardware partners have all put their own spin on a SteamOS-powered PC, and by the second half of 2014, users will be able to choose from $500 SFF cubes to ~$1000 Mini-ITX builds with dedicated graphics, to powerhouse desktop PCs that have MSRPs up to $6,000 and multiple GPUs. In fact, aside from SteamOS and support for the Steam Controller, the systems do not share much else, offering up unique options–which is a great thing.
For the curious, the 13 Steam Machine hardware vendors are listed below.
- Digital Storm
- Falcon Northwest
- Origin PC
- Scan Computers
As luck would have it for those eager to compare all of the available options, the crew over at Ars Technica have put together a handy table of the currently-known specifications and pricing of each company's Steam Machines! Some interesting takeaways from the chart include the almost even division between AMD and NVIDIA dedicated graphics while Intel has a single hardware win with it's Iris Pro 5200 (Gigabyte BRIX Pro). On the other hand, on the CPU side of things, Intel has the most design wins with AMD having as many as 3 design wins versus Intel's 10 (in the best case scenario). The pricing is also interesting. While there are outliers that offer up very expensive and affordable models, the majority of Steam Machines tend to be closer to the $1000 mark than either the $500 or $2000+ price points. In other words, about the same amount of money for a mid-range DIY PC. This is not necessarily a bad thing, as users are getting decent hardware for their money, a free OS, and OEM warranty/support (and there is nothing stopping the DIYers from making their own Steamboxes).
A SFF Steambox (left) from Zotac and a full-tower SteamOS gaming desktop from Falcon Nothwest (right).
So far, I have to say that I'm more impressed than not with the Steam Machine launch which has gone off better than I had expected. Here's hoping the hardware vendors are able to come through at the announced price points and Valve is able to continue wrangling developer support (and to improve the planned game streaming functionality from a Windows box). If so, I think Valve and it's partners will have a hit on their hands that will help bring PC gaming into the living room and (hopefully) on par (at least) in the mainstream perspective with the ever-popular game consoles (which are now using x86 PC architectures).
What do you think about the upcoming crop of Steam Machines? Does SteamOS have a future? Let us know your thoughts and predictions in the comments below!
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Editorial, General Tech | December 8, 2013 - 09:11 AM | Scott Michaud
Tagged: TSMC, GLOBALFOUNDRIES, broadcom
Josh Walrath titled the intro of his "Next Gen Graphics and Process Migration: 20nm and Beyond" editorial: "The Really Good Times are Over". Moore's Law predicts that, with each ~2 year generation, we will be able to double the transistor count of our integrated circuits. It does not, however, set a price.
A look into GlobalFoundries.
"Moore's Law is expensive" remarked Tom Kilroy during his Computex 2013 keynote. Intel spends about $12 billion USD in capital, every year, to keep the transistors coming. It shows. They are significantly ahead of their peers in terms of process technology. Intel is a very profitable company who can squirrel away justifications for these research and development expenses across numerous products and services.
The benefits of a process shrink are typically three-fold: increased performance, decreased power consumption, and lower cost per chip (as a single wafer is better utilized). Chairman and CTO of Broadcom, Henry Samueli, told reporters that manufacturing complexity is pushing chip developers into a situation where one of those three benefits must be sacrificed for the other two.
You are suddenly no longer searching for an overall better solution. You are searching for a more optimized solution in many respects but with inherent tradeoffs.
He expects GlobalFoundries and TSMC to catch up to Intel and "the cost curve should come back to normal". Still, he sees another wall coming up when we hit the 5nm point (you can count the width or height of these transistors, in atoms, using two hands) and even more problems beyond that.
Image Credit: IONAS
From my perspective: at some point, we will need to say goodbye to electronic integrated circuits. The theorists are already working on how we can develop integrated circuits using non-electronic materials. For instance, during the end of my Physics undergraduate degree, my thesis adviser was working on nonlinear optics within photonic crystals; waveguides which transmit optical frequency light rather than radio frequency electric waves. Of course I do not believe his research was on Optical Integrated Circuits, but that is not really the point.
Humanity is great at solving problems when backs are against walls. But, what problem will they try?
Power consumption? Cost? Performance?
Subject: Editorial, General Tech | December 5, 2013 - 11:53 PM | Scott Michaud
Tagged: windows, microsoft
Peter Bright at Ars Technica is wondering how many operating systems (OSes) Microsoft actually needs and, for that matter, how many they already have. Three consumer versions of Windows exists (or brands of it does): Windows RT, "full" Windows, and Windows Phone. Then again, it is really difficult to divide up what a unique operating system even is. All of the aforementioned "OSes" run on the same base kernel and even app compatibility does not align to that Venn diagram.
In my personal opinion, it really does not matter how many (or what) operating systems Microsoft has. That innate desire to categorize things into boxes really does nothing useful. At best, it helps you create relationships between it and other platforms; these comparisons may not even be valid. Sure, from the perspective of Microsoft's marketing team, these categories help convey information about their products to consumers.
... And if recent trends mean anything: very incorrect and confusing information.
So really, and I believe this is what Peter Bright was getting at, who cares how many OSes Microsoft has? The concern should really be what these products mean for consumers. In that sense, I really hope we trend towards the openness of the last couple Internet Explorer versions (and of course Windows 7) and further from the censored nature of Windows RT.
You can have 800 channels or just a single one but that doesn't mean something good is on.
Subject: Editorial, General Tech | December 2, 2013 - 07:23 PM | Scott Michaud
Tagged: microsoft, CEO
The search for a Microsoft CEO has been intensely monitored by journalists and financial analysts alike. The recent acquisition of Nokia (which was just approved by the DOJ, by the way) suggested that its CEO, Stephen Elop, was in the front running; if you watched coverage you would think CEO of Microsoft was his fate while everyone daydreamed of Alan Mulally.
While not confirmed, it looks like he (and former CEO of Skype, Tony Bates) are out of the running.
The top two candidates are Alan Mulally and Satya Nadella. The former would be an "acquisition" from Ford (more like a stressful retirement from there). His fame arose from turning that company around just prior to the 2008 Financial Crisis which wrecked the rest of the US auto industry. The latter runs the Cloud and Enterprise group which successfully evolved as times change without even a peep of trouble; it is just about the only stable division the company has.
Personally, I must say that those were just about the two best candidates in the pool -- at least from an outsider viewpoint. Their roles as CEO seem quite different but might not be. Both Mulally and Nadella have a track record of successfully navigating a changing landscape; the difference has been the rate and visibility.
This should be good news either way. Journalists will not have as many exciting things to talk about if Satya will be chosen but this is Microsoft's story, not theirs.
Subject: Editorial, General Tech | November 25, 2013 - 06:02 PM | Ryan Shrout
Tagged: halloween, giveaway, gigabyte, corsair, contest
UPDATE: We have our winner!! I would like to introduce you to...The Glorious PC Gaming Master Race!
There quite a few excellent entries in both photo and video form, but this was far and away the favorite of the PC Perspective voting staff. If you would like to see some of our other favorite entries, just click here and scroll to the bottom. Thanks to everyone who participated and made our holiday at PC Perspective a humorous one. :)
If there is one holiday that I find myself getting more and more attached to as I get older, it is definitely Halloween. The opportunity to dress up like a fool and (legally) scare children and adults in your neighborhood is an occasion that should not be missed! :)
To celebrate this wonderful time of year, PC Perspective has teamed up with our friends at Corsair and Gigabyte to giveaway a collection of hardware that just about anyone would be envious of. Here is what is up for grabs:
- Corsair Graphite 230T orange case ($80)
- Corsair Hydro H80i with the LED set to orange ($87)
- Corsair Raptor K50 RGB keyboard with LED set to orange ($85)
- Corsair Raptor M40 mouse (red, but you know, close enough) ($60)
- Corsair Vengeance Pro 8GB 1600 MHz memory ($99)
- Corsair Neutron GTX 120GB SSD ($125)
- Corsair RM Series 650 watt power supply ($115)
- Gigabyte Z87X-OC Haswell-ready motherboard (orange and black) ($185)
That's more than enough hardware to get anyone started on a killer gaming rig. Just add in your Intel Core i7/i5 Haswell processor, an AMD Radeon or NVIDIA GeForce graphics card and your off and running! Maybe a Zotac GeForce card would fit the bill to maintain the orange/black theme? Total estimated value of this hardware is over $830!!
So how do you enter? We have a few ways, but you are going to have to get creative for this one. We are looking for PC Perspective readers that are able to express their nerdom and hardware fanaticism in costume form. What kind of crazy costumes do you have planned and how are they associated with gaming, or hardware or technology? Pics or it didn't happen! Here are the rules:
- We want YOUR costume and not some image you stole off Google image search. That means you are going to need to hold up a piece of paper that says "PCPer" or "PC Perspective" on when you have your snapshot taken. This prove to us that you actually read this before the photo was taken!
- You can submit your entry in a few ways, all of which are equally judged.
- Leave a photo and description on the wall of our PC Perspective Facebook page. This is probably the easiest option.
- Leave a photo and description on the wall of our PC Perspective Google+ page.
- Leave a photo link and description in the comments below.
- If you want to get really creative, you can leave a video response on our PC Perspective YouTube channel!
That's it! We will accept entries starting today and going through November 3rd (to allow for the weekend parties) so get to planning and get creative! We'll pick a few of our favorites to showcase in a news post announcing the winner then ship the hardware to our favorite. Good luck!
A huge thanks goes to Corsair for the massive amount of hardware they provided for this contest and also to Gigabyte for the Z87X-OC motherboard. If you want, you might even get bonus points if you include Corsair and Gigabyte in your entry somehow...
Subject: Editorial, General Tech, Networking, Processors, Mobile | October 19, 2013 - 05:45 AM | Tim Verry
Tagged: SoC, p5600, MIPS, imagination
Imagination Technologies, a company known for its PowerVR graphics IP, has unleashed its first Warrior P-series MIPS CPU core. The new MIPS core is called the P5600 and is a 32-bit core based on the MIPS Release 5 ISA (Instruction Set Architecture).
The P5600 CPU core can perform 128-bit SIMD computations, provide hardware accelerated virtualization, and access up to a 1TB of memory via virtual addressing. While the MIPS 5 ISA provides for 64-bit calculations, the P5600 core is 32-bit only and does not include the extra 64-bit portions of the ISA.
The MIPS P5600 core can scale up to 2GHz in clockspeed when used in chips built on TSMC's 28nm HPM manufacturing process (according to Imagination Technologies). Further, the Warrior P5600 core can be used in processors and SoCs. As many as six CPU cores can be combined and managed by a coherence manager and given access to up to 8MB of shared L2 cache. Imagination Technologies is aiming processors containing the P5600 cores at mobile devices, networking appliances (routers, hardware firewalls, switches, et al), and micro-servers.
A configuration of multiple P5600 cores with L2 cache.
I first saw a story on the P5600 over at the Tech Report, and found it interesting that Imagination Technologies was developing a MIPS processor aimed at mobile devices. It does make sense to see a MIPS CPU from the company as it owns the MIPS intellectual property. Also, a CPU core is a logical step for a company with a large graphics IP and GPU portfolio. Developing its own MIPS CPU core would allow it to put together an SoC with its own CPU and GPU components. With that said, I found it interesting that the P5600 CPU core was being aimed at the mobile space, where ARM processors currently dominate. ARM is working to increase performance while Intel is working to bring its powerhouse x86 architecture to the ultra low power mobile space. Needless to say, it is a highly competitive market and Imagination Technologies new CPU core is sure to have a difficult time establishing itself in that space of consumer smartphone and tablet SoCs. Fortunately, mobile chips are not the only processors Imagination Technologies is aiming the P5600 at. It is also offering up the MIPS Series 5 compatible core for use in processors powering networking equipment and very low power servers and business appliances where the MIPS architecture is more commonplace.
In any event, I'm interested to see what else IT has in store for its MIPS IP and where the Warrior series goes from here!
More information on the MIPS 5600 core can be found here.
Subject: Editorial, General Tech, Graphics Cards | October 10, 2013 - 07:28 PM | Ryan Shrout
Tagged: podcast, nvidia, contest, batman arkham origins
UPDATE: We picked our winner for week 1 but now you can enter for week 2!!! See the new podcast episode listed below!!
Back in August NVIDIA announced that they would be teaming up with Warner Bros. Interactive to include copies of the upcoming Batman: Arkham Origins game with select NVIDIA GeForce graphics cards. While that's great and all, wouldn't you rather get one for free next week from PC Perspective?
Great, you're in luck! We have a handful of keys to give out to listeners and viewers of the PC Perspective Podcast. Here's how you enter:
- Listen to or watch episode #272 of the PC Perspective Podcast and listen for the "secret phrase" as mentioned in the show!
- Subscribe to our RSS feed for the podcast or subscribe to our YouTube channel.
- Fill out the form at the bottom of this podcast page with the "secret phrase" and you're entered!
I'll draw a winner before the next podcast and announce it on the show! We'll giveaway one copy each of the next two weeks! Our thanks goes to NVIDIA for supplying the Batman: Arkham Origins keys for this contest!!
No restrictions on winning, so good luck!!
Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | September 17, 2013 - 01:15 AM | Scott Michaud
Tagged: Steam Box, LinuxCon, Gabe Newell
Valve Software, as demonstrated a couple of days ago, still believe in Linux as the future of gaming platforms. Gabe Newell discussed this situation at LinuxCon, this morning, which was streamed live over the internet (and I transcribed after the teaser break at the bottom of the article). Someone decided to rip the stream, not the best quality but good enough, and put it on Youtube. I found it and embed it below. Enjoy!
Gabe Newell highlights, from the seventh minute straight through to the end, why proprietary platforms look successful and how they (sooner-or-later) fail by their own design. Simply put, you can control what is on it. Software you do not like, or even their updates, can be stuck in certification or even excluded from the platform entirely. You can limit malicious software, at least to some extent, or even competing products.
Ultimately, however, you limit yourself by not feeding in to the competition of the crowd.
If you wanted to get your cartridge made you bought it, you know, FOB in Tokyo. If you had a competitive product, miraculously, your ROMs didn't show up until, you know, 3 months after the platform holder's product had entered market and stuff like that. And that was really where the dominant models for what was happening in gaming ((came from)).
But, not too surprisingly, open systems were advancing faster than the proprietary systems had. There used to be these completely de novo graphics solutions for gaming consoles and they've all been replaced by PC-derived hardware. The openness of the PC as a hardware standard meant that the rate of innovation was way faster. So even though, you would think, that the console guys would have a huge incentive to invest in it, they were unable to be competitive.
Microsoft attempts to exert control over their platform with modern Windows which is met by a year-over-year regression in PC sales; at the same time, PC gaming is the industry hotbed of innovation and it is booming as a result. In a time of declining sales in PC hardware, Steam saw a 76% growth (unclear but it sounds like revenue) from last year.
Valve really believes the industry will shift toward a model with little divide between creator and consumer. The community has been "an order of magnitude" more productive than the actual staff of Team Fortress 2.
Does Valve want to compete with that?
This will only happen with open platforms. Even the consoles, with systems sold under parts and labor costs to exert control, have learned to embrace the indie developer. The next gen consoles market indie developers, prior to launch, seemingly more than the industry behemoths and that includes their own titles. They open their platforms a little bit but it might still not be enough to hold off the slow and steady advance of PC gaming be it through Windows, Linux, or even web standards.
Speaking of which, Linux and web standards are oft criticized because they are fragmented. Gabe Newell, intentionally or unintentionally, claimed proprietary platforms are more fragmented. Open platforms have multiple bodies push and pull the blob but it all tends to flow in the same direction. Proprietary platforms have lean bodies with control over where they can go, just many of them. You have a dominant and a few competing platforms for each sector: phones and tablets, consoles, desktops, and so forth.
He noted each has a web browser and, because the web is an open standard, is the most unified experience across devices of multiple sectors. Open fragmentation is small compared to the gaps between proprietary silos across sectors. ((As a side note: Windows RT is also designed to be one platform for all platforms but, as we have been saying for a while, you would prefer an open alternative to all RT all the time... and, according to the second and third paragraphs of this editorial, it will probably suffer from all of the same problems inherent to proprietary platforms anyway.))
Everybody just sort of automatically assumes that the internet is going to work regardless of wherever they are. There may be pluses or minuses of their specific environment but nobody says, "Oh I'm in an airplane now, I'm going to use a completely different method of accessing data across a network". We think that should be more broadly true as well. That you don't think of touch input or game controllers or living rooms as being things which require a completely different way for users to interact or acquire assets or developers to program or deliver to those targets.
Obviously if that is the direction you are going in, Linux is the most obvious basis for that and none of the proprietary, closed platforms are going to be able to provide that form of grand unification between mobile, living room, and desktop.
Next week we're going to be rolling out more information about how we get there and what are the hardware opportunities that we see for bringing Linux into the living room and potentially pointing further down the road to how we can get it even more unified in mobile.
Well, we will certainly be looking forward to next week.
Personally, for almost two years I found it weird how Google, Valve, and Apple (if the longstanding rumors were true) were each pushing for wearable computing, Steam Box/Apple TV/Google TV, and content distribution at the same time. I would not be surprised, in the slightest, for Valve to add media functionality to Steam and Big Picture and secure a spot in the iTunes and Play Store market.
As for how wearables fit in? I could never quite figure that out but it always felt suspicious.
Subject: Editorial, General Tech | September 12, 2013 - 12:31 AM | Scott Michaud
Tagged: xbox one, valve, steam
I know there will be some comparison between the recent Steam Family Sharing announcement and what Microsoft proposed, to a flock of airborne tomatoes I might add, for the Xbox One. Steam integrates some level of copy discouragement by accounts which identify a catalog of content with an individual. This account, user name and password, tends to be more precious to the licensee than a physical disk or a nondescript blob of bits.
The point is not to prevent unauthorized copying, however; the point is to increase sales.
Account information is used, not just for authentication, but to add value to the service. If you log in to your account from a friend's computer, you have access to your content and it can be installed to their machine. This is slightly more convenient, given a fast internet connection, than carrying a DRM-free game on physical storage (unless, of course, paid licenses are revoked or something). Soon, authorized friends will also be able to borrow your library when you are not using it if their devices are authorized by your account.
Microsoft has a similar authentication system through Xbox Live. The Xbox One also proposed a sharing feature with the caveat that all devices would need a small, few kilobyte, internet connection once every 24 hours.
The general public went mental.
The debate (debacle?) between online sharing and online restrictions saw fans of the idea point to the PC platform and how Steam has similar restrictions. Sure, Steam has an offline mode, but it is otherwise just as restrictive; Valve gets away with it, Microsoft should too!
It is true, Microsoft has a more difficult time with public relations than Valve does with Steam. However, like EA and their troubles with Origin, they have shown themselves to be less reliable than Valve over time. When a purchase is made on Steam, it has been kept available to the best of their abilities. Microsoft, on the other hand, bricked the multiplayer and online features of each and every original Xbox title. Microsoft did a terrible job explaining how the policy benefits customers, and that is declared reason for the backlash, but had they acquired trust from their customers over the years then this might have just blown over. Even still, I find Steam Family Sharing to be a completely different situation from what we just experienced in the console space.
So then, apart from banked good faith, what is the actual difference?
Steam is not the only place to get PC games!
Games could be purchased at retail or competing online services such as GoG.com. Customers who disagree with the Xbox One license have nowhere else to go. In the event that a game is available only with restrictive DRM, which many are, the publisher and/or developer holds responsibility. There is little stopping a game from being released, like The Witcher 3, DRM-free at launch and trusting the user to be ethical with their bits.
Unfortunately for Xbox Division, controlling the point of sale is how they expect to recover the subsidized hardware. Their certification and retail policies cannot be circumvented because that is their business model: lose some money acquiring customers who then have no choice but to give you money in return.
This is not the case on Windows, Mac, and Linux. It is easy to confuse Steam with "PC Gaming", however, due to how common it is. They were early, they were compelling, but most of all they were consistent. Their trust was earned and, moreover, is not even required to enjoy the PC platform.
Subject: Editorial, General Tech | August 6, 2013 - 06:30 PM | Ryan Shrout
Tagged: workshop, video, streaming, quakecon, prizes, live, giveaways
UPDATE: Did you miss the live event? Check out the replay below! Thanks to everyone that helped make it possible and see you next year!
It is that time of year again: another installment of the PC Perspective Hardware Workshop! Once again we will be presenting on the main stage at Quakecon 2013 being held in Dallas, TX August 1-4th.
Main Stage - Quakecon 2013
Saturday, August 3rd, 12:30pm CT
Our thanks go out to the organizers of Quakecon for allowing us and our partners to put together a show that we are proud of every year. We love giving back to the community of enthusiasts and gamers that drive us to do what we do! Get ready for 2 hours of prizes, games and raffles and the chances are pretty good that you'll take something out with you - really, they are pretty good!
Our thanks for this year's workshop logo goes to John Pastor!!
Our primary partners at the event are those that threw in for our ability to host the workshop at Quakecon and for the hundreds of shirts we have ready to toss out! Our thanks to NVIDIA, Western Digital and Corsair!!
If you can't make it to the workshop - don't worry! You can still watch the workshop live on our page right here as we stream it over one of several online services. Just remember this URL: http://pcper.com/workshop and you will find your way!
PC Perspective LIVE Podcast and Meetup
We are planning on hosting any fans that want to watch us record our weekly PC Perspective Podcast (http://pcper.com/podcast) on Wednesday evening in our meeting room at the Hilton Anatole. I don't yet know exactly WHEN or WHERE the location will be, but I will update this page accordingly on Wednesday July 31st when we get the data. You might also consider following me on Twitter for updates on that status as well.
After the recording, we'll hop over the hotel bar for a couple drinks and hang out. We have room for at leaast 50-60 people to join us in the room but we'll still be recording if just ONE of you shows up. :)
Prize List (will continue to grow!)
Subject: Editorial, General Tech, Processors, Mobile | August 3, 2013 - 11:21 PM | Scott Michaud
Tagged: qualcomm, Intel, mediatek, arm
MediaTek, do you even lift?
According to a Taiwan Media Roundtable transcript, discovered by IT World, Qualcomm has no interest, at least at the moment, in developing an octo-core processor. MediaTek, their competitor, recently unveiled an eight core ARM System on a Chip (SoC) which can be fully utilized. Most other mobile SoCs with eight cores function as a fast quad-core and a slower, but more efficient, quad-core processor with the most appropriate chosen for the task.
Anand Chandrasekher of Qualcomm believes it is desperation.
So, I go back to what I said: it's not about cores. When you can't engineer a product that meets the consumers' expectations, maybe that’s when you resort to simply throwing cores together. That is the equivalent of throwing spaghetti against the wall and seeing what sticks. That's a dumb way to do it and I think our engineers aren't dumb.
The moderator, clearly amused by the reaction, requested a firm clarification that Qualcomm will not launch an octo-core product. A firm, but not clear, response was given, "We don't do dumb things". Of course they would not commit to swearing off eight cores for all eternity, at some point they may find core count to be their bottleneck, but that is not the case for the moment. They will also not discuss whether bumping the clock rate is the best option or whether they should focus on graphics performance. He is just assured that they are focused on the best experience for whatever scenario each product is designed to solve.
And he is assured that Intel, his former employer, still cannot catch them. As we have discussed in the past: Intel is a company that will spend tens of billions of dollars, year over year, to out-research you if they genuinely want to play in your market. Even with his experience at Intel, he continues to take them lightly.
We don't see any impact from any of Intel's claims on current or future products. I think the results from empirical testers on our products that are currently shipping in the marketplace is very clear, and across a range of reviewers from Anandtech to Engadget, Qualcomm Snapdragon devices are winning both on experience as well as battery life. What our competitors are claiming are empty promises and is not having an impact on us.
Qualcomm has a definite lead, at the moment, and may very well keep ahead through Bay Trail. AMD, too, kept a lead throughout the entire Athlon 64 generation and believed they could beat anything Intel could develop. They were complacent, much as Qualcomm sounds currently, and when Intel caught up AMD could not float above the sheer volume of money trying to drown them.
Then again, even if you are complacent, you may still be the best. Maybe Intel will never get a Conroe moment against ARM.
Subject: Editorial, General Tech | August 3, 2013 - 08:03 PM | Scott Michaud
Tagged: nvidia, CloudLight, cloud gaming
Trust the cloud... be the cloud.
The executives on stage might as well have waved their hands while reciting that incantation during the announcement of the Xbox One. Why not? The audience would have just assumed Don Mattrick was trying to get some weird Kinect achievement on stage. You know, kill four people with one laser beam while trying to sink your next-generation platform in a ranked keynote. 50 Gamerscore!
Microsoft stated, during and after the keynote, that each Xbox One would have access to cloud servers for certain processing tasks. Xbox Live would be receiving enough servers such that each console could access three times its performance, at launch, to do... stuff. You know, things that are hard to calculate but are not too dependent upon latency. You know what we mean, right?
Apparently Microsoft did not realize that was a detail they were supposed to sell us on.
In the mean time, NVIDIA has been selling us on offloaded computation to cloud architectures. We knew Global Illumination (GI) was a very complicated problem. Most of the last couple decades has been progressively removing approximations to what light truly does.
CloudLight is their research project, presented at SIGRAPH Asia and via Williams College, to demonstrate server-processed indirect lighting. In their video, each of the three effects are demonstrated at multiple latencies. The results look pretty good until about 500ms which is where the brightest points are noticeably in the wrong locations.
The three methods used to generate indirect lighting are: irradiance maps, where lightmaps are continuously calculated on a server and streamed by H.264; photons, which raytraces lighting for the scene as previous rays expire and streams only the most current ones to clients who need it; and voxels, which stream fully computed frames to the clients. The most interesting part is that as you add more users, in most cases, server-processing remains fairly constant.
It should be noted, however, that each of these demonstrations only moved the most intense lights slowly. I would expect an effect such as switching a light on in an otherwise dark room would create a "pop-in" effect if it lags too far behind user interaction or the instantaneous dynamic lights.
That said, for a finite number of instant switches, it would be possible for a server to render both results and have the client choose the appropriate lightmap (or the appropriate set of pixels from the same, large, lightmap). For an Unreal Tournament 3 mod, I was experimenting with using a Global Illumination solver to calculate lighting. My intention was to allow users to turn on and off a handful of lights in each team's base. As lights were shot out or activated by a switch, the shader would switch to the appropriate pre-rendered solution. I would expect a similar method to work here.
What other effects do you believe can withstand a few hundred milliseconds of latency?
Subject: Editorial, General Tech | August 1, 2013 - 12:03 AM | Scott Michaud
Tagged: Privacy, mozilla, DNT
Mozilla Labs is researching a new approach to the problem of privacy and targeted advertising: allow the user to provide the data that honest advertisers intend to acquire via tracking behavior. The hope is that users who manage their own privacy will not have companies try to do it for them.
Internet users are growing concerned about how they are tracked and monitored online. Crowds rally behind initiatives, such as Do Not Track (DNT) and neutering the NSA, because of an assumed promise of privacy even if it is just superficial.
DNT, for instance, is a web developer tool permitting honest sites to be less shy when considering features which make privacy advocates poop themselves and go to competing pages. Users, who were not the intended audience of this feature, threw a fit because it failed to satisfy their privacy concerns. Internet Explorer, which is otherwise becoming a great browser, decided to break the standard by not providing the default, "user has not specified", value.
Of course, all this does is hands honest web developers a broken tool; immoral and arrogantly amoral sites will track anyway.
Mozilla Labs is currently investigating another solution. We could, potentially, at some point, see an addition to Firefox which distills all of the information honest websites would like to know into a summary which users could selectively share. This, much like DNT, will not prevent companies or other organizations from tracking you but rather give most legitimate situations a fittingly legitimate alternative.
All of this data, such as history and geolocation, is already stored by browsers as a result of how they operate. This concept allows users to release some of this information to the sites they visit and, ideally, satisfy both parties. Maybe then, those who actually are malicious, cannot shrug off their actions as a common industry requirement.
Subject: Editorial | July 30, 2013 - 11:08 AM | Tim Verry
Tagged: time warner, Internet, cable modem, cable isp
According to the Consumerist, cable ISP Time Warner Cable is increasing its modem rental fee to $5.99 per month. The company initially instituted a $3.95 rental fee in October of last year, and it is already ratcheting up the fees further for the same hardware people are already using.
The new rental fee will be $5.99 per month for most accounts (it seems the SignatureHome with TV is exempted from a rental fee), which works out to $71.88 per year. For comparison, the previous $3.95 fee would be $47.4 a year. As such, Time Warner Cable is looking at a healthy increase in revenue from the higher fee, which ISI analyst Vijay Jayant estimates to be around (an extra) $150 million according to Reuters. That’s a lot of money, but even zooming in to the per-customer numbers, it is a large increase. In fact, with the $71.88 total per year in rental fees for the modem, it is now cheaper to buy a new DOCSIS 3 modem outright.
For example, the Motorola SB6121 is on TWC’s approved modem list and is $69.45 on Amazon which would save you about 20 cents a month in the first year, and whatever the rental fee is in later years. I have been using this modem (albeit on Comcast) since February 2012 and I can easily recommend it. Things can look even better in favor of buying your own modem if you are on a slower tier and can get by with a DOCSIS 2 modem, some of which can be purchased for around $40 used.
Time Warner Cable further claims that it has increased speeds on its most popular internet service tier and added additional Wi-Fi hotspots for customers over the past year alongside the fee increase announcement, but unless you are on its “most popular” tier, its hardly enough good news to outweigh the fee increase and is likely to only further the outrage from customers who have already attempted class action lawsuits over the $3.95 fee.
Fortunately, there are options to bring your own modem and it is a simple process if you cannot justify the fee increase out of principle or just want to save a bit of money.
Subject: Editorial, General Tech | July 27, 2013 - 07:39 AM | Tim Verry
Tagged: streaming, media, google. chrome, chromecast, chrome os
Earlier this week, web search giant Google launched a new portable media streaming device called the Chromecast. The Chromecast is a small device about the size of a large USB flash drive that has a full size HDMI video output, micro USB power jack, and Wi-Fi connectivity. The device run’s Google’s Chrome OS and is able to display or playback any web page or media file that the Chrome web browser can.
The Chromecast is designed to plug into televisions and stream media from the internet. Eventually, users will be able to “cast” embedded media files or web pages from a smartphone, tablet, or PC running Android, iOS, Windows, or Mac OS X with a Chrome web browser over to the Chromecast. The sending device will point the Chromecast as the requisite URL where the streaming media or web page resides along with any necessary authorization tokens needed to access content behind a pay-wall or username/password login. From there, the Chromecast itself will reach out to the Internet over the Wi-Fi radio, retrieve the web page or media stream, and output it to the TV over HDMI. Playback controls will be accessible on the sending device, such as an Android smartphone, but it is the Chromecast itself that is streaming the media unlike solutions like wireless HDMI, AirPlay, DLNA, or Miracast. As such, the sending device is able to perform other tasks while the Chromecast handles the media streaming.
At launch, users will be able to use the Chromecast to stream Netflix, YouTube, and Google Play videos. At some point in the future, Google will be adding support for additional apps, including Pandora Internet radio streaming. Beyond that, (and this feature is still in development) users will be able to share entire Chrome tabs with the Chromecast (some reports are indicating that this tab sharing is done using the WebRTC standard). Users will need to download and install a Google Cast extension, which will put a button to the right of the URL button that, when pressed, will “cast” the tab to the Chromecast which will pull it up over its own internet connection and output it to the TV. When on a website that implements the SDK, users will have additional options for sharing just the video and using the PC as a remote along with handy playback and volume controls.
Alternatively, Google is releasing a Chromecast SDK that will allow developers to integrate their streaming media with the Chromecast. Instead of needing to share the entire tab, web developers or mobile app developers will be able to integrate casting functionality that will allow users to share solely the streaming media with the Chromecast similar to the upcoming ability to stream just the YouTube or Netflix video itself rather than the entire web page with the video embedded into it. Unfortunately, there is currently a caveat that states that developers must have all there apps (using the Chromecast SDK) approved by Google.
Sharing ("Casting") a Chrome web browser tab to a TV from a PC using the Chromecast.
It should be noted that Wired has reported success in using the tab sharing functionality to play back local media by electing Chrome to playback locally-stored video files, but this is not a perfect solution as Chrome has a limited number of formats it can playback in a window and audio sync proved tricky at times. With that said, the Chromecast is intended to be an Internet streaming device, and Google is marketing it as such, so it is difficult to fault the Chromecast for local streaming issues. There are better solutions for getting the most out of your LAN-accessible media, after all.
The Chromecast is $35 and will ship as soon as August 7, 2013 from the Google Play Store. Amazon and Best Buy had stock listed on their websites until yesterday when both e-tailers sold out (though you might be lucky enough to find a Chromecast at a brick and mortar Best Buy store). For $35, you get the Chromecast itself, a rigid HDMI extender that extends the Chromecast closer to the edge of the TV to make installation/removal easier, and a USB power cord. Google was initially also offering 3 free months of Netflix Instant streaming but has since backed away from the promo due to overwhelming demand (and if Google can continue to sell out of Chromecasts without spending money on Netflix for each unit, it is going to do that despite the PR hit (or at least disappointed buyers) to bolster the profit margin on the inexpensive gadget).
The Chromecast does have its flaws, and the launch was not perfect (many OS support and device features are still being worked on), but at $35 it is a simple impulse buy on a device that should only get better from here as the company further fleshes out the software. Even on the off-chance that Google abandons the Chromecast, it can still stream Netflix, YouTube, and Google Play for a pittance.
Subject: Editorial | July 18, 2013 - 01:34 AM | Josh Walrath
Tagged: silvermont, quarterly results, money, Lenovo, k900, Intel, atom, 22 nm tri-gate, 14 nm
Intel announced their Q2 results for this year, and it did not quite meet expectations. When I say expectations, I usually mean “make absolutely obscene amounts of money”. It seems that Intel was just shy of estimates and margins were only slightly lower than expected. That being said, Intel reported revenue of $12.8 billion US and a net income of $2 billion US. Not… too… shabby.
Analysts were of course expecting higher, but it seems as though the PC slowdown is in fact having a material effect on the market. Intel earlier this quarter cut estimates, so this was not exactly a surprise. Margins came in around 58.3%, but these are expected to recover going into Q3. Intel is certainly still in a strong position as millions of PCs are being shipped every quarter and they are the dominant CPU maker in its market.
Intel has been trying to get into the mobile market as it still exhibits strong growth not only now, but over the next several years as things become more and more connected. Intel had ignored this market for some time, much to their dismay. Their Atom based chips were slow to improve and typically used a last generation process node for cost savings. In the face of a strong ARM based portfolio of products from companies like Qualcomm, Samsung, and Rockchip, the Intel Atom was simply not an effective solution until the latest batch of chips were available from Intel. Products like the Atom Z2580, which powers the Lenovo K900 phone, were late to market as compared to other 28 nm products such as the Snapdragon series from Qualcomm.
Intel expects the next generation of Atom being built on its 22 nm Tri-Gate process, Silvermont, to be much more competitive with the latest generation offerings from its ARM based competitors. Unfortunately for Intel, we do not expect to see Silvermont based products until later in Q3 with availability in late Q4 or Q1 2014. Intel needs to move chips, but this will be a very different market than what they are used to. These SOCs have decent margins, but they are nowhere near what Intel can do with their traditional notebook, desktop, and server CPUs.
To help cut costs going forward, it seems as though Intel will be pulling back on its plans for 14 nm production. Expenditures and floor space/equipment for 14 nm will be cut back as compared to what previous plans had held. Intel still is hoping to start 14 nm production at the end of this year with the first commercial products to hit at the end of 2014. There are questions as to how viable 14 nm is as a fully ramped process in 2014. Eventually 14 nm will work as advertised, but it appears as though the kinks were much more complex than anticipated given how quickly Intel ramped 22 nm.
Intel has plenty of money, a dominant position in the x86 world, and a world class process technology on which to base future products on. I would say that they are still in very, very good shape. The market is ever changing and Intel is still fairly nimble given their size. They also recognize (albeit sometimes a bit later than expected) shifts in the marketplace and they invariably craft a plan of attack which addresses their shortcomings. While Intel revenue seems to have peaked last year, they are addressing new markets aggressively as well as holding onto their dominant position in notebooks, desktops, and server markets. Intel is expecting Q3 to be up, but overall sales throughout 2013 to be flat as compared to 2012. Have I mentioned they still cleared $2 billion in a down quarter?
Subject: Editorial, General Tech, Systems | July 15, 2013 - 06:09 AM | Scott Michaud
Tagged: xbox, xbox one
Two weeks have passed since Steve Ballmer informed all Microsoft employees that Don Mattrick would disembark and pursue a career at Zynga for one reason or another. Initially, Ballmer himself was set to scab the void for an uncertain amount of time, further unsettling the upcoming Xbox One launch without a proper manager to oversee. His reign was cut short, best measured in days, when he appointed Julie Larson-Green as the head of Microsoft Devices and Studios.
... because a Christmas gift without ribbon would just be a box... one X box.
Of course the internet, then, erupted with anxiety: some reasonable concerns, even more (predictably) inane. Larson-Green has a long list of successfully shipped products to her name but, apart from the somewhat cop-out of Windows 7, nothing which resonates with gamers. Terrible sexism and similarly embarrassments boiled over the gaming community, but crazies will always be crazy, especially those adjacent to Xbox Live subscribers.
Operating Systems will be filled by Terry Myerson, who rose to power from the Windows Phone division. This could be a sign of things to come for Windows, particularly as Microsoft continues to push for convergence between x86, RT, and Phone. I would not be surprised to see continued pressure from Microsoft to ingrain Windows Store, and all of its certification pros and woes, into each of their operating systems.
As for Xbox, while Julie is very user experience (UX)-focused, division oversight passed to her long after its flagship product's lifetime high-level plans have been defined. If Windows 7 is any indication, she might not stray too far away from that which has been laid out prior her arrival; likewise, if Windows 8 is any indication, a drastically new direction could just spring without notice.