Subject: Editorial, General Tech | February 25, 2014 - 03:43 PM | Scott Michaud
Tagged: titanfall, ssd
UPDATE (Feb 26th): Our readers pointed out in the comments, although I have yet to test it, that you can change Origin's install-to directory before installing a game to have them on a separate hard drive as the rest. Not as easy as Steam's method, but apparently works for games like this that you want somewhere else. I figured it would forget games in the old directory, but apparently not.
Well, okay. Titanfall will require a significant amount of hard drive space when it is released in two weeks. Receiving the game digitally will push 21GB of content through your modem and unpack to 48GB. Apparently, the next generation has arrived.
Honestly, I am not upset over this. Yes, this basically ignores customers who install their games to their SSDs. Origin, at the moment, forces all games to be installed in a single directory (albeit that can be anywhere) unlike Steam, which allows games to be individually sent to multiple folders. It would be a good idea to keep those customers in mind... but not at the expense of the game itself. Like always, both "high-end" and "unoptimized" titles have high minimum specifications; we decide which one applies by considering how effectively the performance is used.
That is something that we will need to find out when it launches on March 11th.
Subject: Editorial, General Tech | February 25, 2014 - 03:34 PM | Jeremy Hellstrom
Tagged: ddr3, Kaveri, A10 7850K, amd, linux
You don't often see performance scaling as clean as what Phoronix saw when testing the effect of memory speed on AMD's A10-7850K. Pick any result and you can clearly see a smooth increase in performance from DDR3-800 to DDR3-2400. The only time that increase seems to decline slightly is between DDR3-2133 and 2400MHz, with some tests showing little to no increase between those two speeds. Some tests do still show an improvement, for certain workloads on Linux the extra money is worth it but in other cases you can save a few dollars and limit yourself to the slightly cheaper DDR3-2133. Check out the full review here.
"Earlier in the week I published benchmarks showing AMD Kaveri's DDR3-800MHz through DDR3-2133MHz system memory performance. Those results showed this latest-generation AMD APU craving -- and being able to take advantage of -- high memory frequencies. Many were curious how DDR3-2400MHz would fair with Kaveri so here's some benchmarks as we test out Kingston's HyperX Beast 8GB DDR3-2400MHz memory kit."
Here are some more Memory articles from around the web:
- DDR3-800MHz To DDR3-2133MHz Memory Testing With AMD's Kaveri @ Phoronix
- G.SKILL Ripjaws 8GB 2133MHz DDR3L SO-DIMM Memory Kit Review @ Legit Reviews
- Patriot Viper 3 16GB 2400MHz Memory Kit @ eTeknix
- Team Group Vulcan 8GB 2400MHz C10 Memory Kit @ eTeknix
- Patriot Viper 8GB DDR3-2133 C11 (Low Profile) Memory @ Funky Kit
- G.SKILL Ripjaws 1866MHz 8GB DDR3L SO-DIMM Memory Kit Review @ Legit Reviews
Subject: Editorial, General Tech | February 19, 2014 - 06:15 PM | Scott Michaud
Tagged: bioshock infinite
The team behind the original BioShock and Bioshock: Infinite decided to call it quits. After seventeen years, depending on where you start counting, the company dissolved to form another, much smaller studio. Only about fifteen employees will transition to the new team. The rest are provided financial support, given a bit of time to develop their portfolios, and can attend a recruitment day to be interviewed by other studios and publishers. They may also be offered employment elsewhere in Take-Two Interactive.
The studio formed by the handful of remaining employees will look to develop games based on narrative, which is definitely their strength. Each game will be distributed digitally and Take-Two will continue to be their parent company.
While any job loss is terrible, I am interested in the future project. BioShock: Infinite sold millions of copies but I wonder if its size ultimately caused it harm. It was pretty and full of detail, at the expense of requiring a large team. The game had a story which respected your intelligence, you may not understand it and that was okay, but I have little confidence that it was anywhere close to the team's original vision. From budget constraints to the religious beliefs of development staff, we already know about several aspects of the game that changed significantly. Even Elizabeth, according to earlier statements from Ken Levine, was on the bubble because of her AI's complexity. I can imagine how difficult it is to resist those changes when seeing man-hour budgets. I cannot, however, imagine BioShock: Infinite without Elizabeth. A smaller team might help them concentrate their effort where it matters and keep artistic vision from becoming too dilute.
As for BioShock? The second part of the Burial at Sea DLC is said to wrap up the entire franchise. 2K will retain the license if they want to release sequels or spin-offs. I doubt Ken Levine will have anything more to do with it, however.
It wouldn’t be February if we didn’t hear the Q4 FY14 earnings from NVIDIA! NVIDIA does have a slightly odd way of expressing their quarters, but in the end it is all semantics. They are not in fact living in the future, but I bet their product managers wish they could peer into the actual Q4 2014. No, the whole FY14 thing relates back to when they made their IPO and how they started reporting. To us mere mortals, Q4 FY14 actually represents Q4 2013. Clear as mud? Lord love the Securities and Exchange Commission and their rules.
The past quarter was a pretty good one for NVIDIA. They came away with $1.144 billion in gross revenue and had a GAAP net income of $147 million. This beat the Street’s estimate by a pretty large margin. As a response, trading of NVIDIA’s stock has gone up in after hours. This has certainly been a trying year for NVIDIA and the PC market in general, but they seem to have come out on top.
NVIDIA beat estimates primarily on the strength of the PC graphics division. Many were focusing on the apparent decline of the PC market and assumed that NVIDIA would be dragged down by lower shipments. On the contrary, it seems as though the gaming market and add-in sales on the PC helped to solidify NVIDIA’s quarter. We can look at a number of factors that likely contributed to this uptick for NVIDIA.
Subject: Editorial, General Tech, Systems | February 12, 2014 - 10:45 PM | Scott Michaud
Tagged: xbox, xbone, ps4, Playstation, pc gaming
PCMag, your source for Apple and gaming console coverage (I joke), wrote up an editorial about purchasing a gaming console. Honestly, they should have titled it, "How to Buy a Game Device" since they also cover the NVIDIA SHIELD and other options.
The entire Console vs PC debate bothers me, though. Neither side handles it well.
I will start by highlighting problems with the PC side, before you stop reading. Everyone says you can assemble your own gaming PC to save a little money. Yes, that is true and it is unique to the platform. The problem is that the public vision then becomes, "You must assemble and maintain your own gaming PC".
No. No. No.
Some people prefer the support system provided by the gaming consoles. If it bricks, which some of them do a lot, you can call up the manufacturer for a replacement in a few weeks. The same could be absolutely true for a gaming PC. There is nothing wrong with purchasing a computer from a system builder, ranging from Dell to Puget Systems.
The point of gaming PC is that you do not need to. You can also deal with a small business. For Canadians, if you purchase all of your hardware through NCIX, you can add $50 to your order for them to ship your parts as a fully assembled PC, with Windows installed (if purchased). You also get a one-year warranty. The downside is that you lose your ability to pick-and-choose components from other retailers and you cannot reuse your old stuff. Unfortunately, I do not believe NCIX USA offers this. Some local stores may offer similar benefits, though. One around my area assembled for free.
The benefits of the PC is always choice. You can assemble it yourself (or with a friend). You can have a console-like experience with a system builder. You can also have something in-between with small businesses. It is your choice.
Most importantly, your choice of manufacturer does not restrict your choice in content.
As for the consoles, I cannot find a rock-solid argument that will always be better on them. If you are thinking about purchasing one, the available content should sway your decision. Microsoft will be the place to get "Halo". Sony will be the place to get "The Last of Us". Nintendo will be the place to get "Mario". Your money should go where the content you want is. That, and wherever your friends play.
But, of course, then you are what made the content exclusive.
Note: Obviously the PC has issues with proprietary platforms, too. Unlike the consoles, it could also be a temporary issue. The PC business model does not depend upon Windows. If it remains a sufficient platform? Great. If not, we have multiple options which range from Linux/SteamOS to Web Standards for someone to develop a timeless classic on.
Subject: Editorial, General Tech, Graphics Cards | January 22, 2014 - 02:12 AM | Scott Michaud
Tagged: linux, intel hd graphics, haswell
Looking through this post by Phoronix, it would seem that Intel had a significant regression in performance on Ubuntu 14.04 with the Linux 3.13 kernel. In some tests, HD 4600 only achieves about half of the performance recorded on the HD 4000. I have not been following Linux iGPU drivers and it is probably a bit late to do any form of in-depth analysis... but yolo. I think the article actually made a pretty big mistake and came to the exact wrong conclusion.
Let's do this!
According to the article, in Xonotic v0.7, Ivy Bridge's Intel HD 4000 scores 176.23 FPS at 1080p on low quality settings. When you compare this to Haswell's HD 4600 and its 124.45 FPS result, this seems bad. However, even though they claim this as a performance regression, they never actually post earlier (and supposedly faster) benchmarks.
So I dug one up.
Back in October, the same test was performed with the same hardware. The Intel HD 4600 was not significantly faster back then, rather it was actually a bit slower with a score of 123.84 FPS. The Intel HD 4000 managed 102.68 FPS. Haswell did not regress between that time and Ubuntu 14.04 on Linux 3.13, Ivy Bridge received a 71.63% increase between then and Ubuntu 14.04 on Linux 3.13.
Of course, there could have been a performance increase between October and now and that recently regressed for Haswell... but I could not find those benchmarks. All I can see is that Haswell has been quite steady since October. Either way, that is a significant performance increase on Ivy Bridge since that snapshot in time, even if Haswell had a rise-and-fall that I was unaware of.
Subject: Editorial, General Tech | January 20, 2014 - 11:35 PM | Scott Michaud
Tagged: valve, virtual reality
Steam Dev Days was last week. At it, Valve announced a redesign of their Steam Controller and the removal of Steam Greenlight, among other notables. This was a press-free event, officially. Of course, due to Twitter and other social media platforms, everyone can decide to be a journalist on a whim. Things are going to leak out.
Other things are going to be officially released, too.
Michael Abrash held a speech at the event discussing his virtual reality initiative within Valve. Both it and the Steam Machine project was in question when the company released Jeri Ellsworth and several other employees. After SteamOS was announced and castAR, Jeri's project at Valve, had its Kickstarter, it was assumed that Valve gave up on augmented reality. Despite this, they still kept Michael Abrash on their staff.
I would speculate, completely from an outside position, that two virtual reality groups existed at one point (at least to some extent). The project seems to have been sliced into two parts, one leaving with Jeri and one continuing with Michael. I seriously doubt this had anything to do with the "High School Cliques" that Jeri was referring to, however. She said it was "longtime staff" (Michael was newly hired around the end of Portal 2's development) and not within her hardware team.
These are the specs that Valve has developed prototypes to.
1K x 1K per eye is about 100x less than they would like, however.
Ooo... 100 megapixels per eye.
I just believe it all shook out to an unfortunate fork in the project.
Politics aside, Michael Abrash sees virtual reality affecting "the entire entertainment industry" and will be well supported by Steam. I hope this would mean that Valve will finally drop the hammer on music and movie distribution. I have been expecting this ever since the Steam infrastructure was upgraded back in July 2011. Of course, neither server or software will solve content availability but I am still expecting them to take a shot at it. Remember that Valve is creating movies, could they have plans for virtual reality content?
The latest prototype of the Oculus Rift uses camera tracking for low-latency visibility.
This looks like Valve's solution.
The PDF slide deck is publicly available and each page includes the script he heavily followed. Basically, reading this is like being there, just less fun.
Subject: Editorial, General Tech, Graphics Cards, Processors, Memory, Systems | January 20, 2014 - 02:40 AM | Scott Michaud
Tagged: corsair, overclocking
I rarely overclock anything and this is for three main reasons. The first is that I have had an unreasonably bad time with computer parts failing on their own. I did not want to tempt fate. The second was that I focused on optimizing the operating system and its running services. This was mostly important during the Windows 98, Windows XP, and Windows Vista eras. The third is that I did not find overclocking valuable enough for the performance you regained.
A game that is too hefty to run is probably not an overclock away from working.
Thankfully this never took off...
Today, overclocking is easier and safer than ever with parts that basically do it automatically and back off, on their own, if thermals are too aggressive. Several components are also much less locked down than they have been. (Has anyone, to this day, hacked the locked Barton cores?) It should not be too hard to find a SKU which encourages the enthusiast to tweak some knobs.
But how much of an increase will you see? Corsair has been blogging about using their components (along with an Intel processor, Gigabyte motherboard, and eVGA graphics card because they obviously do not make those) to overclock. The cool part is they break down performance gains in terms of raising the frequencies for just the CPU, just the GPU, just the RAM, or all of the above together. This breakdown shows how each of the three categories contribute to the whole. While none of the overclocks are dramatic, Corsair is probably proud of the 5% jump in Cinebench OpenGL performance just by overclocking the RAM from 1600 MHz to 1866 MHz without touching the CPU or GPU.
It is definitely worth a look.
Subject: Editorial, General Tech | January 19, 2014 - 11:46 PM | Scott Michaud
Tagged: Keyboards, keyboard
Peter Bright down at Ars Technica wrote an editorial about the Lenovo ThinkPad X1 Carbon. His opinion is that keyboard developers should innovate in ways that "doesn't undermine expectations". Replacing a row of physical keys for a software-controlled touch strip is destructive because, even if the change proved invaluable, it would ultimately be inferior because it clashes with every other keyboard the user encounters. He then concludes with a statement that really should have directed his thesis.
Lenovo's engineers may be well-meaning in their attempts to improve the keyboard. But they've lost a sale as a result. The quest for the perfect laptop continues.
That is the entire point of innovation! You may dislike how a feature interacts with your personal ecosystem and that will drive you away from the product. Users who purchased the laptop without considering the keyboard have the option of returning it and writing reviews for others (or simply put up with it). Users who purchased the laptop because of the keyboard are happy.
I mainly disagree with the article because it claims that it is impossible to innovate the keyboard in any way that affects the core layout. I actually disagree with it for two reasons.
My first issue is about how vague he is. His primary example of good keyboard innovation is the IBM ThinkPad 701c and its "butterfly keyboard". The attempt is to increase the keyboard size to exceed the laptop itself to make it more conventional. Conventional for who? How many people use primarily small laptops with shrunken keyboards compared to people who touch-type function keys?
The second critique leads from the first. The PC industry became so effective because every manufacturer tries to be a little different with certain SKUs to gain tiny advantages. There could have easily been a rule against touchscreen computers. Eventually someone hit it out of the park and found an implementation that was wildly successful to a gigantic market. The QWERTY design has weathered the storm for more than a century but there is no rule that it cannot shift in the future.
In fact, at some point, someone decided to add an extra row of function keys. This certainly could undermine the expectations of users who have to go between computers and electronic typewriters.
It will be tough, though. Keyboards have settled down and learning their layouts is a significant mental investment. There are several factors to consider when it comes to how successful a keyboard modification will become. Mostly, however, it will come down to someone trying and observing what happens. Do not worry about letting random ideas in because the bad ideas will show themselves out.
Basically the point is: never say never (especially not that vaguely).
Subject: Editorial, General Tech | January 11, 2014 - 12:13 AM | Scott Michaud
Tagged: SimCity, ea
Maxis and Electronic Arts recognize the hefty portion of SimCity's popularity as a franchise is due to its mod community. The current version could use all of the help it can get after its unfortunate first year. They have finally let the community take over... to some extent. EA is imposing certain rules upon the content creators. Most of them are reasonable. One of them can have unforeseen consequences for the LGBQT community. The first rule should apply to their expansion packs.
Starting at the end, the last three rules (#3 through #5) are mostly reasonable. They protect EA against potential malware and breaches of their EULA and Terms of Service. The fifth rule does begin to dip its toe into potential censorship but it does not really concern me.
No-one can be "Best Friends" in North America.
The second rule, while mostly condemning illegal activity, does include the requirement that content remains within ESRB 10+ and PEGI 7. The problem with any content certification is that it limits the dialog between artists and society. In this case, references to same-sex topics (ex: Harvest Moon) in games may force a minimum T or M rating. A mod which introduces some story element where two Sims of the same gender go on a date or live together (again, like Harvest Moon) might result in interest groups rattling the ESRB's cage until EA draws a firm line on that specific topic.
EA is very good with the LGBQT community but this could get unnecessarily messy.
The first rule is a different story. It says that mods which affect the simulation for multiplayer games or features are not allowed (despite being the only official mode). They do not want a modification to give players an unfair advantage over the rest of the game's community.
You know, like maybe an airship which boosts "your struggling industry or commercial [districts]" and also brings in tourists and commuters without causing traffic on your connecting highway?
Maxis is still, apparently, exploring options for offline SimCity experiences. Even if they allow a server preference to not affect the global economy, mods would be able to be quarantined to those areas. Great, problem solved. Instead, it is somewhat left up to interpretation what is allowed. To make matters worse, the current examples of mods that we have are purely cosmetic.
SimCity is nowhere near as bad as Halo 2 Vista for its mod functionality (those mod tools were so hobbled that its own tutorial was impossible). It could actually be good. These are just areas for EA to consider and, hopefully, reconsider.