The Dirty Laggard
It may seem odd, but sometimes reviewers are some of the last folks to implement new technology. This has been the case for myself many a time. Yes, we get some of the latest and greatest components, but often we review them and then keep them on the shelf for comparative purposes, all the while our personal systems run last generation parts that we will not need to re-integrate into a test rig ever again. Or in other cases, big money parts, like the one 30” 2560x1600 LCD that I own, are always being utilized on the testbed and never actually being used for things like browsing, gaming, or other personal activities. Don’t get me wrong, this is not a “woe-is-me” rant about the hardships of being a reviewer, but rather just an interesting side effect not often attributed to folks who do this type of work. Yes, we get the latest to play with and review, but we don’t often actually use these new parts in our everyday lives.
One of the technologies that I had only ever seen at trade shows is that of Eyefinity. It was released back in the Fall of 2009, and really gained some momentum in 2010. Initially it was incompatible with Crossfire technology, which limited it to a great degree. A single HD 5970 card could push 3 x 1920x1080 monitors in most games, but usually only with details turned down and no AA enabled. Once AMD worked a bit more on the drivers were we able to see Crossfire setups working in Eyefinity, which allowed users to play games at higher fidelity with the other little niceties enabled. The release of the HD 6900 series of cards also proved to be a boon to Eyefinity, as these new chips had much better scaling in Crossfire performance, plus were also significantly faster than the earlier HD 5800 series at those price points.
Continue on to the rest of the story for more on my experiences with AMD Eyefinity.
Subject: Editorial, General Tech, Systems | July 11, 2011 - 05:57 PM | Scott Michaud
Tagged: xbox, pc gaming
Last week we reported on Microsoft rolling their Games for Windows initiative into Xbox.com and I essentially said that unless Microsoft is trying to roll their now established Xbox brand into Windows that they are missing the point of PC gaming. This week we hear rumors that, in fact, Microsoft may be trying to roll their now established Xbox brand into Windows. According to Insideris, Windows 8 will allow you to play Xbox 360 games on your PC. That said, despite speculation as a result of this news, it does not state whether it will be the complete catalog or a subset of 360 games that are compatible with the PC.
Which came first? The console or the Newegg?
What does this mean for PC gaming? I am unsure at this point. A reasonable outcome would be that Xbox becomes a user-friendly brand for Microsoft’s home theatre PC initiatives which adds a whole lot more sense to the Windows 8 interface outside of the tablet PC space. This is a very positive outcome for the videogame industry as a whole since it offers the best of Xbox for those who desire it and the choice of the PC platform.
This however opposes Microsoft’s excessively strict stance on closed and proprietary video gaming platforms. Could Microsoft have been pushing their proprietary platform to gut the industry norms knowing that at some point they would roll back into their long-standing more-open nature of Windows? Could Microsoft be attempting to lock down PCs, meeting somewhere in the middle? We will see, but my hopes are that proprietary industry will finally move away from art. After all, why have a timeless classic if your platform will end-of-life in a half-dozen years at best?
Subject: Editorial, General Tech | July 8, 2011 - 12:29 AM | Scott Michaud
Tagged: mod, battlefield 3
The Battlefield franchise has had a somewhat indecisive history with the mod community. Battlefield 2 was developed in part by a mod team for the first game, Battlefield 1942, and mod tools were provided for several of their releases. Recently they shifted their focus on to the console spinoff, Bad Company. While the second in the franchise was created for the PC neither featured mod tools. Now that DICE has returned to the original canon with Battlefield 3 there were hopes that mod tools would return with the franchise but according to DICE that is not the case.
These tools are hard, just look at the destructibility, you wouldn’t like it…
German gaming site GameStar met up with DICE’s CEO Patrick Soderlund to discuss Battlefield 3. Soderlund answered an array of questions from the community about the Bad Company 2 friends list, alternatives to the commander mode, and the potential future of Mirror’s Edge. When questioned about the mod tools: Soderlund did not rule out the possibility of mod tools in the future but might as well done so. He contends that Frostbite 2 is too difficult to deal with for modders (which historically means: “the tools barely work for us, we are not going through the effort to polish them for public use”).
Surprisingly, to those who know me, I can agree with DICE’s stance on the issue. If your mod tools do not fit your level of polish required to release, then do not release them; provided, of course, you do not actively harm the creation of mods. With that in mind, the mod community is what will keep your game flowing with new content, for a little upfront cost. If your tail is shorter than you anticipated: this should be the first place to look.
Subject: Editorial, General Tech | July 7, 2011 - 04:06 AM | Scott Michaud
Tagged: starcraft 2, bumpday
This week the second season of Starcraft 2 came to a close immortalizing whatever rank you happened to be just prior to the lock. This is the second league freeze that Starcraft 2 underwent since it was launched last July. Starcraft 2 was a very anticipated launch for Blizzard due to the twelve year gap between it and its predecessor, Starcraft. Our forums were just one of many places where rabid fans frothed over Starcraft and I believe we deserve to nostalgia over it.
The calendar says 7/7/2011… … Bumpedaday??
Back in early 2001, just over ten years ago, ccrazy1263 created a very energetic forum thread questioning the coming of a new Starcraft. Blizzard would not even announce the existence of a new Starcraft until over 6 years later. One thing that comes with Starcraft is that players will constantly tell you their strategies, good or bad, and there were a few instances of that in this thread. In closing, the best advice to be had from this thread is short and sweet: Don’t play Hunter or Lost Temple against Hacker. They’ll kill your expansion.
Subject: Editorial, General Tech | July 5, 2011 - 02:06 PM | Scott Michaud
Tagged: windows, games
So I am a very avid supporter of PC gaming: I do not feel that I should pay license fees to a company to actively limit me. I feel that if people keep asking me whether HD-DVD or BluRay will win that there is no reason to have eight gaming platforms co-exist (more if you include Mac, iOS, Android, etc.). I feel that instead of relying on a cardboard coffin that will arrive in 1-2 business weeks, you or your friend should be allowed to fix your own hardware… or choose your own local small business computer store to deal with in person. I feel that it is much better to buy an extra GTX 560 every four years and have money left over for a meal than a multiplayer subscription pass that does not even let you run your own dedicated servers with admins (what cheaters? BAN!) So you can guess my reaction when I saw Microsoft roll Games for Windows Marketplace into Xbox.com.
Underlined for your convenience.
Now do not get me wrong, I was never much a fan of Games for Windows to begin with. Microsoft’s attempt to rid the PC of the stability stereotype was to push certification for all content on Games for Windows Live which worked very well for Fallout: New Vegas on the Xbox 360. Ironically the PC version was much more stable just after launch because the patch was stuck in certification on Xbox during the busy holiday season (lols!) The biggest problem with forcing certification is that would include mods as well (video, 3:10) and that is not something mod developers could really afford. Halo 2 Vista was one such Games for Windows Live game whose mod tools were so neutered that the provided tutorial level was impossible to create because jump pads assets were not provided nor able to be created.
Still, it seems odd to me for Microsoft to push so feverishly to roll PC gaming into Xbox branding when other initiatives like Steam are already showing the entire industry how to do things mostly right. It is possible that at some point Microsoft could roll Direct(X)box back in to Windows and simply create a canon home theatre PC (if Valve does not eat that lunch too); but if their plan is to merge Windows into Xbox then they are completely missing the point of why we would rather play a PC game with an Xbox 360 controller: because we choose to.
Subject: Editorial, General Tech | June 29, 2011 - 11:55 PM | Scott Michaud
Tagged: far cry, bumpday
This week Crytek released their DirectX 11 update to Crysis 2 and immediately hushed the crowing of all GPUs everywhere. We have gone through the various advancements in fairly great detail but that was not the first time our GPUs came Crying to us about the monster that goes Polybump in the night. No, our forums were aflame even before Crysis back when Far Cry was about to reach its Beta phase. I guess it is time to bump it up in our memory.
Back in early 2004 our forum-goers rallied around now defunct pre-beta screenshots of the best graphics they have ever seen… in 2004. People were surprised of the novelty of having mutant zombies in your game… in 2004. People were gathered to play with their fellow amdmb members… in 2004. People were gloating about how they cannot show what the beta looked like due to non-disclosure agreements… back in 2004. Of course we now know everything about Far Cry, its official sequel from another developer, and both of its spinoff sequels from its original developer. But obviously, what we know now we did not know then; let us nostalgia at the forum thread long since forgotten.
Subject: Editorial, General Tech | June 27, 2011 - 09:12 PM | Scott Michaud
Tagged: windows 8, leak
Update: 6/28/2011 - One of our commenters suggested that the screenshots were fake. Upon looking at ZDNet's sources -- it appears as if at least the first screenshot is fake (the tile screen) as well as their onscreen keyboard (which we did not link to). The other screenshots we linked follow a completely different aesthetic to the other screenshots on the fake portfolio (shape and color of close button, for instance) so they in fact appear to be genuine. Fooled me. -Scott
So Windows 8 was shown off at the All Things Digital D9 conference and surprise it was leaked. Naturally Microsoft did not show all aspects of the Windows 8 build at the conference; they must leave some cards hidden that are either not yet ready or otherwise not designed to be shown. Ziff Davis got a hold of someone who either had a leaked build of Windows 8 or otherwise access to screenshots that Microsoft did not intend to show. And what good are screenshots that are not in a slideshow?
Care to take a spin around the leek?
So we start off with the well-known start overlay with the typical tiles including weather, calendar, Computer, email, and Internet Explorer. The next image makes us feel immediately guilty for exactly a half of a second. The new interface extends all the way to the installer where you read the EULA and enter your personalization information. The windowing look and feel has changed with Windows 8 at least temporarily exaggerating the close button and minimizing the, well, minimize and full screen buttons. The ribbon UI is also seen exploding all across the interface including the file browser. Installations, at least of Windows software, are more integrated into the operating system. Lastly, the task manager is getting a facelift which may or may not be a bad thing.
What do you think of the leaked build? What would you do differently if you were Microsoft? (Registration not required to comment.)
Subject: Editorial, General Tech, Graphics Cards | June 27, 2011 - 04:44 PM | Scott Michaud
Tagged: dx11, crysis 2
Last Wednesday we reported on the announcement of the Crysis 2 DX11 patch and high resolution texture pack upcoming for the 27th of June. Looking at the calendar it appears as if your graphics card just ran out of time to rule the roost. Clocking in at 546 megabytes for the DirectX 11 update and 1695 megabytes for the high resolution texture pack the new updates are not small especially since that does not include the size of the 1.9 patch itself. The big question is whether these updates will push the limits of your computer, and if so, is it worth it?
Can you run me now? … Hello?
VR-Zone benchmarked the new updates on an Intel Core i7-965 system paired with an NVIDIA GeForce GTX 580. We believe they accidentally mislabeled their Extreme Quality benchmark with their Ultra Quality benchmark as the ultra is the more intensive of the two settings; also, ultra should have the biggest difference between DX9 and DX11 settings as DX11 effects are not enabled at the extreme settings. ((Update: 6/28/2011 - That's exactly what happened. VR-Zone fixed it; it is correct now.)) Under that assumption you are looking at approximately 40 FPS for a 1080p experience with that test system and all the eye-candy enabled. That is a drop of approximately 33% from its usual 60 FPS under extreme settings.
But how does it look? Read on for all of that detail.
Subject: Editorial, General Tech | June 25, 2011 - 02:09 PM | Scott Michaud
Tagged: mozilla, enterprise
For enterprise users looking to introduce Firefox to their business: you may wish to reconsider. Businesses are notorious for being substantially behind in version numbers, occasionally (or a lot) trading even security for compatibility. Mozilla had a staggered release schedule: a minor version number was little more than a security update; a major version number was a fairly-large overhaul. Enterprise users were able to upgrade minor version numbers and be reasonably assured that compatibility would be maintained. There were no such assurances for a major version number, thus requiring rigorous testing before applying. Mozilla has ended their policy of supporting back versions with security updates and are also moving between full versions much more rapidly, causing dissension amongst enterprise users.
Moving the world forward, not backwards, and always twirling towards freedom.
Ed Bott took the opportunity to prod Mozilla during his Thursday evening column. He contends that shutting out enterprise will assist in the impending implosion of Firefox and allow Microsoft and Google to pick up the pieces. I seriously disagree with that statement and applaud Mozilla for staying focused on their goal. True, Mozilla will be vastly less attractive to the enterprise; however, if Microsoft did not have Windows and Office to push with Internet Explorer, would search ad revenue and donations cover the long-term development cost incurred supporting enterprise users? And really, I would have thought Ed Bott of all people (ok, except maybe Paul Thurrott) would respect a company that can make a decision like Mozilla just did and stick by it after covering Microsoft for so long.
Subject: Editorial, General Tech | June 21, 2011 - 02:36 PM | Ryan Shrout
Tagged: VIA, sysmark, nvidia, Intel, benchmark, bapco, amd
It seems that all the tech community is talking about today is BAPCo and its benchmarking suite called Sysmark. A new version, 2012, was released just recently and yesterday we found out that AMD, NVIDIA and VIA have all dropped their support of the "Business Applications Performance Corporation". Obviously those companies have a beef with the benchmark as it is, yet somehow one company stands behind the test: Intel.
Everyone you know of is posting about it. My twitter feed "asplode" with comments like this:
AMD quits BAPCo, says SYSmark is nutso. Nvidia and VIA, they say, also. http://bit.ly/kHvKux
AMD: Voting For Openness: In order to get a better understanding of AMD's press release earlier concerning BAPCO... http://bit.ly/kNtKkj
Ooh, BapCo drama.
Even PC Perspective posted on this drama yesterday afternoon saying: "The disputes centered mostly over the release of SYSmark 2012. For years various members have been complaining about various aspects of the product which they allege Intel strikes down and ignores while designing each version. One major complaint is the lack of reporting on the computer’s GPU performance which is quickly becoming beyond relevant to an actual system’s overall performance. With NVIDIA, AMD, and VIA gone from the consortium, Intel is pretty much left alone in the company: now officially."
Obviously while cutting the grass this morning this is the topic swirling through my head; so thanks for that everyone. My question is this: does it really matter and how is this any different than it has been for YEARS? The cynical side of me says that AMD, NVIDIA and VIA all dropped out because each company's particular products aren't stacking up as well as Intel's when it comes to the total resulting score. Intel makes the world's fastest CPUs, I don't think anyone with a brain will dispute that, and as such on benchmarks that test the CPU, they are going to have the edge.
We recently reviewed the AMD Llano-based Sabine platform and in CPU-centric tests like SiSoft Sandra, TrueCrypt and 7zip the AMD APU is noticeably slower. But AMD isn't sending out press releases and posting blogs about how these benchmarks don't show the true performance of a system as the end user will see. And Intel isn't pondering why we used games like Far Cry 2 and Just Cause 2 to show the AMD APU dominating there. Why? Because these tests are part of a suite of benchmarks we use to show the overall performance of a system. They are tools which competent reviewers wield in order to explain to readers why certain hardware acts in a certain way in certain circumstances.
Continue reading for more on this topic...