Subject: Editorial, General Tech | July 5, 2011 - 06:06 PM | Scott Michaud
Tagged: windows, games
So I am a very avid supporter of PC gaming: I do not feel that I should pay license fees to a company to actively limit me. I feel that if people keep asking me whether HD-DVD or BluRay will win that there is no reason to have eight gaming platforms co-exist (more if you include Mac, iOS, Android, etc.). I feel that instead of relying on a cardboard coffin that will arrive in 1-2 business weeks, you or your friend should be allowed to fix your own hardware… or choose your own local small business computer store to deal with in person. I feel that it is much better to buy an extra GTX 560 every four years and have money left over for a meal than a multiplayer subscription pass that does not even let you run your own dedicated servers with admins (what cheaters? BAN!) So you can guess my reaction when I saw Microsoft roll Games for Windows Marketplace into Xbox.com.
Underlined for your convenience.
Now do not get me wrong, I was never much a fan of Games for Windows to begin with. Microsoft’s attempt to rid the PC of the stability stereotype was to push certification for all content on Games for Windows Live which worked very well for Fallout: New Vegas on the Xbox 360. Ironically the PC version was much more stable just after launch because the patch was stuck in certification on Xbox during the busy holiday season (lols!) The biggest problem with forcing certification is that would include mods as well (video, 3:10) and that is not something mod developers could really afford. Halo 2 Vista was one such Games for Windows Live game whose mod tools were so neutered that the provided tutorial level was impossible to create because jump pads assets were not provided nor able to be created.
Still, it seems odd to me for Microsoft to push so feverishly to roll PC gaming into Xbox branding when other initiatives like Steam are already showing the entire industry how to do things mostly right. It is possible that at some point Microsoft could roll Direct(X)box back in to Windows and simply create a canon home theatre PC (if Valve does not eat that lunch too); but if their plan is to merge Windows into Xbox then they are completely missing the point of why we would rather play a PC game with an Xbox 360 controller: because we choose to.
Subject: Editorial, General Tech | June 30, 2011 - 03:55 AM | Scott Michaud
Tagged: far cry, bumpday
This week Crytek released their DirectX 11 update to Crysis 2 and immediately hushed the crowing of all GPUs everywhere. We have gone through the various advancements in fairly great detail but that was not the first time our GPUs came Crying to us about the monster that goes Polybump in the night. No, our forums were aflame even before Crysis back when Far Cry was about to reach its Beta phase. I guess it is time to bump it up in our memory.
Back in early 2004 our forum-goers rallied around now defunct pre-beta screenshots of the best graphics they have ever seen… in 2004. People were surprised of the novelty of having mutant zombies in your game… in 2004. People were gathered to play with their fellow amdmb members… in 2004. People were gloating about how they cannot show what the beta looked like due to non-disclosure agreements… back in 2004. Of course we now know everything about Far Cry, its official sequel from another developer, and both of its spinoff sequels from its original developer. But obviously, what we know now we did not know then; let us nostalgia at the forum thread long since forgotten.
Subject: Editorial, General Tech | June 28, 2011 - 01:12 AM | Scott Michaud
Tagged: windows 8, leak
Update: 6/28/2011 - One of our commenters suggested that the screenshots were fake. Upon looking at ZDNet's sources -- it appears as if at least the first screenshot is fake (the tile screen) as well as their onscreen keyboard (which we did not link to). The other screenshots we linked follow a completely different aesthetic to the other screenshots on the fake portfolio (shape and color of close button, for instance) so they in fact appear to be genuine. Fooled me. -Scott
So Windows 8 was shown off at the All Things Digital D9 conference and surprise it was leaked. Naturally Microsoft did not show all aspects of the Windows 8 build at the conference; they must leave some cards hidden that are either not yet ready or otherwise not designed to be shown. Ziff Davis got a hold of someone who either had a leaked build of Windows 8 or otherwise access to screenshots that Microsoft did not intend to show. And what good are screenshots that are not in a slideshow?
Care to take a spin around the leek?
So we start off with the well-known start overlay with the typical tiles including weather, calendar, Computer, email, and Internet Explorer. The next image makes us feel immediately guilty for exactly a half of a second. The new interface extends all the way to the installer where you read the EULA and enter your personalization information. The windowing look and feel has changed with Windows 8 at least temporarily exaggerating the close button and minimizing the, well, minimize and full screen buttons. The ribbon UI is also seen exploding all across the interface including the file browser. Installations, at least of Windows software, are more integrated into the operating system. Lastly, the task manager is getting a facelift which may or may not be a bad thing.
What do you think of the leaked build? What would you do differently if you were Microsoft? (Registration not required to comment.)
Subject: Editorial, General Tech, Graphics Cards | June 27, 2011 - 08:44 PM | Scott Michaud
Tagged: dx11, crysis 2
Last Wednesday we reported on the announcement of the Crysis 2 DX11 patch and high resolution texture pack upcoming for the 27th of June. Looking at the calendar it appears as if your graphics card just ran out of time to rule the roost. Clocking in at 546 megabytes for the DirectX 11 update and 1695 megabytes for the high resolution texture pack the new updates are not small especially since that does not include the size of the 1.9 patch itself. The big question is whether these updates will push the limits of your computer, and if so, is it worth it?
Can you run me now? … Hello?
VR-Zone benchmarked the new updates on an Intel Core i7-965 system paired with an NVIDIA GeForce GTX 580. We believe they accidentally mislabeled their Extreme Quality benchmark with their Ultra Quality benchmark as the ultra is the more intensive of the two settings; also, ultra should have the biggest difference between DX9 and DX11 settings as DX11 effects are not enabled at the extreme settings. ((Update: 6/28/2011 - That's exactly what happened. VR-Zone fixed it; it is correct now.)) Under that assumption you are looking at approximately 40 FPS for a 1080p experience with that test system and all the eye-candy enabled. That is a drop of approximately 33% from its usual 60 FPS under extreme settings.
But how does it look? Read on for all of that detail.
Subject: Editorial, General Tech | June 25, 2011 - 06:09 PM | Scott Michaud
Tagged: mozilla, enterprise
For enterprise users looking to introduce Firefox to their business: you may wish to reconsider. Businesses are notorious for being substantially behind in version numbers, occasionally (or a lot) trading even security for compatibility. Mozilla had a staggered release schedule: a minor version number was little more than a security update; a major version number was a fairly-large overhaul. Enterprise users were able to upgrade minor version numbers and be reasonably assured that compatibility would be maintained. There were no such assurances for a major version number, thus requiring rigorous testing before applying. Mozilla has ended their policy of supporting back versions with security updates and are also moving between full versions much more rapidly, causing dissension amongst enterprise users.
Moving the world forward, not backwards, and always twirling towards freedom.
Ed Bott took the opportunity to prod Mozilla during his Thursday evening column. He contends that shutting out enterprise will assist in the impending implosion of Firefox and allow Microsoft and Google to pick up the pieces. I seriously disagree with that statement and applaud Mozilla for staying focused on their goal. True, Mozilla will be vastly less attractive to the enterprise; however, if Microsoft did not have Windows and Office to push with Internet Explorer, would search ad revenue and donations cover the long-term development cost incurred supporting enterprise users? And really, I would have thought Ed Bott of all people (ok, except maybe Paul Thurrott) would respect a company that can make a decision like Mozilla just did and stick by it after covering Microsoft for so long.
Subject: Editorial, General Tech | June 21, 2011 - 06:36 PM | Ryan Shrout
Tagged: VIA, sysmark, nvidia, Intel, benchmark, bapco, amd
It seems that all the tech community is talking about today is BAPCo and its benchmarking suite called Sysmark. A new version, 2012, was released just recently and yesterday we found out that AMD, NVIDIA and VIA have all dropped their support of the "Business Applications Performance Corporation". Obviously those companies have a beef with the benchmark as it is, yet somehow one company stands behind the test: Intel.
Everyone you know of is posting about it. My twitter feed "asplode" with comments like this:
AMD quits BAPCo, says SYSmark is nutso. Nvidia and VIA, they say, also. http://bit.ly/kHvKux
AMD: Voting For Openness: In order to get a better understanding of AMD's press release earlier concerning BAPCO... http://bit.ly/kNtKkj
Ooh, BapCo drama.
Even PC Perspective posted on this drama yesterday afternoon saying: "The disputes centered mostly over the release of SYSmark 2012. For years various members have been complaining about various aspects of the product which they allege Intel strikes down and ignores while designing each version. One major complaint is the lack of reporting on the computer’s GPU performance which is quickly becoming beyond relevant to an actual system’s overall performance. With NVIDIA, AMD, and VIA gone from the consortium, Intel is pretty much left alone in the company: now officially."
Obviously while cutting the grass this morning this is the topic swirling through my head; so thanks for that everyone. My question is this: does it really matter and how is this any different than it has been for YEARS? The cynical side of me says that AMD, NVIDIA and VIA all dropped out because each company's particular products aren't stacking up as well as Intel's when it comes to the total resulting score. Intel makes the world's fastest CPUs, I don't think anyone with a brain will dispute that, and as such on benchmarks that test the CPU, they are going to have the edge.
We recently reviewed the AMD Llano-based Sabine platform and in CPU-centric tests like SiSoft Sandra, TrueCrypt and 7zip the AMD APU is noticeably slower. But AMD isn't sending out press releases and posting blogs about how these benchmarks don't show the true performance of a system as the end user will see. And Intel isn't pondering why we used games like Far Cry 2 and Just Cause 2 to show the AMD APU dominating there. Why? Because these tests are part of a suite of benchmarks we use to show the overall performance of a system. They are tools which competent reviewers wield in order to explain to readers why certain hardware acts in a certain way in certain circumstances.
Continue reading for more on this topic...
Subject: Editorial, General Tech | June 20, 2011 - 07:24 AM | Tim Verry
Tagged: simulator, networking, Internet, cyber warfare
Our world is the host to numerous physical acts of aggression every day, and until a few years ago those acts have remained in the (relatively) easily comprehensible physical world. However, the millions of connected servers and clients that overlay the numerous nations around the world have rapidly become host to what is known as “cyber warfare,” which amounts to subversion and attacks against another people or nation through electronic means-- by attacking its people or its electronic and Internet-based infrastructure.
While physical acts of aggression are easier to examine (and gather evidence) and attribute to the responsible parties, attacks on the Internet are generally the exact opposite. Thanks to the anonymity of the Internet, it is much more difficult to determine the originator of the attack. Further, the ethical debate of whether physical actions in the form of military action is appropriate in response to online attacks comes into question.
It seems as though the Pentagon is seeking the answers to the issues of attack attribution and appropriate retaliation methods through the usage of an Internet simulator dubbed the National Cyber Range. According to Computer World, two designs for the simulator are being constructed by Lockheed Martin with a $30.8 million USD grant and Johns Hopkins University Applied Physics Laboratory with a $24.7 million USD grant provided by DARPA.
The National Cyber Range is to be designed to mimic human behavior in response to various DefCon and InfoCon (Informational Operations Condition) levels. It will allow the Pentagon and authorized parties to study the effectiveness of war plan execution as it simulates offensive and defensive actions on the scale of nation-backed levels of cyber warfare. Once the final National Cyber Range design has been chosen by DARPA from the two competing projects (by Johns Hopkins and Lockheed Martin), the government would be able to construct a toolkit that would allow them to easily transfer and conduct cyber warfare testing from any facility.
Image cortesy Kurtis Scaletta via Flickr Creative Commons.
Subject: Editorial, Graphics Cards, Processors, Mobile, Shows and Expos | June 16, 2011 - 06:41 PM | Ryan Shrout
Tagged: llano, liveblog, fusion, APU, amd, AFDS
The AMD Fusion Developer Summit 2011 is set to begin at 11:30am ET / 8:30am PT and promises to bring some interesting and forward looking news about the future of AMD's APU technology. We are going to cover the keynotes LIVE right here throughout the week so if you want to know what is happening AS IT HAPPENS, stick around!!
Subject: Editorial | June 16, 2011 - 01:54 AM | Scott Michaud
Tagged: duke nukem, bumpday
Yesterday saw the launch of Duke Nukem Forever. If you were a member of our forums for a long period of time you might actually have an odd case of deja vu. This is actually the second time that we have experienced the launch of Duke Nukem Forever. The first time was a sick joke one April fool’s day over nine years ago which is quite fitting for the lewd and crude franchise. Some argue that our current release is a sick joke, but that is a whole other story.
Duke Nukem: Forever and 3363 days.
Back in April 1st, 2002: Think Geek announced the availability of Duke Nukem Forever for Linux, Windows 3.1, WheatoniX, and Plan 9. The general buzz on the forum was about the marvel of being a PC gamer and how much of an impact Duke Nukem 3D had on our lives. So go on, nostalgia at the old fashioned giant PC gaming boxes and nostalgia at the forum thread long since forgotten.
Subject: Editorial, General Tech, Shows and Expos | June 15, 2011 - 09:58 PM | Ryan Shrout
Tagged: programming, microsoft, fusion, c++, amp, AFDS
During this morning's keynote at the AMD Fusion Developer Summit, Microsoft's Herb Sutter went on stage to discuss the problems and solutions involved around programming and developing for multi-processing systems and heterogeneous computing systems in particular. While the problems are definitely something we have discussed before at PC Perspective, the new solution that was showcased was significant.
C++ AMP (accelerated massive parallelism) was announced as a new extension to Visual Studio and the C++ programming language to help developers take advantage of the highly parallel and heterogeneous computing environments of today and the future. The new programming model uses C++ syntax and will be available in the next version of Visual Studio with "bits of it coming later this year." Sorry, no hard release date was given when probed.
Perhaps just as significant is the fact that Microsoft announced the C++ AMP standard would be an open specification and they are going to allow other compilers to integrated support for it. Unlike C# then, C++ AMP has a chance to be a new dominant standard in the programming world as the need for parallel computing expands. While OpenCL was the only option for developers that promised to allow easy utilization of ALL computing power in a computing device, C++ AMP gives users another option with the full weight of Microsoft behind it.
To demonstrate the capability of C++ AMP Microsoft showed a rigid body simulation program that ran on multiple computers and devices from a single executable file and was able to scale in performance from 3 GLOPS on the x86 cores of Llano to 650 GFLOPS on the combined APU power and to 830 GFLOPS with a pair of discrete Radeon HD 5800 GPUs. The same executable file was run on an AMD E-series APU powered tablet and ran at 16 GFLOPS with 16,000 particles. This is the promise of heterogeneous programming languages and is the gateway necessary for consumers and business to truly take advantage of the processors that AMD (and other companies) are building today.
If you want programs other than video transcoding apps to really push the promise of heterogeneous computing, then the announcement of C++ AMP is very, very big news.