All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Editorial, General Tech | July 7, 2011 - 04:06 AM | Scott Michaud
Tagged: starcraft 2, bumpday
This week the second season of Starcraft 2 came to a close immortalizing whatever rank you happened to be just prior to the lock. This is the second league freeze that Starcraft 2 underwent since it was launched last July. Starcraft 2 was a very anticipated launch for Blizzard due to the twelve year gap between it and its predecessor, Starcraft. Our forums were just one of many places where rabid fans frothed over Starcraft and I believe we deserve to nostalgia over it.
The calendar says 7/7/2011… … Bumpedaday??
Back in early 2001, just over ten years ago, ccrazy1263 created a very energetic forum thread questioning the coming of a new Starcraft. Blizzard would not even announce the existence of a new Starcraft until over 6 years later. One thing that comes with Starcraft is that players will constantly tell you their strategies, good or bad, and there were a few instances of that in this thread. In closing, the best advice to be had from this thread is short and sweet: Don’t play Hunter or Lost Temple against Hacker. They’ll kill your expansion.
Subject: Editorial, General Tech | July 5, 2011 - 02:06 PM | Scott Michaud
Tagged: windows, games
So I am a very avid supporter of PC gaming: I do not feel that I should pay license fees to a company to actively limit me. I feel that if people keep asking me whether HD-DVD or BluRay will win that there is no reason to have eight gaming platforms co-exist (more if you include Mac, iOS, Android, etc.). I feel that instead of relying on a cardboard coffin that will arrive in 1-2 business weeks, you or your friend should be allowed to fix your own hardware… or choose your own local small business computer store to deal with in person. I feel that it is much better to buy an extra GTX 560 every four years and have money left over for a meal than a multiplayer subscription pass that does not even let you run your own dedicated servers with admins (what cheaters? BAN!) So you can guess my reaction when I saw Microsoft roll Games for Windows Marketplace into Xbox.com.
Underlined for your convenience.
Now do not get me wrong, I was never much a fan of Games for Windows to begin with. Microsoft’s attempt to rid the PC of the stability stereotype was to push certification for all content on Games for Windows Live which worked very well for Fallout: New Vegas on the Xbox 360. Ironically the PC version was much more stable just after launch because the patch was stuck in certification on Xbox during the busy holiday season (lols!) The biggest problem with forcing certification is that would include mods as well (video, 3:10) and that is not something mod developers could really afford. Halo 2 Vista was one such Games for Windows Live game whose mod tools were so neutered that the provided tutorial level was impossible to create because jump pads assets were not provided nor able to be created.
Still, it seems odd to me for Microsoft to push so feverishly to roll PC gaming into Xbox branding when other initiatives like Steam are already showing the entire industry how to do things mostly right. It is possible that at some point Microsoft could roll Direct(X)box back in to Windows and simply create a canon home theatre PC (if Valve does not eat that lunch too); but if their plan is to merge Windows into Xbox then they are completely missing the point of why we would rather play a PC game with an Xbox 360 controller: because we choose to.
Subject: Editorial, General Tech | June 29, 2011 - 11:55 PM | Scott Michaud
Tagged: far cry, bumpday
This week Crytek released their DirectX 11 update to Crysis 2 and immediately hushed the crowing of all GPUs everywhere. We have gone through the various advancements in fairly great detail but that was not the first time our GPUs came Crying to us about the monster that goes Polybump in the night. No, our forums were aflame even before Crysis back when Far Cry was about to reach its Beta phase. I guess it is time to bump it up in our memory.
Back in early 2004 our forum-goers rallied around now defunct pre-beta screenshots of the best graphics they have ever seen… in 2004. People were surprised of the novelty of having mutant zombies in your game… in 2004. People were gathered to play with their fellow amdmb members… in 2004. People were gloating about how they cannot show what the beta looked like due to non-disclosure agreements… back in 2004. Of course we now know everything about Far Cry, its official sequel from another developer, and both of its spinoff sequels from its original developer. But obviously, what we know now we did not know then; let us nostalgia at the forum thread long since forgotten.
Subject: Editorial, General Tech | June 27, 2011 - 09:12 PM | Scott Michaud
Tagged: windows 8, leak
Update: 6/28/2011 - One of our commenters suggested that the screenshots were fake. Upon looking at ZDNet's sources -- it appears as if at least the first screenshot is fake (the tile screen) as well as their onscreen keyboard (which we did not link to). The other screenshots we linked follow a completely different aesthetic to the other screenshots on the fake portfolio (shape and color of close button, for instance) so they in fact appear to be genuine. Fooled me. -Scott
So Windows 8 was shown off at the All Things Digital D9 conference and surprise it was leaked. Naturally Microsoft did not show all aspects of the Windows 8 build at the conference; they must leave some cards hidden that are either not yet ready or otherwise not designed to be shown. Ziff Davis got a hold of someone who either had a leaked build of Windows 8 or otherwise access to screenshots that Microsoft did not intend to show. And what good are screenshots that are not in a slideshow?
Care to take a spin around the leek?
So we start off with the well-known start overlay with the typical tiles including weather, calendar, Computer, email, and Internet Explorer. The next image makes us feel immediately guilty for exactly a half of a second. The new interface extends all the way to the installer where you read the EULA and enter your personalization information. The windowing look and feel has changed with Windows 8 at least temporarily exaggerating the close button and minimizing the, well, minimize and full screen buttons. The ribbon UI is also seen exploding all across the interface including the file browser. Installations, at least of Windows software, are more integrated into the operating system. Lastly, the task manager is getting a facelift which may or may not be a bad thing.
What do you think of the leaked build? What would you do differently if you were Microsoft? (Registration not required to comment.)
Subject: Editorial, General Tech, Graphics Cards | June 27, 2011 - 04:44 PM | Scott Michaud
Tagged: dx11, crysis 2
Last Wednesday we reported on the announcement of the Crysis 2 DX11 patch and high resolution texture pack upcoming for the 27th of June. Looking at the calendar it appears as if your graphics card just ran out of time to rule the roost. Clocking in at 546 megabytes for the DirectX 11 update and 1695 megabytes for the high resolution texture pack the new updates are not small especially since that does not include the size of the 1.9 patch itself. The big question is whether these updates will push the limits of your computer, and if so, is it worth it?
Can you run me now? … Hello?
VR-Zone benchmarked the new updates on an Intel Core i7-965 system paired with an NVIDIA GeForce GTX 580. We believe they accidentally mislabeled their Extreme Quality benchmark with their Ultra Quality benchmark as the ultra is the more intensive of the two settings; also, ultra should have the biggest difference between DX9 and DX11 settings as DX11 effects are not enabled at the extreme settings. ((Update: 6/28/2011 - That's exactly what happened. VR-Zone fixed it; it is correct now.)) Under that assumption you are looking at approximately 40 FPS for a 1080p experience with that test system and all the eye-candy enabled. That is a drop of approximately 33% from its usual 60 FPS under extreme settings.
But how does it look? Read on for all of that detail.
Subject: Editorial, General Tech | June 25, 2011 - 02:09 PM | Scott Michaud
Tagged: mozilla, enterprise
For enterprise users looking to introduce Firefox to their business: you may wish to reconsider. Businesses are notorious for being substantially behind in version numbers, occasionally (or a lot) trading even security for compatibility. Mozilla had a staggered release schedule: a minor version number was little more than a security update; a major version number was a fairly-large overhaul. Enterprise users were able to upgrade minor version numbers and be reasonably assured that compatibility would be maintained. There were no such assurances for a major version number, thus requiring rigorous testing before applying. Mozilla has ended their policy of supporting back versions with security updates and are also moving between full versions much more rapidly, causing dissension amongst enterprise users.
Moving the world forward, not backwards, and always twirling towards freedom.
Ed Bott took the opportunity to prod Mozilla during his Thursday evening column. He contends that shutting out enterprise will assist in the impending implosion of Firefox and allow Microsoft and Google to pick up the pieces. I seriously disagree with that statement and applaud Mozilla for staying focused on their goal. True, Mozilla will be vastly less attractive to the enterprise; however, if Microsoft did not have Windows and Office to push with Internet Explorer, would search ad revenue and donations cover the long-term development cost incurred supporting enterprise users? And really, I would have thought Ed Bott of all people (ok, except maybe Paul Thurrott) would respect a company that can make a decision like Mozilla just did and stick by it after covering Microsoft for so long.
Subject: Editorial, General Tech | June 21, 2011 - 02:36 PM | Ryan Shrout
Tagged: VIA, sysmark, nvidia, Intel, benchmark, bapco, amd
It seems that all the tech community is talking about today is BAPCo and its benchmarking suite called Sysmark. A new version, 2012, was released just recently and yesterday we found out that AMD, NVIDIA and VIA have all dropped their support of the "Business Applications Performance Corporation". Obviously those companies have a beef with the benchmark as it is, yet somehow one company stands behind the test: Intel.
Everyone you know of is posting about it. My twitter feed "asplode" with comments like this:
AMD quits BAPCo, says SYSmark is nutso. Nvidia and VIA, they say, also. http://bit.ly/kHvKux
AMD: Voting For Openness: In order to get a better understanding of AMD's press release earlier concerning BAPCO... http://bit.ly/kNtKkj
Ooh, BapCo drama.
Even PC Perspective posted on this drama yesterday afternoon saying: "The disputes centered mostly over the release of SYSmark 2012. For years various members have been complaining about various aspects of the product which they allege Intel strikes down and ignores while designing each version. One major complaint is the lack of reporting on the computer’s GPU performance which is quickly becoming beyond relevant to an actual system’s overall performance. With NVIDIA, AMD, and VIA gone from the consortium, Intel is pretty much left alone in the company: now officially."
Obviously while cutting the grass this morning this is the topic swirling through my head; so thanks for that everyone. My question is this: does it really matter and how is this any different than it has been for YEARS? The cynical side of me says that AMD, NVIDIA and VIA all dropped out because each company's particular products aren't stacking up as well as Intel's when it comes to the total resulting score. Intel makes the world's fastest CPUs, I don't think anyone with a brain will dispute that, and as such on benchmarks that test the CPU, they are going to have the edge.
We recently reviewed the AMD Llano-based Sabine platform and in CPU-centric tests like SiSoft Sandra, TrueCrypt and 7zip the AMD APU is noticeably slower. But AMD isn't sending out press releases and posting blogs about how these benchmarks don't show the true performance of a system as the end user will see. And Intel isn't pondering why we used games like Far Cry 2 and Just Cause 2 to show the AMD APU dominating there. Why? Because these tests are part of a suite of benchmarks we use to show the overall performance of a system. They are tools which competent reviewers wield in order to explain to readers why certain hardware acts in a certain way in certain circumstances.
Continue reading for more on this topic...
Subject: Editorial, General Tech | June 20, 2011 - 03:24 AM | Tim Verry
Tagged: simulator, networking, Internet, cyber warfare
Our world is the host to numerous physical acts of aggression every day, and until a few years ago those acts have remained in the (relatively) easily comprehensible physical world. However, the millions of connected servers and clients that overlay the numerous nations around the world have rapidly become host to what is known as “cyber warfare,” which amounts to subversion and attacks against another people or nation through electronic means-- by attacking its people or its electronic and Internet-based infrastructure.
While physical acts of aggression are easier to examine (and gather evidence) and attribute to the responsible parties, attacks on the Internet are generally the exact opposite. Thanks to the anonymity of the Internet, it is much more difficult to determine the originator of the attack. Further, the ethical debate of whether physical actions in the form of military action is appropriate in response to online attacks comes into question.
It seems as though the Pentagon is seeking the answers to the issues of attack attribution and appropriate retaliation methods through the usage of an Internet simulator dubbed the National Cyber Range. According to Computer World, two designs for the simulator are being constructed by Lockheed Martin with a $30.8 million USD grant and Johns Hopkins University Applied Physics Laboratory with a $24.7 million USD grant provided by DARPA.
The National Cyber Range is to be designed to mimic human behavior in response to various DefCon and InfoCon (Informational Operations Condition) levels. It will allow the Pentagon and authorized parties to study the effectiveness of war plan execution as it simulates offensive and defensive actions on the scale of nation-backed levels of cyber warfare. Once the final National Cyber Range design has been chosen by DARPA from the two competing projects (by Johns Hopkins and Lockheed Martin), the government would be able to construct a toolkit that would allow them to easily transfer and conduct cyber warfare testing from any facility.
Image cortesy Kurtis Scaletta via Flickr Creative Commons.
Subject: Editorial, Graphics Cards, Processors, Mobile, Shows and Expos | June 16, 2011 - 02:41 PM | Ryan Shrout
Tagged: llano, liveblog, fusion, APU, amd, AFDS
The AMD Fusion Developer Summit 2011 is set to begin at 11:30am ET / 8:30am PT and promises to bring some interesting and forward looking news about the future of AMD's APU technology. We are going to cover the keynotes LIVE right here throughout the week so if you want to know what is happening AS IT HAPPENS, stick around!!
Subject: Editorial | June 15, 2011 - 09:54 PM | Scott Michaud
Tagged: duke nukem, bumpday
Yesterday saw the launch of Duke Nukem Forever. If you were a member of our forums for a long period of time you might actually have an odd case of deja vu. This is actually the second time that we have experienced the launch of Duke Nukem Forever. The first time was a sick joke one April fool’s day over nine years ago which is quite fitting for the lewd and crude franchise. Some argue that our current release is a sick joke, but that is a whole other story.
Duke Nukem: Forever and 3363 days.
Back in April 1st, 2002: Think Geek announced the availability of Duke Nukem Forever for Linux, Windows 3.1, WheatoniX, and Plan 9. The general buzz on the forum was about the marvel of being a PC gamer and how much of an impact Duke Nukem 3D had on our lives. So go on, nostalgia at the old fashioned giant PC gaming boxes and nostalgia at the forum thread long since forgotten.
Subject: Editorial, General Tech, Shows and Expos | June 15, 2011 - 05:58 PM | Ryan Shrout
Tagged: programming, microsoft, fusion, c++, amp, AFDS
During this morning's keynote at the AMD Fusion Developer Summit, Microsoft's Herb Sutter went on stage to discuss the problems and solutions involved around programming and developing for multi-processing systems and heterogeneous computing systems in particular. While the problems are definitely something we have discussed before at PC Perspective, the new solution that was showcased was significant.
C++ AMP (accelerated massive parallelism) was announced as a new extension to Visual Studio and the C++ programming language to help developers take advantage of the highly parallel and heterogeneous computing environments of today and the future. The new programming model uses C++ syntax and will be available in the next version of Visual Studio with "bits of it coming later this year." Sorry, no hard release date was given when probed.
Perhaps just as significant is the fact that Microsoft announced the C++ AMP standard would be an open specification and they are going to allow other compilers to integrated support for it. Unlike C# then, C++ AMP has a chance to be a new dominant standard in the programming world as the need for parallel computing expands. While OpenCL was the only option for developers that promised to allow easy utilization of ALL computing power in a computing device, C++ AMP gives users another option with the full weight of Microsoft behind it.
To demonstrate the capability of C++ AMP Microsoft showed a rigid body simulation program that ran on multiple computers and devices from a single executable file and was able to scale in performance from 3 GLOPS on the x86 cores of Llano to 650 GFLOPS on the combined APU power and to 830 GFLOPS with a pair of discrete Radeon HD 5800 GPUs. The same executable file was run on an AMD E-series APU powered tablet and ran at 16 GFLOPS with 16,000 particles. This is the promise of heterogeneous programming languages and is the gateway necessary for consumers and business to truly take advantage of the processors that AMD (and other companies) are building today.
If you want programs other than video transcoding apps to really push the promise of heterogeneous computing, then the announcement of C++ AMP is very, very big news.
Subject: Editorial, Processors, Shows and Expos | June 14, 2011 - 05:09 PM | Ryan Shrout
Tagged: nvidia, Intel, heterogeneous, fusion, arm, AFDS
Before the AMD Fusion Developer Summit started this week in Bellevue, WA the most controversial speaker on the agenda was Jem Davies, the VP of Technology at ARM. Why would AMD and ARM get together on a stage with dozens of media and hundreds of developers in attendance? There is no partnership between them in terms of hardware or software but would there be some kind of major announcement made about the two company's future together?
In that regard, the keynote was a bit of a letdown and if you thought there was going to be a merger between them or a new AMD APU being announced with an ARM processor in it, you left a bit disappointed. Instead we got a bit of background on ARM how the race of processing architectures has slowly dwindled to just x86 and ARM as well as a few jibes at the competition NOT named AMD.
As is usually the case, Davies described the state of processor technology with an emphasis on power efficiency and the importance of designing with that future in mind. One of the interesting points was shown in regard to the "bitter reality" of core-type performance and the projected DECREASE we will see from 2012 onward due to leakage concerns as we progress to 10nm and even 7nm technologies.
The idea of dark silicon "refers to the huge swaths of silicon transistors on future chips that will be underused because there is not enough power to utilize all the transistors at the same time" according to this article over at physorg.com. As the process technology gets smaller then the areas of dark silicon increase until the area of the die that can be utilized at any one time might hit as low as 10% in 2020. Because of this, the need to design chips with many task-specific heterogeneous portions is crucial and both AMD and ARM on that track.
Those companies not on that path today, NVIDIA specifically and Intel as well, were addressed on the below slide when discussing GPU computing. Davies pointed out that if a company has a financial interest in the immediate success of only CPU or GPU then benchmarks will be built and shown in a way to make it appear that THAT portion is the most important. We have seen this from both NVIDIA and Intel in the past couple of years while AMD has consistently stated they are going to be using the best processor for the job.
Amdahl's Law is used in parallel computing to predict the theoretical maximum speed up using multiple processors. Davies reiterated what we have been told for some time that if only 50% of your application can actually BE parallelized, then no matter how many processing cores you throw at it, it will only ever be 50% faster. The heterogeneous computing products of today and the future can address both the parallel computing and serial computing tasks with improvements in performance and efficiency and should result in better computing in the long run.
So while we didn't get the major announcement from ARM and AMD that we might have been expecting, the fact that ARM would come up and share a stage with AMD reiterates the message of the Fusion Developer Summit quite clearly: a combined and balanced approach to processing might not be the sexiest but it is very much the correct one for consumers.
Subject: Editorial, General Tech | June 9, 2011 - 12:25 AM | Scott Michaud
Tagged: windows 8, silverlight
That interface doesn’t look very silvery, or light.
I think the real message here is that when you invest (through time, money, or otherwise) in a proprietary infrastructure you need to expect that you have no real recourse should the owner work against you; you voided all recourse except for what is explicitly contractually bound to you. In the case of an open, particularly copyleft, platform: should support from the original owners be absent or insufficient you are legally allowed to take over provided that right is also granted by you. Often it may still be worthwhile to invest in proprietary platforms, but remember, you give up your right to maintain your dependencies. All your dependent art is relying on your trust in the platform owner, and you have no legal recourse, because you gave it away.
Do you have any comments on this? Discuss below.
Subject: Editorial, General Tech, Shows and Expos | May 29, 2011 - 06:08 PM | Ryan Shrout
Tagged: pcper, computex 2011, computex
It is that time of year again where the staff at PC Perspective wades through the moist air of Taipei, Taiwan to bring you the latest news, videos and updates from the world of technology and computing hardware. This marks my 10th appearance on the island of Taiwan, but Ken's first so it should be an interesting experience for both of us.
Yes, this scary girl is here again too.
This year we have a surprising amount of expected hardware to investigate starting with an onslaught of AMD-based technology. The AMD 990FX chipset will be out and about in force from the likes of ASUS, MSI, ECS and others while the Fusion APUs will be flaunted in notebooks and likely even some "surprise" Llano designs. Intel's Z68 chipset, though already launched, will still be a focus but the company will surprise many with a dramatic move into the world of mobile computing with the Atom line.
Temperatures are high but so is the excitement!!
Subject: Editorial, General Tech | May 27, 2011 - 06:41 PM | Jeremy Hellstrom
Tagged: friday, forum
Our Tech Talk Forum is a great place to hang out and discuss general technology questions, such as the eternal question of which screwdriver to use, since we can't get hold of a working Sonic Screwdriver, do you have a preferred electric screwdriver or do you prefer the traditional one like mpulliam and I do? Perhaps you are more interested in moving to SoCal where NewEgg is apparently opening up a pick up location.
Head over to the Processor Forum for speculation on Bulldozer, since the recent leaks have given us a bit more information; or maybe a laugh at the Apple fanatics and the price they pay for the same hardware we use? Especially considering that while some of their designs are nice, they can never match the imagination of a dedicated modder!
There is a lot more inside, take a walk through the forums and see what you can find or even better ... what you can offer! For a more laid back way to catch up on the latest tech news you should catch the new PC Perspective Podcast, #156 even has Ryan in it!
Subject: Editorial | May 27, 2011 - 01:52 PM | John Davis
Tagged: ubuntu, linux, kernel, interview, hardware
In a continuation of our effort to embrace and report on the open-source community, PC Perspective has contacted another very interesting Open-Source project. This week we selected Ubuntu and their Manager of the Ubuntu Kernel Team, Pete Graner
Image courtesy of Ubuntu
The self-described beginning of Ubuntu:
Linux was already established as an enterprise server platform in 2004. But free software was still not a part of everyday life for most computer users. That's why Mark Shuttleworth gathered a small team of developers from one of the most established Linux projects – Debian - and set out to create an easy-to-use Linux desktop, Ubuntu.
The vision for Ubuntu is part social and part economic: free software, available free of charge to everybody on the same terms, and funded through a portfolio of services provided by Canonical.
If you would like to learn more about Ubuntu please click here.
Ubuntu also lists its features as the following:
- A fresh look
The launcher: Get easy access to your favourite tools and applications with our lovely new launcher. You can hide and reveal it, add and remove apps and keep track of your open windows.
The dash: Our new dash offers a great way to get to your shortcuts and search for more apps and programs. So you can get fast access to your email, music, pictures and much more.
Workspaces: Our handy workspaces tool gives you a really easy way to view and move between multiple windows and applications.
You can surf in safety with Ubuntu – confident that your files and data will stay protected. A built-in firewall and virus protection come as standard. And if a potential threat appears, we provide automatic updates which you can install in a single click. You get added security with AppArmor, which protects your important applications so attackers can’t access your system. And thanks to Firefox and gnome-keyring, Ubuntu helps you keep your private information private. So whether it’s accessing your bank account or sharing sensitive data with friends or colleagues, you’ll have peace of mind when you need it the most.
Ubuntu works brilliantly with a range of devices. Simply plug in your mp3 player, camera or printer and you’ll be up and running straight away. No installation CDs. No fuss. And it’s compatible with Windows too! So you can open, edit and share Microsoft Office documents stress-free.
Ubuntu loads quickly on any computer, but it's super-fast on newer machines. With no unnecessary programs and trial software slowing things down, booting up and opening a browser takes seconds. Unlike other operating systems that leave you staring at the screen, waiting to get online. And Ubuntu won’t grow sluggish over time. It’s fast. And it stays fast.
Accessibility is central to the Ubuntu philosophy. We believe that computing is for everyone regardless of nationality, race, gender or disability. Fully translated into 25 languages, Ubuntu also includes essential assistive technologies, which are, of course, completely free. We recommend the Ubuntu classic desktop experience for users with particular accessibility requirements.
(Image courtesy of Distrowatch)
I have used Ubuntu almost as long as I have been using Fedora. Ubuntu has been my go to Linux distrobution since Wartty Warthog. I have installed Ubuntu on laptops, family members computers, and I even went 100% Ubuntu for a year. In my experience, any and all of my questions could be answered by Documentation, Community, and Launchpad.
Now that you have a brief idea about Ubuntu, lets get to the interview:
(Hit that Read More link for the details!!)
Subject: Editorial, General Tech | May 26, 2011 - 02:04 PM | Ken Addison
Tagged: R6970, podcast, nvidia, Intel, firepro, amd, 990x, 990fx
PC Perspective Podcast #156- 5/26/2011
This week we talk about the AMD FirePro V7900 and V5900, MSI R6970 Lightning, Intel i7-990x,Viewer questions and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store
- RSS - Subscribe through your regular
- MP3 - Direct download link to the MP3 file
Hosts: Jeremy Hellstrom, Josh Walrath and Allyn Malventano
This Podcast is brought to you by MSI
- 0:00:45 Introduction
- 1-888-38-PCPER or email@example.com
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 0:01:45 AMD FirePro V7900 and V5900 Professional Graphics Review
- 0:05:25 MSI R6970 Lightning Review: A Supercharged AMD Radeon HD 6970
- 0:16:35 This Podcast is brought to you by MSI
, and their all new Sandy Bridge Motherboards!
- 0:17:20 Intel Core i7-990X Gulftown Processor and DX58SO2 Motherboard Review
- 0:22:30 NVIDIA GTX 560 Review Coming soon
- 0:24:36 Cray Announces AMD Bulldozer CPU and NVIDIA Tesla GPU Supercomputer Capable of 50 Petaflops
- 0:27:38 Sneak Peak at the MSI 990FXA-GD65
- 0:30:13 ASUS Sabertooth Motherboard Supporting 990FX Chipset Pictured
- 0:32:33 Email from Jeff about Bulldozer leaks
- 0:36:00 Revisiting quad-gpus and the Law of Diminishing Returns
- 0:39:58 Leaking Llano and Bulldozer prices
- 0:43:57 Corsair Hydro H80 and H100 Water Coolers On The Horizon
- 0:46:30 Email from Bernie about desktop standby mode
- 0:51:05 Email from Jesse about overclocking temps
- 0:53:40 Hardware / Software Pick of the Week
- Ryan: Lucid Virtu - its working
- Jeremy: Remember Action Quake? How 'bout a little Action Half Life 2, still less buggy than your average commercial release and better supported.
- Josh: DiRT 3: Love it.
- 0:59:30 THIS JUST IN: Asus ROG Matrix GeForce GTX 580 Graphics Card Details Leaked
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 1:02:00 Closing
Subject: Editorial, General Tech | May 25, 2011 - 09:22 PM | Scott Michaud
Tagged: Malware, apple
Apple users have been dealing with a bad bout of malware over the last few weeks ironically called Mac Defender. Its modus operandi involves scaring the Apple user with claims of malware in a phony file browser and giving them a magical option to remove all problems. That option is actually the malware, but since the users are convinced they are downloading anti-malware they will often allow it to happen and provide their admin password. At that point, they are prompted to provide their credit card number to actually remove the now-present infection. Apple was actively quiet about the whole experience but has now gone vocal about the experience. Also, a new revision of Mac Defender just got substantially harder to avoid.
It should be noted that admin password or not; Apple or not; patch or not; this form of malware strikes the most vulnerable point of any system: the user’s complacency. It does not matter how good of an antivirus solution you have, or how protected your operating system and programs are (though in many cases both of those are lacking as well) you need to be cautious about what you do with any device that accepts information that is not yours. Food for thought: software that can jailbreak an iPhone steal admin privileges from Apple and give it to you. Even in a locked down system such as an iPhone where the user does not have admin rights, what would have happened had you not been the recipient of the admin privileges?
Subject: Editorial, General Tech, Graphics Cards | May 22, 2011 - 09:16 PM | Ryan Shrout
Tagged: gtx 570, giveaway, contest, asus
As you can no doubt tell, PC Perspective got a HUGE and much needed facelift recently to what we are internally calling "PC Perspective v4.0". I know there are still some kinks to work out and we are actively addressing any feedback from our readers in this comment thread.
But we want to celebrate the launch of the new site in style!! Some of our site sponsors have very generously offered up some prizes for us to give out throughout the coming days...
The tenth (!!) prize is a wicked ASUS GeForce GTX 570 DirectCU II card that is a triple-slot design and that supports 3D Vision Surround out of the box!
What do you have to do to win this wonderful piece of hardware?
Couldn't be easier: post a comment in this post thanking ASUS for its sponsorship of PC Perspective as well as what feature in a graphics card you would most like to see in the future. Be creative! You should probably have a registered account or at least be sure you include your email address in the appropriate field so we can contact you!
Subject: Editorial | May 21, 2011 - 08:41 PM | John Davis
Tagged: Red Hat, open-source, open source, linux, Fedora
In a continuation of our effort to embrace and report on the open-source community, PC Perspective has contacted another very interesting Open-Source project. This week we selected Fedora and their Project Leader Jared Smith.
(Image courtesy of Fedora)
Fedora is self-described as:
Fedora is a Linux-based operating system, a collection of software that makes your computer run. You can use Fedora in addition to, or instead of, other operating systems such as Microsoft Windows™ or Mac OS X™. The Fedora operating system is completely free of cost for you to enjoy and share.
The Fedora Project is the name of a worldwide community of people who love, use, and build free software from around the globe. We want to lead in the creation and spread of free code and content by working together as a community. Fedora is sponsored by Red Hat, the world's most trusted provider of open source technology. Red Hat invests in Fedora to encourage collaboration and incubate innovative new free software technologies.
Fedora also lists its features as the following:
- 100% Free & Open Source: Fedora is 100% gratis and consists of free & open source software.
- Thousands of Free Apps!: With thousands of apps across 10,000+ packages, Fedora's got an app for you.
- Virus- and Spyware-Free: No more antivirus and spyware hassles. Fedora is Linux-based and secure.
- Worldwide Community: Built by a global community of contributors, there's a local website for you.
- An Amazingly Powerful OS: Fedora is the foundation for Red Hat Enterprise Linux, a powerful enterprise OS.
- Share it with Friends!: Fedora is free to share! Pass it along to your friends and family, no worry!
- Beautiful Artwork: Compute in style with many open & beautiful wallpapers and themes!
- Millions of Installations: Fedora has been installed millions of times. It's a large community to join!
(Image courtesy of Distrowatch)
I have used Fedora since it was Fedora Core, which has been almost eight years now. Fedora is a community-supported distrobution that is sponsored by Red Hat. Fedora is known for being on the leading edge of technology at the time of shipment. Their release cycle is every 6 months and they are very transparent as to what will be included and excluded. Fedora has a huge community and tries to involve everyone and encourages participation. If you need any help with using Fedora or have any questions, they can be answered by; Frequently Asked Questions (FAQ), Documentation, IRC, and Mailing Lists.
Now that you have a brief idea about Fedora, lets get to the interview:
(Hit that Read More link for the details!!)