RAGE is not as dependant on your graphics hardware as it is on your CPU and storage system (which may be an industry first); the reason for which we will discover when talking about the texture pop-up issue on the next page.
The first id Software designed game since the release of Doom 3 in August of 2004, RAGE has a lot riding on it. Not only is this the introduction of the idTech 5 game engine but also culminates more than 4 years of development and the first new IP from the developer since the creation of Quake. And since the first discussions and demonstrations of Carmack's new MegaTexture technology, gamers have been expecting a lot as well.
Would this game be impressive enough on the visuals to warrant all the delays we have seen? Would it push today's GPUs in a way that few games are capable of? It looks like we have answers to both of those questions and you might be a bit disappointed.
First, let's get to the heart of the performance question: will your hardware play RAGE? Chances are, very much so. I ran through some tests of RAGE on a variety of hardware including the GeForce GTX 580, 560 Ti, 460 1GB and the Radeon HD 6970, HD 6950, HD 6870 and HD 5850. The test bed included an Intel Core i7-965 Nehalem CPU, 6GB of DDR3-1333 memory running off of a 600GB VelociRaptor hard drive. Here are the results from our performance tests running at 1920x1080 resolution with 4x AA enabled in the game options:
Subject: Editorial, General Tech, Graphics Cards | September 29, 2011 - 03:51 PM | Ryan Shrout
Tagged: radeon, pcper live, nvidia, geforce, bf3, battlefield 3, amd
Look, there isn't any football on tonight, so what else are you going to do? Rather than watch a rerun of Seinfeld, why not stop by here at http://pcper.com/live to hang out and not only watch us play some Battlefield 3 but also participate yourself by calling in on Skype, using Google+ Hangouts or even just chatting on our IRC server (welcome back to 1998, amiright?).
We are going to start the show at 10:30pm EST and plan on running until midnight at least, but it all depends on participation levels from YOU!
What all do we have on the agenda? Well, it is going to be pretty informal as we are still getting our feet wet with this whole "live" thing but here is what we are planning:
- Watch Ryan and Ken get destroyed over and over in Battlefield 3 and watch as we attempt to sneak our way into the password protected Caspian levels with vehicles, 64 players and a new game type (oh my!).
- Skype call ins from readers and gamers that want to talk about BF3 and their experiences with the game so far. How does it run on your hardware? Discussion like this will help others that might not have the beta figure out what they might want to upgrade in the near future. (Be prepared to give us your Skype handle in the chat so we can call you!)
- Trying some group games via our PC Perspective Platoon, the Fragging Frogs! (Head over and apply to join or become a fan!) You can also find me on EA Origin or in the Battlelog system as "ryanshrout".
- We will discuss our brainstorming sessions for what hardware we will recommend for certain gaming resolutions and specific image quality settings in a future article...all live!
- Maybe some surprise guests from the PC Perspective staff and beyond...??
We will be streaming the festivities live on our Justin.tv channel (embedded below) and will have a chat widget here as well for those of you that would rather use IRC than the integrated Justin.tv chat.
In short, we are planning on having a good time playing some games and talking hardware so if you are into that, then I think you should be sure to stop by and say hello!! Let us know in the comments if you have anything else you want to see or any more ideas for our live show. Thanks!!!
Subject: Graphics Cards | September 29, 2011 - 02:54 PM | Ryan Shrout
Tagged: bf3, battlefield 3, nvidia, amd, geforce, radeon, gtx 460, hd 5850, 9800 gt
I just wanted to drop a quick note here on the home page to let you know that we haven't been just playing around on Battlefield 3 all day - instead we have been playing around on BF3 all day in order to bring you some more information about performance in the game! Earlier this afternoon I updated my Battlefield 3 Beta Performance article with a new page that focuses on results from the GeForce GTX 460 1GB, the AMD Radeon HD 5850 1GB and the GeForce 9800 GT 512MB card for those of you that are really behind the curve in the upgrade cycle.
What's this? The Caspian map? Performance results soon!
I would recommend you check out that new content to see where you hardware might fall and hopefully over the weekend we are going to be putitng together a system guide similar to our own Hardware Leaderboard but targetted at the BF3 gaming experience. Stay tuned!!
The Battlefield 3 Beta
Update 2 (9/30/11): We have some quick results from our time on the Caspian Border map as well if you are interested - check them out!
It was an exciting day at PC Perspective yesterday with much our time dedicated to finding, installing and playing the new Battlefield 3 public beta. Released on the morning of the 26th to those of you who had pre-ordered BF3 on Origin or those of you who had purchased Medal of Honor prior to July 26th, getting the beta a couple of days early should give those of you with more FPS talent than me a leg up once the open beta starts on Thursday the 29th.
My purpose in playing Battlefield 3 yesterday was purely scientific, of course. We wanted to test a handful of cards from both AMD and NVIDIA to see how the beta performed. With all of the talk about needing to upgrade your system and the relatively high recommended system requirements, there is a lot of worry that just about anyone without a current generation GPU is going to need to shell out some cash.
Is that a justified claim? While we didn't have time yet to test EVERY card we have at the office we did put some of the more recent high end, mid-range and lower cost GPUs to the test.
Bulldozer Ships for Revenue
Some months back we covered the news that AMD had released its first revenue shipments of Llano. This was a big deal back then, as it was the first 32 nm based product from AMD, and one which could help AMD achieve power and performance parity with Intel in a number of platforms. Llano has gone on to be a decent seller for AMD, and it has had a positive effect on AMD’s marketshare in laptops. Where once AMD was a distant second in overall terms of power and performance in the mobile environment, Llano now allows them to get close to the CPU performance of the Intel processors, achieve much greater performance in graphics workloads, and has matched Intel in overall power consumption.
KY Wong and Marshall Kwait hand off the first box of Bulldozer based Interlagos processors to Cray's Joe Fitzgerald. Photo courtesy of AMD.
Some five months later we are now making the same type of announcement for AMD and their first revenue shipment of the Bulldozer core. The first chips off the line are actually “Interlagos” chips; basically server processors that feature upwards of 16 cores (8 modules, each module containing two integer units and then the shared 256 bit FPU/SSE SIMD unit). The first customer is Cray, purveyor of fine supercomputers everywhere. They will be integrating these new chips into their Cray XE6 supercomputers, which have been purchased by a handful of governmental and education entities around the world.
Subject: Editorial | August 25, 2011 - 12:47 PM | Ken Addison
Tagged: ROG, radeon, podcast, HD6970, GTX580, asus, amd, 6970
PC Perspective Podcast #167 - 8/25/2011
This week we talk about HD6970 shortages, the ASUS ROG Matrix GTX580, HP selling it's PC business and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store
- RSS - Subscribe through your regular
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano
This Podcast is brought to you by
- 0:00:38 Introduction
- 1-888-38-PCPER or email@example.com
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 0:01:32 Where Have All the 6970s Gone?
- 0:12:23 ASUS ROG Matrix GTX 580 Platinum 1.5GB Graphics Card Review - Best of the best?
- 0:26:00 This Podcast is brought to you by
, and their all new Sandy Bridge Motherboards!
- 0:27:05 Llano is running short
- 0:33:50 HP conference call this afternoon, could a major division drop?
- 0:35:50 Corsair Unveils Two New 90GB SATA 6Gb/s SSDs, A World's First
- 0:42:02 Battlefield 3: This is what the PC players will be enjoying
- 0:43:59 Intel returns to upgrade cards for more of their crippled parts
- 0:48:20 Gigabyte motherboard with 20GB cache
- 0:50:49 Drobo Improves Storage with new App-Driven Delivery
- 1:00:25 Deus Ex gives beautiful performance on cards costing less than $250
- 1:01:22 Steve Jobs steps down blah blah
- 1:02:15 Emails from Graeme about a SWTOR rig
- 1:06:41 Email from Eric about a new MB/CPU
- 1:11:20 Email from a lot of people - What SSD would Allyn buy?
- 1:14:30 Email from Mark about Matrox TripleHead2Go
- 1:19:32 Hardware / Software Pick of the Week
- Ryan: Blue Icicle
- Jeremy: Space Marine demo on Steam
- Josh: http://www.newegg.com/Product/Product.aspx?Item=N82E16833122326
- Allyn: Unlocker (is finally 64 bit)
- 1-888-38-PCPER or firstname.lastname@example.org
- http://twitter.com/ryanshrout and http://twitter.com/pcper
When building a computer, enthusiasts are likely to combine components from several different manufacturers, especially on the Intel side. Short of the power supply, hard drive, and accessories; however, AMD is slowing diversifying to provide components to put together an all-AMD system. Before today, AMD already had the motherboard, processor, and graphics card (including processor graphics if that's your thing), and today Maximum PC reports that AMD may be moving into the RAM market with its own line of Radeon branded memory. It seems that AMD's future Leo-like platform may resemble a small AMD branded borg cube.
The new memory in question is comprised of 2GB, DDR3 sticks and comes in three series branded the "Entertainment," "ULTRA PRO Gaming," and "Enterprise." The enthusiastic naming conventions aside, the Entertainment series looks to be the budget modules for those looking for stable DIMMs that get the job done for cheap. They have a rated speed of 1333Mbps and a CAS latency of 9-9-9. The next highest series is the "ULTRA PRO Gaming" series, which promises to be overclocking friendly. These DIMMs receive a slight boost in speed to 1600Mbps while taking a slight dip in CAS latency to 11-11-11. The final, and likely most expensive, modules are the Enterprise series. These modules are still somewhat of a mystery as the specifications have yet to be announced by AMD; however, they are likely geared more towards enterprise workstations than servers as they are unbuffered DIMMs.
Further, all three series are rated to run at 1.5V and have a height of 30mm. Unfortunately, there is no word yet on price or availability. There are; however, several photos of Radeon branded memory modules over at PC Watch for you to check out. Do you think AMD's move to enter the DRAM market is a good thing or a bad thing for future profitability?
The Dirty Laggard
It may seem odd, but sometimes reviewers are some of the last folks to implement new technology. This has been the case for myself many a time. Yes, we get some of the latest and greatest components, but often we review them and then keep them on the shelf for comparative purposes, all the while our personal systems run last generation parts that we will not need to re-integrate into a test rig ever again. Or in other cases, big money parts, like the one 30” 2560x1600 LCD that I own, are always being utilized on the testbed and never actually being used for things like browsing, gaming, or other personal activities. Don’t get me wrong, this is not a “woe-is-me” rant about the hardships of being a reviewer, but rather just an interesting side effect not often attributed to folks who do this type of work. Yes, we get the latest to play with and review, but we don’t often actually use these new parts in our everyday lives.
One of the technologies that I had only ever seen at trade shows is that of Eyefinity. It was released back in the Fall of 2009, and really gained some momentum in 2010. Initially it was incompatible with Crossfire technology, which limited it to a great degree. A single HD 5970 card could push 3 x 1920x1080 monitors in most games, but usually only with details turned down and no AA enabled. Once AMD worked a bit more on the drivers were we able to see Crossfire setups working in Eyefinity, which allowed users to play games at higher fidelity with the other little niceties enabled. The release of the HD 6900 series of cards also proved to be a boon to Eyefinity, as these new chips had much better scaling in Crossfire performance, plus were also significantly faster than the earlier HD 5800 series at those price points.
Continue on to the rest of the story for more on my experiences with AMD Eyefinity.
How much will these Bitcoin mining configurations cost you in power?
Earlier this week we looked at Bitcoin mining performance across a large range of GPUs but we had many requests for estimates on the cost of the power to drive them. At the time we were much more interested in the performance of these configurations but now that we have that information and we started to look at the potential profitability of doing something like this, look at the actual real-world cost of running a mining machine 24 hours a day, 7 days a week became much more important.
This led us to today's update where we will talk about the average cost of power, and thus the average cost of running our 16 different configurations, in 50 different locations across the United States. We got our data from the U.S. Energy Information Administration website where they provide average retail prices on electricity divided up by state and by region. For use today, we downloaded the latest XLS file (which has slightly more updated information than the website as of this writing) and started going to work with some simple math.
Here is how your state matches up:
The first graph shows the rates in alphabetical order by state, the second graph in order from the most expensive to the least. First thing we noticed: if you live in Hawaii, I hope you REALLY love the weather. And maybe it's time to look into that whole solar panel thing, huh? Because Hawaii was SO FAR out beyond our other data points, we are going to be leaving it out of our calculations and instead are going to ask residents and those curious to just basically double one of our groupings.
Keep reading to get the full rundown on how power costs will affect your mining operations, and why it may not make sense to mine AT ALL with NVIDIA graphics cards!
What is a Bitcoin?
This article looking at Bitcoins and the performance of various GPUs with mining them was really a big team effort at PC Perspective. Props goes out to Tim Verry for doing the research on the process of mining and helping to explain what Bitcoins are all about. Ken Addison did a great job doing through an alottment of graphics cards running our GUIMiner and getting the data you will see presented later. Scott Michaud helped with some graphics and imagery and I'm the monkey that just puts it all together at the end.
** Update 7/13/11 ** We recently wrote another piece on the cost of the power to run our Bitcoin mining operations used in this performance article. Based on the individual prices of electric in all 50 states of the US, we found that the cost of the power to run some cards exceeded the value of the Bitcoin currency based on today's exchange rates. I would highly recommend you check out that story as well after giving this performance-based article a thorough reading. ** End Update **
A new virtual currency called Bitcoin has been receiving a great deal of news fanfare, criticism and user adoption. The so called cryptographic currency uses strong encryption methods to eliminate the need for trust when buying and selling goods over the Internet in addition to a peer-to-peer distributed timestamp server that maintains a public record of every transaction to prevent double spending of the electronic currency. The aspect of Bitcoin that has caused the most criticism and recent large rise in growth lies in is its inherent ability to anonymize the real life identities of users (though the transactions themselves are public) and the ability to make money by supporting the Bitcoin network in verifying pending transactions through a process called “mining” respectively. Privacy, security, cutting out the middle man and making it easy for users to do small casual transactions without fees as well as the ability to be rewarded for helping to secure the network by mining are all selling points (pun intended) of the currency.
When dealing with a more traditional and physical local currency, there is a need to for both parties to trust the currency but not much need to trust each other as handing over cash is fairly straightforward. One does not need to trust the other person as much as if it were a check which could bounce. Once it has changed hands, the buyer can not go and spend that money elsewhere as it is physically gone. Transactions over the Internet; however, greatly reduce the convenience of that local currency, and due to the series of tubes’ inability to carry cash through the pipes, services like Paypal as well as credit cards and checks are likely to be used in its place. While these replacements are convenient, they also are much riskier than cash as fraudulent charge-backs and disputes are likely to occur, leaving the seller in a bad position. Due to this risk, sellers have to factor a certain percentage of expected fraud into their prices in addition to collecting as much personally identifiable information as possible. Bitcoin seeks to remedy these risks by bringing the convenience of a local currency to the virtual plane with irreversible transactions, a public record of all transactions, and the ability to trust strong cryptography instead of the need for trusting people.
There are a number of security measures inherent in the Bitcoin protocol that assist with these security goals. Foremost, bitcoin uses strong public and private key cryptography to secure coins to a user. Money is handled by a bitcoin wallet, which is a program such as the official bitcoin client that creates public/private key pairs that allow you to send and receive money. You are further able to generate new receiving addresses whenever you want within the client. The wallet.dat file is the record of all your key pairs and thus your bitcoins and contains 100 address/key pairs (though you are able to generate new ones beyond that). Then, to send money one only needs to sign the bitcoin with their private key and send it to the recipient’s public key. This creates a chain of transactions that are secured by these public and private key pairs from person to person. Unfortunately this cryptography alone is not able to prevent double spending, meaning that Person A could sign the bitcoin with his private key to Person B, but also could do the same to Person C and so on. This issue is where the peer-to-peer and distributed computing aspect of the bitcoin protocol come into play. By using a peer-to-peer distributed timestamp server, the bitcoin protocol creates a public record of every transaction that prevents double spending of bitcoins. Once the bitcoin has been signed to a public key (receiving address) with the user’s private key, and the network confirms this transaction the bitcoins can no longer be spent by Person A as the network has confirmed that the coin belongs to Person B now, and they are the only ones that can spend it using their private key.
Keep reading our article that details the theories behind Bitcoins as well as the performance of modern GPUs in mining them!
Get notified when we go live!