Subject: Graphics Cards | May 28, 2013 - 11:32 PM | Tim Verry
Tagged: gpu, drivers, catalyst 13.6 beta, beta, amd
AMD has released its Catalyst 13.6 beta graphics driver, and it fixes a number of issues under both Windows 8 and Linux. The new beta driver is also compatible with the existing Catalyst 13.5 CAP1 (Catalyst Application Profile) which improves performance of several PC games.
As far as the Windows version of the graphics driver, Catalyst 13.6 adds OpenCL GPU acceleration support to Adobe's Premiere Pro CC software and enables AMD Wireless Display technology on systems with the company's A-Series APUs and either Broadcom or Atheros Wi-Fi chipsets. AMD has also made a couple of tweaks to its Enduro technology, including correctly identifying when a Metro app idles and offloading the corresponding GPU tasks to integrated graphics instead of a discrete card. The new beta driver also resolves an issue with audio dropout over HDMI.
On the Linux side of things, Catalyst 13.6 beta adds support for the following when using AMD's A10, A8, A6, and A4 APUs:
- Ubuntu 13.04
- Xserver 1.14
- GLX_EXT_buffer age
The driver fixes several bugs as well, including resolving black screen and corruption issues under TF2, an issue with OpenGL applications and VSYNC, and UVD playback issues where the taskbar would disappear and/or the system would experience a noticeable performance drop while playing a UVD in XBMC.
You can grab the new beta driver from the AMD website.
Subject: General Tech | April 3, 2013 - 01:21 PM | Jeremy Hellstrom
Tagged: gpu, DRAM, ddr3, price increase
It has taken a while but the climbing price of memory is about to have an effect on the price you pay for your next GPU. DigiTimes does specifically mention DDR3 but as both GDDR4 and GDDR5 are based off of DDR3 they will suffer the same price increases. You can expect to see the new prices last as part of the reason for the increase in the price of RAM is the decrease in sales volume. AMD may be hit harder overall than NVIDIA as they tend to put more memory on their cards and buyers of value cards might see the biggest percentage increase as those cards still sport 1GB or more of memory.
"Since DDR3 memory prices have recently risen by more than 10%, the sources believe the graphics cards are unlikely to see their prices return to previous levels within the next six months unless GPU makers decide to offer promotions for specific models or launch next-generation products."
Here is some more Tech News from around the web:
- Memory vendors pile on '3D' stacking standard @ The Register
- History of the GPU, Part 2: 3Dfx Voodoo, the game-changer @ Techspot
- Intel releases OpenCL SDK for upcoming Haswell chips @ The Inquirer
- Linux Foundation Training Prepares the International Space Station for Linux Migration @ Linux.com
- Microsoft releases Exchange 2013 update @ The Register
- Canon PowerShot A2600 Review @ TechReviewSource
- AMD Releases Open-Source UVD Video Support @ Phoronix
- Win An Amazing PC Specialist Gaming System @ eTeknix
Subject: General Tech | March 27, 2013 - 01:21 PM | Jeremy Hellstrom
Tagged: gpu, history, get off my lawn
TechSpot has jsut published an article looking at the history of the GPU over the past decades, from the first NTSC capable cards, through the golden 3DFX years straight through to the modern GPGPU. There have been a lot of standards over the years such as MDA, CGA and EGA as well as different interfaces like ISA, the graphic card specific AGP to our current PCIe standard. The first article in this four part series takes us from 1976 through to 1995 and the birth of the Voodoo series of accelerators. Read on to bring back memories or perhaps to encounter some of this history for the first time.
"The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer. While 3D graphics turned a fairly dull PC industry into a light and magic show, they owe their existence to generations of innovative endeavour. Over the next few weeks we'll be taking an extensive look at the history of the GPU, going from the early days of 3D consumer graphics, to the 3Dfx Voodoo game-changer, the industry's consolidation at the turn of the century, and today's modern GPGPU."
Here is some more Tech News from around the web:
- The Do-It-Yourself All-In-One Computer Standard from Intel @ Hardware Secrets
- What is Windows Blue? @ TechReviewSource
- Mozilla Firefox OS on Dreamfone video demo @ The Inquirer
- Google Keep hands-on @ The Inquirer
- Whoops! Tiny bug in NetBSD 6.0 code ruins SSH crypto keys @ The Register
In case you missed it...
In one of the last pages of our recent NVIDIA GeForce GTX TITAN graphics card review we included an update to our Frame Rating graphics performance metric that details the testing method in more detail and showed results for the first time. Because it was buried so far into the article, I thought it was worth posting this information here as a separate article to solict feedback from readers and help guide the discussion forward without getting lost in the TITAN shuffle. If you already read that page of our TITAN review, nothing new is included below.
I am still planning a full article based on these results sooner rather than later; for now, please leave me your thoughts, comments, ideas and criticisms in the comments below!
Why are you not testing CrossFire??
If you haven't been following our sequence of stories that investigates a completely new testing methodology we are calling "frame rating", then you are really missing out. (Part 1 is here, part 2 is here.) The basic premise of Frame Rating is that the performance metrics that the industry is gathering using FRAPS are inaccurate in many cases and do not properly reflect the real-world gaming experience the user has.
Because of that, we are working on another method that uses high-end dual-link DVI capture equipment to directly record the raw output from the graphics card with an overlay technology that allows us to measure frame rates as they are presented on the screen, not as they are presented to the FRAPS software sub-system. With these tools we can measure average frame rates, frame times and stutter, all in a way that reflects exactly what the viewer sees from the game.
We aren't ready to show our full sets of results yet (soon!) but the problems lie in that AMD's CrossFire technology shows severe performance degradations when viewed under the Frame Rating microscope that do not show up nearly as dramatically under FRAPS. As such, I decided that it was simply irresponsible of me to present data to readers that I would then immediately refute on the final pages of this review (Editor: referencing the GTX TITAN article linked above.) - it would be a waste of time for the reader and people that skip only to the performance graphs wouldn't know our theory on why the results displayed were invalid.
Many other sites will use FRAPS, will use CrossFire, and there is nothing wrong with that at all. They are simply presenting data that they believe to be true based on the tools at their disposal. More data is always better.
Here are these results and our discussion. I decided to use the most popular game out today, Battlefield 3 and please keep in mind this is NOT the worst case scenario for AMD CrossFire in any way. I tested the Radeon HD 7970 GHz Edition in single and CrossFire configurations as well as the GeForce GTX 680 and SLI. To gather results I used two processes:
- Run FRAPS while running through a repeatable section and record frame rates and frame times for 60 seconds
- Run our Frame Rating capture system with a special overlay that allows us to measure frame rates and frame times with post processing.
Here is an example of what the overlay looks like in Battlefield 3.
Frame Rating capture on GeForce GTX 680s in SLI - Click to Enlarge
The column on the left is actually the visuals of an overlay that is applied to each and every frame of the game early in the rendering process. A solid color is added to the PRESENT call (more details to come later) for each individual frame. As you know, when you are playing a game, multiple frames will make it on any single 60 Hz cycle of your monitor and because of that you get a succession of colors on the left hand side.
By measuring the pixel height of those colored columns, and knowing the order in which they should appear beforehand, we can gather the same data that FRAPS does but our results are seen AFTER any driver optimizations and DX changes the game might make.
Frame Rating capture on Radeon HD 7970 CrossFire - Click to Enlarge
Here you see a very similar screenshot running on CrossFire. Notice the thin silver band between the maroon and purple? That is a complete frame according to FRAPS and most reviews. Not to us - we think that frame rendered is almost useless.
Subject: Graphics Cards | January 31, 2013 - 08:04 AM | Tim Verry
Tagged: PC, gaming, amd, graphics drivers, gpu, Crysis 3, catalyst
The Crysis 3 beta was launched January 29th, and AMD came prepared with its new Catalyst 13.2 beta driver. In addition to the improvements rolled into the Catalyst 13.1 WHQL graphics driver, Catalyst 13.2 beta features performance improvements in a number of games.
Foremost, AMD focused on optimizing the drivers for the Crysis 3 beta. With the new 13.2 beta drivers, gamers will see a 15% performance improvement in Crysis 3 when using high MSAA settings. AMD has also committed itself to future tweaks to improve Crysis 3 performance when using both single and CrossFire graphics configurations. The driver also allows for a 10% improvement in CrossFire performance in Crytek’s Crysis 2 and a 50% performance boost in DMC: Devil May Cry when running a single AMD GPU. Reportedly, the new beta driver also reduces latency issues in Skyrim, Borderlands 2, and Guild Wars 2. Finally, the 13.2 beta driver resolves a texture filtering issue when running DirectX 9.0c games.
Subject: Graphics Cards | January 18, 2013 - 11:33 AM | Tim Verry
Tagged: Radeon HD 7000, gpu, drivers, catalyst 13.1, amd
AMD recently released a new set of Catalyst graphics card drivers with Catalyst 13.1. The new drivers are WHQL (Microsoft certified) and incorporate all of the fixes contained in the 12.11 beta 11 drivers. The Radeon HD 7000 series will see the majority of the performance and stability tweaks with 13.1. Additionally, the Catalyst 13.1 suite includes a new 3D settings interface in Catalyst Control Center that allows per-application profile management. The Linux version of the Catalyst 13.1 drivers now officially support Ubuntu 12.10 as well.
Some of the notable performance tweaks for the HD 7000 series include:
- CrossFire scaling performance in Call of Duty: Black Ops II improvements.
- Up to a 25% increase in Far Cry 3 when using 8X MSAA.
- An 8% performance increase in Sleeping Dogs and StarCraft II.
- A 5% improvement in Max Payne 3.
Beyond the performance increases, AMD has fixes several bugs with the latest drivers. Some of the noteworthy fixes include:
- Fixed a system hang on X58 and X79 chipset-based systems using HD 7000-series GPUs.
- Fixed an intermittent hang with HD 7000-series GPUs in CrossFireX and Eyefinity configurations.
- Resolved a system hang in Dishonored on 5000 and 6000 series graphics cards.
- Resolved a video issue with WMP Classic Home Cinema.
- Added Super Sample Anti-Aliasing support in the OpenGL driver.
AMD has also released a new standalone un-installation utility that will reportedly clean your system of AMD graphics card drivers to make way for newer versions. That utility can be downloaded here.
If you have a Radeon HD 7000-series card, it would be worth it to update your drivers ASAP. You can download the Catalyst 13.1 drivers on the AMD website.
You can find a full list of the performance tweaks and bug fixes in the Catalyst 13.1 release notes.
In our previous article and video, I introduced you to our upcoming testing methodology for evaluating graphics cards based not only frame rates but on frame smoothness and the efficiency of those frame rates. I showed off some of the new hardware we are using for this process and detailed how direct capture of graphics card output allows us to find interesting frame and animation anomalies using some Photoshop still frames.
Today we are taking that a step further and looking at a couple of captured videos that demonstrate a "stutter" and walking you through, frame by frame, how we can detect, visualize and even start to measure them.
This video takes a couple of examples of stutter in games, DiRT 3 and Dishonored to be exact, and shows what they look like in real time, at 25% speed and then finally in a much more detailed frame-by-frame analysis.
Obviously this is just a couple instances of what a stutter is and there are often times less apparent in-game stutters that are even harder to see in video playback. Not to worry - this capture method is capable of seeing those issues as well and we plan on diving into the "micro" level as well shortly.
We aren't going to start talking about whose card and what driver is being used yet and I know that there are still a lot of questions to be answered on this topic. You will be hearing more quite soon from us and I thank you all for your comments, critiques and support.
Let me know below what you thought of this video and any questions that you might have.
A change is coming in 2013
If the new year will bring us anything, it looks like it might be the end of using "FPS" as the primary measuring tool for graphics performance on PCs. A long, long time ago we started with simple "time demos" that recorded rendered frames in a game like Quake and then played them back as quickly as possible on a test system. The lone result was given as time, in seconds, and was then converted to an average frame rate having known the total number of frames recorded to start with.
More recently we saw a transition to frame rates over time and the advent frame time graphs like the ones we have been using in our graphics reviews on PC Perspective. This expanded the amount of data required to get an accurate picture of graphics and gaming performance but it was indeed more accurate, giving us a more clear image of how GPUs (and CPUs and systems for that matter) performed in games.
And even though the idea of frame times have been around just a long, not many people were interested in getting into that detail level until this past year. A frame time is the amount of time each frame takes to render, usually listed in milliseconds, and could range from 5ms to 50ms depending on performance. For a reference, 120 FPS equates to an average of 8.3ms, 60 FPS is 16.6ms and 30 FPS is 33.3ms. But rather than average those out by each second of time, what if you looked at each frame individually?
Scott over at Tech Report started doing that this past year and found some interesting results. I encourage all of our readers to follow up on what he has been doing as I think you'll find it incredibly educational and interesting.
Through emails and tweets many PC Perspective readers have been asking for our take on it, why we weren't testing graphics cards in the same fashion yet, etc. I've stayed quiet about it simply because we were working on quite a few different angles on our side and I wasn't ready to share results. I am still not ready to share the glut of our information yet but I am ready to start the discussion and I hope our community find its compelling and offers some feedback.
At the heart of our unique GPU testing method is this card, a high-end dual-link DVI capture card capable of handling 2560x1600 resolutions at 60 Hz. Essentially this card will act as a monitor to our GPU test bed and allow us to capture the actual display output that reaches the gamer's eyes. This method is the best possible way to measure frame rates, frame times, stutter, runts, smoothness, and any other graphics-related metrics.
Using that recorded footage, sometimes reaching 400 MB/s of consistent writes at high resolutions, we can then analyze the frames one by one, though with the help of some additional software. There are a lot of details that I am glossing over including the need for perfectly synced frame rates, having absolutely zero dropped frames in the recording, analyzing, etc, but trust me when I say we have been spending a lot of time on this.
Subject: General Tech | December 27, 2012 - 03:37 PM | Ken Addison
Tagged: video, ssd, podcast, picks of the year, memory, gpu, editors choice, cpu, case, best of the year
PC Perspective Podcast #232 - 12/27/2012
Join us this week as we discuss our picks for Best Products of 2012!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano and Scott Michaud
Program length: 1:40:13
Podcast topics of discussion:
- Welcome to our Best Of 2012 Episode!
- Rules explanation: there are no rules!!
0:07:30 Best CPU
- AMD Trinity A10 APU
- Intel Core i7-3770K
- Intel Ultrabook mobile CPU
- Qualcomm Krait Mobile CPU
0:20:00 Best Motherboard
- MSI Z77 MPower
- ASUS Rampage IV Extreme
- ASUS Z77 Mini ITX
- EVGA dual Xeon board
- Asus Crosshair Formula Z
0:31:20 Best GPU
- GeForce GTX 680 2GB
- GeForce GTX 660 Ti 2GB
- Radeon HD 7970 3GB
- Radeon HD 7870 2GB
- 0:44:00 This Podcast is brought to you by MSI!
0:45:00 Best Storage
- Samsung 840 Pro SSD
- OCZ Vector SSD
- WD Red HDD
- WD 4TB Enterprise
1:05:00 Best Case
- Corsair Carbide 300R
- Corsair Obsidian 550D
- Cooler Master Cosmos II
- Mineral Oil Case
1:12:00 Best Price Drop
- AMD Radeon 7000 GPUs
- Good IPS displays ($199 - $400)
- 2560x1440 27-in panels
- System Memory
1:22:00 Best Newly Popular Technology
- High-res monitors (Korean or otherwise)
- Cherry style keyboards
- Mini ITX Motherboards
- Touch screen?
1:35:00 Best Lenovo Laptop on my Desk
- Thinkpad Twist
- Thinkpad X1 Carbon
- Thinkpad X230
- Yoga 13
- 1-888-38-PCPER or firstname.lastname@example.org
- http://twitter.com/ryanshrout and http://twitter.com/pcper
Subject: General Tech | December 3, 2012 - 01:35 AM | Tim Verry
Tagged: mining, gpu, btc, block reward halving, block reward, bitcoin
Earlier this week, the 50BTC reward given to miners that successfully find blocks of Bitcoin transactions was halved to 25BTC. This means that the gross income of miners is now half of what it has been since the cryptocurrency’s inception. As a result, many miners – especially those using graphics cards – will have to re-evaluate their net earnings to determine if they are still making any profits from mining coins after hardware and electricity costs are taken into consideration.
As an example, when mining bitcoins using a single shader-unlocked AMD Radeon HD 6950 graphics card, I was able to obtain approximately 0.12 BTC per day. Now, because the reward is cut in half, I am able to make about 0.06 BTC per day. Unfortunately, that is approximately the same amount of BTC (when converted to USD) that it costs in electricity to run, negating profits. Technically, according to Allchains.info, I am (just barely) still profitable with a net profit of $0.076 USD per day after electricity costs. As the exchange rate has gone up slightly, it is a bit more than that in actuality but it is still a good estimation of profitability. There are also non-monetary costs associated with mining bitcoins in the form of the extra heat and noise generated by the graphics card being run under load 24/7. And at $0.076 cents a day, it really does not seem worth it anymore. Then again, it does act as a room heater in the winter and it at least subsidizes part of the cost of heating the room (heh) even if it does not pay out much more than it costs to run.
It will be interesting to see how miners react, especially once the colder months are behind us. Some Bitcoin miners have moved on to alternative cryptocurrencies such as Litecoin and Terracoin, however that has caused the difficulty of mining LTC to double without an equal increase in exchange rate which has actually made mining LTC less profitable than mining BTC (d'oh!).
The reward halving is not an unexpected event, and, in fact, it has been intentionally executed by the bitcoin developers to prevent inflation. Just as the mining difficulty adjusts every 2016 blocks, every four years the reward is cut in half until mining blocks no longer provides rewards (with the intention that transaction fees will provide all of the incentive to mine from then one). As such, many miners knew about it before hand, and planned accordingly. Some miners sold off their rigs, others (mostly those with free electricity) are continuing to mine, and yet other miners moved to alternative cryptocurrencies. (That's not to mention those that mine for the purpose of securing the network which is not always a profitable (albeit necessary) pursuit.)
Interestingly, the difficulty of Bitcoin is estimated to increase rather than decrease with the upcoming adjustment despite the reward split and a certain number of users moving away from mining. This may be the result of miners speculating that the price (exchange rate of USD/BTC) will increase and/or that other miners will drop off and the difficulty will begin to decrease at some point. Miners with lage farms of gaphics cards may also be able to hold out despite the block reward halving as they have enough hashing power to keep profits at a worthwile level. While my single card is only making about $18 a month now (versus ~35+), users with more cards can still be seeing sizable returns. So long as profits are there, mining will continue, though newcomers looking to invest in mining rigs are less likely to join in the current climate (seemingly vaporware ASICs notwithstanding). There may also be users that are mining solely for BTC that they will hoard or spend with merchants (like WordPress) that accept bitcoins without concern for exchanging to USD or other national currencies.
In all, there are an astounding number of factors surrounding the block reward halving along with many theories about what will happen as a result of it. At this point, it is still to early to tell, but It will be interesting to see which theories hold true.
Read more about the bitcoin cryptocurrency and how mining works at PC Perspective.
What do you think about the bitcoin reward being cut in half? How will it effect you, and will you continue to mine at the current exchange rate and difficulty? Let us know in the comments below (no registration required).
Get notified when we go live!