AMD To Release 2GB DDR3 Radeon Branded DIMMs

Subject: Memory | August 8, 2011 - 12:38 PM |
Tagged: radeon, memory, ddr3, amd

When building a computer, enthusiasts are likely to combine components from several different manufacturers, especially on the Intel side.  Short of the power supply, hard drive, and accessories; however, AMD is slowing diversifying to provide components to put together an all-AMD system.  Before today, AMD already had the motherboard, processor, and graphics card (including processor graphics if that's your thing), and today Maximum PC reports that AMD may be moving into the RAM market with its own line of Radeon branded memory.  It seems that AMD's future Leo-like platform may resemble a small AMD branded borg cube.

50094_Chip_frontand_Back228W.png

The new memory in question is comprised of 2GB, DDR3 sticks and comes in three series branded the "Entertainment," "ULTRA PRO Gaming," and "Enterprise."  The enthusiastic naming conventions aside, the Entertainment series looks to be the budget modules for those looking for stable DIMMs that get the job done for cheap.  They have a rated speed of 1333Mbps and a CAS latency of 9-9-9.  The next highest series is the "ULTRA PRO Gaming" series, which promises to be overclocking friendly.  These DIMMs receive a slight boost in speed to 1600Mbps while taking a slight dip in CAS latency to 11-11-11.  The final, and likely most expensive, modules are the Enterprise series.  These modules are still somewhat of a mystery as the specifications have yet to be announced by AMD; however, they are likely geared more towards enterprise workstations than servers as they are unbuffered DIMMs.

Further, all three series are rated to run at 1.5V and have a height of 30mm.  Unfortunately, there is no word yet on price or availability.  There are; however, several photos of Radeon branded memory modules over at PC Watch for you to check out.  Do you think AMD's move to enter the DRAM market is a good thing or a bad thing for future profitability?

Source: AMD
Author:
Subject: Editorial
Manufacturer: AMD

The Dirty Laggard

 

It may seem odd, but sometimes reviewers are some of the last folks to implement new technology.  This has been the case for myself many a time.  Yes, we get some of the latest and greatest components, but often we review them and then keep them on the shelf for comparative purposes, all the while our personal systems run last generation parts that we will not need to re-integrate into a test rig ever again.  Or in other cases, big money parts, like the one 30” 2560x1600 LCD that I own, are always being utilized on the testbed and never actually being used for things like browsing, gaming, or other personal activities.  Don’t get me wrong, this is not a “woe-is-me” rant about the hardships of being a reviewer, but rather just an interesting side effect not often attributed to folks who do this type of work.  Yes, we get the latest to play with and review, but we don’t often actually use these new parts in our everyday lives.

One of the technologies that I had only ever seen at trade shows is that of Eyefinity.  It was released back in the Fall of 2009, and really gained some momentum in 2010.  Initially it was incompatible with Crossfire technology, which limited it to a great degree.  A single HD 5970 card could push 3 x 1920x1080 monitors in most games, but usually only with details turned down and no AA enabled.  Once AMD worked a bit more on the drivers were we able to see Crossfire setups working in Eyefinity, which allowed users to play games at higher fidelity with the other little niceties enabled.  The release of the HD 6900 series of cards also proved to be a boon to Eyefinity, as these new chips had much better scaling in Crossfire performance, plus were also significantly faster than the earlier HD 5800 series at those price points.

eye_fin.jpg

Continue on to the rest of the story for more on my experiences with AMD Eyefinity.

Author:
Manufacturer: General

How much will these Bitcoin mining configurations cost you in power?

Earlier this week we looked at Bitcoin mining performance across a large range of GPUs but we had many requests for estimates on the cost of the power to drive them.  At the time we were much more interested in the performance of these configurations but now that we have that information and we started to look at the potential profitability of doing something like this, look at the actual real-world cost of running a mining machine 24 hours a day, 7 days a week became much more important. 

This led us to today's update where we will talk about the average cost of power, and thus the average cost of running our 16 different configurations, in 50 different locations across the United States.  We got our data from the U.S. Energy Information Administration website where they provide average retail prices on electricity divided up by state and by region.  For use today, we downloaded the latest XLS file (which has slightly more updated information than the website as of this writing) and started going to work with some simple math. 

Here is how your state matches up:

kwh-1.jpg

kwh-2.jpg

The first graph shows the rates in alphabetical order by state, the second graph in order from the most expensive to the least.  First thing we noticed: if you live in Hawaii, I hope you REALLY love the weather.  And maybe it's time to look into that whole solar panel thing, huh?  Because Hawaii was SO FAR out beyond our other data points, we are going to be leaving it out of our calculations and instead are going to ask residents and those curious to just basically double one of our groupings.

Keep reading to get the full rundown on how power costs will affect your mining operations, and why it may not make sense to mine AT ALL with NVIDIA graphics cards! 

Author:
Manufacturer: General

What is a Bitcoin?

This article looking at Bitcoins and the performance of various GPUs with mining them was really a big team effort at PC Perspective.  Props goes out to Tim Verry for doing the research on the process of mining and helping to explain what Bitcoins are all about.  Ken Addison did a great job doing through an alottment of graphics cards running our GUIMiner and getting the data you will see presented later.  Scott Michaud helped with some graphics and imagery and I'm the monkey that just puts it all together at the end.

** Update 7/13/11 **  We recently wrote another piece on the cost of the power to run our Bitcoin mining operations used in this performance article.  Based on the individual prices of electric in all 50 states of the US, we found that the cost of the power to run some cards exceeded the value of the Bitcoin currency based on today's exchange rates.  I would highly recommend you check out that story as well after giving this performance-based article a thorough reading.  ** End Update **

A new virtual currency called Bitcoin has been receiving a great deal of news fanfare, criticism and user adoption. The so called cryptographic currency uses strong encryption methods to eliminate the need for trust when buying and selling goods over the Internet in addition to a peer-to-peer distributed timestamp server that maintains a public record of every transaction to prevent double spending of the electronic currency. The aspect of Bitcoin that has caused the most criticism and recent large rise in growth lies in is its inherent ability to anonymize the real life identities of users (though the transactions themselves are public) and the ability to make money by supporting the Bitcoin network in verifying pending transactions through a process called “mining” respectively. Privacy, security, cutting out the middle man and making it easy for users to do small casual transactions without fees as well as the ability to be rewarded for helping to secure the network by mining are all selling points (pun intended) of the currency.

When dealing with a more traditional and physical local currency, there is a need to for both parties to trust the currency but not much need to trust each other as handing over cash is fairly straightforward. One does not need to trust the other person as much as if it were a check which could bounce. Once it has changed hands, the buyer can not go and spend that money elsewhere as it is physically gone. Transactions over the Internet; however, greatly reduce the convenience of that local currency, and due to the series of tubes’ inability to carry cash through the pipes, services like Paypal as well as credit cards and checks are likely to be used in its place. While these replacements are convenient, they also are much riskier than cash as fraudulent charge-backs and disputes are likely to occur, leaving the seller in a bad position. Due to this risk, sellers have to factor a certain percentage of expected fraud into their prices in addition to collecting as much personally identifiable information as possible. Bitcoin seeks to remedy these risks by bringing the convenience of a local currency to the virtual plane with irreversible transactions, a public record of all transactions, and the ability to trust strong cryptography instead of the need for trusting people.

paymentpal.jpg

There are a number of security measures inherent in the Bitcoin protocol that assist with these security goals. Foremost, bitcoin uses strong public and private key cryptography to secure coins to a user. Money is handled by a bitcoin wallet, which is a program such as the official bitcoin client that creates public/private key pairs that allow you to send and receive money. You are further able to generate new receiving addresses whenever you want within the client. The wallet.dat file is the record of all your key pairs and thus your bitcoins and contains 100 address/key pairs (though you are able to generate new ones beyond that). Then, to send money one only needs to sign the bitcoin with their private key and send it to the recipient’s public key. This creates a chain of transactions that are secured by these public and private key pairs from person to person. Unfortunately this cryptography alone is not able to prevent double spending, meaning that Person A could sign the bitcoin with his private key to Person B, but also could do the same to Person C and so on. This issue is where the peer-to-peer and distributed computing aspect of the bitcoin protocol come into play. By using a peer-to-peer distributed timestamp server, the bitcoin protocol creates a public record of every transaction that prevents double spending of bitcoins. Once the bitcoin has been signed to a public key (receiving address) with the user’s private key, and the network confirms this transaction the bitcoins can no longer be spent by Person A as the network has confirmed that the coin belongs to Person B now, and they are the only ones that can spend it using their private key.

Keep reading our article that details the theories behind Bitcoins as well as the performance of modern GPUs in mining them!  

Author:
Manufacturer: AMD

Introducing the AMD FSA

At AMD’s Fusion 11 conference, we were treated to a nice overview of AMD’s next generation graphics architecture.  With the recent change in their lineup going from the previous VLIW-5 setup (powered their graphics chips from the Radeon HD 2900 through the latest “Barts” chip running the HD 6800 series) to the new VLIW-4 (HD 6900), many were not expecting much from AMD in terms of new and unique designs.  The upcoming “Southern Isles” were thought to be based on the current VLIW-4 architecture, and would feature more performance and a few new features due to the die shrink to 28 nm.  It turns out that speculation is wrong.

amd_fsa01.jpg

In late Q4 of this year we should see the first iteration of this new architecture that was detailed today by Eric Demers.  The overview detailed some features that will not make it into this upcoming product, but eventually it will all be added in over the next three years or so.  Historically speaking, AMD has placed graphics first, with GPGPU/compute as the secondary functionality of their GPUs.  While we have had compute abilities since the HD 1800/1900 series of products, AMD has not been as aggressive with compute as has its primary competition.  From the G80 GPUs and beyond, NVIDIA has pushed compute harder and farther than AMD has.  With its mature CUDA development tools and the compute heavy Fermi architecture, NVIDIA has been a driving force in this particular market.  Now that AMD has released two APU based products (Llano and Brazos), they are starting to really push OpenCL, Direct Compute, and the recently announced C++ AMP.

Continue reading for all the details on AMD's Graphics Core Next!

Wii U Revealed To Contain Last Gen PC DirectX 10.1 Capable AMD Radeon GPU

Subject: Systems | June 14, 2011 - 05:08 PM |
Tagged: Wii U, radeon, r770, Nintendo, amd

While the current Nintendo console’s internals are very underpowered compared to the competition from the Xbox 360 and PS3, the company looks to leapfrog those consoles in the graphics department with the upcoming Wii U console. According to Engadget, the new Nintendo offering will come equipped with a GPU much like that of AMD’s 4800 series. The custom R770 chip is DirectX 10.1 and multi-display capable, allowing the console to output up to four SD video streams.

While the proposed chip is last-generation in terms of PC gaming, on the console front it will be the current highest-end GPU, with the Xbox 360 using a custom ATI X1900 GPU and the PS3 employing a custom RSX (”Reality Synthesizer”) graphics chip based on NVIDIA’s 7800GTX PC graphics card.

What do you think about Nintendo’s move to employ the AMD GPU?

Source: Engadget
Author:
Manufacturer: MSI Computers

MSI R6970 Lightning: High Speed, Low Drag

MSI has been on a tear as of late with their video card offerings.  The Twin Frozr II and III series have all received positive reviews, people seem to be buying their products, and the company has taken some interesting turns in how they handle overall design and differentiation in a very crowded graphics marketplace.  This did not happen overnight, and MSI has been a driving force in how the video card business has developed.

r6970_box1.jpg

Perhaps a company’s reputation is best summed up by what the competition has to say about them.  I remember well back in 1999 when Tyan was first considering going into the video card business.  Apparently they were going to release a NVIDIA TnT-2 based card to the marketplace, and attempt to work their way upwards with more offerings.  This particular project was nixed by management.  A few years later Tyan attempted the graphics business again, but this time with some ATI Radeon 9000 series of cards.  Their biggest seller was their 9200 cards, but they also offered their Tachyone 9700 Pro.  In talking with Tyan about where they were, the marketing guy simply looked at me and said, “You know, if we had pursued graphics back in 1999 we might be in the same position that MSI is in now.”

AMD Radeon HD 6770 and 6750 Launch, add Blu-ray 3D decode acceleration

Subject: Graphics Cards | April 28, 2011 - 12:49 AM |
Tagged: radeon, amd, 6770, 6750, 5770, 5750

After the release of the AMD Radeon HD 6790 graphics card earlier this month that brought the Barts GPU architecture down to the sub-$150 graphics market, we expected to see something in a similar vein from the updated HD 6770 and HD 6750 cards.  But it was not to be: the Radeon HD 6770 and HD 6750 will continue in nearly an identical fashion to that of the Radeon HD 5770 and HD 5750 as we know them today.

6770-1.png

When released back in October of 2009, the Radeon HD 5770 and HD 5750 were based on the Juniper 40nm GPU, ran at clock speeds of 850 MHz and 700 MHz respectively and included 1GB of GDDR5 memory running at either 1200 MHz or 1150 MHz.  Today, as the Radeon HD 6770 and HD 6750 see light, we are greeted with basically identical specs:

6770-4.png

Read on for more information!

Source: AMD
Author:
Manufacturer: AMD
Tagged: turks, radeon, htpc, amd, 6670, 6570

Introduction and the new Turks GPU

Introduction

It seems that the graphics card wars have really heated up recently.  With the release of the Radeon HD 6990 4GB and the GeForce GTX 590 3GB card it might seem that EVERYONE was spending $600 on their next GPU purchase.  Obviously that isn't the case and the world of the sub-$100 card, while way less sexy, is just as important.

slide01.jpg

This week AMD has announced a slew of new options to address this market including the Radeon HD 6670, HD 6570 and even the HD 6450.  Topping out at $99, the Radeon HD 6670 offers good performance, strong HTPC features and low power consumption.  NVIDIA's competition is still reasonable though as we compare how the now price-dropped GeForce GTS 450 sits into the stack.