May the Radeon be with You
In celebration of the release of The Force Awakens as well as the new Star Wars Battlefront game from DICE and EA, AMD sent over some hardware for us to use in a system build, targeted at getting users up and running in Battlefront with impressive quality and performance, but still on a reasonable budget. Pairing up an AMD processor, MSI motherboard, Sapphire GPU with a low cost chassis, SSD and more, the combined system includes a FreeSync monitor for around $1,200.
Holiday breaks are MADE for Star Wars Battlefront
Though the holiday is already here and you'd be hard pressed to build this system in time for it, I have a feeling that quite a few of our readers and viewers will find themselves with some cash and gift certificates in hand, just ITCHING for a place to invest in a new gaming PC.
The video above includes a list of components, the build process (in brief) and shows us getting our gaming on with Star Wars Battlefront. Interested in building a system similar the one above on your own? Here's the hardware breakdown.
|AMD Powered Star Wars Battlefront System|
|Processor||AMD FX-8370 - $197
Cooler Master Hyper 212 EVO - $29
|Motherboard||MSI 990FXA Gaming - $137|
|Memory||AMD Radeon Memory DDR3-2400 - $79|
|Graphics Card||Sapphire NITRO Radeon R9 380X - $266|
|Storage||SanDisk Ultra II 240GB SSD - $79|
|Case||Corsair Carbide 300R - $68|
|Power Supply||Seasonic 600 watt 80 Plus - $69|
|Monitor||AOC G2460PF 1920x1080 144Hz FreeSync - $259|
|Total Price||Full System (without monitor) - Amazon.com - $924|
For under $1,000, plus another $250 or so for the AOC FreeSync capable 1080p monitor, you can have a complete gaming rig for your winter break. Let's detail some of the specific components.
AMD sent over the FX-8370 processor for our build, a 4-module / 8-core CPU that runs at 4.0 GHz, more than capable of handling any gaming work load you can toss at it. And if you need to do some transcoding, video work or, heaven forbid, school or productivity work, the FX-8370 has you covered there too.
For the motherboard AMD sent over the MSI 990FXA Gaming board, one of the newer AMD platforms that includes support for USB 3.1 so you'll have a good length of usability for future expansion. The Cooler Master Hyper 212 EVO cooler was our selection to keep the FX-8370 running smoothly and 8GB of AMD Radeon DDR3-2133 memory is enough for the system to keep applications and the Windows 10 operating system happy.
Subject: General Tech | December 22, 2015 - 02:07 PM | Jeremy Hellstrom
Tagged: amd, Samsung, 14nm, rumour
The talk around the watercooler includes a rumour that AMD may use Samsung to produce at least some of their 14nm chips in the coming year. If true this has been a huge year for Samsung who produce NVIDIA chips as well as recently picking up a contract with Apple to produce some of their A9 SoCs. The rumour still includes GLOBALFOUNDRIES as a source for APUs and GPUs so this would make Samsung a second source for working silicon, which we can hope will alleviate some of AMD's difficulty in maintaining supplies of products. This could also help fund Samsung's development of their 10nm FinFET node which the claim should be in production by the end of 2016. As always, take the rumour for what it is but if you want to learn more about what is being said you can pop over to The Inquirer.
"A report in South Korea's Electronic Times, which cited unknown sources, said that Samsung Electronics will start making new chips for AMD sometime next year."
Here is some more Tech News from around the web:
- Toshiba denies NAND exit report with 'no decision made' comment @The Register
- 25 years ago: Sir Tim Berners-Lee builds world's first website @ The Register
- Facepalm time: Windows 10 security patch wipes custom Word autotext @ The Register
- Make Show-Stopping Netflix Socks @ MAKE:Blog
Subject: Graphics Cards | December 17, 2015 - 05:49 PM | Jeremy Hellstrom
Tagged: radeon, crimson, amd
That's right folks, the official AMD Radeon Software Crimson Edition 15.12 has just launched for you to install. This includes the fixes for fan speeds when you are using AMD Overdrive, your settings will stick and the fans will revert to normal after you go back to the desktop from an intense gaming session. There are multiple fixes for Star Wars Battlefront, Fallout 4 and several GUI fixes within the software itself. As always there are still a few kinks being worked out but overall it is worth popping over to AMD to grab the new driver. You should also have less issues upgrading from within Crimson after this update as well.
Subject: General Tech | December 17, 2015 - 02:35 PM | Ken Addison
Tagged: video, Thrustmaster, T300, snapdragon 820, Skylake, qualcomm, podcast, logitech g, Intel, i3-6100, gpuopen, gameworks, arx control, amd
PC Perspective Podcast #379 - 12/17/2015
Join us this week as we discuss the Snapdragon 820, AMD's GPUOpen, Thrustmaster T300 and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Josh Walrath, Jeremy Hellstrom, Allyn Malventano, and Sebastian Peak
Program length: 1:13:34
Week in Review:
News item of interest:
Hardware/Software Picks of the Week:
Sebastian: If only you could buy this case.
Open Source your GPU!
As part of the AMD’s recent RTG (Radeon Technologies Group) Summit in Sonoma, the company released information about a new initiative to help drive development and evolution in the world of gaming called GPUOpen. As the name implies, the idea is to use an open source mentality to drivers, libraries, SDKs and more to improve the relationship between AMD’s hardware and the gaming development ecosystem.
When the current generation of consoles was first announced, AMD was riding a wave of positive PR that it hadn’t felt in many years. Because AMD Radeon hardware was at the root of the PlayStation 4 and the Xbox One, game developers would become much more adept at programming for AMD’s GCN architecture and that would waterfall down to PC gamers. At least, that was the plan. In practice though I think you’d be hard pressed to find any analyst to put their name on a statement claiming that proclamation from AMD actually transpired. It just hasn’t happened – but that does not mean that it still can’t if all the pieces fall into place.
The issue that AMD, NVIDIA, and game developers have to work around is a divided development ecosystem. While on the console side programmers tend to have very close to the metal access on CPU and GPU hardware, that hasn’t been the case with PCs until very recently. AMD was the first to make moves in this area with the Mantle API but now we have DirectX 12, a competing low level API, that will have much wider reach than Mantle or Vulkan (what Mantle has become).
AMD also believes, as do many developers, that a “black box” development environment for tools and effects packages is having a negative effect on the PC gaming ecosystem. The black box mentality means that developers don’t have access to the source code of some packages and thus cannot tweak performance and features to their liking.
Subject: Graphics Cards | December 14, 2015 - 03:55 PM | Jeremy Hellstrom
Tagged: amd, asus, STRIX R9 380X DirectCU II OC, overclock
Out of the box the ASUS STRIX R9 380X OC has a top GPU speed of 1030MHz and memory at 5.7GHz, enough to outperform a stock GTX 960 4GB at 1440p but not enough to provide satisfactory performance at that resolution. After spending some time with the card, [H]ard|OCP determined that the best overclock they could coax out of this particular GPU was 1175MHz and 6.5GHz, so they set about testing the performance at 1440p again. To make it fair they also overclocked their STRIX GTX 960 OC 4GB to 1527MHz and 8GHz. Read the full review for the detailed results, you will see that overclocking your 380X does really increase the value you get for your money.
"We take the new ASUS STRIX R9 380X DirectCU II OC based on AMD's new Radeon R9 380X GPU and overclock this video card to its highest potential. We'll compare performance in six games, including Fallout 4, to a highly overclocked ASUS GeForce GTX 960 4GB video card and find out who dominates 1440p gaming."
Here are some more Graphics Card articles from around the web:
- Sapphire R9 390 Nitro 8GB @ Kitguru
- XFX R9 380X DD XXX OC Review @ OCC
- AMD R9 380X 4GB Graphics Card CrossFire @ eTeknix
- HIS R9 380X IceQ X2 Turbo 4GB Video Card Review @ Madshrimps
What RTG has planned for 2016
Last week the Radeon Technology Group invited a handful of press and analysts to a secluded location in Sonoma, CA to discuss the future of graphics, GPUs and of course Radeon. For those of you that seem a bit confused, the RTG (Radeon Technologies Group) was spun up inside AMD to encompass all of the graphics products and IP inside the company. Though today’s story is not going to focus on the fundamental changes that RTG brings to the future of AMD, I will note, without commentary, that we saw not a single AMD logo in our presentations or in the signage present throughout the week.
Much of what I learned during the RTG Summit in Sonoma is under NDA and will likely be so for some time. We learned about the future architectures, direction and product theories that will find their way into a range of solutions available in 2016 and 2017.
What I can discuss today is a pair of features that are being updated and improved for current generation graphics cards and for Radeon GPUs coming in 2016: FreeSync and HDR displays. The former is one that readers of PC Perspective should be very familiar with while the latter will offer a new window into content coming in late 2016.
High Dynamic Range Displays: Better Pixels
In just the last couple of years we have seen a spike in resolution for mobile, desktop and notebook displays. We now regularly have 4K monitors on sale for around $500 and very good quality 4K panels going for something in the $1000 range. Couple that with the increase in market share of 21:9 panels with 3440x1440 resolutions and clearly there is a demand from consumers for a better visual experience on their PCs.
But what if the answer isn’t just more pixels, but better pixels? We already have this discussed weekly when comparing render resolutions in games of 4K at lower image quality solutions versus 2560x1440 at maximum IQ settings (for example) but the truth is that panel technology has the ability to make a dramatic change to how we view all content – games, movies, productivity – with the introduction of HDR, high dynamic range.
As the slide above demonstrates there is a wide range of luminance in the real world that our eyes can see. Sunlight crosses the 1.6 billion nits mark while basic fluorescent lighting in our homes and offices exceeds 10,000 nits. Compare to the most modern PC displays that range from 0.1 nits to 250 nits and you can already tell where the discussion is heading. Even the best LCD TVs on the market today have a range of 0.1 to 400 nits.
Subject: Graphics Cards, Processors | December 8, 2015 - 08:07 AM | Scott Michaud
Tagged: hsa, GCC, amd
Phoronix, the Linux-focused hardware website, highlighted patches for the GNU Compiler Collection (GCC) that implement HSA. This will allow newer APUs, such as AMD's Carrizo, to accelerate chunks of code (mostly loops) that have been tagged with a precompiler flag as valuable to be done on the GPU. While I have done some GPGPU development, many of the low-level specifics of HSA aren't areas that I have too much experience with.
The patches have been managed by Martin Jambor of SUSE Labs. You can see a slideshow presentation of their work on the GNU website. Even though features froze about a month ago, they are apparently hoping that this will make it into the official GCC 6 release. If so, many developers around the world will be able to target HSA-compatible hardware in the first half of 2016. Technically, anyone can do so regardless, but they would need to specifically use the unofficial branch on the GCC Subversion repository. This probably means compiling it themselves, and it might even be behind on a few features in other branches that were accepted into GCC 6.
Subject: Graphics Cards | December 6, 2015 - 11:29 PM | Scott Michaud
Tagged: gigabyte, cooler master, asetek, amd
AMD and Gigabyte have each received cease and desist letters from Asetek, regarding the Radeon Fury X and GeForce GTX 980 Water Force, respectively, for using a Cooler Master-based liquid cooling solution. The Cooler Master Seiden 120M is a self-contained block and water pump, which courts have ruled that it infringes on one of Asetek's patents. Asetek has been awarded 25.375% of Cooler Master's revenue from all affected products since January 1st, 2015.
This issue obviously affects NVIDIA less than AMD, since it applies to a single product from just one AIB partner. On AMD's side, however, it affects all Fury X products, but obviously not the air-cooled Fury and Fury Nano cards. It's also possible that future SKUs could be affected as well, especially since upcoming, top end GPUs will probably be in small packages adjacent HBM 2.0 memory. This dense form-factor lends itself well to direct cooling techniques, like closed-loop water.
Even more interesting is that we believe Asetek was expecting to get the Fury X contract. We reported on an Asetek press release that claimed they received their “Largest Ever Design Win” with an undisclosed OEM. We expected it to be the follow-up to the 290X, which we assumed was called 390X because, I mean, AMD just chose that branding, right? Then the Fury X launched and it contained a Cooler Master pump. I was confused. No other candidate for “Largest Ever Design Win” popped up from Asetek, either. I guess we were right? Question mark? The press release of Asetek's design win came out in August 2014 while Asetek won the patent case in December of that year.
Regardless, this patent war has been ongoing for several months now. If it even affects any future products, I'd hope that they'd have enough warning at this point.
Subject: Graphics Cards | December 3, 2015 - 10:34 PM | Sebastian Peak
Tagged: Tonga XT, tonga, Radeon R9 380X, Radeon R9 285, Radeon R9 280X, Radeon R9 280, radeon, amd, 384-bit
While it was reported a year ago that AMD's Tonga XT GPU had a 384-bit memory bus in articles sourcing the same PC Watch report, when the Radeon R9 380X was released last month we saw a Tonga XT GPU with a 256-bit memory interface.
The full Tonga core features a 384-bit GDDR5 memory bus (Credit: PC Watch)
Reports of the upcoming card had consistently referenced the wider 384-bit bus, and tonight we are able to officially confirm that Tonga (not just Tonga XT) has been 384-bit capable all along, though this was never enabled by AMD. The reason? The company never found the right price/performance combination.
AMD confirms 384-bit bus available on Tonga, just not enabled on any product, including 380X. Didn't find a perfect perf/$ slot.
— Ryan Shrout (@ryanshrout) December 4, 2015
AMD's Raja Koduri confirmed Tonga's 384-bit bus tonight, and our own Ryan Shrout broke the news on Twitter.
So does this mean an upcoming Tonga GPU could offer this wider memory bus? Tonga itself was a follow-up to Tahiti (R9 280/280X), which did have a 384-bit bus, but all along the choice had been made to keep the updated core at 256-bit.
Now more than a year after the launch of Tonga a new part featuring a fully enabled memory bus doesn't seem realistic, but it's still interesting to know that significantly more memory bandwidth is locked away from owners of these cards.