AMD Catalyst 14.7 Release Candidate 3

Subject: Graphics Cards | August 14, 2014 - 04:20 PM |
Tagged: catalyst 14.7 RC3, beta, amd

A new Catalyst Release Candidate has arrived and as with the previous driver it no longer supports Windows 8.0 or the WDDM 1.2 driver, so upgrade to Win 7 or Win 8.1 before installing please.  AMD will eventually release a driver which supports WDDM 1.1 under Win 8.0 for those who do not upgrade.

AMD-Catalyst-12-11-Beta-11-7900-Modded-Driver-Crafted-for-Performance.jpg

Feature Highlights of the AMD Catalyst 14.7​ RC3 Driver for Windows Includes all improvements found in the AMD Catalyst 14.7 RC driver

  • Display interface enhancements to improve 4k monitor performance and reduce flickering.
  • Improvements apply to the following products: ​
    • AMD Radeon R9 290 Series
    • AMD Radeon R9 270 Series
    • AMD Radeon HD 7800 Series​ ​​
  • Even with these improvements, cable quality and other system variables can affect 4k performance. AMD recommends using DisplayPort 1.2 HBR2 certified cables with a length of 2m (~6 ft) or less when driving 4K monitors.​
  • Wildstar: AMD Crossfire profile support
  • Lichdom: Single GPU and Multi-GPU performance enhancements
  • Watch Dogs: Smoother gameplay on single GPU and Multi-GPU configurations​

Feature Highlights of the AMD Catalyst 14.7​ RC Driver for Windows

  • Includes all improvements found in the AMD Catalyst 14.6 RC driver
    • AMD ​CrossFire and AMD Radeon Dual Graphics profile update for Plants vs. Zombies​​​
    • Assassin's Creed IV - improved CrossFire scaling (3840x2160 High Settings) up to 93%
    • Collaboration with AOC has identified non-standard display timings as the root cause of 60Hz SST flickering exhibited by the AOC U2868PQU panel on certain AMD Radeon graphics cards.
    • A software workaround has been implemented in AMD Catalyst 14.7 RC driver to resolve the display timing issues with this display. Users are further encouraged to obtain newer display firmware from AOC that will resolve flickering at its origin.
    • Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection.​​​

Feature Highlights of the AMD Catalyst 14.6 RC Driver for Windows

  • Plants vs. Zombies (Direct3D performance improvements):
    • AMD Radeon R9 290X - 1920x1080 Ultra – improves up to 11%
    • AMD Radeon R9 290X - 2560x1600 Ultra – improves up to 15%
    • AMD Radeon R9 290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
  • 3DMark Sky Diver improvements:
    • AMD A4-6300 – improves up to 4%
    • Enables AMD Dual Graphics/AMD CrossFire support
  • Grid Auto Sport: AMD CrossFire profile
  • Wildstar: Power Xpress profile
    • Performance improvements to improve smoothness of application
    • Performance improves up to 24% at 2560x1600 on the AMD Radeon R9 and R7 Series of products for both single GPU and multi-GPU configurations.
  • Watch Dogs: AMD CrossFire – Frame pacing improvements
  • Battlefield Hardline Beta: AMD CrossFire profile

Known Issues

  • Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
  • Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in flickering
  • AMD CrossFire configurations with AMD Eyefinity enabled will see instability with BattleField 4 or Thief when running Mantle
  • Catalyst Install Manager text is covered by Express/Custom radio button text
  • Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder
Source: AMD

Richard Huddy Discusses FreeSync Availability Timeframes

Subject: General Tech, Displays | August 14, 2014 - 01:59 PM |
Tagged: amd, freesync, g-sync, Siggraph, siggraph 2014

At SIGGRAPH, Richard Huddy of AMD announced the release windows of FreeSync, their adaptive refresh rate technology, to The Tech Report. Compatible monitors will begin sampling "as early as" September. Actual products are expected to ship to consumers in early 2015. Apparently, more than one display vendor is working on support, although names and vendor-specific release windows are unannounced.

amd-freesync1.jpg

As for cost of implementation, Richard Huddy believes that the added cost should be no more than $10-20 USD (to the manufacturer). Of course, the final price to end-users cannot be derived from this - that depends on how quickly the display vendor expects to sell product, profit margins, their willingness to push new technology, competition, and so forth.

If you want to take full advantage of FreeSync, you will need a compatible GPU (look for "gaming" support in AMD's official FreeSync compatibility list). All future AMD GPUs are expected to support the technology.

Source: Tech Report

Podcast #313 - New Kaveri APUs, ASUS ROG Swift G-Sync Monitor, Intel Core M Processors and more!

Subject: General Tech | August 14, 2014 - 12:30 PM |
Tagged: video, ssd, ROG Swift, ROG, podcast, ocz, nvidia, Kaveri, Intel, g-sync, FMS 2014, crossblade ranger, core m, Broadwell, asus, ARC 100, amd, A6-7400K, A10-7800, 14nm

PC Perspective Podcast #313 - 08/14/2014

Join us this week as we discuss new Kaveri APUs, ASUS ROG Swift G-Sync Monitor, Intel Core M Processors and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Program length: 1:41:24
 

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Diamond's Xtreme Sound XS71HDU looks good but how does it sound?

Subject: General Tech | August 14, 2014 - 12:00 PM |
Tagged: audio, diamond multimedia, Xtreme Sound XS71HDU, usb sound card, DAC

The Diamond Xtreme Sound XS71HDU could be a versitile $60 solution for those with high end audio equipment that would benefit from a proper DAC.  With both optical in and out it is capable of more than an onboard solution, not to mention the six 3.5-mm jacks for stereo headphones, 7.1 surround support with rear, sub, side, mic, and line in.  The design and features are impressive however the performance failed to please The Tech Report who felt that there were similar solutions with much higher quality sound reproduction.

back.jpg

"We love sound cards here at TR, but they don't fit in every kind of PC. Diamond's Xtreme Sound XS71HDU serves up the same kinds of features in a tiny USB package suitable for mini-PCs and ultrabooks. We took it for a spin to see if it's as good as it looks."

Here is some more Tech News from around the web:

Audio Corner

That Linux thing that nobody uses

Subject: General Tech | August 14, 2014 - 10:31 AM |
Tagged: linux

For many Linux is a mysterious thing that is either dead or about to die because no one uses it.  Linux.com has put together an overview of what Linux is and where to find it being used.  Much of what they describe in the beginning applies to all operating systems as they share similar features, it is only in the details that they differ.  If you have only thought about Linux as that OS that you can't game on then it is worth taking a look through the descriptions of the distributions and why people choose to use Linux.  You may never build a box which runs Linux but if you are considering buying a Steambox when they arrive on the market you will find yourself using a type of Linux and having a basic understanding of the parts of the OS for troubleshooting and optimization.   If you already use Linux then fire up Steam and take a break.

software-center.png

"For those in the know, you understand that Linux is actually everywhere. It's in your phones, in your cars, in your refrigerators, your Roku devices. It runs most of the Internet, the supercomputers making scientific breakthroughs, and the world's stock exchanges."

Here is some more Tech News from around the web:

Tech Talk

Source: Linux.com

Intel and Microsoft Show DirectX 12 Demo and Benchmark

Subject: General Tech, Graphics Cards, Processors, Mobile, Shows and Expos | August 13, 2014 - 06:55 PM |
Tagged: siggraph 2014, Siggraph, microsoft, Intel, DirectX 12, directx 11, DirectX

Along with GDC Europe and Gamescom, Siggraph 2014 is going on in Vancouver, BC. At it, Intel had a DirectX 12 demo at their booth. This scene, containing 50,000 asteroids, each in its own draw call, was developed on both Direct3D 11 and Direct3D 12 code paths and could apparently be switched while the demo is running. Intel claims to have measured both power as well as frame rate.

intel-dx12-LockedFPS.png

Variable power to hit a desired frame rate, DX11 and DX12.

The test system is a Surface Pro 3 with an Intel HD 4400 GPU. Doing a bit of digging, this would make it the i5-based Surface Pro 3. Removing another shovel-load of mystery, this would be the Intel Core i5-4300U with two cores, four threads, 1.9 GHz base clock, up-to 2.9 GHz turbo clock, 3MB of cache, and (of course) based on the Haswell architecture.

While not top-of-the-line, it is also not bottom-of-the-barrel. It is a respectable CPU.

Intel's demo on this processor shows a significant power reduction in the CPU, and even a slight decrease in GPU power, for the same target frame rate. If power was not throttled, Intel's demo goes from 19 FPS all the way up to a playable 33 FPS.

Intel will discuss more during a video interview, tomorrow (Thursday) at 5pm EDT.

intel-dx12-unlockedFPS-1.jpg

Maximum power in DirectX 11 mode.

For my contribution to the story, I would like to address the first comment on the MSDN article. It claims that this is just an "ideal scenario" of a scene that is bottlenecked by draw calls. The thing is: that is the point. Sure, a game developer could optimize the scene to (maybe) instance objects together, and so forth, but that is unnecessary work. Why should programmers, or worse, artists, need to spend so much of their time developing art so that it could be batch together into fewer, bigger commands? Would it not be much easier, and all-around better, if the content could be developed as it most naturally comes together?

That, of course, depends on how much performance improvement we will see from DirectX 12, compared to theoretical max efficiency. If pushing two workloads through a DX12 GPU takes about the same time as pushing one, double-sized workload, then it allows developers to, literally, perform whatever solution is most direct.

intel-dx12-unlockedFPS-2.jpg

Maximum power when switching to DirectX 12 mode.

If, on the other hand, pushing two workloads is 1000x slower than pushing a single, double-sized one, but DirectX 11 was 10,000x slower, then it could be less relevant because developers will still need to do their tricks in those situations. The closer it gets, the fewer occasions that strict optimization is necessary.

If there are any DirectX 11 game developers, artists, and producers out there, we would like to hear from you. How much would a (let's say) 90% reduction in draw call latency (which is around what Mantle claims) give you, in terms of fewer required optimizations? Can you afford to solve problems "the naive way" now? Some of the time? Most of the time? Would it still be worth it to do things like object instancing and fewer, larger materials and shaders? How often?

To boldy go where no 290X has gone before?

Subject: Graphics Cards | August 13, 2014 - 03:11 PM |
Tagged: factory overclocked, sapphire, R9 290X, Vapor-X R9 290X TRI-X OC

As far as factory overclocks go, the 1080MHz core and 5.64GHz RAM on the new Sapphire Vapor-X 290X is impressive and takes the prize for the highest factory overclock on this card [H]ard|OCP has seen yet.  That didn't stop them from pushing it to 1180MHz and 5.9GHz after a little work which is even more impressive.  At both the factory and manual overclocks the card handily beat the reference model and the manually overclocked benchmarks could meet or beat the overclocked MSI GTX 780 Ti GAMING 3G OC card.  The speed is not the only good feature, Intelligent Fan Control keeps two of the three fans from spinning when the GPU is under 60C which vastly reduces the noise produced by this card.  It is currently selling for $646, lower than the $710 that the GeForce is currently selling for as well.

1406869221rJVdvhdB2o_1_6_l.jpg

"We take a look at the SAPPHIRE Vapor-X R9 290X TRI-X OC video card which has the highest factory overclock we've ever encountered on any AMD R9 290X video card. This video card is feature rich and very fast. We'll overclock it to the highest GPU clocks we've seen yet on R9 290X and compare it to the competition."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Podcast Listeners and Viewers: Win an EVGA SuperNOVA 1000 G2 Power Supply

Subject: General Tech | August 13, 2014 - 02:15 PM |
Tagged: supernova, podcast, giveaway, evga, contest

A big THANK YOU goes to our friends at EVGA for hooking us up with another item to give away for our podcast listeners and viewers this week. If you watch tonight's LIVE recording of Podcast #313 (10pm ET / 7pm PT at http://pcper.com/live) or download our podcast after the fact (at http://pcper.com/podcast) then you'll have the tools needed to win an EVGA SuperNOVA 1000 G2 Power Supply!! (Valued at $165 based on Amazon current selling price.) See review of our 750/850G2 SuperNOVA units.

120-G2-1000-XR_XL_1.jpg

How do you enter? Well, on the live stream (or in the downloaded version) we'll give out a special keyword during our discussion of the contest for you to input in the form below. That's it! 

We'll draw a random winner next week, anyone can enter from anywhere in the world - we'll cover the shipping. We'll draw a winner on August 20th and announce it on the next episode of the podcast! Good luck, and once again, thanks goes out to EVGA for supplying the prize!

The downwards Arc of flash prices; OCZ releases an SSD at $0.50/GB

Subject: Storage | August 13, 2014 - 11:38 AM |
Tagged: toshiba, ssd, sata, ocz, barefoot 3, ARC

Before even looking at the performance the real selling point of the new OCZ ARC 100 is the MSRP, the 240GB and 480GB models are slated to be released at $0.50/GB and will likely follow the usual trend of SSD prices and drop from there.  The drives use the Barefoot 3 controller, this one clocked slightly lower than the Vertex 460 but still capable of accelerating encryption.  Once The Tech Report set the drive up in their test bed the performance was almost on par with the Vertex 460 and other mid to high end SSDs, especially in comparison to the Crucial MX100.

Make sure to read Al's review as well, not just for the performance numbers but also an explanation of OCZ's warranty on this drive.

DSC04576.JPG

"OCZ's latest value SSD is priced at just $0.50 per gig, but it hangs with mid-range and even high-end drives in real-world and demanding workloads. It's also backed by an upgraded warranty and some impressive internal reliability data provided by OCZ. We take a closer look:"

Here are some more Storage reviews from around the web:

Storage

Unreal Tournament's Training Day, get in on the Pre-Pre-Alpha right now!

Subject: General Tech | August 13, 2014 - 11:02 AM |
Tagged: Unreal Tournament, gaming, Alpha

Feel like (Pre-Pre-)Alpha testing Unreal Tournament without forking money over for early access?  No problems thanks to Epic and Unreal Forums member ‘raxxy’ who is compiling and updating the (pre)Alpha version of the next Unreal Tournament.  Sure there may not be many textures but there is a Flak Cannon so what could you possible have to complain about?  There are frequent updates and a major part of participating is to give feedback to the devs so please be sure to check into the #beyondunreal IRC channel to get tips and offer feedback.  Rock, Paper, SHOTGUN reports that the severs are massively packed now so you may not be able to immediately join in but it is worth trying.

raxxy would like you to understand "These are PRE-ALPHA Prototype Builds. Seriously. Super early testing. So early it's technically not even pre alpha, it's debug code!"

You can be guaranteed that the Fragging Frogs will be taking advantage of this, as well as revisiting the much beloved UT2K4 so if you haven't joined up yet ... what are you waiting for?

Check out Fatal1ty playing if you can't get on

"Want to play the new Unreal Tournament for free, right this very second? Cor blimey and OMG you totes can! Hero of the people ‘raxxy’ on the Unreal Forums is compiling Epic’s builds and releasing them as small, playable packages that anyone can run, with multiple updates per week. The maps are untextured, the weapons unbalanced, and things change rapidly as everything’s still “pre-alpha” but it’s playable and – more importantly – fun."

Here is some more Tech News from around the web:

Gaming

Meet Tonga, soon to be your new Radeon

Subject: General Tech | August 13, 2014 - 09:58 AM |
Tagged: tonga, radeon, FirePro W7100, amd

A little secret popped out with the release of AMD's FirePro W7100, a new family of GPU that goes by the name of Tonga, which is very likely to replace the aging Tahiti chip that has been used since the HD 7900 series.  The stats that The Tech Report saw show interesting changes from Tahiti including a reduction of the memory interface to 256-bit which is in line with NVIDIA's current offerings.  The number of stream processors might be reduced to 1792 from 2048 but that is based on the W7100 and it the GPUs may be released with the full 32 GCN compute units.  Many other features have seen increases, the number of Asynchronous Compute Engines goes from 2 to 8, the number of rasterized triangles per clock doubles to 4 and it adds support for the new TrueAudio DSP and CrossFire XDMA.

IMG0045305.jpg

"The bottom line is that Tonga joins the Hawaii (Radeon R9 290X) and Bonaire (R7 260X) chips as the only members of AMD' s GCN 1.1 series of graphics processors. Tonga looks to be a mid-sized GPU and is expected to supplant the venerable Tahiti chip used in everything from the original Radeon HD 7970 to the current Radeon R9 280."

Here is some more Tech News from around the web:

Tech Talk

Select GeForce GTX GPUs Now Include Borderlands: The Pre-Sequel

Subject: General Tech | August 13, 2014 - 09:26 AM |
Tagged: borderlands, nvidia, geforce

 

Santa Clara, CA — August 12, 2014 — Get ready to shoot ‘n’ loot your way through Pandora’s moon. Starting today, gamers who purchase select NVIDIA GeForce GTX TITAN, 780 Ti, 780, and 770 desktop GPUs will receive a free copy of Borderlands: The Pre-Sequel, the hotly anticipated new chapter to the multi-award winning Borderlands franchise from 2K and Gearbox Software.

Discover the story behind Borderlands 2’s villain, Handsome Jack, and his rise to power. Taking place between the original Borderlands and Borderlands 2, Borderlands: The Pre-Sequel offers players a whole lotta new gameplay in low gravity.

“If you have a high-end NVIDIA GPU, Borderlands: The Pre-Sequel will offer higher fidelity and higher performance hardware-driven special effects including awesome weapon impacts, moon-shatteringly cool cryo explosions and ice particles, and cloth and fluid simulation that blows me away every time I see it," said Randy Pitchford, CEO and president of Gearbox Software.

With NVIDIA PhysX technology, you will feel deep space like never before. Get high in low gravity and use new ice and laser weapons to experience destructible levels of mayhem. Check out the latest trailer here: http://youtu.be/c9a4wr4I1hk that just went live this morning!

Borderlands: The Pre-Sequel will also stream to your NVIDIA SHIELD tablet or portable. For the first time ever, you can play Claptrap anywhere by using NVIDIA Gamestream technologies. You can even livestream and record every fist punch with GeForce Shadowplay

Borderlands: The Pre-Sequel will be available on October 14, 2014 in North America and on October 17, 2014 internationally. Borderlands: The Pre-Sequel is not yet rated by the ESRB.

The GeForce GTX and Borderlands: The Pre-Sequel bundle is available starting today from leading e-tailers including Amazon, NCIX, Newegg, and Tiger Direct and system builders including Canada Computers, Digital Storm, Falcon Northwest, Maingear, Memory Express, Origin PC, V3 Gaming, and Velocity Micro. For a full list of participating partners, please visit: www.GeForce.com/GetBorderlands.

Source: NVIDIA

DOTA 2 May Be Running Source Engine 2. Now.

Subject: General Tech | August 12, 2014 - 06:00 PM |
Tagged: valve, source engine, Source 2, DOTA 2

While it may not seem like it in North America, we are in a busy week for videogame development. GDC Europe, which stands for Game Developers Conference Europe, is just wrapping up to make room for Gamescom, which will take up the rest of the week. Valve will be there and people are reading tea leaves to find out why. SteamOS seems likely, but what about their next generation gaming engine, Source 2? Maybe it already happened?

valve-dota2-workshop.jpg

Valve is the most secretive company with values of openness that I know. They are pretty good at preventing leaks from escaping their walls. Recently, Dota 2 was updated to receive new features and development tools for user-generated maps and gametypes. The tools currently require 64-bit Windows and a DirectX 11-compatible GPU.

Those don't sound like Source requirements...

And the editor doesn't look like Valve's old tools.

Video Credit: "Valve News Network".

Leaks also point to things like "tf_imported", "left4dead2_source2", and "left4dead2_imported". This is interesting. Valve is pushing Dota 2, their most popular, free-to-play game into Source 2. Also, because it is listed as "tf" rather than "tf2", like "dota" is not registered as "dota2" but "left4dead2" keeps its number, this might mean that the free-to-play Team Fortress 2 could be in a perpetual-development mode, like Dota 2. Eventually, it could be pushed to the new engine and given more content.

As for Left4Dead2? I am wondering if it is intended to be a product, rather than an internal (or external) Source 2 tech demo.

Was this what brought Valve to Gamescom, or will be be surprised by other announcements (or nothing at all)?

Source: Polygon

G-SYNC is sweet but far from free

Subject: Displays | August 12, 2014 - 12:36 PM |
Tagged: asus, g-sync, geforce, gsync, nvidia, pg278q, Republic of Gamers, ROG, swift, video

Ryan was not the only one to test the ASUS ROG Swift PG278Q G-Sync monitor, Overclockers Club also received a model to test out.  Their impressions of the 27" 2560 x 1440 TN panel were very similar, once they saw this monitor in action going back to their 30-inch 60Hz IPS monitor was not as enjoyable as once it was.  The only bad thing they could say about the display was the MSRP, $800 is steep for any monitor and makes it rather difficult to even consider getting two or more of them for a multiple display system.

2.jpg

”When you get down to it, the facts are that even with a TN panel being used for the high refresh rate, the ASUS ROG Swift PG278Q G-Sync monitor delivers great picture quality and truly impressive gaming. I could go on all day long about how smooth each of the games played while testing this monitor, but ultimately not be able to show you without having you sit at the desk with me. No stuttering, no tearing, no lag; it's like getting that new car and having all the sales hype end up being right on the money. When I flip back and forth between my 60Hz monitor and the PC278Q, its like a night and day experience.”

Here are some more Display articles from around the web:

Displays

Let's hope DDR4 is less expensive than DDR3 3100MHz

Subject: Memory | August 12, 2014 - 11:09 AM |
Tagged: XPG V3, DDR3 3100, adata

Currently available for a mere $870 the 8GB DDR3-3100 dual channel kit from ADATA with timings of 12-14-14-36 has to be among the most expensive consumer RAM available on the market.   We can only hope that DDR4 does not arrive at a similar speed and price point but instead with slower clocked DIMMs at a more reasonable price and with improvements to performance.  Legit Reviews' testing showed that these DIMMs offer almost no benefit over DDR3-1600 with tighter timings in real usage but you can get higher scores on synthetic benchmarks.  If benchmarking better than the competition and swap-able heatspreaders with different colours is attractive to you then you could pick up these DIMMs, otherwise you really won't be getting value for your money.

adata-xpg-v3-3100mhz-memory.jpg

"Gone are the days of being on the cutting edge of memory with DDR3 running at 2133MHz! These days running 2133MHz memory is pretty much considered the norm for a high end gaming rig. If you’re looking to be on the bleeding edge of memory speeds you’re going to be limited to only one or two kits. Today we have one of the fastest kits available on the market to put through the paces, the ADATA XPG V3 DDR3 3100MHz 8GB memory kit. Read on to see if this big dollar kit is worth nearly a thousand dollars."

Here are some more Memory articles from around the web:

Memory



Intel is disabling TSX in Haswell due to software failures

Subject: General Tech | August 12, 2014 - 10:07 AM |
Tagged: Intel, haswell, tsx, errata

Transactional Synchronization Extensions, aka TSX, are a backwards compatible set of instructions which first appeared in some Haswell chips as a method to improve concurrency and multi-threadedness with as little work for the programmer as possible.  It was intended to improve the scaling of multi-threaded apps running on multi-core processors and has not yet been widely adopted.  The adoption has run into another hurdle, in some cases the use of TSX can cause critical software failures and as a result Intel will be disabling the instruction set via new BIOS/UEFI updates which will be pushed out soon.  If your software uses the new instruction set and you wish it to continue to do so you should avoid updating your motherboard BIOS/UEFI and ask your users to do the same.  You can read more about this bug/errata and other famous problems over at The Tech Report.

intel2.jpg

"The TSX instructions built into Intel's Haswell CPU cores haven't become widely used by everyday software just yet, but they promise to make certain types of multithreaded applications run much faster than they can today. Some of the savviest software developers are likely building TSX-enabled software right about now."

Here is some more Tech News from around the web:

Tech Talk

NVIDIA Reveals 64-bit Denver CPU Core Details, Headed to New Tegra K1 Powered Devices Later This Year

Subject: Processors | August 11, 2014 - 10:06 PM |
Tagged: tegra k1, project denver, nvidia, Denver, ARMv8, arm, Android, 64-bit

During GTC 2014 NVIDIA launched the Tegra K1, a new mobile SoC that contains a powerful Kepler-based GPU. Initial processors (and the resultant design wins such as the Acer Chromebook 13 and Xiaomi Mi Pad) utilized four ARM Cortex-A15 cores for the CPU side of things, but later this year NVIDIA is deploying a variant of the Tegra K1 SoC that switches out the four A15 cores for two custom (NVIDIA developed) Denver CPU cores.

Today at the Hot Chips conference, NVIDIA revealed most of the juicy details on those new custom cores announced in January which will be used in devices later this year.

The custom 64-bit Denver CPU cores use a 7-way superscalar design and run a custom instruction set. Denver is a wide but in-order architecture that allows up to seven operations per clock cycle. NVIDIA is using a custom ISA and on-the-fly binary translation to convert ARMv8 instructions to microcode before execution. A software layer and 128MB cache enhance the Dynamic Code Optimization technology by allowing the processor to examine and optimize the ARM code, convert it to the custom instruction set, and further cache the converted microcode of frequently used applications in a cache (which can be bypassed for infrequently processed code). Using the wider execution engine and Dynamic Code Optimization (which is transparent to ARM developers and does not require updated applications), NVIDIA touts the dual Denver core Tegra K1 as being at least as powerful as the quad and octo-core packing competition.

Further, NVIDIA has claimed at at peak throughput (and in specific situations where application code and DCO can take full advantage of the 7-way execution engine) the Denver-based mobile SoC handily outpaces Intel’s Bay Trail, Apple’s A7 Cyclone, and Qualcomm’s Krait 400 CPU cores. In the results of a synthetic benchmark test provided to The Tech Report, the Denver cores were even challenging Intel’s Haswell-based Celeron 2955U processor. Keeping in mind that these are NVIDIA-provided numbers and likely the best results one can expect, Denver is still quite a bit more capable than existing cores. (Note that the Haswell chips would likely pull much farther ahead when presented with applications that cannot be easily executed in-order with limited instruction parallelism).

NVIDIA Denver CPU Core 64bit ARMv8 Tegra K1.png

NVIDIA is ratcheting up mobile CPU performance with its Denver cores, but it is also aiming for an efficient chip and has implemented several power saving tweaks. Beyond the decision to go with an in-order execution engine (with DCO hopefully mostly making up for that), the beefy Denver cores reportedly feature low latency power state transitions (e.g. between active and idle states), power gating, dynamic voltage, and dynamic clock scaling. The company claims that “Denver's performance will rival some mainstream PC-class CPUs at significantly reduced power consumption.” In real terms this should mean that the two Denver cores in place of the quad core A15 design in the Tegra K1 should not result in significantly lower battery life. The two K1 variants are said to be pin compatible such that OEMs and developers can easily bring upgraded models to market with the faster Denver cores.

NVIDIA Denver CPU cores in Tegra K1.png

For those curious, In the Tegra K1, the two Denver cores (clocked at up to 2.5GHz) share a 16-way L2 cache and each have 128KB instruction and 64KB data L1 caches to themselves. The 128MB Dynamic Code Optimization cache is held in system memory.

Denver is the first (custom) 64-bit ARM processor for Android (with Apple’s A7 being the first 64-bit smartphone chip), and NVIDIA is working on supporting the next generation Android OS known as Android L.

The dual Denver core Tegra K1 is coming later this year and I am excited to see how it performs. The current K1 chip already has a powerful fully CUDA compliant Kepler-based GPU which has enabled awesome projects such as computer vision and even prototype self-driving cars. With the new Kepler GPU and Denver CPU pairing, I’m looking forward to seeing how NVIDIA’s latest chip is put to work and the kinds of devices it enables.

Are you excited for the new Tegra K1 SoC with NVIDIA’s first fully custom cores?

Source: NVIDIA

Aftermath of the Fragging Frogs VLAN #7

Subject: General Tech | August 11, 2014 - 03:52 PM |
Tagged: VLAN party, pcper, kick ass, fragging frogs

Several rowboats worth of snacks, a couple of canoe-fulls of assorted beverages and boatloads of fun were had this weekend in the highly successful 7th Fragging Frogs VLAN; if you missed it there will be another chance some day but you really missed an epic event.  There were over 120 Teamspeak connections in a variety of channels and an estimated peak of 78 active participants.  Thanks to AMD there is a new game for the Frogs as well as Plants vs. Zombies Garden Warfare was a hit both to the players and to those watching iamApropos' live stream.  We also gained some ARMA 2 fans which will not only appear again next VLAN but is also in danger of becoming a frequent activity for some members. 

FraggingFrogs.jpg

Once again there was quite a bit of valuable hardware and software given away, the list includes:

AMD

  • AMD Fan Kit (headset, 16 GB USB drive, mouse)
  • AMD Gaming Series RAM - 8 GB of 2133 Mhz
  • MSI Military Class 4 A88XM-E35 FM2+ motherboard *and* A10-7850K APU
  • AMD FX-8350 Processor
  • XFX R9 290 Double D graphics card
  • Several Plants vs Zombies: Garden Warfare Origin codes

AMD Red Team+

  • Murdered Soul Suspect game codes
  • Sniper Elite 3 game codes  

Please stop by this thread to offer your thanks and support for all the hard work put into these events by Lenny, iamApropos, Spazster, Brandito, Cannonaire, AMD and the Frogs in general.

SilverStone's Fanless Nightjar 520W PSU keeps picking up awards

Subject: Cases and Cooling | August 11, 2014 - 02:23 PM |
Tagged: SST-NJ520, SilverStone Tech, Nightjar, Fanless Power Supply, 80 Plus Platinum

Just over a month ago Lee took a look at Silverstone's Fanless Nightjar PSU, giving it a Gold Award.  It has now arrived in [H]ard|OCP's torture chamber so you can see how well it works with a different system.  At the lowest load level of 25% of max the efficiency was a mere 89.79% but at 50%, 75% and 100% loads it stayed well above 90% and the temperature difference was 9C between the lowest load and the highest, impressive for a fanless PSU.  In the end not only was this the best fanless 500W PSU [H] has tested, it was the overall best 500W PSU they've seen, justifying the high asking price.

1406084192F3XTun5qzm_2_9_l.jpg

"What is more quiet than a computer power supply with a fan? You guessed it, a PSU with no fan. This unit has small footprint builds in mind featuring excellent Platinum efficiency, flexible flat cables, and of course no sound profile to speak of. Does the 520 watt rated SilverStone Nightjar hold up when we put it in the incubator?"

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: [H]ard|OCP

Kaveri on Linux

Subject: Processors | August 11, 2014 - 12:40 PM |
Tagged: A10-7800, A6-7400K, linux, amd, ubuntu 14.04, Kaveri

Linux support for AMD's GPUs has not been progressing at the pace many users would like, though it is improving over time but that is not the same with their APUs.  Phoronix just tested the A10-7800 and A6-7400K on Ubuntu 14.04 with kernel 3.13 and the latest Catalyst 14.6 Beta.  This preview just covers the raw performance, you can expect to see more published in the near future that will cover new features such as the configurable TDP which exists on these chips.  The tests show that the new 7800 can keep pace with the previous 7850K and while the A6-7400K is certainly slower it will be able to handle a Linux machine with relatively light duties.  You can see the numbers here.

image.php_.jpg

"At the end of July AMD launched new Kaveri APU models: the A10-7800, A8-7600, and A6-7400K. AMD graciously sent over review samples on their A10-7800 and A6-7400K Kaveri APUs, which we've been benchmarking and have some of the initial Linux performance results to share today."

Here are some more Processor articles from around the web:

Processors

Source: Phoronix