AMD announces new OpenCL programming tools

Subject: General Tech, Shows and Expos | June 15, 2011 - 09:14 PM |
Tagged: opencl, amd, AFDS

If you are a developer of applications which requires more performance than a CPU alone can provide then you are probably having a gleeful week. Today Microsoft announced their competitor to OpenCL and we have a large write-up about that aspect of their keynote address. If you are currently an OpenCL developer you are not left out, however, as AMD has announced new tools designed to make your life easier too.

OpenCL_Logo.png

General Purpose GPU utilities: Because BINK won't satisfy this crowd.

(Logo trademark Apple Inc.)

AMD’s spectrum of enhanced tools includes:

  • gDEBuger: An OpenCL and OpenGL debugger, profiler, and memory analyzer released as a plugin for Visual Studio.
  • Parallel Path Analyzer (PPA): A tool designed to profile data transfers and kernel execution across your system.
  • Global Memory for Accelerators (GMAC) API: Lets developers use multiple devices without needing to manage multiple data buffers in both the CPU and the GPU.
  • Task Manager API: A framework to manage scheduling kernels across devices. 

These tools and utilities should make the development of software easier and allow more developers to take the risk on the new technology. The GPU has already proven itself worthy of more and more important tasks and it is only a matter of time before it is finally ubiquitous enough that it is a default component as important as the CPU itself. As an ironic aside, that should spur the adoption of PC Gaming given how many people would have sufficient hardware.

Source: AMD

AFDS11: Microsoft Announces C++ AMP, Competitor to OpenCL

Subject: Editorial, General Tech, Shows and Expos | June 15, 2011 - 05:58 PM |
Tagged: programming, microsoft, fusion, c++, amp, AFDS

During this morning's keynote at the AMD Fusion Developer Summit, Microsoft's Herb Sutter went on stage to discuss the problems and solutions involved around programming and developing for multi-processing systems and heterogeneous computing systems in particular.  While the problems are definitely something we have discussed before at PC Perspective, the new solution that was showcased was significant.

014.jpg

C++ AMP (accelerated massive parallelism) was announced as a new extension to Visual Studio and the C++ programming language to help developers take advantage of the highly parallel and heterogeneous computing environments of today and the future.  The new programming model uses C++ syntax and will be available in the next version of Visual Studio with "bits of it coming later this year."  Sorry, no hard release date was given when probed.

Perhaps just as significant is the fact that Microsoft announced the C++ AMP standard would be an open specification and they are going to allow other compilers to integrated support for it.  Unlike C# then, C++ AMP has a chance to be a new dominant standard in the programming world as the need for parallel computing expands.  While OpenCL was the only option for developers that promised to allow easy utilization of ALL computing power in a computing device, C++ AMP gives users another option with the full weight of Microsoft behind it.

016.jpg

To demonstrate the capability of C++ AMP Microsoft showed a rigid body simulation program that ran on multiple computers and devices from a single executable file and was able to scale in performance from 3 GLOPS on the x86 cores of Llano to 650 GFLOPS on the combined APU power and to 830 GFLOPS with a pair of discrete Radeon HD 5800 GPUs.  The same executable file was run on an AMD E-series APU powered tablet and ran at 16 GFLOPS with 16,000 particles.  This is the promise of heterogeneous programming languages and is the gateway necessary for consumers and business to truly take advantage of the processors that AMD (and other companies) are building today. 

If you want programs other than video transcoding apps to really push the promise of heterogeneous computing, then the announcement of C++ AMP is very, very big news. 

Source: PCPer

AFDS11: Upcoming Trinity APU will use VLIW4 / Cayman Architecture

Subject: Graphics Cards, Processors, Shows and Expos | June 14, 2011 - 08:06 PM |
Tagged: vliw, trinity, llano, fusion, evergreen, cayman, amd, AFDS

Well that was an interesting twist...  During a talk on the next generation of GPU technology at the AMD Fusion Developer Summit, one of the engineers was asked about Trinity, the next APU to be released in 2012 (and shown running today for the very first time).  It was offered that Trinity in fact used a VLIW4 architecture rather than the VLIW5 design found in the just released Llano A-series APU

asdfcaymanstuff.png

A shader unit from the VLIW4-based Cayman architecture

That means that Trinity APUs will ship with Cayman-based GPU technology (6900 series) rather than the Evergreen (5000 series).  While that doesn't tell us much in terms of performance simply because we have so many variables including shader counts and clocks, it does put to rest the rumor that Trinity was going to keep basically the same class of GPU technology that Llano had. 

asdf0a8a.jpg

Trinity notebook shown for the first time today at AFDS.  Inside is an APU with Cayman-class graphics.

AMD is definitely pushing the capabilities of APUs forward and if they can stay on schedule with Trinity, Intel might find the GPU portion of its Ivy Bridge architecture well behind again.

Source: PCPer

AFDS11: ARM Talks Dark Silicon and Computing Bias at Fusion Summit

Subject: Editorial, Processors, Shows and Expos | June 14, 2011 - 05:09 PM |
Tagged: nvidia, Intel, heterogeneous, fusion, arm, AFDS

Before the AMD Fusion Developer Summit started this week in Bellevue, WA the most controversial speaker on the agenda was Jem Davies, the VP of Technology at ARM.  Why would AMD and ARM get together on a stage with dozens of media and hundreds of developers in attendance?  There is no partnership between them in terms of hardware or software but would there be some kind of major announcement made about the two company's future together?

10.jpg

In that regard, the keynote was a bit of a letdown and if you thought there was going to be a merger between them or a new AMD APU being announced with an ARM processor in it, you left a bit disappointed.  Instead we got a bit of background on ARM how the race of processing architectures has slowly dwindled to just x86 and ARM as well as a few jibes at the competition NOT named AMD.

As is usually the case, Davies described the state of processor technology with an emphasis on power efficiency and the importance of designing with that future in mind.  One of the interesting points was shown in regard to the "bitter reality" of core-type performance and the projected DECREASE we will see from 2012 onward due to leakage concerns as we progress to 10nm and even 7nm technologies.

12.jpg

The idea of dark silicon "refers to the huge swaths of silicon transistors on future chips that will be underused because there is not enough power to utilize all the transistors at the same time" according to this article over at physorg.com.  As the process technology gets smaller then the areas of dark silicon increase until the area of the die that can be utilized at any one time might hit as low as 10% in 2020.  Because of this, the need to design chips with many task-specific heterogeneous portions is crucial and both AMD and ARM on that track.

13.jpg

Those companies not on that path today, NVIDIA specifically and Intel as well, were addressed on the below slide when discussing GPU computing.  Davies pointed out that if a company has a financial interest in the immediate success of only CPU or GPU then benchmarks will be built and shown in a way to make it appear that THAT portion is the most important.  We have seen this from both NVIDIA and Intel in the past couple of years while AMD has consistently stated they are going to be using the best processor for the job.  

16.jpg

Amdahl's Law is used in parallel computing to predict the theoretical maximum speed up using multiple processors.  Davies reiterated what we have been told for some time that if only 50% of your application can actually BE parallelized, then no matter how many processing cores you throw at it, it will only ever be 50% faster.  The heterogeneous computing products of today and the future can address both the parallel computing and serial computing tasks with improvements in performance and efficiency and should result in better computing in the long run.

So while we didn't get the major announcement from ARM and AMD that we might have been expecting, the fact that ARM would come up and share a stage with AMD reiterates the message of the Fusion Developer Summit quite clearly: a combined and balanced approach to processing might not be the sexiest but it is very much the correct one for consumers.

Source: PCPer

AFDS11: AMD Demonstrates Trinity Powered Notebook

Subject: Processors, Mobile, Shows and Expos | June 14, 2011 - 12:08 PM |
Tagged: trinity, fusion, APU, AFDS

On stage during the opening keynote at the AMD Fusion Developer Summit 2011, Rick Bergman showed off a notebook that was being powered not by the recently released AMD Llano A-series APUs, but rather the Trinity core due in 2012.

03.jpg

Trinity is the desktop APU for next year that will combine Bulldozer-based x86 CPU cores with an updated DX11 GPU architecture built on the current 32nm process.  Not much else is known about the chip yet but hopefully we'll get some more details this week at the show.

Source: PCPer

RAGE on, PC: PCGamer interviews John Carmack

Subject: General Tech, Shows and Expos | June 11, 2011 - 02:28 PM |
Tagged: john carmack, id, E3

John Carmack was and is one of the biggest faces in videogame engine development since Wolfenstein 3D, Doom, and Quake. He was at E3 to promote his company, iD Software’s, RAGE: their nearest upcoming release. While he was there, PCGamer managed to corner him for a 22 minute interview ranging from RAGE; to the current and future state of PC gaming; to the perceptive effect of input latency and how framerate affects it.

Look at how stable the framerate is!

Some points of interest from the interview include:
  • Texture resolution and memory limitations on consoles
  • Higher end PCs being approximately 10-fold higher performance than the consoles
  • Sandy Bridge is finally barely good enough for integrated graphics to be viable GPUs for games
  • DirectX and OpenGL APIs hold the PC back, looking forward to new movements to access GPU better
  • His interest focuses on the toolset to let the artists do more with less effort
  • PC Gaming is still viable but a minority
  • Input latency is longer than people expect, sometimes up to 100ms and beyond
  • The exciting yet not necessarily crucial nature of newer rendering technologies

John Carmack always has interesting interviews from his very down to Earth and blunt tone. If you have a free half hour and want to hear one of the best game programmers in the world talk about his trade, this is definitely an interview for you.

Source: PCGamer

Razer announces better mouse, better trap in question

Subject: General Tech, Shows and Expos | June 8, 2011 - 07:48 PM |
Tagged: razer, E3

You may have noticed a slew of gaming-related news flooding from various cracks in the internet this week. E3, the Electronic Entertainment Expo, is currently in progress in Los Angeles and much news spawned from its presence. PC Gamers are not left out of the expo, however, as companies like Razer announce their latest wares and technology. While a standard mouse is sufficient for most users there are some who desire extra sensitivity and extra buttons and those are precisely the customers for companies like Razer. Today, Razer announced that two of their upcoming mice would have two independent sensors, one optical and one laser, for enhanced tracking.

26-mamba-4g-2.jpg

If they announce a five sensor Razer, The Onion won. (Image by Razer)

Razer listed a series of benefits to adding a second sensor to their next generation Mamba and Imperator mice:

  • One sensor can calibrate the other to the surface you are using.
  • The user will be able to determine how far away from the surface the mouse will stop tracking.
  • Less latency tracking the surface you are operating on.
  • Higher tracking precision.

While it is possible that you may appreciate those extra features on your mouse the largest factor in your gameplay will not be your hardware. The largest benefit I received switching from a three-button Microsoft mouse to a gaming mouse was the extra thumb buttons which I bound to an AutoHotkey script for single-button scrolling up and down large documents. (Available here if that's something you desire.) If these features speak to you however, check out Razer’s website.

Source: Razer

AMD, IBM, Nintendo: Over ten years of The Wii U nit.

Subject: General Tech, Shows and Expos | June 8, 2011 - 03:06 AM |
Tagged: Nintendo, E3, amd

Nintendo’s hardware manufacturers have been pretty stable for the last two generations of consoles. Following the NEC and SGI pairing of the Nintendo 64, Nintendo roped in the talents of IBM and AMD to create the hardware for the GameCube. With the transition to the Wii, AMD and IBM remained as the hardware producers for Nintendo’s console and with the announcement of the Wii U (the successor to the Wii) that will still remain true.

25-Nintendo.jpg

HOOOOOOOO Wii! (Image by Nintendo)

AMD published a press release to state that the Wii U will contain AMD Radeon HD graphics to power Nintendo’s first entry to the high definition club. AMD touted their experience in multiple display support during their Wii U press release which would be suitable for the LCD monitors embedded in their controllers. IBM also released a statement confirming that they are shipping multi-core 45nm parts for the Nintendo’s next-generation console but did not state any more details such as how many cores or their clock speed.

Nintendo is rarely ever vocal about the specifications of their consoles and this version is no different. For their entire press conference Nintendo did not even show the console itself opting to focus on the controller and software. Beyond the controller, the hardware looks to be comparable to Microsoft and Sony’s offering from the limited info and screen shots we have seen. More info should come up as we approach the Wii U’s launch in a little over a year.

Source: marketwire

ASUS does thin and light right: UX21 and X101 Photos

Subject: Mobile, Shows and Expos | June 5, 2011 - 12:19 AM |
Tagged: x101, ux01, notebook, laptop, computex, asus

ASUS had a lot of new and innovative products on display at Computex, but maybe none as interesting as these two notebooks.  The UX21 was the flagship product for Intel's new "Ultrabook" category and while we have already posted about it earlier, I thought these new photos would be worth sharing.

asus07.jpg

The UX21 is an ultra-thin 1.7cm at its widest and weighs only 1.1 Kg fully loaded.  It will include the ASUS "Instant On" technology, resuming the system in just 5 seconds and is claimed as the first notebooks with a SATA 6G SSD.  

asus05.jpg

Sporting a new ULV Sandy Bridge Core i7 processor, this system won't skimp on performance either if it lives up to its claims.  

More photos and information after the break!!

Source: ASUS

Thermaltake Level 10 GT White, Frio GT and BigWater coolers and USB Power Strip

Subject: General Tech, Cases and Cooling, Shows and Expos | June 4, 2011 - 11:28 PM |
Tagged: computex, thermaltake, frio, level 10, power strip

Thermaltake had its standard booth array of cases, coolers, keyboards, mice, headphones, etc but also had some new items to show us when we stopped by.  The first was a new "Snow Edition" of the Level 10 GT chassis we reviewed back in April.

ttake01.jpg

The case remains mostly unchanged with some USB 3.0 ports up front, 5 "EasySwap" HDD bays and room for some very long graphics cards.  The white color is not paint but rather plastic injected so you won't have to worry about the paint scratching off. 

ttake02.jpg

Next up is the Frio GT CPU cooler - yes the above image is showing you a freaking CPU COOLER.  It supports up to 300 watts of cooling and does so with an enormous amount of heatpipes, fins and airflow.  This cooler will be available in Q4 and should cost you under $100.

ttake03.jpg

Under the two big collections of fins you can see the heatpipes that move the energy from the CPU cores.  Obviously you are going to need to check out your case and motherboard dimensions before picking up a cooler like this as I imagine there are going to be quite a few configurations that are incompatible. 

ttake04.jpg

Thermaltake is also going into the self-contained water cooling direction as well with the internally designed and built BigWater A80.  Thermaltake claims this device will get better results than the competition by including some interesting airflow modifications.  Expect this to be very price competitive and be available in Q3.

ttake05.jpg

A big surprise at the booth was new USB-controlled power strip called the "Wireless USB Control Series".  Besides offering some convenient USB outlets directly on the power strip, this surge protector also has a USB powered remote control that will turn on and off the "Energy Saver" ports with the push of a button.

ttake06.jpg

The remote sits in a little stand on top of your desk so you can power offer your display, printer or other devices all at once and without reaching behind anything.  For those of you that want to go green then this will allow you to do so for a modest cost of $30-40 later this year.

Computex 2011 Coverage brought to you by MSI Computer and Antec

Source: Thermaltake