Subject: General Tech | June 17, 2011 - 06:24 PM | Jeremy Hellstrom
Tagged: TSMC, southern islands, northern islands, llano, global foundries, arm, amd, 40nm, 32nm, 28nm
Back in April there was a kerfuffle in the news about a deal penned between AMD, Global Foundries and TSMC. It is not worth repeating completely as you can follow the story by using the previous link, suffice to say that it did not indicate problems with the relationship between AMD and Global Foundries.
The previous post was specifically about 40nm and 32nm process chips, however today we hear from DigiTimes that TSMC has scored a deal with AMD for the 28nm Southern Islands APUs of which we have seen much recently. The 40nm Northern Islands GPUs will also be produced by TSMC. That leaves a lot of production capabilities free at Global Foundries to work on ARM processors.
"AMD reportedly has completed the tape-out of its next-generation GPU, codenamed Southern Islands, on Taiwan Semiconductor Manufacturing Company's (TSMC) 28nm process with High-k Metal Gate (HKMG) technology, according to a Chinese-language Commercial Times report. The chip is set to expected to enter mass produciton at the end of 2011.
TSMC will also be AMD's major foundry partner for the 28nm Krishna and Wichita accelerated processing units (APUs), with volume production set to begin in the first half of 2012, the report said.
TSMC reportedly contract manufactures the Ontario, Zacate and Desna APUs for AMD as well as the Northern Island family of GPUs. All of these use the foundry's 40nm process technology.
TSMC was quoted as saying in previous reports that it had begun equipment move-in for the phase one facility of a new 12-inch fab (Fab 15) with volume production of 28nm technology products slated for the fourth quarter of 2011. The foundry previously said it would begin moving equipment into the facility in June, with volume production expected to kick off in the first quarter of 2012."
Here is some more Tech News from around the web:
- ARM acquires Obsidian Software @ The Inquirer
- Mozilla pushes out final Firefox 5 test build @ The Register
- Sega Hacked @ XSReviews
- Tablets of 2011: What to Look For - June Update @ TechSpot
- A few thoughts on Ultrabooks @ The Tech Report
- Dsabling Windows Pagefile & Hibernation to Reclaim SSD Space @ Techgage
- Overclockers Benchmarking Party II: Where the Bell Tolls!
- Post Computex 2011 - Part 2 @ Bjorn3D
Subject: General Tech, Graphics Cards, Mobile | June 17, 2011 - 08:35 AM | Scott Michaud
Tagged: webgl, microsoft
WebGL: Heaven or Hell?
(Image from MrDoob WebGL demo; contains Lucy model from Stanford 3D repository)
WebGL is an API very similar to OpenGL ES 2.0: the API used for OpenGL features in embedded systems, particularly smart phones. The goal of WebGL is to provide a light-weight, CSS obeying, 3D and shader system for websites that require advanced 3D graphics or even general purpose calculations performed on the shader units of the client’s GPU. Mozilla and Google currently have support in their public browsers with Opera and Apple shipping support in the near future. Microsoft has stated that allowing third-party websites that level of access to the hardware is dangerous as security vulnerabilities that formerly needed to be exploited locally can now be exploited from the web browser. This is an area of expertise that Microsoft knows all too well from their past attempts at active(x)ly adding scripting functionality to the web browser evolving into a decade-long game of whack-a-mole for security holes.
But skeptics to Microsoft’s position could easily point to their effort to single out the one standard based on OpenGL, competitor to their still-cherished DirectX standard. Regardless of Microsoft’s motives it seems to put to rest the question of whether Microsoft will be working towards implementing WebGL in any release of Internet Explorer currently in development.
Do you think Microsoft is warning its competitors about its past ActiveX woes, or is this more politically motivated? Comment below (registration not required.)
Subject: General Tech, Storage | June 16, 2011 - 07:02 PM | Scott Michaud
Tagged: ssd, Intel, enterprise
Intel is currently in the process of releasing their 2011 lineup of solid state hard drives. A lot of news and products came out regarding their consumer 300-series and enthusiast 500-series line however it has been pretty silent regarding their enterprise 700-series products. That has changed recently with the release of specifications as a result of Anandtech’s coverage of the German hardware website ComputerBase.de.
And how does it compare to OCZ?
Intel will be releasing two enterprise SSDs: the SATA 3 Gbps based 710 SSD codename Lyndonville and the PCI express 2.0 based 720 SSD codename Ramsdale. The SATA based 710 will feature 25nm MLC-HET flash at capacities of 100, 200, and 300 GB. The 710 will have read and write speeds of 270/210 MB/s with 35,000/3300 read and write IOPS at 4KB and a 64MB cache. The PCIe based 720 will feature 34nm SLC flash at capacities of 200 and 400 GB. The 720 will be substantially faster than the 710 with read and write speeds of 2200/1800 MB/s with 180,000/56,000 read and write IOPS at 4KB and a 512MB cache. On the security front the 710 will be encrypted with 128 bit AES encryption where the 720 will be encrypted with 256 bit AES.
While there has been no hint toward pricing of these drives Intel is still expected to make a second quarter release date for their SATA based 710 SSD. If you are looking for a PCI express SSD you will need to be a bit more patient as they are still expected to be released in the fourth quarter. It will be interesting to see how the Intel vs OCZ fight will play out in 2012 for dominance in the PCIe-based SSD space.
Subject: General Tech | June 16, 2011 - 04:57 PM | Jeremy Hellstrom
Tagged: amd, Intel, nvidia
In some sort of bizarre voyeuristic hardware love/hate triangle AMD, Intel and NVIDIA are all semi-intertwined and being observed by Microsoft. Speaking with The Inquirer the VP of product and platform marketing at AMD, Leslie Sobon, stated that there was no chance that Intel would attempt to purchase NVIDIA as AMD did with ATI. AMD's purchase was less about the rights to the Radeon series as it was taking possession of the intellectual property that ATI owned after a decade of creating GPUs and lead directly to the APUs that AMD has recently released which will likely become their main product. Intel already has a working architecture that combines GPU and CPU and doesn't need to purchase another company's IP in order to develop that type of product.
There is another reason for purchasing NVIDIA though, which has very little to do with their discreet graphics card IP and everything to do with Tegra and Fermi which are two specialized products which so far Intel doesn't have an answer for. A vastly improved and shrunken Atom might be able to push Tegra off of mobile platforms and perhaps specialized SandyBridge CPUs could accelerate computation like the Fermi products do but so far there are no solid leads, only speculation.
If you learn more from your failures than your successes then Intel knows a lot about graphics.
"CHIP DESIGNER AMD believes that it is on a divergent path from Intel thanks to its accelerated processor unit (APU) and that Intel buying Nvidia "would never happen"."
Here is some more Tech News from around the web:
- Find Out if Your Passwords Were Leaked by LulzSec Right Here @ Gizmodo
- Adobe patches critical bugs in Flash and Reader @ The Register
- Umi, we hardly knew ye: contemplating the fate of the videophone in 2011 @ Ars Technica
- 'A SHARK attacked my ROBOT', gasps ex-Sun exec @ The Register
- We’ve got a real bone to pick with this mouse @ Hack a Day
- Fun Quotes from the AFDS Media Roundtable @ SemiAccurate
Subject: General Tech, Shows and Expos | June 16, 2011 - 01:14 AM | Scott Michaud
Tagged: opencl, amd, AFDS
If you are a developer of applications which requires more performance than a CPU alone can provide then you are probably having a gleeful week. Today Microsoft announced their competitor to OpenCL and we have a large write-up about that aspect of their keynote address. If you are currently an OpenCL developer you are not left out, however, as AMD has announced new tools designed to make your life easier too.
General Purpose GPU utilities: Because BINK won't satisfy this crowd.
(Logo trademark Apple Inc.)
AMD’s spectrum of enhanced tools includes:
- gDEBuger: An OpenCL and OpenGL debugger, profiler, and memory analyzer released as a plugin for Visual Studio.
- Parallel Path Analyzer (PPA): A tool designed to profile data transfers and kernel execution across your system.
- Global Memory for Accelerators (GMAC) API: Lets developers use multiple devices without needing to manage multiple data buffers in both the CPU and the GPU.
- Task Manager API: A framework to manage scheduling kernels across devices.
These tools and utilities should make the development of software easier and allow more developers to take the risk on the new technology. The GPU has already proven itself worthy of more and more important tasks and it is only a matter of time before it is finally ubiquitous enough that it is a default component as important as the CPU itself. As an ironic aside, that should spur the adoption of PC Gaming given how many people would have sufficient hardware.
Subject: Editorial, General Tech, Shows and Expos | June 15, 2011 - 09:58 PM | Ryan Shrout
Tagged: programming, microsoft, fusion, c++, amp, AFDS
During this morning's keynote at the AMD Fusion Developer Summit, Microsoft's Herb Sutter went on stage to discuss the problems and solutions involved around programming and developing for multi-processing systems and heterogeneous computing systems in particular. While the problems are definitely something we have discussed before at PC Perspective, the new solution that was showcased was significant.
C++ AMP (accelerated massive parallelism) was announced as a new extension to Visual Studio and the C++ programming language to help developers take advantage of the highly parallel and heterogeneous computing environments of today and the future. The new programming model uses C++ syntax and will be available in the next version of Visual Studio with "bits of it coming later this year." Sorry, no hard release date was given when probed.
Perhaps just as significant is the fact that Microsoft announced the C++ AMP standard would be an open specification and they are going to allow other compilers to integrated support for it. Unlike C# then, C++ AMP has a chance to be a new dominant standard in the programming world as the need for parallel computing expands. While OpenCL was the only option for developers that promised to allow easy utilization of ALL computing power in a computing device, C++ AMP gives users another option with the full weight of Microsoft behind it.
To demonstrate the capability of C++ AMP Microsoft showed a rigid body simulation program that ran on multiple computers and devices from a single executable file and was able to scale in performance from 3 GLOPS on the x86 cores of Llano to 650 GFLOPS on the combined APU power and to 830 GFLOPS with a pair of discrete Radeon HD 5800 GPUs. The same executable file was run on an AMD E-series APU powered tablet and ran at 16 GFLOPS with 16,000 particles. This is the promise of heterogeneous programming languages and is the gateway necessary for consumers and business to truly take advantage of the processors that AMD (and other companies) are building today.
If you want programs other than video transcoding apps to really push the promise of heterogeneous computing, then the announcement of C++ AMP is very, very big news.
If you happened to open up the store page in the Steam client or glance at their website, you may have noticed that Steam has made a moderately big announcement. Valve's digital download service now supports Free-to-Play games, which are games that are free to download and play at the basic level; however aesthetic and other upgrades can be purchased via so-called "microtransactions". F2P games on still will be free to download and will not require a credit card to do so.
Steam seems excited about the new F2P games.
At launch, the service is featuring five new Free-to-Play games including Champions Online: Free For All, Spiral Knights, Global Agenda: Free Agent, Forsaken World, and Alliance of Valiant Arms. According to the F2P Steam FAQ, games in which you wish to purchase content will be done through the use of your Steam Wallet. Further, for any Steam account that does not have at least one purchased (non Free-to-Play) game or a funded Steam Wallet will be considered a "Limited User" and will be restricted in the community features that it is able to access. Specifically, limited users can create community groups, be added as friends, and chat with other users; however, they are not able to send out friend invitations or start chat sessions (a non-limited user must initiate chat).
In adding the new genre to its repertoire, Steam will greatly increase its digital games library and add more options for PC gamers. One game that I have not played in some time that I would love to see make its way onto the new Free-to-Play Steam selection is a FPS game called Crossfire. That game was a good example of Free-To-Play done right as even accounts that did not spend a dime where able to stay competitive. Is there a Free-to-Play game that you would like to see Steam feature, and do you think F2P will add value to the service? Let us know in the comments.
Subject: General Tech | June 15, 2011 - 04:46 PM | Jeremy Hellstrom
Tagged: gaming, duke nukem, 10 commandments
We need a new joke, the poster boy of vapourware has actually arrived and no one remembers the Phantom console. You can catch up on all of the reviews of Duke Nukem Forever below the fold, but make sure you don't say anything mean about the game or the PR firm will get you. There is also a lot of previews from E3 to drool over, many new games offered teases of their unreleased products.
Before you take a look at the games, The Tech Report has recently crafted 10 commandments that all PC games should follow. Read through them and see which of the new games look to be following the reasonable requirements that they have listed.
It's beside the Any key, right?
"Picture this for a second: you just unpacked the latest PlayBox 720-X blockbuster game, Gran Gears of Duty Fantasy XVIII. It's a game so juicy and dreamy that it'll send you flying into all the colors of the rainbow, twitching and jerking with pleasure-induced spasms just from looking at the loading screen. Let's assume for the sake of argument that said game is a first-person shooter, like, oh, about 135% of recent releases. You insert the Megaray disc, go about the installation process, and merrily start to play.
All of a sudden, you notice the left stick is used for switching weapons. The right stick moves the character, and shooting is only accomplished by pressing it. The camera is moved with the directional buttons, and the triangle, square, A, and B buttons are used for your character's smartass quips. You enter the menu to change the controls, but you can only navigate them using the motion sensors. After five minutes of furniture-dusting motions, you finally enter the options menu and find out there are barely any options, and none that matter. Frustrated, you throw the TenAxis controller at your 4D TV screen and take the shiny disc out of the console to find out whether it will blend."
Here is some more Tech News from around the web:
- Duke Nukem Forever Performance Test @ TechSpot
- Wot I Think: Duke Nukem Forever @ Rock, Paper, SHOTGUN
- Duke Nukem Forever Game Review (PC) @ HardwareHeaven
- Duke Nukem Forever: barely playable, not funny, rampantly offensive @ Ars Technica
- Duke Nukem Forever Review @ Techgage
- Skyrim's dragon battles top off impressive demo @ Ars Technica
- Asura's Wrath is effortlessly weird and weirdly awesome @ Ars Technica
- Gaming Friday – Bulletstorm @ ThinkComputers
- Serious Sam 3 adds no modern gameplay, thank heavens @ Ars Technica
- Star Wars: The Old Republic's lead writer on good Sith, evil Jedi @ Ars Technica
- Section 8: Prejudice Review (PC) @ HardwareHeaven
- DiRT 3 Gameplay Performance @ The Register
- Assassin's Creed Revelations: The Escape From Constantinople Q&A @ HEXUS
- CANVAS Teaser Is Sinister, Awesome @ Rock, Paper, SHOTGUN
- Wot I Think: Red Faction Armageddon @ Rock, Paper, SHOTGUN
- Steam Now Offering Free-To-Play Games @ Slashdot
- A character, not a voice: tuning the narration in Bastion @ Ars Technica
- Silent Hill: Downpour - Xbox 360, PS3 @ HEXUS
- Wii? Maybe U... But I'm Not Sold Yet @ Techgage
- Super Mario 3DS hands-on: the Tanooki suit is back! @ Ars Technica
- Dead or Alive Dimensions 3DS @ Tweaktown
- Duke Nukem Forever Xbox 360 @ Tweaktown
- Icebreaker Hockey for iOS: hockey stripped down to $1 perfection @ Ars Technica
Subject: General Tech | June 15, 2011 - 04:16 PM | Jeremy Hellstrom
Tagged: servers, calxeda, arm
ARM has assembled their own Super Best Friends in a team lead by Calxeda, and composed of Autonomic Resources, Canonical, Caringo, Couchbase, Datastax, Eucalyptus Systems, Gluster, Momentum SI, Opscode, and Pervasive. This places Ubuntu as the ARM OS of choice for the server room and as it includes companies developing applications for running Cloud services, not only Microsoft should be paying attention; applications like Amazon's EC2 could face new competition as well.
Calexda's current reference machines pack 120 server nodes with 480 cores in a 2U chassis, a density which even a 1W Atom is going to find hard to match and the 1W Atoms are still a ways away. They are planning on getting the machines out to clients for testing by the end of the year, Intel's time table is nowhere near that tight. Read more about the low powered battle for dominance at The Register.
"With Intel's top brass bad-mouthing ARM-based servers, upstart server chip maker Calxeda can't let Intel do all the talking. It has to put together an ecosystem of hardware and software partners who believe there's a place for a low-power, 32-bit ARM-based server platform in the data center."
Here is some more Tech News from around the web:
- ARM Fellow takes the mic at AMD event @ The Tech Report
- AMD demos Trinity laptop @ SemiAccurate
- Epson WorkForce 60 Review @ TechReviewSource
- Chameleon WiGig gains momentum @ The Register
- Having fun with Microsoft Windows @ t-break
- Microsoft squeaks on Google Nortel sale @ The Register
- A tour of Zotac's Dongguan factory @ The Tech Report
Subject: General Tech | June 14, 2011 - 09:29 PM | Jeremy Hellstrom
Tagged: audio, razer, ferox, 2.1
The Razer Ferox speakers are designed to be portable, a pair of satellites measuring 70x70x64mm (not even 3") which come in a handy carrying case. They sport batteries that should last about 11 hours that are recharged over a USB connection but still require a 3.5mm jack to carry the audio, something that did not impress t-break in the least. The sound quality was good for this type of speaker, which equates to unnoticeable bass and decent mid and high end when in use. If you usually use headphones and simply need a way to share your audio, as opposed to needing new speakers then check out the Ferox, otherwise Razer has better choices as do Corsair and other manufacturers.
"Razer is no stranger to high quality audio equipment, what with the number of high-end stereo and surround headsets over the past years. Their breakthrough hit, the Razer Mako 2.1 THX speakers were one of the best desktop audio speakers at the time, and are still hard pressed to beat till this day. And now with the new Ferox speakers, Razer has entered the world of mobile speakers with a big bang."
Here is some more Tech News from around the web:
- Radiopaq Duo Headphones Review @ Tech-Reviews
- Thermaltake Shock Headset @ Bjorn3D
- Apacer Audio Steno AU825 MP4 Player Review @ Real World Labs
- SteelSeries 7H Headset for iPod, iPhone and iPad Review @ HardwareHeaven
- Speedlink Xbox 360 Headset Adapter @ XSReviews