TSMC gets AMD's 28nm APU business

Subject: General Tech | June 17, 2011 - 02:24 PM |
Tagged: TSMC, southern islands, northern islands, llano, global foundries, arm, amd, 40nm, 32nm, 28nm

Back in April there was a kerfuffle in the news about a deal penned between AMD, Global Foundries and TSMC.  It is not worth repeating completely as you can follow the story by using the previous link, suffice to say that it did not indicate problems with the relationship between AMD and Global Foundries. 

The previous post was specifically about 40nm and 32nm process chips, however today we hear from DigiTimes that TSMC has scored a deal with AMD for the 28nm Southern Islands APUs of which we have seen much recently.  The 40nm Northern Islands GPUs will also be produced by TSMC.  That leaves a lot of production capabilities free at Global Foundries to work on ARM processors.  

DT_AMD_APU.jpg

"AMD reportedly has completed the tape-out of its next-generation GPU, codenamed Southern Islands, on Taiwan Semiconductor Manufacturing Company's (TSMC) 28nm process with High-k Metal Gate (HKMG) technology, according to a Chinese-language Commercial Times report. The chip is set to expected to enter mass produciton at the end of 2011.

TSMC will also be AMD's major foundry partner for the 28nm Krishna and Wichita accelerated processing units (APUs), with volume production set to begin in the first half of 2012, the report said.

TSMC reportedly contract manufactures the Ontario, Zacate and Desna APUs for AMD as well as the Northern Island family of GPUs. All of these use the foundry's 40nm process technology.

TSMC was quoted as saying in previous reports that it had begun equipment move-in for the phase one facility of a new 12-inch fab (Fab 15) with volume production of 28nm technology products slated for the fourth quarter of 2011. The foundry previously said it would begin moving equipment into the facility in June, with volume production expected to kick off in the first quarter of 2012."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes

Microsoft reiterates stance on 'harmful' WebGL

Subject: General Tech, Graphics Cards, Mobile | June 17, 2011 - 04:35 AM |
Tagged: webgl, microsoft

Microsoft has made substantial efforts lately to increase their support of open standards even to the point of giving them first class treatment ahead of their home-grown formats. Internet Explorer 9 shows the best support for web standards such as HTML 5, CSS, and Javascript that the browser line has ever had. One feature set, however, has been outright omitted from Internet Explorer: WebGL. Microsoft has very recently made a more official statement on the subject, claiming it harmful from a security standpoint.

WebGL-Angel.png

WebGL: Heaven or Hell?

(Image from MrDoob WebGL demo; contains Lucy model from Stanford 3D repository)

WebGL is an API very similar to OpenGL ES 2.0: the API used for OpenGL features in embedded systems, particularly smart phones. The goal of WebGL is to provide a light-weight, CSS obeying, 3D and shader system for websites that require advanced 3D graphics or even general purpose calculations performed on the shader units of the client’s GPU. Mozilla and Google currently have support in their public browsers with Opera and Apple shipping support in the near future. Microsoft has stated that allowing third-party websites that level of access to the hardware is dangerous as security vulnerabilities that formerly needed to be exploited locally can now be exploited from the web browser. This is an area of expertise that Microsoft knows all too well from their past attempts at active(x)ly adding scripting functionality to the web browser evolving into a decade-long game of whack-a-mole for security holes.

But skeptics to Microsoft’s position could easily point to their effort to single out the one standard based on OpenGL, competitor to their still-cherished DirectX standard. Regardless of Microsoft’s motives it seems to put to rest the question of whether Microsoft will be working towards implementing WebGL in any release of Internet Explorer currently in development.

Do you think Microsoft is warning its competitors about its past ActiveX woes, or is this more politically motivated? Comment below (registration not required.)

Source: Microsoft

Intel Enterprise SSDs Specifications

Subject: General Tech, Storage | June 16, 2011 - 03:02 PM |
Tagged: ssd, Intel, enterprise

Intel is currently in the process of releasing their 2011 lineup of solid state hard drives. A lot of news and products came out regarding their consumer 300-series and enthusiast 500-series line however it has been pretty silent regarding their enterprise 700-series products. That has changed recently with the release of specifications as a result of Anandtech’s coverage of the German hardware website ComputerBase.de.

11-intel.png

And how does it compare to OCZ?

Intel will be releasing two enterprise SSDs: the SATA 3 Gbps based 710 SSD codename Lyndonville and the PCI express 2.0 based 720 SSD codename Ramsdale. The SATA based 710 will feature 25nm MLC-HET flash at capacities of 100, 200, and 300 GB. The 710 will have read and write speeds of 270/210 MB/s with 35,000/3300 read and write IOPS at 4KB and a 64MB cache. The PCIe based 720 will feature 34nm SLC flash at capacities of 200 and 400 GB. The 720 will be substantially faster than the 710 with read and write speeds of 2200/1800 MB/s with 180,000/56,000 read and write IOPS at 4KB and a 512MB cache. On the security front the 710 will be encrypted with 128 bit AES encryption where the 720 will be encrypted with 256 bit AES.

While there has been no hint toward pricing of these drives Intel is still expected to make a second quarter release date for their SATA based 710 SSD. If you are looking for a PCI express SSD you will need to be a bit more patient as they are still expected to be released in the fourth quarter. It will be interesting to see how the Intel vs OCZ fight will play out in 2012 for dominance in the PCIe-based SSD space.

Source: Anandtech

Microsoft is probably laughing as AMD speculates the unlikelihood of Intel buying NVIDIA

Subject: General Tech | June 16, 2011 - 12:57 PM |
Tagged: amd, Intel, nvidia

In some sort of bizarre voyeuristic hardware love/hate triangle AMD, Intel and NVIDIA are all semi-intertwined and being observed by Microsoft. Speaking with The Inquirer the VP of product and platform marketing at AMD, Leslie Sobon, stated that there was no chance that Intel would attempt to purchase NVIDIA as AMD did with ATI.  AMD's purchase was less about the rights to the Radeon series as it was taking possession of the intellectual property that ATI owned after a decade of creating GPUs and lead directly to the APUs that AMD has recently released which will likely become their main product.  Intel already has a working architecture that combines GPU and CPU and doesn't need to purchase another company's IP in order to develop that type of product. 

There is another reason for purchasing NVIDIA though, which has very little to do with their discreet graphics card IP and everything to do with Tegra and Fermi which are two specialized products which so far Intel doesn't have an answer for.  A vastly improved and shrunken Atom might be able to push Tegra off of mobile platforms and perhaps specialized SandyBridge CPUs could accelerate computation like the Fermi products do but so far there are no solid leads, only speculation.

If you learn more from your failures than your successes then Intel knows a lot about graphics.

microsoft.jpg

"CHIP DESIGNER AMD believes that it is on a divergent path from Intel thanks to its accelerated processor unit (APU) and that Intel buying Nvidia "would never happen"."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

AMD announces new OpenCL programming tools

Subject: General Tech, Shows and Expos | June 15, 2011 - 09:14 PM |
Tagged: opencl, amd, AFDS

If you are a developer of applications which requires more performance than a CPU alone can provide then you are probably having a gleeful week. Today Microsoft announced their competitor to OpenCL and we have a large write-up about that aspect of their keynote address. If you are currently an OpenCL developer you are not left out, however, as AMD has announced new tools designed to make your life easier too.

OpenCL_Logo.png

General Purpose GPU utilities: Because BINK won't satisfy this crowd.

(Logo trademark Apple Inc.)

AMD’s spectrum of enhanced tools includes:

  • gDEBuger: An OpenCL and OpenGL debugger, profiler, and memory analyzer released as a plugin for Visual Studio.
  • Parallel Path Analyzer (PPA): A tool designed to profile data transfers and kernel execution across your system.
  • Global Memory for Accelerators (GMAC) API: Lets developers use multiple devices without needing to manage multiple data buffers in both the CPU and the GPU.
  • Task Manager API: A framework to manage scheduling kernels across devices. 

These tools and utilities should make the development of software easier and allow more developers to take the risk on the new technology. The GPU has already proven itself worthy of more and more important tasks and it is only a matter of time before it is finally ubiquitous enough that it is a default component as important as the CPU itself. As an ironic aside, that should spur the adoption of PC Gaming given how many people would have sufficient hardware.

Source: AMD

AFDS11: Microsoft Announces C++ AMP, Competitor to OpenCL

Subject: Editorial, General Tech, Shows and Expos | June 15, 2011 - 05:58 PM |
Tagged: programming, microsoft, fusion, c++, amp, AFDS

During this morning's keynote at the AMD Fusion Developer Summit, Microsoft's Herb Sutter went on stage to discuss the problems and solutions involved around programming and developing for multi-processing systems and heterogeneous computing systems in particular.  While the problems are definitely something we have discussed before at PC Perspective, the new solution that was showcased was significant.

014.jpg

C++ AMP (accelerated massive parallelism) was announced as a new extension to Visual Studio and the C++ programming language to help developers take advantage of the highly parallel and heterogeneous computing environments of today and the future.  The new programming model uses C++ syntax and will be available in the next version of Visual Studio with "bits of it coming later this year."  Sorry, no hard release date was given when probed.

Perhaps just as significant is the fact that Microsoft announced the C++ AMP standard would be an open specification and they are going to allow other compilers to integrated support for it.  Unlike C# then, C++ AMP has a chance to be a new dominant standard in the programming world as the need for parallel computing expands.  While OpenCL was the only option for developers that promised to allow easy utilization of ALL computing power in a computing device, C++ AMP gives users another option with the full weight of Microsoft behind it.

016.jpg

To demonstrate the capability of C++ AMP Microsoft showed a rigid body simulation program that ran on multiple computers and devices from a single executable file and was able to scale in performance from 3 GLOPS on the x86 cores of Llano to 650 GFLOPS on the combined APU power and to 830 GFLOPS with a pair of discrete Radeon HD 5800 GPUs.  The same executable file was run on an AMD E-series APU powered tablet and ran at 16 GFLOPS with 16,000 particles.  This is the promise of heterogeneous programming languages and is the gateway necessary for consumers and business to truly take advantage of the processors that AMD (and other companies) are building today. 

If you want programs other than video transcoding apps to really push the promise of heterogeneous computing, then the announcement of C++ AMP is very, very big news. 

Source: PCPer

Steam Now Offering Free To Play Games On Its Digital Download Service

Subject: General Tech | June 15, 2011 - 04:17 PM |
Tagged: steam, PC, gaming, F2P

If you happened to open up the store page in the Steam client or glance at their website, you may have noticed that Steam has made a moderately big announcement. Valve's digital download service now supports Free-to-Play games, which are games that are free to download and play at the basic level; however aesthetic and other upgrades can be purchased via so-called "microtransactions". F2P games on still will be free to download and will not require a credit card to do so.

 

Steam seems excited about the new F2P games.

At launch, the service is featuring five new Free-to-Play games including Champions Online: Free For All, Spiral Knights, Global Agenda: Free Agent, Forsaken World, and Alliance of Valiant Arms.  According to the F2P Steam FAQ, games in which you wish to purchase content will be done through the use of your Steam Wallet.  Further, for any Steam account that does not have at least one purchased (non Free-to-Play) game or a funded Steam Wallet will be considered a "Limited User" and will be restricted in the community features that it is able to access.  Specifically, limited users can create community groups, be added as friends, and chat with other users; however, they are not able to send out friend invitations or start chat sessions (a non-limited user must initiate chat).

In adding the new genre to its repertoire, Steam will greatly increase its digital games library and add more options for PC gamers.  One game that I have not played in some time that I would love to see make its way onto the new Free-to-Play Steam selection is a FPS game called Crossfire.  That game was a good example of Free-To-Play done right as even accounts that did not spend a dime where able to stay competitive.  Is there a Free-to-Play game that you would like to see Steam feature, and do you think F2P will add value to the service?  Let us know in the comments.

Source: Steam

Gaming from E3, starring The Duke

Subject: General Tech | June 15, 2011 - 12:46 PM |
Tagged: gaming, duke nukem, 10 commandments

We need a new joke, the poster boy of vapourware has actually arrived and no one remembers the Phantom console.  You can catch up on all of the reviews of Duke Nukem Forever below the fold, but make sure you don't say anything mean about the game or the PR firm will get you.  There is also a lot of previews from E3 to drool over, many new games offered teases of their unreleased products.

Before you take a look at the games, The Tech Report has recently crafted 10 commandments that all PC games should follow.  Read through them and see which of the new games look to be following the reasonable requirements that they have listed. 

TR_example.jpg

It's beside the Any key, right?

"Picture this for a second: you just unpacked the latest PlayBox 720-X blockbuster game, Gran Gears of Duty Fantasy XVIII. It's a game so juicy and dreamy that it'll send you flying into all the colors of the rainbow, twitching and jerking with pleasure-induced spasms just from looking at the loading screen. Let's assume for the sake of argument that said game is a first-person shooter, like, oh, about 135% of recent releases. You insert the Megaray disc, go about the installation process, and merrily start to play.

All of a sudden, you notice the left stick is used for switching weapons. The right stick moves the character, and shooting is only accomplished by pressing it. The camera is moved with the directional buttons, and the triangle, square, A, and B buttons are used for your character's smartass quips. You enter the menu to change the controls, but you can only navigate them using the motion sensors. After five minutes of furniture-dusting motions, you finally enter the options menu and find out there are barely any options, and none that matter. Frustrated, you throw the TenAxis controller at your 4D TV screen and take the shiny disc out of the console to find out whether it will blend."

Here is some more Tech News from around the web:

Gaming

A look at what ARM could be doing in your server room

Subject: General Tech | June 15, 2011 - 12:16 PM |
Tagged: servers, calxeda, arm

ARM has assembled their own Super Best Friends in a team lead by Calxeda, and composed of Autonomic Resources, Canonical, Caringo, Couchbase, Datastax, Eucalyptus Systems, Gluster, Momentum SI, Opscode, and Pervasive.  This places Ubuntu as the ARM OS of choice for the server room and as it includes companies developing applications for running Cloud services, not only Microsoft should be paying attention; applications like Amazon's EC2 could face new competition as well. 

Calexda's current reference machines pack 120 server nodes with 480 cores in a 2U chassis, a density which even a 1W Atom is going to find hard to match and the 1W Atoms are still a ways away.  They are planning on getting the machines out to clients for testing by the end of the year, Intel's time table is nowhere near that tight.  Read more about the low powered battle for dominance at The Register.

1405_thumb.jpg

"With Intel's top brass bad-mouthing ARM-based servers, upstart server chip maker Calxeda can't let Intel do all the talking. It has to put together an ecosystem of hardware and software partners who believe there's a place for a low-power, 32-bit ARM-based server platform in the data center."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Razer thinks small with their new Ferox 2.1 speakers

Subject: General Tech | June 14, 2011 - 05:29 PM |
Tagged: audio, razer, ferox, 2.1

The Razer Ferox speakers are designed to be portable, a pair of satellites measuring 70x70x64mm (not even 3") which come in a handy carrying case.  They sport batteries that should last about 11 hours that are recharged over a USB connection but still require a 3.5mm jack to carry the audio, something that did not impress t-break in the least.  The sound quality was good for this type of speaker, which equates to unnoticeable bass and decent mid and high end when in use.  If you usually use headphones and simply need a way to share your audio, as opposed to needing new speakers then check out the Ferox, otherwise Razer has better choices as do Corsair and other manufacturers.

t-b_razer_ferox.jpg

"Razer is no stranger to high quality audio equipment, what with the number of high-end stereo and surround headsets over the past years. Their breakthrough hit, the Razer Mako 2.1 THX speakers were one of the best desktop audio speakers at the time, and are still hard pressed to beat till this day. And now with the new Ferox speakers, Razer has entered the world of mobile speakers with a big bang."

Here is some more Tech News from around the web:

Audio Corner

Source: t-break

HP announces 11 models using AMD Vision Technology

Subject: General Tech, Systems | June 14, 2011 - 03:24 PM |
Tagged: llano, hp

8-amd.png

Level up! Llano life increased by 11 HP.

So, AMD is currently having a little shindig right now as you might be aware from recent news posts and news is just a leaking from the rafters. HP recently contacted us to announce that they just expanded both their consumer and business product lines to include 11 new models using “AMD’s latest Vision Technology”. What this means is we can expect a large array of products coming from HP that utilizes the latest generation of AMD CPUs and GPUs from their new Llano-based AMD A-Series product line. Expect a helping of Llano on your HP in the near future.

Source: HP

Getting TCP out of the strange loop it is stuck in

Subject: General Tech | June 14, 2011 - 12:02 PM |
Tagged: http, tcp, spdy, Internet

Google has been working on SPDY, a new protocol which is intended to speed up HTTP without forcing changes to existing websites or protocols.  This application-layer protocol sits between HTTP and TCP, replacing neither instead translating for the application layer and the transport layer to optimize certain parts of the transaction.  Specifically they hope to allow multiple connections over TCP, something that up until now is provided by a workaround in the browser which creates parallel connections as well as getting servers to push data to clients more effectively.  They are also working to reduce latency by reducing the size of the headers that are transported which will be very important in the near future, not only as a way to speed up SSL connections but to help with the increased size of IPv6 headers. 

Up until now SPDY has only been available for Chrome and even then only for certain Google sites which utilize the new translation protocol.  Now Strangeloop is offering an online service as well as hardware which will allow you to implement SPDY without the need to change your website or host.  The Register covers the long overdue change to TCP here.

speedy-gonzales.jpg

"Strangeloop – a Vancouver-based outfit offering an online service for accelerating website load times – has embraced Google's SPDY project, a new application-layer protocol designed to significantly improve the speed of good ol' HTTP."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Intel announces Haswell's new instruction set for 2013

Subject: General Tech, Processors | June 14, 2011 - 02:47 AM |
Tagged: Intel, haswell

Intel’s new processor lines come in two flavors: process shrinks and new architectures. Each revision comes out approximately a year after the prior one alternative between new architectures (tock) and process shrinks (tick). Sandy Bridge was the most recent new architecture which will be followed by Ivy Bridge, a process shrink of Sandy Bridge, and that will be succeeded by Intel’s newest architecture: Haswell.

11-intel.png

I can Haswell?

The instructions added by Intel for their upcoming Haswell architecture are useful for a whole range of applications from image and video processing; to face detection; to database manipulation; to the generation of hashes; as well as arithmetic in general. As you can see the addition of instructions in this revision is quite wide in its scope. Keep in mind that the introduction of a new instruction set does not mean that programs will be optimized to take advantage of the added benefits for some time. However, when programs do start optimizing for the newer architectures it looks as though Haswell’s new offerings will speed up otherwise complicated tasks into a single instruction.

What task would you like to see a speedup on? Comment below.

Source: Intel Blog

Sing the praises of this SteelSeries board in the key of mechanical

Subject: General Tech | June 13, 2011 - 01:52 PM |
Tagged: mechanical keyboard, input, steelseries

Mechanical keyboards seem to be a hot topic, with round ups appearing to deal with all of the new boards coming out.  Hardware Heaven chose to focus on one particular product, the SteelSeries 6Gv2 Mechanical gaming keyboard, which thankfully didn't take 'gaming' to mean sticking extra buttons all along the side.  The Cherry Black MX designed keys are very common amongst these new mechanical keyboards though the n-key rollover, being able to hit an unlimited number of keys and have them properly register, is not something you find on all USB keyboards.  The 6Gv2 can handle multiple keys for you circle strafers and replacing the Windows key on the left hand side with a 'media key' that is disabled in games is a very nice touch.  Check out the full review at Hardware Heaven since there are some negative aspects to the design of this board.

HH_steelseries_6gv2.jpg

"For quite some time the gaming keyboard market has concentrated on products which add macro buttons, re-assignments, profiles, USB and audio pass-through and weighted key actions to enhance the gaming experience. In addition to this we see branded products such as the Razer StarCraft 2 gear and SteelSeries Medal of Honor products however few manufacturers have looked to release high quality mechanical keyboards for the gaming masses.

There have been a few though and these have clearly made an impact with gamers as we are regularly seeing manufacturers launch their own mechanical gaming models. One manufacturer which has historically offered mechanical keyboards for gamers is SteelSeries and they are now back with a new model, the 6Gv2 which we have connected to our system today."

Here is some more Tech News from around the web:

Tech Talk

Don't you love it when Patch Tuesday hits double digits

Subject: General Tech | June 13, 2011 - 11:47 AM |
Tagged: microsoft, patch tuesday, security, windows, internet explorer, silverlight

Tomorrow will see the arrival of 9 critical security patches and 7 recommended ones, covering Windows, IE, Silverlight and Office.  The critical patches all resolve remote code execution vulnerabilities, the recommended vary from the same type as well as privledge escalation and denial of  service vulnerabilities.  WinXP through Win7 as well as server OSes will all be affected so be warned that your Tuesday and Wednesday might not be very fun.  Follow the link from The Register to see Microsoft's pre-release document for yourself.

Adobe, obviously not wanting to seem lazy, is also pushing out a patch for both Reader and Acrobat.

band-aid.jpg

"Microsoft is preparing a bumper Patch Tuesday for next week, with 16 security bulletins that collectively address 34 vulnerabilities.

Nine of the bulletins earn the dread rating of critical, while the other seven grapple with flaws rated as important. All supported versions of Windows will need patching on 14 June along with various server-side software packages and applications, including the .NET framework and SQL Server. Internet Explorer, which is affected by two bulletins, will also need some fiddling under the bonnet."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Mechanical Keyboards: Are they for you? Which one?

Subject: General Tech | June 13, 2011 - 04:15 AM |
Tagged: mechanical keyboard, cherry

There is a large amount of choice when it comes to PC components and input devices are no exception to that assertion. You are probably well aware of the multitude of choices when it comes to non-standard mice in terms of number of buttons and resolution of the optical and/or laser sensor. Keyboards have their own higher performance counterparts as well: not just in terms of how many web and media function buttons can be crammed on them, but also how the keys themselves register a press. Recently Tom’s Hardware reviewed a series of mechanical keyboards based on their switches and gave a lot of background information about what advantages and disadvantages each switch has.

logo_cherry.gif

Are you a mechanical keyboard virgin? Feeling the MX Blues?

(Logo from the Cherry Corporation)

My first couple keyboards were the old IBM model M buckle spring keyboards. Eventually when I got a later computer I moved on to the cheap keyboards and immediately missed my original mechanical keyboards. Years and a little shopping around later, I eventually settled on the Logitech G15v1 as my first attempt at a higher-end gaming keyboard. It was with the G15v1 that I experienced serious limitations to be had with some, particularly non-mechanical, keyboards: I am a left-handed gamer. The Logitech G15v1 was optimized for right handed gamers as a lot of arrow-key combinations with shift or control did not register by the keyboard; Logitech expected, when they designed the keyboard, that everyone’s mouse would be on the right of the keyboard, and thus the further away WSAD keys would be used. Consider playing as a Scout in Team Fortress 2 but not being able to jump sideways and only being able to crouch-walk in a straight line. While each keyboard is designed with a different set of jammable key combinations it was events like those that led me to go overkill and purchase a mechanical keyboard with NKRO attached via PS/2 port.

Do you have any keyboard stories? Comment below. Otherwise, check out Tom’s Hardware’s guide and review to mechanical keyboards.

The Next Generation is 3D HD SMARTBoards

Subject: General Tech, Displays | June 12, 2011 - 05:56 PM |
Tagged: SMART, 3d

SMART has been making interactive whiteboards for quite some time now. An interactive whiteboard is essentially a giant writing tablet similar to a Wacom. This tablet is also a projector screen which is often wall mounted but could be mounted on a cart. SMART Boards attach to PCs by USB and could attach to video and audio out if you purchase one with an attached projector and speakers rather than use your own. Recently SMART announced and released their fifth generation product line complete with a projector supporting HDMI input and active 3D technology.

35-planetsLesson.jpg

IT’S LIKE I CAN TOUCH YOU!

(Image by SMART Technologies)

While I can see this useful for companies that are doing 3D technology during their company, investor, and vendor meetings it seems a little bit unlikely that active 3D will appear in the classroom. It seems quite difficult for me to imagine twenty to forty students each with their own active shutter 3D glasses atop the investment of the 3D interactive whiteboard itself. Also while it might be to support the 3D functionality of the projector it seems quite odd to include HDMI functionality and barely exceed 720p resolution (1280x800) in your highest-end projector.

If an interactive whiteboard is in your interest but were holding out until you can pop things out at your audience the new SMART boards were available since May 25th in North America and May 30th internationally. Prices range between $3000 and $4000 US, computer not included. If you already have a digital whiteboard but want a 3D projector upgrade that will cost just north of 2000$.
Source: SMART

Demand For IT Workers Remains High In US Despite Economy

Subject: General Tech | June 12, 2011 - 01:12 PM |
Tagged: US, technology, networking, IT

The US has seen a rather rapid rise in unemployment in the last few years as companies cut back on staff and computing costs. According to Computer World, Tom Silver has been quoted in saying “several years ago companies cut back pretty far, particularly in infrastructure and technology development.” Silver further believes that the tech unemployment rate is half that of the national unemployment rate due to companies needing to replace aging hardware, software, and deal with increased security threats. 65% of 900 respondents in a recent biannual hiring survey conducted by Dice.com found that hiring managers and head hunters plan on bringing even more new workers into their businesses in the second half of 2011 versus the first half.

Workers with mobile operating system, hardware, and ecosystem expertise and java development skills are the most desirable technology workers, according to Computer World. Although anyone with an IT background and recent programming skills have a fairly good chance of acquiring jobs in a market that is demanding now-rare talent. Employers are starting to be more confident in the economy and thus are more willing to invest in new workers. In an era where Internet security is more important that ever, skilled enterprise IT workers are becoming a valuable asset to employers, who are increasingly fighting for rare talent and incentivizing new workers with increased salaries.

Even though businesses are still remaining cautious in their new hiring endeavors, it is definitely a good sign for people with tech backgrounds who are looking for work as the market is ever so slowly starting to bounce back. For further information on the study, Computer World has the full scoop here.

Are you in or studying to enter into the IT profession? Do you feel confident in the US employers' valuation of their IT workers?

Dell Survey Suggests CEOs Believe Cloud Computing Is A Fad

Subject: General Tech | June 12, 2011 - 10:36 AM |
Tagged: networking, dell, cloud computing

A recent survey conducted during the first two days of the Cloud Expo by Marketing Solutions and sponsored by Dell suggests that IT professionals believe that their less technical CEOs believe cloud computing to be a "fad" that will soon pass. On the other hand, IT departments see the opportunities and potential of the technology. This gap between the two professions, according to Dell, lies in "the tendency of some enthusiasts to overhype the cloud and its capacity for radical change." Especially with a complex and still evolving technology like cloud computing, CEOs are less likely to see the potential benefits and moreso the obstacles and cost to adopt the methods.

The study surveyed 223 respondents from various industries (excluding technology providers), and found that the attitudes of IT professionals and what they felt their respective CEOs' attitudes were regarding "the cloud" were rather different. The pie graphs in figure 1 below illustrate the gap between the two professions mentioned earlier. Where 47% of those in IT see cloud computing as a natural evolution of the trend towards remote networks and virtualization, only 26% of IT believed that CEOs agreed. Also, while 37% of IT professions stated that cloud computing is a new way to think about their function in IT, "37 percent deemed their business leaders mostly likely to describe the cloud as having “immense potential,” contrasted with only 22 percent of the IT pros who said that was their own top descriptor."

Further, the survey examined what both IT professionals and CEOs believed to be obstacles in the way of adopting cloud computing. On the IT professionals' front, 57% believed data security to be the biggest issue, 32% stated industry compliance and governance as the largest obstacle, and 27% thought disaster recovery options to be the most important barrier, contrasted with 51%, 30%, and 22% of CEOs. This comparison can be seen in figure 2 below.

While the survey has handily indicated that enterprises' IT departments are the most comfortable with the idea of adopting cloud computing, other areas of the business could greatly benefit from the technology but are much more opposed to the technology. As seen in figure 3, 66% of IT departments are willing to advocate for cloud computing, only 13% of Research and Development, 13% of Strategy and Business Development, and a mere 5% of Supply Chain Management departments feel that they would move to cloud computing and benefit from the technology.

Dell stated that IT may be able to help in many more functions and departments by advocating for and implementing cloud computing strategies in information-gathering and data-analyzation departments. In doing so, IT could likely benefit the entire company and further educate their CEOs in cloud computing's usefulness to close the gap between the IT professionals' and CEO's beliefs.

You can read more about the Dell study here. How do you feel about cloud computing?

Source: Dell

Windows 8 UI: Wait, Windows 7?

Subject: General Tech | June 12, 2011 - 04:08 AM |
Tagged: windows 8, ImmersiveUI

Microsoft announced and demonstrated their Windows 8 interface a couple of weeks ago and since then there has been some love and some hate for it by various groups. The idea that the new paradigm for icons would display information from the program, particularly in such a fashion, better suits a tablet rather than a traditional desktop interface. Regardless, there would likely be some application for such an interface and you do not need Windows 8 to unofficially have it.

“Start”: must be Windows.

ImmersiveUI developer Sergio James Bruccoleri has released a video to show his pre-beta interface for Windows 7. In his demonstration he showed various websites and programs launched with a little bit of feedback in the tiles such as his Facebook name and Xbox Live gamertag with avatar. Bruccoleri has stated that a public beta is forthcoming with “effects and some cool stuffs.”

Would you find yourself adding this to your Windows desktop? If so, on what device?

Source: WinRumors