Linux Mint Rising In Popularity And Surpassing Ubuntu For Top Spot

Subject: General Tech | November 26, 2011 - 04:53 AM |
Tagged: Unity, ubuntu, mint, linux, katya

Linux Mint Is On The Rise

Ubuntu has long been the popular choice when it comes to Linux distributions. The open source operating system even managed to be picked by large computer OEM Dell for the company’s netbooks and select desktop computers at one time. As far as free alternative operating systems go, Ubuntu was the top choice of many Linux users. Lately; however, the distro seems to be declining in popularity. According to ZDNet, Pingdom has gathered Linux market share data from the past few years and found that the once popular Ubuntu OS has given up a great deal of ground to competing distributions. In particular, Linux Mint has risen to the 11% usage level that Ubuntu held at its prime versus Ubuntu’s current 4% market usage in 2011.

linux-mint-11.png

Linux Mint 11's desktop.

Interestingly, Linux mint started at 0% adoption in 2005 versus Ubuntu’s 11% in that same year where it would grow to 4% in 2007 and grow slowly to 5% in 2010. From there, the adoption grows rapidly to it’s current 11% market usage as of November 23rd 2011 (based on DistroWatch ranking data).

Linux Mint 11 is a very respectable and speedy distribution and is comparatively very media friendly and easy to use out of the box for newcomers. These qualities likely have contributed to the operating system’s place on the Top 5 Linux Distribution list.

24-11-2011-14-39-03.png

Wait- What Happened To Ubuntu?

Ubuntu gained fame due to its friendliness to newcomers, casual users, and enthusiasts/power users alike. Adrian Kingsley-Hughes over at ZDnet notes that the operating system’s popularity is wavering. Linux Fans have cited Ubuntu’s recent interface overhaul-dubbed Unity- as a possible source of the decline in popularity. Kingsley-Hughes believes; however, that in the latest iteration(s) Ubuntu has spread itself too thin by attempting to appeal to too many people at once.

Install_8.png

The Ubuntu 11.10 installation.  One of several slides on everything that is packed in tight in Ubuntu.

On that point I think he is correct. Ubuntu has been attempting to become the Windows equivalent of the Linux space. This goal in and of itself is a noble one; however, it also goes against the grain of the “ideal Linux OS” (meaning the OS that users want to use). Linux itself is (by comparison) a niche operating system, and within that general term spawns numerous Linux distributions that are even further niche and highly specialized products and user experiences.

I have to concur with Mr. Kingsley-Hughes on this one, even with my own personal lackluster (or “meh” in less technical terms ;) ) opinion of Ubuntu’s Unity it’s not bad or difficult enough to get rid of to cause such a drop in usage. The inherent purpose and goal of a Linux distro is to be a highly specialized and customizable user experience that is easily tailored to a specific users’ wants and needs. Ubuntu is falling out of favor with many Linux fans due to it trying too hard to appeal to everyone in a “jack of all trades, master of none” method instead of the perfect distribution for each individual aspect that makes Linux so appealing to users to begin with. Many design and under the hood changes have taken place in Ubuntu to accommodate the mainstream Linux goal(s) and in doing so a lot of users and configurations aren’t as easily obtained with Ubuntu anymore. There’s now more programs included by default and more programs running to maintain the something for everyone system, and that is not what many Linux fans want out of their distributions. They want a distro that only does what they want with as minimal of resources as possible while still being productive for example.

What are your thoughts? Is there a reason for Ubuntu’s decline or is the distro’s time in the spotlight simply over (for now at least)? Have you moved on from Ubuntu? You can read more about the Linux usage data here.

Source: ZDnet

Intel Processors Power The Majority of Top 500 Supercomputers, Looking to Expand With MIC Solutions

Subject: General Tech, Processors | November 25, 2011 - 08:45 PM |
Tagged: xeon, SC11, mic, many integrated core, knights corner, Intel

This year saw the 40th anniversary of (the availability of) the world’s first microprocessor- the Intel 4004 processor- and Intel is as strong as ever. On the supercomputing and HPC (High Performance Computing) front, Intel processors are powering the majority of the Top 500 supercomputers, and at this years supercomputing conference (SC11) the company talked about their current and future high performance silicon. Mainly, Intel talked about its new Intel Xeon E5 family of processors and the new Many Integrated Cores Knights Corner Larrabee successor.

220px-Intel_xeon_e7.jpg

The Intel Xeon E5 is available now.

The new Xeon chips are launching now and should be widely available within the first half of 2012. Several (lucky) supercomputing centers have already gotten their hands on the new chips and are now powering 10 systems on the Top 500 list where the 20,000 Xeon E5 CPUs are delivering a combined 3.4 Petaflops.

According to benchmarks, Intel is expecting a respectable 70% performance increase on HPC workloads versus the previous generation Xeon 5600 CPUs. Further Intel stated that the new E5 silicon is capable of as much as a 2x increase in raw FLOPS performance, according to Linpack benchmarks.

Intel is reporting that demand for the initial production run chips is “approximately 20 times greater than previous generation processors.” Rajeeb Hazra, the General Manager of Technical Computing of Intel’s Datacenenter and Connected Systems Group, stated that “customer acceptance of the Intel Xeon E5 processor has exceeded our expectations and is driving the fastest debut on the TOP 500 list of any processor in Intel’s history.” The company further reiterated several supercomputers that are set to go online son and will be powered by the new E5 CPUs including the 10 Petaflops Stampede computer at the Texas Advanced Computing Center and the 1 Petaflops Pleiades expansion for NASA.

Intel Xeon Top 500.png

While Intel processors are powering the majority of the world’s fastest supercomputers, graphics card hardware and GPGPU software has started to make its way into quite a few supercomputers as powerful companion processors that can greatly outperform a similar number of traditional CPUs (assuming the software can take advantage of the GPU hardware of course). In response to this, Intel has been working on it’s own MIC (Many Integrated Core) solution for a few years now. Starting with Larrabee, then Knights Ferry, and now Knights Corner, Intel has been working on silicon that using numerous small processing cores that can use the X86 instruction set to power highly parallel applications. Examples given by Intel as useful applications for their Many Integrated Core hardware includes weather modeling, tomography, and protein folding.

Intel Many Integrated Core.png

Knights Corner is the company’s latest iteration of MIC hardware, and is the first hardware that is commercially available. Knights Corner is capable of delivering more than 1 Teraflops of double precision floating point performance. Hazra stated that “having this performance now in a single chip based on Intel MIC architecture is a milestone that will once again be etched into HPC history” much like Intel’s first Teraflop supercomputer that utilized 9,680 Pentium Pro CPUs in 1997.

What’s interesting about Knights Corner lies in the ability of the hardware to run existing applications without porting to alternative programing languages like Nvidia’s CUDA or AMD’s Stream GPU languages. That is not to say that the hardware itself is not interesting, however. Knights Corner will be produced using Intel’s Tri-Gate transistors on a 22nm manufacturing process, and will feature “more than 50 cores.” Unlike current GPGPU solutions, the Knights Corner hardware is fully accessible and can be programmed as if the card is it’s own HPC node running a Linux based operating system.

More information on the Knights Corner architecture can be found here. I think it will be interesting to see how well Knights Corner will be adopted for high performance workloads versus graphics cards from Nvidia and AMD, especially now that the industry has already begun adapting GPGPU solutions using such programming technologies like CUDA, and graphics cards are becoming more general purpose (or at least less specialized) in hardware design. Is Intel too late for the (supercomputing market adoption) party, or just in time? What do you think?

Source: Intel

TSMC finds its 28nm dance card a little overbooked

Subject: General Tech | November 25, 2011 - 12:21 PM |
Tagged: TSMC, 28nm, amd, krishna, wichita

The 28nm process is causing a lot of problems for tech companies especially AMD who have cancelled the follow ups to Llano and Ontario, Krishna and Wichita.  Not only have they cancelled the chips but they have switched from GLOBALFOUNDRIES to TSMC to have the replacement chips designed and fabbed.  This is most likely because of the low yields that have been coming out of GLOBALFOUNDRIES with Llano, AMD's most successful recent design.  The low volumes hurt AMD's market share since many companies would not base a product line on a chip that might not be around in volume.  As well a deal is expiring in January which had AMD only paying for good dies, instead of the more usual practice of paying for the entire wafer and dealing with the bad dies as they come

That move might not be as successful as AMD hopes when you look at this article from DigiTimes.  As it turns out TSMC is concerned about their ability to meet the demand for 28nm chips from their customers.  It is not just AMD that is turning to TSMC for 28nm, Altera, NVIDIA, Qualcomm and Xilinx are already customers and Broadcom, LSI Logic and STMicroelectronics may join that crowd.  With so many customers utilizing the same process even small problems on TSMC's lines could lead to big drops in available chips.  Let us hope the days of the 40nm problems at TSMC never come back.

DT_TSMC_f15.jpg

"Taiwan Semiconductor Manufacturing Company (TSMC) continues to see orders heat up for advanced 28nm technology, despite a general slowdown in the semiconductor industry, according to industry sources. Order visibility has stretched to about six months, said the sources.

TSMC is expected to see 28nm processes account for more than 2% of company revenues in the fourth quarter of 2011. The proportion will expand further to over 10% in 2012, as more available capacity coupled with rising customer demand boost the output, the sources indicated."

Here is some more Tech News from around the web:

Tech Talk

 

Source: DigiTimes

The game isn't even out but you can already grab Diablo III branded gear

Subject: General Tech | November 24, 2011 - 11:49 AM |
Tagged: steelseries, input, gaming mouse, diablo iii

When SteelSeries went about creating the Diablo 3 themed mouse, they decided not to just slap a paint job on an existing mouse, they actually added a bit to the design.  A soft coating over the top of the mouse should help cushion your fingers so you can get a few extra thousand clicks out of them for your long Diablo III sessions.  A 5700DPI sensor will see you through not only Diablo but any other game you might want and as the software supports macros you can string together a variety of commands for your avatar.  Also included in Hardware Heavens review is the Diablo III branded Siberia V2 headset; are they just a common drop or are we looking at gold items?  Read on to find out.

steelseries-diablo3-mouse_hand.jpg

"Continuing their great working relationship with Blizzard SteelSeries are back with a new set of gaming peripherals based on the upcoming Diablo 3. We have the Diablo 3 Headset and Mouse on our test bench today and have been taking them through a few dungeon runs to find out how they perform."

Here is some more Tech News from around the web:

Tech Talk

 

Bring that laptop back to life

Subject: General Tech | November 24, 2011 - 11:22 AM |
Tagged: DIY, hack

Laptops have a harder life than desktops, not just because they get knocked around while you are on the move, but the plugs see a lot more action as you unplug your peripherals and power to put it in its case and plug them back in when you get to where you are going.  As a result broken USB ports can be common but can be worked around, as can bent network pins but what about the power plug?   Quite a few people have taken their laptop apart to clean the insides or to upgrade the RAM or other hardware but have you done any soldering inside the case or replaced plastic mounting points?  Hack a Day will take you through a simple fix for a broken power plug on a Satellite which will bring your laptop back from the dead.  This particular model is fixable because the power plug is not directly attached to the circuit board, a design which might be more brittle than direct attachment but does mean you can make these types of repairs.

broken-laptop-500x375.jpg

This might take a little more ingenuity.

"It seems that there’s a whole range of Toshiba Satellite laptop computers that suffer from a power jack design that is prone to breaking. We see some good and some bad in this. The jack is not mounted to the circuit board, so if it gets jammed into the body like the one above it doesn’t hose the electronics. But what has happened here is the plastic brackets inside the case responsible for keeping the jack in place have failed. You won’t be able to plug in the power adapter unless you figure out a way to fix it."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Hack a Day

Batman: Arkham City DX11 Stuttering Issue

Subject: General Tech, Graphics Cards | November 23, 2011 - 03:50 PM |
Tagged: dx11, batman

We have been waiting for Batman: Arkham City for quite some time on the PC, and after weeks of delays, the game was finally released this week, to quite a bit of fanfare.  NVIDIA has been touting the game as the poster child for several technology features like DX11, 3D Vision, PhysX, etc.  It appears that the developers have had some issues though with the release - DX11 features are causing significant stuttering even with high end hardware.

batmanac2.png

Batman doesn't like it when his games are late...and broken.

I put together a quick video comparing the gameplay experience with and without DX11 enabled; you can see it below.  The system specifications for our test bed for this video were:

  • Intel Core i7-965
  • Intel X58 motherboard
  • 6GB DDR3-1600 memory
  • GeForce GTX 580 1.5GB graphics card
  • Driver version: 285.79
  • Windows 7 SP1 x64

The DX11 settings that are causing the issues are tessellation, ambient occlusion and a new type of soft shadow rendering.  When these features are enabled the game experiences noticeable, repeatable and quite annoying stutters both in the actual gameplay and during the integrated benchmark. 

batmanac.png

In our video below you can clearly the see the phenomenon in action. 

On the official Batman: Arkham City forums, the publisher gave the following statement, confirming the broken DX11 implementation.

PC DirectX 11 Issues Please Read
We have received reports of performance issues from players of Batman: Arkham City on PC. After researching the matter, we found that running the game with DX 11 is causing the performance issues. We’re working on a title update to address this matter and expect to make it available in the near future.

In the meantime, a workaround for this issue is to run the game with DX 9 instead of DX 11. Instructions on how to turn off DX 11 are listed below.

We sincerely apologize for any inconvenience with your gameplay experience and thank you all for your patience as we work to resolve this issue.

While we love to see new technologies implemented in games that improve our gameplay experience, we HATE it when it delays games or causes issues like this when released.  Here is hoping that the developer, publisher and driver teams from AMD and NVIDIA can fix this quickly.

The best console port I've played yet ... at least when Skyrim feels like letting me play

Subject: General Tech | November 23, 2011 - 01:08 PM |
Tagged: gaming, elder scrolls V, skyrim, consolitis

Starting Skyrim for the first time was an interesting experience, obviously you once again start as a prisoner but perhaps one with some serious brain damage as reality seems to move in starts and jerks as if your eyes had a stuttering problem.  Eventually the stuttering cleared up, providing a weekends worth of gaming but by Tuesday the stuttering had returned.  It became clear that it was time to embark on every PC gamers favourite pastime; troubleshooting the game you just bought in the hopes of some day playing it.

Some troubleshooting revealed a serious case of consolitis, the game was not Large Address Aware and limited its self to a maximum serving of 2GB; the adoption of 64bit versions of Windows being very limited by the end of 2011.  Even more damning was what happened when Intel's SpeedStep technology was enabled in the BIOS, the CPU would dip to about 60% of its maximum frequency when you played the game and the process would use under 10% of a core, maybe two if you were lucky.  GPU usage was variable and was sometimes actually sitting at or above 90% usage, but for the most part varied widely. 

A little research showed that SandyBridge owners and those with the previous generation of chips who overclocked above 4GHz were not having many problems, proving that the brute force method of overcoming consolitis could work.   For those who haven't upgraded yet and are waiting for the new year to do so, they must either wait or find a more elegant solution.  To the intarwebs!

INI file tweaks are always popular and Gamefront has a few, the most notable are bMouseAcceleration=0 and iPresentInterval=0 which disable mouse acceleration and V-Sync respectively.  As well, over at Skyrim Nexus is a modified TESV.exe that makes the game LAA and more importantly does not need to replace the main executable in your Skyrim folder so that you won't need to worry about having a modified executable.  As well adding the string +fullproc to the end of the path in your executable should help Skyrim utilize a bit more of your processor.  In the end though, more tweaking is needed for some PC gamers to fully appreciate the latest Elder Scrolls game and more time needs to be spent researching general tweaks as well as Bethesda specific ones.

If you are experiencing no issues with Skyrim, but would like to tweak it to look better then [H]ard|OCP offers a guide for a variety of tweaks and you can grab a variety of game mods from GameFront.

GameRage2.gif

Now if only BF3 multiplayer would stop locking with a loud noise that sounds suspiciously like a raspberry.

Here is some more Tech News from around the web:

Gaming

 

Less is more; Microsoft questions the need for people to have a clue when installing an OS

Subject: General Tech | November 23, 2011 - 11:48 AM |
Tagged: microsoft, windows 8, pebkac

It seems that Microsoft sees no problem in letting the non-technical upgrade their operating system without having to answer even the simple questions that were present in the Windows 7 upgrade.  It seems they've decided there is an untapped market of people who are desperate to go out and purchase a copy of the newest Windows so that they can upgrade their own machine by themselves.  For anyone who has had a discussion with a friend or family member who expressed utter shock when told that Windows costs money and doesn't just come for free on new computers; this new market seems unlikely.

Whether a good idea or not, The Register reports Win 8 will install in two different ways.  The first is a streamlined upgrade, which you start from your current version of Windows via an EXE file, instead of having to deal with one of those pesky bootable USB or DVD drives.  Microsoft hopes to reduce the time for even the most extreme upgrade path to under 60 minutes and with hardly any user interaction required.  While this is good for the theoretical market of upgraders scared of reading and understanding messages from their operating system it probably scares most techs who realize they are going to have to support installations in which the user has no idea what when wrong or where.

This also seems to underline the concern many IT professionals feel when looking at Win 8.  Microsoft seems to be ignoring the corporate customers who want the ability to customize Windows installations for their company.  With products like SCCM you can make images which can install essentially unattended over your corporate network, but not without serious work done by people who know exactly what they are installing on.  You don't release a generic build of Windows onto a network where you already know what models of clients are out there, let alone release a build which is intended to work on any machine plugged into the network whatsoever.  There are too many hardware setup permutations to expect that what works on a desktop is going work on a custom Alienware laptop.   It is too early to count Win 8 out yet, as there is a second type of installation which does involve booting from removable media and includes esoteric functions like disk formating, modifiable installation scripts and other scary technical terms that might result in you having to read text and click your mouse. 

clickyclicky.png

"Redmond said it wants to make the upgrade path easier, since the Windows 7 introduction saw some users complaining that the process was too complicated. To ease the introduction of Windows 8, Microsoft will now offer two options for those looking to make the leap to the new OS: a streamlined and an advanced setup. The new format will dramatically decrease upgrade times, Microsoft promised."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

AMD Facebook Giveaway

Subject: General Tech | November 22, 2011 - 02:45 PM |
Tagged: contest, amd

AMD1.jpg

This year AMD is celebrating a gamer’s holiday and offering a gift a day through January 2 including a custom Eyefinity setup valued at over $3,000. A few of the stocking stuffers include:

  • A Custom Eyefinity System valued at over $3,000
  • A FX 8120/990FX/8GB/1TB Super Combo valued at $749
  • 3 Dell UltraSharp 23-inch Monitors with LED
  • 6 AMD FX 8150s
  • 9 Radeon HD 6850s
  • 9 Radeon FleX HD 6870s

Here is the link to AMD’s Gaming Giveaway Facebook tab:

https://www.facebook.com/AMD?sk=app_158415244247092#!/AMD?sk=app_222384104493588

AMD2.jpg

Source: AMD

More memristor magic

Subject: General Tech | November 22, 2011 - 01:31 PM |
Tagged: memristor, hp

One of the new technologies we have been keeping an eye on is the memristor, a transistor whose resistance can be altered and used as a storage medium.  The development of this new technology has been headed by HP and they have some new results to announce, which you can catch at Nanotechweb.  We have already seen a recent game changer, with SSDs based on non-volatile flash memory bringing never before seen data access speeds to the desktop.  Memristors could be the next step, bringing storage access speeds measured in picoseconds not nanoseconds and usable lifetimes of a trillion cycles as opposed to flash which is at best measured in hundreds of thousands.  They may also make volatile flash obsolete as the speeds are faster than current DRAM and SRAM, or perhaps see memory and storage unite into one unit.

NTW_memristor.jpg

"Memristors are promising candidates for future high-density nonvolatile memories given their demonstrated desirable properties such as endurance on the order of 1 trillion cycles, electroforming-free operation, compatibility with complementary metal-oxide-semiconductor (CMOS) processes, and the ability to be integrated in high-density cross-bar arrays. Other envisioned applications include digital logic, synaptic and hybrid circuits. For many of these applications evaluating the high-speed dynamical properties of memristors, including the switching speed, is paramount."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Nanotechweb