Google Announces the Chromebook, Students To Receive $20 Per Month Subscription

Subject: General Tech, Mobile | May 12, 2011 - 01:52 PM |
Tagged: mobile, laptop, Chromebook

First there was the laptop. Then the notebook. The netbook is the most recent addition to mobile devices with hardware keyboards. That is, until today. Google has officially launched a new cloud OS based mobile device dubbed the ChromeBook.

Samsung_Chromebook.jpg

As a netbook with an operating system that amounts to little more than a web browser, the device purports to not only match the functionality of a "normal" netbook, but surpass it thanks to file storage residing in the cloud, automatic updates to the OS, virtually unlimited applications, and an eight second boot time.

Google further states that the device is capable of all the promises feats while remaining secure. Security is accomplished by several independent strategies. The OS splits up system settings and user settings, and each ChromeBook allows only one "owner" per device. The owner is able to allow other users to log in to the device as well, whether it is with their Google account or as a guest. Guest Mode does not sync or cache data, and all system settings are kept out of the session, including network configuration. Each process is sandboxed in an effort to reduce the likely hood of cross-process attacks. Further, the browser and plugin processes are not given direct kernel interface access. Toolchain hardening seeks to limit exploit reliability and success. The file system has several restrictions, including a read-only root partition, tmpfs-based /tmp, and User home directories that can not have executable files.

Further, ChromeBooks utilize a secure automatic update system and Verified Boot that seeks to eliminate attacks tampering with the underlying code. All updates are downloaded over SSL, and are required to pass various integrity checks. The version number of updates is not allowed to regress, meaning that only updates with a version number higher than those already installed on the system are allowed to install. Further, on the next boot-up, the updates undergo a further integrity check in the form of what Google calls "Verified Boot."

According to Google, Verified Boot "provides a means of getting cryptographic assurances that the Linux kernel, non-volatile system memory, and the partition table are untampered with when the system starts up." The process depends on a "chain of trust" which is created using custom read-only firmware rather than a TPM (Trusted Platform Module) device. The read-only firmware checks the integrity of the writable firmware, and if it passes then the writable firmware is used to check the integrity of the next component in the boot up process. While Verified Boot does not protect against dedicated attackers, it does allow a safe recovery option when re-installing as well as detecting changes made by a successful run-time attack and files or write-able firmware changes made by an attacker with a bootable USB drive.

In future iterations of the OS, Google is pursuing driver sandboxing as well as implementing a secure method for auto-logins. Further, Google states that they are interested in pursuing biometric security if they are able to ensure their authentication software is secure when using low cost hardware. Also on the agenda is implementing a "single signon" system that would allow users to log into third party sites using credentials generated by their Google account.

Acer_ChromeBook.jpg

Hardware running Chrome OS is not new, however. Google's CR-48 notebook has been in the wild for months, allowing thousands of users the chance to try out the new operating system and its accompanying hardware. Both Acer (11.6", $349) and Samsung (12.1", $429 wifi only) have stepped up to the plate and are offering ChromeBooks at launch. What is new; however, is the way in which users are able to purchase the hardware. While consumers will still be able to purchase a ChromeBook from retailers, Google has announced a new subscription option for school and business users. The new subscription service would allow students to receive a ChromeBook for $20 a month, while business users would pay $28 a month.  In order to get the subscription price schools and businesses must enter into a three year contract.  The subscription price includes the "hardware, operating system, updates and cloud-based management" along with online, email, and telephone support directly from Google. The monthly subscription further includes regular hardware refreshes.

It is apparent that Google sees its largest market for ChromeBooks as being large businesses and schools, which can then manage a fleet of ChromeBooks for their users for a much lower cost versus maintaining hundreds of traditional computers. While large IT departments are likely to see the cost benefits, It remains to be seen how consumers will react to this subscription based model. Subscriptions have become more prevalent, with the majority of the US using cell phones with monthly contracts. On the other hand, users --students especially-- are used to buying a computer outright. Will the lure of low cost subscription ChromeBooks be enough to break consumers' traditional thoughts on purchasing computers?  Will students accept remotely administrated computers in exchange for a low cost subscription?

Source: Google

Are you nuts to switch to the Narwhal? Check out Ubuntu 11.04

Subject: General Tech | May 12, 2011 - 11:54 AM |
Tagged: Unity, Ubuntu 11.04, ubuntu, OS, natty narwhal, linux, gnome

Natty Narwhal, officially called Ubuntu 11.04, has arrived on the scene and it brings some changes to the way you will look at Linux.  It was designed to be the first desktop version to dump the Gnome GUI in favour of the Unity interface that has been previously used on netbook and other lower powered machines.  The design its self is fairly minimalistic as you would expect from what it was first implemented as, but not to the point where you won't recognize the familiar dock style interface common to OS X and Win 7.  Ars Technica takes you through a thorough look at the newest Linux and the pluses and minuses of the new GUI.

 

"Ubuntu 11.04, codenamed Natty Narwhal, rose from the depths last week. The update brings a number of significant new features to the Linux-based operating system. It includes a much-improved refresh of the Unity shell and a number of other significant improvements throughout the application stack.

This is the first version of Ubuntu to ship with Unity on the desktop. Due to the far-reaching nature of the changes that accompany the transition to a new desktop shell, this review will focus almost entirely on Unity and how it impacts the Ubuntu user experience. We will also look at how Unity compares with GNOME 3.0 and the classic GNOME experience."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica

Sandy B's Little Sister: New Celeron Details

Subject: General Tech, Processors | May 12, 2011 - 12:34 AM |
Tagged: sandy bridge, celeron

Intel has made a splash with their Sandy Bridge parts; for being in the middle-range they keep up with the higher end of the prior generation in many applications. We have heard rumors of new Atom-level parts from Intel deviating from their on-chip GPU structure that Sandy Bridge promotes. What about the next level? What about Celeron.

11-intel.png

I'm guessing less than an i7.

Details were posted to CPU-World about Intel’s upcoming Sandy Bridge-based Celeron processors. There are three variants listed each supporting Intel’s on-chip GPU. The G440 is a single core part clocked at 1.6 GHz with a 650 MHz GPU where the G530 and G540 are both dual core parts clocked at 2.4 GHz and 2.5 GHz respectively and both with an 850 MHz GPU. The dual core parts have a 2MB L3 cache though the article is inconsistent on whether the single core part has 1 or 2 MB of L3 cache though we will assume 1 MB due to the wording of the article. While the GPU performance differs between the single core and dual core parts both will Turbo Boost to a maximum of 1GHz as need arises.

Functionally the chips will only contain the bare minimum of Sandy Bridge core features like 64-bit and virtualization support. There are still currently no further details on launch date and pricing. But if you are waiting to upgrade your lower end devices rest assured that Sandy B is there for you; at some point, at least.
 

Source: CPU-World

Alenka: The SQL, starring CUDA!

Subject: General Tech, Graphics Cards, Storage | May 11, 2011 - 07:58 PM |
Tagged: SQL, developer, CUDA

Programmers are beginning to understand and be ever more comfortable with the uses of GPUs in their applications. Late last week we explored the KGPU project. KGPU is designed to allow the Linux kernel to offload massively parallel processes to the GPU to offload the CPU as well as directly increase performance. KGPU showed that in terms of an encrypted file system you can see whole multiple increases in read and write bandwidth on an SSD. Perhaps this little GPU thing can be useful for more? Alenka Project thinks so: they are currently working on a CUDA-based SQL-like language for data processing.

10-nv_logo.png

CUDA woulda shoulda... and did.

SQL databases are some of the most common methods to store and manipulate larger sets of data. If you have a blog it almost definitely is storing its information in a SQL database. If you play an MMO your data is almost definitely stored and accessed on a SQL server. As your data size expands and your number of concurrent accesses increases you can see why using a GPU could keep your application running much smoother.

Alenka in its current release supports large data sets exceeding both GPU and system RAM via streaming chunks, processing, and moving on. Its supported primitive types are doubles, longs, and varchars. It is open source under the Apache license V2.0. Developers interested in using or assisting with the project can check out their Sourceforge. We should continue to see more and more GPU-based applications appear in the near future as problems such as these are finally lifted from the CPU and given to someone more suitable to bear.

Gigabyte Launches World's First Z68 Motherboards With Support for mSATA Intel SLC SSDs and Smart Response Tech

Subject: General Tech, Motherboards | May 11, 2011 - 05:23 PM |
Tagged: z68, srt, motherboard, gigabyte

Popular enthusiast motherboard maker Gigabyte has today announced 4 additional motherboards to their already expansive Z68 chipset based lineup at launch.

4277.jpg

In addition to the features discussed in the previous announcement, including Lucid Virtu technology, the four new models feature a mSATA connection for onboard Intel SLC SSDs such as the new Intel 311 20GB SLC SSD. The 20GB drive can be used in conjunction with the Intel Smart Response Technology to boost system performance.

While Intel's SRT technology is also included in the other Gigabyte Z68 Motherboards, these 4 specific models differ in the implementation. Specifically, they allow consumers to attach the small solid state drive directly onto the motherboard. This will free the standard SATA ports of a SRT SSD for another hard drive or optical drive.

news10.jpg

Gigabyte has found as much as a 471% improvement in PC Mark Vantage scores in using a 20GB Intel 311 SLC SSD and a SATA 2 hard drive versus solely a SATA 2 hard drive. PC Perspective also examined Intel's Smart Response Technology and found that in trace based testing, the SLC SSD greatly improved performance once the data had been cached to the SSD. As for improvements in boot performance, PC Perspective found that:

"Boot times were just 3 seconds shy of those achieved with the OS cached on the SSD entirely. Of significant note here is that the SSD 310 was able to edge out (0.5 secs) faster boot times than the SSD 320 *and* the SSD 510, which we tossed in for an additional point of comparison."

Intel's SRT technology can definitely improve performance in the right situations, and Gigabyte is offering even more options to implement it in their newly announced models; the Z68XP-UD3, Z68XP-D3, Z68AP-D3, and Z68P-DS3. The new models are due to be released in June 2011.

Source: Gigabyte

Why does he get to play Deus Ex 3 already!?!

Subject: General Tech | May 11, 2011 - 03:57 PM |
Tagged: revolution, human, gaming, deus ex, Adam Jensen, 3

As further proof that no fairness exists in the universe, a lucky member of Rock, Paper, SHOTGUN has played the first 10 hours of Deus Ex:Human Revolution; not just once, but twice.  If you played the first incarnation then you are probably terrified that it will follow the awful path of the sequel; unless you've taken the Highlander fan stance and pretend that it never happened.  If you've only played the second ...

Anyways, head over as the news is not all bad, though the rampant cut scenes made the (p)reviewer want to do something that would effect his ability to reproduce rather than watch another one played.

dx3.jpg

"Because I am the luckiest man alive, I spent this weekend playing the first ten hours of Deus Ex: Human Revolution, which is starting to look like it’ll be the biggest release of 2011. When I finished those ten hours, I went back and played them again, and have finally managed to compress my thoughts into a handy list of thoughts that’ll occur to you, too, as you play. Five reasons to be hugely excited Deus Ex 3 and five reasons to be knuckle-chewingly nervous await you below."

Here is some more Tech News from around the web:

Gaming

News from the honeycomb hideout, Google's I/O

Subject: General Tech | May 11, 2011 - 01:42 PM |
Tagged: youtube, sandwich, music, ice cream, honeycomb, google, cloud, Android

The fourth Google I/O took place over the past two days and AnandTech was there to bear witness on the keynote speech and other presentations.  As you might well expect Android was the most talked about, the new Honeycomb update was discussed in great detail and with good reason.  The update allows Android powered devices to use USB peripherals in the same way as a PC, powering mice, keyboards and even XBox controllers which is a big change from only being able to be used as a USB device and offers even more for those interested in the Open Accessory Library. 

Others will be more interested in Google's Music Beta which will let you upload your music collection to the web and includes the ability to make playlists and albums as well as gatherig meta artist information.  You can think of it like Amazon's Cloud service, though hopefully more reliable, but as Google seems not to have got the permission of the record companies it may not be.

fry2.jpg

"Google’s I/O 2011 keynote may have suffered from a few choice leaks, namely the new Music service and Ice Cream Sandwich announcement, but Google still managed to include some surprises. Android 3.1, the update to Honeycomb, was announced along with a slew of development platforms, including one committed to bringing better introduction of accessories to Android devices of all types, and a home integration platform based on Android."

Here is some more Tech News from around the web:

Tech Talk

Source: AnandTech

Samaritan Demo Showcases New Unreal Engine 3 Graphical Effects

Subject: General Tech, Graphics Cards | May 11, 2011 - 12:36 AM |
Tagged: UE3, graphics engine, gaming

Since 2006'a Gears of War, Epic Games' Unreal Engine 3 has provided both console and PC gamers hours of game play packed with graphical prowess. The now 5 year old graphics engine has enjoyed constant evolution to remain viable. At 2011's Games Developers Conference, Epic Games unvieled its Samaritan demo, proving to the world that not only could Unreal Engine 3 deliver graphics capable of fully utilizing current gen hardware but a huge evolution in graphical prowess that would require next gen hardware to in order to utilize all of it's features.

Using a three-way SLI GTX 580 powered gaming system, Epic Games was able to showcase some of the engine's newest features.  Taking eight months of development, the engine contains a slew of lighting, reflection, and shadow improvements as well as realistic hair and cloth physics.

Bokeh Depth of Field has been a popular artistic choice in Hollywood Films for many years.  Seen as out of focus but identifiable colored shapes in the background, bokeh objects serve to enhance a scene and influence viewers' moods.  Epic was able to improve upon earlier methods of rendering bokeh objects, though they admit that real time rendering of bokeh objects as seen in Hollywood films will necessitate next gen hardware.  Currently, the bokeh effects will be best used in cutscenes where developers can control and pre-render the objects to the best storytelling effect.

SamaritanPLRIE.png

Epic has also greatly enhanced the ways that light and reflections are handled.  Collectively called Image Based Reflections, Epic has implemented Point Light and Billboard Reflections.  These are then coupled with both static and dynamic Reflection Shadows to achieve a look resembling the real world.  While the graphics horsepower is not available today to allow Epic to mirror the way light works in the real world exactly, they are able to achieve a very close representation.  For example, they are not able to render the road to be as detailed as real life.  The road shown in their Samaritin demo was much less un-uniform.  This is so because the hardware required to calculate reflections on a road as un-uniform as in real life (in real time) is simply not available today.

Read on for more details...

Source: GeForce

Discrete Graphics Card Shipments See Slight Increase Versus Previous Quarter

Subject: General Tech, Graphics Cards | May 10, 2011 - 08:52 PM |
Tagged: nvidia, jpr, gpu, amd

The last quarter of 2010 saw shipments totalling 18.84 million units. In 2011, shipments rose slightly by 2% to 19.03 million add-in cards. According to JPR (Jon Peddie Research), while Q1 of 2011 behaved similarly to past years seasonally, it did not fair as well overall as shipments did not exceed those of Q1 2010. Where AMD increased units shipped by 5.7% versus the previous quarter (Q4 2010), NVIDIA saw a 2% decrease.

JPR notes that while increase in units shipped versus Q4 2010 was rather slight, it remains a positive change due to Q4 2010 behaving irregularly regarding the seasonal cycle.

The increased units shipped further reflect changes in market share for the two largest discrete graphics card makers. Versus last quarter, NVIDIA lost 2.7% of the market while AMD gained 4.4%. JPR states that AMD has gained 16.6% market share while rival NVIDIA lost 8.4 on a year-to-year basis.

 

Q1 2011

Market Share

Q4 2010

Market Share

Market Share

Change Qtr-Qtr

Previous Year

Market Share

Market Share

Change Yr-Yr

AMD 40.46% 38.77% 4.37% 34.65% 16.79%
NVIDIA 59.12% 60.77% -2.71% 64.50% -8.35%
Others 0.42% 0.47% -9.99% 0.85% -50.62%

JRP's reported market shares over time.

John Peddie Research notes that of the 19.03 million discrete graphics cards shipped, NVIDIA was the clear market leader, thanks in part to sales of CUDA and GPU-Compute cards used in scientific and data research.  The add-in board market is further composed of three main segments that amount to the 19.03 million boards shipped.  On the high end rests the enthusiast gamer (approx. 9 million sold per year) and GPU-compute markets which exists as lower volume of sales but higher price per card.  The majority of graphics card shipments come from the mainstream market which is a balance of price and volume.  Finally, the workstation segment which is smaller than even the enthusiast gaming market but traditionally sees higher average asking prices for the hardware that is shipped.

JPR estimates that the add-in market will fall 4.5% to $19.8 billion USD despite positive increases in the number of cards shipped due to "a gradual decline in the ASP."

As the chart illustrates, NVIDIA still remains the market juggernaut, shipping 11.25 million cards; however, AMD has made a lot of headway in the past year.  With both the AMD 6950 and Nvidia 560ti proving to be the cards of choice by many gamers worldwide competition is healthy and enthusiasts have only to benefit from the market's positive increases.

Source:

Corsair Updates Its Builder PSU Series With Three New Models

Subject: General Tech | May 10, 2011 - 04:59 PM |
Tagged: PSU, corsair

 Corsiar recently announced three new power supply models to update its popular CX Builder series. A favorite among many enthusiasts, the new CX V2 models off the same 80 plus certification as well as European Commission “European Commission Energy-Related Product (ErP) directive compliance for guaranteed efficiency and low standby power consumption.” Further, the new V2 models will carry have an extra warranty year on its predecessors, for a total of three years.

Ruben Mookerjee, VP and General Manager for Components at Corsair states that Corsair’s Builder PSU series offers enthusiasts quality and reliability at attractive prices. “Low cost of operation and trouble-free performance are highly desired features at every price point and today we are proud to offer even better efficiency and a longer warranty with the new Builder Series CX V2 PSUs.”

corsaircxv2.png

The new CX V2 models will be available worldwide for purchase in May, including the CX600 V2, CX500 V2, and the CX430 V2. Offering 600, 500, and 430 watts respectively, the models offer the following specifications:

CX600 V2 - 600 Watts

  CX600 V2 CX500 V2 CX430 V2
Connections

1xATX, 1xEPS, 2xPCI-E, 4xMolex, 6xSATA, 1xFloppy

1xATX, 1xEPS, 2xPCI-E, 4xMolex, 5xSATA, 1xFloppy

1xATX, 1xEPS, 1xPCI-E, 3xMolex, 4xSATA, 1xFloppy
Max Current 25A @ +3.3V and +5V, 40A @ +12V, 0.8A @ -12V, 3A @ +5Vsb 25A @ +3.3V, 20A @ +5V, 34A @ +12V, 0.8A @ -12V, 3A @ +5Vsb 20A @ +3.3V and +5V, 28A @ +12V, 0.8A @ -12V, 3A @ +5Vsb
Max Wattage 600 Watts @ 30°C Ambient 500 Watts @ 30°C Ambient 430 Watts @ 30°C Ambient
MSRP $74 USD $64 USD $49 USD

 

Will a Corsiar PSU be part of your next build?

Source: Corsair

Your company lost $7 million last year? Can we buy it for $8.5 billion? Microsoft buys Skype.

Subject: General Tech | May 10, 2011 - 12:51 PM |
Tagged: ballmer, microsoft, boomtown, skype, purchase, billion

The rumour mill really dropped the ball on this one, as just a few hours ago it was Facebook that everyone was muttering would one day buy Skype.  Turns out that in just a few hours the new rumour that Microsoft was going to buy Skype for $7 billion became a reality at an $8.5 billion price tag. 

Skype lost $7 million dollars last year, though that number seems rather small compared to their overall balance sheet to date which puts them $686 million in the hole.  As All Things Digital is quick to point out, that is slightly less than what Microsoft Online Services Division lost last Quarter, proving all things are relative even at very high amounts of dollars.

On the plus side, Microsoft gets its hands on Skype's 763 million registered users, about twice as many as there are MSN users and significantly more that there are XBox Live users.  Toss in the TechNet people and you still have nowhere near the user pool that Skype brings.  That huge increase in the number of people Microsoft can reach possibly gives them the ability to recoup the money they spent to buy them.  Consider that 8 million users pay actual money for their Skype account, which Wired considers as at least a hint of Microsoft's strategy.

Most PC users who already use Windows, such as those at Ars Technica, are scratching their heads over the purchase while Linux users at Slashdot are very concerned about continuing support for the Skype Linux Client.

ballmer.jpg

"The Wall Street Journal reported earlier tonight that Microsoft–in what would be its most aggressive acquisition in the digital space–was zeroing in on buying Skype for $8.5 billion all in with an assumption of the Luxembourg-based company’s debt.

Sources told BoomTown tonight that the deal for the online telephony and video communications giant is actually done and will be announced early tomorrow morning."

Here is some more Tech News from around the web:

Tech Talk

The Unity Linux GUI Controversy and Linux Mint's Decision to Stick With Gnome 2

Subject: Editorial, General Tech | May 9, 2011 - 09:06 PM |
Tagged: OS, linux, GUI

With the release of Ubuntu 11.04, a new desktop environment called Unity was released. Unity promised to revamp the Linux operating system’s desktop GUI to be more user friendly and intuitive. There are a multitude of noticeable changes that Unity brings to Ubuntu’s GUI compared to the classic Gnome environment. A new Windows 7 like task bar stretches along the left side of the screen where small icons of running and pinned applications reside. This new application dock is used instead of the traditional Gnome task bar that ran along the bottom of the screen. Also present is a new Ubuntu button that acts as an application launcher where installed programs can be sorted and searched for. Further, there are improvements to the workspace switcher and changes in window management with new hover-to-reveal scroll bars and each application’s (context sensitive) file menus being relocated to the top of the screen. These and other minor changes in the latest Ubuntu release have caused a flood of controversy among both reviewers and users alike.

Pictured:  Unity GUI (Insert:  Ubuntu Classic GUI)

On the positive side of the issue, there are a number of new and long time users of Ubuntu that have embraced the new GUI for it’s new features and design. Many people migrating from Windows 7 or Mac OS will become accustomed to the interface quickly as it works in much the same manner. Further, users of convertible tablet PCs have an easier time of navigating to applications and windows thanks to the larger icons. Touch and digitizer controls on the Dell Latitude XT worked well out of the box without a need to much with drivers, for example.

In contrast, as a newly developed desktop environment, it is less customizable from a user standpoint than the traditional Gnome GUI. Because of this (at the time of writing) restriction on customizability, many self-proclaimed power users have called Unity a step backwards in the aspects that make Linux a desirable OS--the ability to customize. Mainly, they dislike the constraints that Unity places on their ability to customize the operating system to their liking.

Read on for more...

IEEE seeks to increase Ethernet bandwidth, but to what?

Subject: General Tech | May 9, 2011 - 08:31 PM |
Tagged: IEEE, Ethernet

IEEE is a professional association known for creating technology standards, producing publications, and hosting activities both for educational and professional development. If you are browsing this website on a high speed connection you are almost definitely using IEEE 802.3 or IEEE 802.11 which are more commonly known as Ethernet and WIFI, respectively. IEEE constantly evolves their standards: speeds get faster, WIFI-n allowed you to leave 2.4 GHz, and other changes as needs progress over time.

9-hopen.png

Change for the future.

IEEE recently appointed John D’Ambrosia to chair a group to determine how much demand will be required from Ethernet in the future. This committee could potentially end up producing a standard for Terabit network connections should demand deem it necessary.

The committee is being very cautious this time around with respect to how much speed is required for their next standard. The prior standard, 802.3ab, was discussed in 2005 and determined that 100 Gbps was a necessary advancement. Later it was discovered that many vendors did not require more than 40 Gbps and would delay adoption for several years. Regardless of whether they settle on Terabit or 400 Gigabit, this standard will take years to develop with Terabit taking even longer. Their findings about demand will be published early next year.

Source: IT World

Beauty is in the ears of the beholder; a second opinion on Corsair's SP2500 2.1 speakers

Subject: General Tech | May 9, 2011 - 06:05 PM |
Tagged: tweeter, subwoofer, speakers, sp2500, satellite, corsair, bass, audio, 2.1

Not too long ago Josh reviewed Corsair's 2.1 speaker set which goes by the name of SP2500 and then later Ryan gave it away to a reader.  He was very happy with the way that they performed, but seeing as how judging audio quality can be quite subjective and as most of our readers did not get a set of speakers perhaps a second opinion is a good idea.  To that end you can read about the same set of speakers as they were reviewed over at The Tech Report

TRs_moneyshot.jpg

"Corsair has supplemented its line of audio products with a premium 2.1 speaker set. Is it worth the $250 asking price?"

Here is some more Tech News from around the web:

Audio Corner

AMD's new and improved minimalist BIOS replacement, Coreboot

Subject: General Tech | May 9, 2011 - 11:51 AM |
Tagged: amd, coreboot, uefi, bios, embedded, llano, opteron, s3

A lot of attention is being paid to UEFI, the new graphical BIOS replacement that not only lets you utilize 2TB+ drives as a boot device but will give you mouse control over the games that come integrated with your settings.  It does offer quite a few advantages over the old BIOS but adds complexity as well.  AMD has gone a different route with their Opteron series with Coreboot (aka LinuxBIOS) a different way of initializing a computer.  It does a very minimal hardware initialization and then moves into what is called a payload, which contains the familiar abilities of the BIOS but not integrated directly into the hardware initialization in any way.  This is far more useful for server and embedded applications than the latest ROG board, which is why embedded Llano will be receiving support and why Opteron already does.  Follow the links from The Inquirer for more.

Coreboot_menuconfig1.png

"CHIP DESIGNER AMD has announced that its upcoming Llano accelerated processing unit (APU) will support Coreboot.

AMD has been pushing development the BIOS replacement initiative Coreboot for many years but has focused on getting support for its embedded and server processors. Now the company has come out and said that all of its future processors will support Coreboot, from Llano onwards."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

NVIDIA acquires baseband and RF technology maker Icera

Subject: General Tech, Chipsets, Mobile | May 9, 2011 - 10:56 AM |
Tagged:

NVIDIA is really making moves towards providing the mobile industry with the computing power to bring on better and faster phones.  They took a big hit losing the DMI/QPI license from Intel, though the $1.5 billion court settlement took some of the sting from that loss, the battle essentially spelled the end for NVIDIA's motherboard chipset line.  Only being able to make motherboard chipsets for their main GPU competitor, AMD, might be amusing in an ironic sense but not an economically sound decision. 

icera.png

Tegra saw a change in NVIDIA's target market, suddenly they provided a mobile chip that provided very impressive computing power and did not absorb a huge amount of power.  With the acquistion of Icera they now have a team designing the chips most necessary for a phone to have, RF and baseband transmission.  Perhaps they've a big enough foot in the door of the mobile market that they won't be going anywhere soon.

Icera’s baseband and RF technologies span 2G, 3G and 4G networks. Joining them with our Tegra mobile super chip will result in a powerful combination. Tegra has secured a number of design wins across the mobile device spectrum, and we have extensive relationships across the industry, from device manufacturers to carriers. In short, we can scale Icera’s great innovation. For additional context on Icera’s industry-leading technology, check out this report from Strategy Analytics.

Our OEM partners will reap the benefits of faster time-to-market, better integration and enhanced performance. The deal will also open up a new market to NVIDIA. The $15 billion global market for baseband processors is one of the fastest-growing areas in the industry.

Looking ahead, Icera’s programmable baseband processor architecture will allow NVIDIA and its OEM customers to innovate and adapt signaling algorithms in the rapidly evolving mobile telecommunications market — network responsiveness is critical to delivering on the promise of untethered wireless visual computing. Icera’s highly efficient architecture makes it possible to cleanly integrate their baseband processor into system and software platforms rapidly and, ultimately, into the super chip itself, if that’s the best product approach.

Source: NVIDIA

AMD commits support to coreboot for the foreseeable future

Subject: General Tech, Motherboards | May 7, 2011 - 10:51 PM |
Tagged: coreboot, amd

When you boot your computer, you probably see a splash screen from whatever motherboard manufacturer or system builder you purchased from. Under that splash screen your computer is busily preparing itself to accept your operating system of choice with a lot of proprietary code. coreboot, formerly LinuxBIOS, is an Free Software project first released over a decade ago designed to replace your aforementioned proprietary BIOS with their own lightweight code. They claim boot times to a Linux console of just 3 seconds.

8-amd.png

AMD Embedded: In this article.

Thursday, AMD announced on their blog that they have committed to supporting coreboot for all future products starting with Llano APU. They claim that support will continue for the foreseeable future for both features and products.

We are not expecting our readers to replace their BIOSes with coreboot except for a small segment of hardcore enthusiasts with a decent understanding of C. That said, the motivation of coreboot is not currently in the consumer market: the embedded market is the focus and AMD’s pledge of continued support should mean that cash registers, kiosks, and set-top boxes will have a little more AMD inside driving them.

Source: AMD Blog

KGPU lets the Linux kernel harness your GPU's power

Subject: General Tech, Graphics Cards | May 6, 2011 - 05:25 PM |
Tagged: linux, kgpu, gpgpu

PC Per has discussed using the GPU as a massively-parallel augment to the CPU for a very long time to allow the latter to focus on the branching logic (“if/then/else”) and other processes it is good at that GPUs are not. AMD and Intel both have their attempts to bundle the benefits of a GPU on to their CPU parts with their respective technologies. Currently most of the applications outside of the scientific community are gaming and multimedia; however, as the presence of stronger GPUs saturates, we are seeing more and more functions relegate to the GPU.

7-TuxGpu.png

So happy together!

KGPU is an attempt to bring the horsepower of the GPU to the fingertips of the Linux kernel. While the kernel itself will remain a CPU function, the attempt allows the kernel to offload the parallel stuff to the GPU for large speed-ups and keep the CPU free for more. Their current version shows whole multiple speedups of eCryptfs, an encrypted filesystem, in terms of maximum read and write bandwidth by allowing the GPU to deal with the AES cipher.

We should continue to see speedups as tasks that would be perfect for the GPU are finally allowed to be with their true love. Furthermore, as the number of tasks relegated to the GPU increases we should continue to see more and stronger GPUs embedded in PCs which should decrease the fears for PC game developers worried about the number of PCs capable of running their applications. I am sure that is great news to many of our frequent readers.

Source: KGPU Project

Good news from TSMC, their new 12 inch Fab is ahead of the game

Subject: General Tech | May 6, 2011 - 12:35 PM |
Tagged: fab, TSMC, 12, inch

At 10 million 8-inch equivalent wafers produced in 2010 and an expected 20 million by 2015 it is a good thing that not only is TSMC not having major production issues anymore but it also ahead schedule with the setup of Fab 15, which will be producing 28nm chips on 12 inch wafers.  Moving from 8 to 12 inches should also mean less cost per chip, though whether the savings will be absorbed by the costs of the new fab or if they will be passed straight on to the consumer is a question that cannot be answered until summer next year when they expect to get production capacity up to full speed.  DigiTimes has the scoop here.

tsmc-28nm-wafer5.jpg

"Taiwan Semiconductor Manufacturing Company (TSMC) has begun equipment move-in for the phase 1 facility of a new 12-inch fab (Fab 15) with volume production of 28nm technology products slated for the fourth quarter of 2011, according to the foundry.

TSMC previously said it would begin moving equipment into the facility in June, and expected volume production to kick off in the first quarter of 2012.

Pilot runs at the phase 1 facility of Fab 15 are expected to start in the third quarter of 2011, following by volume production in the fourth quarter, said Jason Chen, senior VP of worldwide sales and marketing for TSMC, at a company event held on May 5. With new capacity coming online, TSMC will see its combined 12-inch capacity top 300,000 units a month."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes

Anonymous Denies Responsibility For Sony PSN Attack

Subject: General Tech | May 6, 2011 - 09:20 AM |
Tagged: sony, Internet, Data Breach, Anonymous

As Sony analyzed the forensic data of the recent PSN/SOE attack, they discovered a text file named "Anonymous" and containing the phrase "We are legion," according to Network World. As a result of this, Sony even went so far as to accuse the hacker group as the responsible party in hacking the Playstation Network (and stealing customers' information) in a letter to the U.S. congress.

Anonymous responded to the implications brought by Sony today. Network World reports that Anonymous has stated they were not involved in the attack and that "others performed the attack with the intent of making Anonymous look bad." Based on a press release by the hacker group, it's prior victims had motive to irreparably defame the group in the public eye.  Anonymous stated that they have never been involved in credit card theft.  Further, they claim to be an "ironically transparent movement," and had they truly been behind the attack they would have claimed responsibility for their actions.

The press release goes on to state that "no one who is actually associated with our movement would do something that would prompt a massive law enforcement response."  They further claim that the world's standard fare of Internet thieves would have invested interest in making Sony and law enforcement agencies believe it was Anonymous to throw police off of their trail.

The hacker group names such former victims as Palantir, HBGary, and the U.S. Chamber Of Commerce of being organizations that would like to discredit Anonymous.  "Anonymous will continue its work in support of transparency and individual liberty; our adversaries will continue their work in support of secrecy and control," they state in their press release "we are anonymous."

As Anonymous, Sony, and spectators the world over debate, the affected public continues to wait for the true identies of the hackers involved in stealing 77 milion Sony customers' private information to come to light.