Subject: General Tech | May 11, 2011 - 01:42 PM | Jeremy Hellstrom
Tagged: youtube, sandwich, music, ice cream, honeycomb, google, cloud, Android
The fourth Google I/O took place over the past two days and AnandTech was there to bear witness on the keynote speech and other presentations. As you might well expect Android was the most talked about, the new Honeycomb update was discussed in great detail and with good reason. The update allows Android powered devices to use USB peripherals in the same way as a PC, powering mice, keyboards and even XBox controllers which is a big change from only being able to be used as a USB device and offers even more for those interested in the Open Accessory Library.
Others will be more interested in Google's Music Beta which will let you upload your music collection to the web and includes the ability to make playlists and albums as well as gatherig meta artist information. You can think of it like Amazon's Cloud service, though hopefully more reliable, but as Google seems not to have got the permission of the record companies it may not be.
"Google’s I/O 2011 keynote may have suffered from a few choice leaks, namely the new Music service and Ice Cream Sandwich announcement, but Google still managed to include some surprises. Android 3.1, the update to Honeycomb, was announced along with a slew of development platforms, including one committed to bringing better introduction of accessories to Android devices of all types, and a home integration platform based on Android."
Here is some more Tech News from around the web:
- First-tier motherboard makers to ship nearly 400,000 Z68-based motherboards in May @ DigiTimes
- Twitpic Owns your Pictures @ XSReviews
- Linux Kernel Benchmarks Of 2.6.24 Through 2.6.39 @ Linux.com
- A Look At Nouveau Driver Power Usage @ Phoronix
- Finding, Not Searching: How to Use Search Engines Correctly @ TechwareLabs
- Nikon Coolpix P500 Review @ TechReviewSource
- Tech-Reviews Prize Giveaway
- We're giving away a P67 mobo, a GTX 560, and a dozen games @ The Tech Report
Subject: General Tech, Graphics Cards | May 11, 2011 - 12:36 AM | Tim Verry
Tagged: UE3, graphics engine, gaming
Since 2006'a Gears of War, Epic Games' Unreal Engine 3 has provided both console and PC gamers hours of game play packed with graphical prowess. The now 5 year old graphics engine has enjoyed constant evolution to remain viable. At 2011's Games Developers Conference, Epic Games unvieled its Samaritan demo, proving to the world that not only could Unreal Engine 3 deliver graphics capable of fully utilizing current gen hardware but a huge evolution in graphical prowess that would require next gen hardware to in order to utilize all of it's features.
Using a three-way SLI GTX 580 powered gaming system, Epic Games was able to showcase some of the engine's newest features. Taking eight months of development, the engine contains a slew of lighting, reflection, and shadow improvements as well as realistic hair and cloth physics.
Bokeh Depth of Field has been a popular artistic choice in Hollywood Films for many years. Seen as out of focus but identifiable colored shapes in the background, bokeh objects serve to enhance a scene and influence viewers' moods. Epic was able to improve upon earlier methods of rendering bokeh objects, though they admit that real time rendering of bokeh objects as seen in Hollywood films will necessitate next gen hardware. Currently, the bokeh effects will be best used in cutscenes where developers can control and pre-render the objects to the best storytelling effect.
Epic has also greatly enhanced the ways that light and reflections are handled. Collectively called Image Based Reflections, Epic has implemented Point Light and Billboard Reflections. These are then coupled with both static and dynamic Reflection Shadows to achieve a look resembling the real world. While the graphics horsepower is not available today to allow Epic to mirror the way light works in the real world exactly, they are able to achieve a very close representation. For example, they are not able to render the road to be as detailed as real life. The road shown in their Samaritin demo was much less un-uniform. This is so because the hardware required to calculate reflections on a road as un-uniform as in real life (in real time) is simply not available today.
Read on for more details...
Subject: General Tech, Graphics Cards | May 10, 2011 - 08:52 PM | Tim Verry
Tagged: nvidia, jpr, gpu, amd
The last quarter of 2010 saw shipments totalling 18.84 million units. In 2011, shipments rose slightly by 2% to 19.03 million add-in cards. According to JPR (Jon Peddie Research), while Q1 of 2011 behaved similarly to past years seasonally, it did not fair as well overall as shipments did not exceed those of Q1 2010. Where AMD increased units shipped by 5.7% versus the previous quarter (Q4 2010), NVIDIA saw a 2% decrease.
JPR notes that while increase in units shipped versus Q4 2010 was rather slight, it remains a positive change due to Q4 2010 behaving irregularly regarding the seasonal cycle.
The increased units shipped further reflect changes in market share for the two largest discrete graphics card makers. Versus last quarter, NVIDIA lost 2.7% of the market while AMD gained 4.4%. JPR states that AMD has gained 16.6% market share while rival NVIDIA lost 8.4 on a year-to-year basis.
JRP's reported market shares over time.
John Peddie Research notes that of the 19.03 million discrete graphics cards shipped, NVIDIA was the clear market leader, thanks in part to sales of CUDA and GPU-Compute cards used in scientific and data research. The add-in board market is further composed of three main segments that amount to the 19.03 million boards shipped. On the high end rests the enthusiast gamer (approx. 9 million sold per year) and GPU-compute markets which exists as lower volume of sales but higher price per card. The majority of graphics card shipments come from the mainstream market which is a balance of price and volume. Finally, the workstation segment which is smaller than even the enthusiast gaming market but traditionally sees higher average asking prices for the hardware that is shipped.
JPR estimates that the add-in market will fall 4.5% to $19.8 billion USD despite positive increases in the number of cards shipped due to "a gradual decline in the ASP."
As the chart illustrates, NVIDIA still remains the market juggernaut, shipping 11.25 million cards; however, AMD has made a lot of headway in the past year. With both the AMD 6950 and Nvidia 560ti proving to be the cards of choice by many gamers worldwide competition is healthy and enthusiasts have only to benefit from the market's positive increases.
Subject: General Tech | May 10, 2011 - 04:59 PM | Tim Verry
Tagged: PSU, corsair
Corsiar recently announced three new power supply models to update its popular CX Builder series. A favorite among many enthusiasts, the new CX V2 models off the same 80 plus certification as well as European Commission “European Commission Energy-Related Product (ErP) directive compliance for guaranteed efficiency and low standby power consumption.” Further, the new V2 models will carry have an extra warranty year on its predecessors, for a total of three years.
Ruben Mookerjee, VP and General Manager for Components at Corsair states that Corsair’s Builder PSU series offers enthusiasts quality and reliability at attractive prices. “Low cost of operation and trouble-free performance are highly desired features at every price point and today we are proud to offer even better efficiency and a longer warranty with the new Builder Series CX V2 PSUs.”
The new CX V2 models will be available worldwide for purchase in May, including the CX600 V2, CX500 V2, and the CX430 V2. Offering 600, 500, and 430 watts respectively, the models offer the following specifications:
CX600 V2 - 600 Watts
|CX600 V2||CX500 V2||CX430 V2|
1xATX, 1xEPS, 2xPCI-E, 4xMolex, 6xSATA, 1xFloppy
1xATX, 1xEPS, 2xPCI-E, 4xMolex, 5xSATA, 1xFloppy
|1xATX, 1xEPS, 1xPCI-E, 3xMolex, 4xSATA, 1xFloppy|
|Max Current||25A @ +3.3V and +5V, 40A @ +12V, 0.8A @ -12V, 3A @ +5Vsb||25A @ +3.3V, 20A @ +5V, 34A @ +12V, 0.8A @ -12V, 3A @ +5Vsb||20A @ +3.3V and +5V, 28A @ +12V, 0.8A @ -12V, 3A @ +5Vsb|
|Max Wattage||600 Watts @ 30°C Ambient||500 Watts @ 30°C Ambient||430 Watts @ 30°C Ambient|
|MSRP||$74 USD||$64 USD||$49 USD|
Will a Corsiar PSU be part of your next build?
Subject: General Tech | May 10, 2011 - 12:51 PM | Jeremy Hellstrom
Tagged: ballmer, microsoft, boomtown, skype, purchase, billion
The rumour mill really dropped the ball on this one, as just a few hours ago it was Facebook that everyone was muttering would one day buy Skype. Turns out that in just a few hours the new rumour that Microsoft was going to buy Skype for $7 billion became a reality at an $8.5 billion price tag.
Skype lost $7 million dollars last year, though that number seems rather small compared to their overall balance sheet to date which puts them $686 million in the hole. As All Things Digital is quick to point out, that is slightly less than what Microsoft Online Services Division lost last Quarter, proving all things are relative even at very high amounts of dollars.
On the plus side, Microsoft gets its hands on Skype's 763 million registered users, about twice as many as there are MSN users and significantly more that there are XBox Live users. Toss in the TechNet people and you still have nowhere near the user pool that Skype brings. That huge increase in the number of people Microsoft can reach possibly gives them the ability to recoup the money they spent to buy them. Consider that 8 million users pay actual money for their Skype account, which Wired considers as at least a hint of Microsoft's strategy.
Most PC users who already use Windows, such as those at Ars Technica, are scratching their heads over the purchase while Linux users at Slashdot are very concerned about continuing support for the Skype Linux Client.
"The Wall Street Journal reported earlier tonight that Microsoft–in what would be its most aggressive acquisition in the digital space–was zeroing in on buying Skype for $8.5 billion all in with an assumption of the Luxembourg-based company’s debt.
Sources told BoomTown tonight that the deal for the online telephony and video communications giant is actually done and will be announced early tomorrow morning."
Here is some more Tech News from around the web:
- Live from Google I/O 2011's opening keynote! @ Engadget
- Hard drive industry to continue facing shortage in May @ DigiTimes
- 15 Web Browsers Tested and Benchmarked @ Legit Reviews
- Complete Zeus Trojan source code gets leaked @ The Inquirer
- Intel's Cedarview Atom to sport PowerVR graphics @ VR-Zone
- Ninjalane Podcast - Aftermarket Heatsinks Sandy Bridge Memory Summer Vacation
- Start Summer with a Big Bang in association with MSI @ OC3D
With the release of Ubuntu 11.04, a new desktop environment called Unity was released. Unity promised to revamp the Linux operating system’s desktop GUI to be more user friendly and intuitive. There are a multitude of noticeable changes that Unity brings to Ubuntu’s GUI compared to the classic Gnome environment. A new Windows 7 like task bar stretches along the left side of the screen where small icons of running and pinned applications reside. This new application dock is used instead of the traditional Gnome task bar that ran along the bottom of the screen. Also present is a new Ubuntu button that acts as an application launcher where installed programs can be sorted and searched for. Further, there are improvements to the workspace switcher and changes in window management with new hover-to-reveal scroll bars and each application’s (context sensitive) file menus being relocated to the top of the screen. These and other minor changes in the latest Ubuntu release have caused a flood of controversy among both reviewers and users alike.
Pictured: Unity GUI (Insert: Ubuntu Classic GUI)
On the positive side of the issue, there are a number of new and long time users of Ubuntu that have embraced the new GUI for it’s new features and design. Many people migrating from Windows 7 or Mac OS will become accustomed to the interface quickly as it works in much the same manner. Further, users of convertible tablet PCs have an easier time of navigating to applications and windows thanks to the larger icons. Touch and digitizer controls on the Dell Latitude XT worked well out of the box without a need to much with drivers, for example.
In contrast, as a newly developed desktop environment, it is less customizable from a user standpoint than the traditional Gnome GUI. Because of this (at the time of writing) restriction on customizability, many self-proclaimed power users have called Unity a step backwards in the aspects that make Linux a desirable OS--the ability to customize. Mainly, they dislike the constraints that Unity places on their ability to customize the operating system to their liking.
Read on for more...
Subject: General Tech | May 9, 2011 - 08:31 PM | Scott Michaud
Tagged: IEEE, Ethernet
IEEE is a professional association known for creating technology standards, producing publications, and hosting activities both for educational and professional development. If you are browsing this website on a high speed connection you are almost definitely using IEEE 802.3 or IEEE 802.11 which are more commonly known as Ethernet and WIFI, respectively. IEEE constantly evolves their standards: speeds get faster, WIFI-n allowed you to leave 2.4 GHz, and other changes as needs progress over time.
Change for the future.
IEEE recently appointed John D’Ambrosia to chair a group to determine how much demand will be required from Ethernet in the future. This committee could potentially end up producing a standard for Terabit network connections should demand deem it necessary.
The committee is being very cautious this time around with respect to how much speed is required for their next standard. The prior standard, 802.3ab, was discussed in 2005 and determined that 100 Gbps was a necessary advancement. Later it was discovered that many vendors did not require more than 40 Gbps and would delay adoption for several years. Regardless of whether they settle on Terabit or 400 Gigabit, this standard will take years to develop with Terabit taking even longer. Their findings about demand will be published early next year.
Subject: General Tech | May 9, 2011 - 06:05 PM | Jeremy Hellstrom
Tagged: tweeter, subwoofer, speakers, sp2500, satellite, corsair, bass, audio, 2.1
Not too long ago Josh reviewed Corsair's 2.1 speaker set which goes by the name of SP2500 and then later Ryan gave it away to a reader. He was very happy with the way that they performed, but seeing as how judging audio quality can be quite subjective and as most of our readers did not get a set of speakers perhaps a second opinion is a good idea. To that end you can read about the same set of speakers as they were reviewed over at The Tech Report.
"Corsair has supplemented its line of audio products with a premium 2.1 speaker set. Is it worth the $250 asking price?"
Here is some more Tech News from around the web:
- Antec Soundscience Rockus 3D 2.1 Review @ Neoseeker
- Corsair SP2200 2.1 Gaming Speakers Review @ Techgage
- Corsair SP2500 High-power PC Speaker System @ Metku.net
- Thermaltake Tt eSPORTS Shock Spin Gaming Headset Review @ Hi Tech Legion
- Plantronics Gamecom X40 Xbox headset @ XSReviews
- Logitech Ultimate Ears 100 Noise Isolating Earphones @ Tweaktown
- Plantronics Gamecom X95 Xbox @ XSReviews
- Plantronics M100 Bluetooth Headset Review @ eTeknix
- BlueAnt T1 Bluetooth Headset @ Maximum CPU
Subject: General Tech | May 9, 2011 - 11:51 AM | Jeremy Hellstrom
Tagged: amd, coreboot, uefi, bios, embedded, llano, opteron, s3
A lot of attention is being paid to UEFI, the new graphical BIOS replacement that not only lets you utilize 2TB+ drives as a boot device but will give you mouse control over the games that come integrated with your settings. It does offer quite a few advantages over the old BIOS but adds complexity as well. AMD has gone a different route with their Opteron series with Coreboot (aka LinuxBIOS) a different way of initializing a computer. It does a very minimal hardware initialization and then moves into what is called a payload, which contains the familiar abilities of the BIOS but not integrated directly into the hardware initialization in any way. This is far more useful for server and embedded applications than the latest ROG board, which is why embedded Llano will be receiving support and why Opteron already does. Follow the links from The Inquirer for more.
"CHIP DESIGNER AMD has announced that its upcoming Llano accelerated processing unit (APU) will support Coreboot.
AMD has been pushing development the BIOS replacement initiative Coreboot for many years but has focused on getting support for its embedded and server processors. Now the company has come out and said that all of its future processors will support Coreboot, from Llano onwards."
Here is some more Tech News from around the web:
- WebGL in Chrome and Firefox is a serious security risk @ The Inquirer
- Worried about data caps? Here's how to check your usage @ Ars Technica
- Boffins develop method of driving computers insane @ The Register
- AMD's FM1 desktop test board pictured @ VR-Zone
- New Gigabyte board spotted at eTeknix HQ @ eTeknix
- Gigabyte GA-Z68MX-UD2H-B3 pictures @ VR-Zone
- Essential Windows 7 Tweaks: Part 3 @ Computing on Demand
- Blackberry App World is finally here @ t-break
- Roccat Apuri Review @ t-break
Subject: General Tech, Chipsets, Mobile | May 9, 2011 - 10:56 AM | Jeremy Hellstrom
NVIDIA is really making moves towards providing the mobile industry with the computing power to bring on better and faster phones. They took a big hit losing the DMI/QPI license from Intel, though the $1.5 billion court settlement took some of the sting from that loss, the battle essentially spelled the end for NVIDIA's motherboard chipset line. Only being able to make motherboard chipsets for their main GPU competitor, AMD, might be amusing in an ironic sense but not an economically sound decision.
Tegra saw a change in NVIDIA's target market, suddenly they provided a mobile chip that provided very impressive computing power and did not absorb a huge amount of power. With the acquistion of Icera they now have a team designing the chips most necessary for a phone to have, RF and baseband transmission. Perhaps they've a big enough foot in the door of the mobile market that they won't be going anywhere soon.
Icera’s baseband and RF technologies span 2G, 3G and 4G networks. Joining them with our Tegra mobile super chip will result in a powerful combination. Tegra has secured a number of design wins across the mobile device spectrum, and we have extensive relationships across the industry, from device manufacturers to carriers. In short, we can scale Icera’s great innovation. For additional context on Icera’s industry-leading technology, check out this report from Strategy Analytics.
Our OEM partners will reap the benefits of faster time-to-market, better integration and enhanced performance. The deal will also open up a new market to NVIDIA. The $15 billion global market for baseband processors is one of the fastest-growing areas in the industry.
Looking ahead, Icera’s programmable baseband processor architecture will allow NVIDIA and its OEM customers to innovate and adapt signaling algorithms in the rapidly evolving mobile telecommunications market — network responsiveness is critical to delivering on the promise of untethered wireless visual computing. Icera’s highly efficient architecture makes it possible to cleanly integrate their baseband processor into system and software platforms rapidly and, ultimately, into the super chip itself, if that’s the best product approach.