Subject: General Tech | June 6, 2011 - 11:52 AM | Jeremy Hellstrom
Tagged: nvidia, microsoft, lawyers
It turns out that while NVIDIA did not quite sell its soul to get its GPU into the first XBox, it did give up its right to go out unchaperoned. As part of the deal Microsoft can block any large purchase of NVIDIA shares by another company. If a company tries to purchase 30% or more of NVIDIA's shares then had and still has Microsoft has the right to put kybosh on the deal. A decade ago when the deal was first inked the agreement would have made a lot of sense to Microsoft; they were going to depend on NVIDIA's GPU and did not want to have another company buy a majority share in NVIDIA to get a grip on Microsoft's new gaming console. This deal makes NVIDIA rather unattractive to many companies as the investment of time and money necessary to set up a large deal could be utterly wasted if Microsoft decides it doesn't like the look of NVIDIA's new bedmate. The Inquirer has more here, and are currently awaiting a response to the article from Microsoft.
"AN UNLIKELY BETROTHAL between Microsoft and Nvidia has been uncovered that gives Microsoft the right of first and last refusal to buy Nvidia.
Microsoft entered into an agreement with Nvidia back in 2000 when the chip design outfit was brought in to work on the GPU of what would then become Microsoft's Xbox. That in itself isn't particularly surprising, but Information Week dug up a 10K filing with the Securities and Exchanges Commission (SEC) in which Nvidia reported that Microsoft had first and last rights of refusal should a third party make an offer to buy 30 per cent or more of Nvidia's shares."
Here is some more Tech News from around the web:
- In-flight Internet: the view from 35,000 feet and three years @ Ars Technica
- Skype reverse-engineered and open sourced @ The Register
- Notorious rootkit gets self-propagation powers @ The Register
- UC San Diego Builds Phase-Change Solid-State Drive That's 2 to 7 Times Faster Than NAND @ ExtremeTech
- HTC Dev launched to support OpenSense development @ t-break
- Kodak PlaySport Zx3 Review @ TechReviewSource
- Installing a Server OS in Intel Media Series Motherboards Guide @MissingRemote
- Post Computex 2011 - Part 1 @ Bjorn3D
- Mach Xtreme Displays New JMicron SATA 3 SSD @ The SSD Review
- Fan speed control? BitFenix has an app for that @ The Tech Report
- Sapphire shows A75 mobo, passive Radeons @ The Tech Report
- Computex 2011: The ROG Releases @ AnandTech
- Liveblog: Microsoft's E3 press conference on June 6 @ Ars Technica
- Liveblog: WWDC 2011 keynote on June 6 @ Ars Technica
NVIDIA recently unveiled a new four core CPU for mobile devices at Mobile World Congress which promises to power 2560x1600, 300 DPI displays as well as enable realistic dynamic lighting and physics in mobile games, features that until recently were only possible in the realm of gaming laptops and desktops.
The quad core ARM CPU has been paired with a new 12 core GeForce graphics processing unit. The CPU alone is able to outperform the older Tegra 2 chip by close to 2x. With the additional GPU cores; however, NVIDIA has even more performance, and the ability to implement great looking games for mobile tablets and so called “super phones.”
At a resolution of 1280x800 (according to Engadget), the new Kal-El graphics demo shows off a new game featuring a glowing ball that acts as a truly dynamic light source in addition to realistic cloth physics. Using all four processing cores of the CPU allowed NVIDIA to implement cloth that reacts to the changing gravity of the game in a dynamic- and very realistic looking- manner. The mobile chip saw approximately 80% usage across all cores during the game demo. When NVIDIA disabled two of the CPU cores, the game became nearly unplayable, with the two remaining cores maxed out, the demo’s frame rate dropped to below 15 frames per second.
The new “Tegra Super Chip” will certainly allow mobile game developers to design immersive and realistic looking worlds as well as enhancing consumers’ ability to watch 1080p HD video with ease. The only drawback of the chip seems to be that battery technology is much slower to advance than transistor technology; therefore, it will be interesting to see how the new NVIDIA chip performs in that regard.
Subject: General Tech | May 26, 2011 - 03:30 PM | Jeremy Hellstrom
Tagged: nvidia, flying pigs, duke
SANTA CLARA, Calif. — May XX, 2011 — NVIDIA is pulling out all the stops for the highly-anticipated launch of Duke Nukem Forever on June 11, 2011. Today, we’re announcing that we’re giving one extremely lucky gamer the chance to win an all-expenses VIP trip for two to the official Gearbox launch event in Dallas, Texas. (Contest open to U.S. and Canada (excluding Quebec) residents only.)
Sure to be one of the coolest – and maybe the rowdiest – parties of the summer (it’s a Duke party, after all!), the Gearbox Community Day/Duke Nukem Launch Event will blow into Dallas Texas at the Palladium Ballroom. If you’re deemed worthy enough to hang with the King and his court, here’s some of the cool stuff you’re in for:
- Test your metal playing Duke Nukem Forever alongside the Gearbox crew
- Rub elbows with the Gearbox and NVIDIA development teams
- Party at an exclusive NVIDIA reception before the launch event (Monday, June 13)
- Hear from the people behind some of Gearbox’s biggest games: Duke Nukem Forever, Borderlands, Aliens: Colonial Marines, Brothers in Arms, and more
- Get exclusive sneak-peeks at new, never-before-seen Gearbox materials
- Join the partying and mayhem
Do you think you’ve got what it takes to party with Duke? If so, what are you waiting for? Christmas? Go to http://www.geforce.com/Community/Rewards right now to enter!
Oh yea, one thing...you need to be 21 years old to party with Duke. That’s just how he rolls. Sorry kids!
By the way, be sure your PC system is cranked up and ready for Duke. To help, NVIDIA and EVGA have teamed up to bring you Duke’s “Fully Loaded Package,” a special edition Duke Nukem Forever bundle that includes a full PC copy of Duke Nukem Forever for the PC, an EVGA GeForce® GTX 560 graphic card, a limited edition DNF art book and mouse pad, and a custom Duke Nukem “Radioactive” belt buckle.
That should complete your arsenal, giving you everything you’ll need to wipe out aliens that have been stealing Earth’s women and drinking Duke’s beer. Get Duke’s Fully Loaded Package now.
But wait, there’s one more thing….
As NVIDIA’s special guest, you’ll be one of the first gamers on the planet to see the unveiling of the brand new, official Duke Nukem Forever mod. This bad boy packs more NVIDIA GeForce GTX power than a Devastator, features a mind-blowing Duke-themed chassis, and will show Duke in all his glory in breathtaking 3D with NVIDIA 3D Vision technology. You haven’t seen a mod like this before…anywhere!
Good luck! And, we hope to see you in Texas!
Subject: Editorial, General Tech | May 26, 2011 - 02:04 PM | Ken Addison
Tagged: R6970, podcast, nvidia, Intel, firepro, amd, 990x, 990fx
PC Perspective Podcast #156- 5/26/2011
This week we talk about the AMD FirePro V7900 and V5900, MSI R6970 Lightning, Intel i7-990x,Viewer questions and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store
- RSS - Subscribe through your regular
- MP3 - Direct download link to the MP3 file
Hosts: Jeremy Hellstrom, Josh Walrath and Allyn Malventano
This Podcast is brought to you by MSI
- 0:00:45 Introduction
- 1-888-38-PCPER or firstname.lastname@example.org
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 0:01:45 AMD FirePro V7900 and V5900 Professional Graphics Review
- 0:05:25 MSI R6970 Lightning Review: A Supercharged AMD Radeon HD 6970
- 0:16:35 This Podcast is brought to you by MSI
, and their all new Sandy Bridge Motherboards!
- 0:17:20 Intel Core i7-990X Gulftown Processor and DX58SO2 Motherboard Review
- 0:22:30 NVIDIA GTX 560 Review Coming soon
- 0:24:36 Cray Announces AMD Bulldozer CPU and NVIDIA Tesla GPU Supercomputer Capable of 50 Petaflops
- 0:27:38 Sneak Peak at the MSI 990FXA-GD65
- 0:30:13 ASUS Sabertooth Motherboard Supporting 990FX Chipset Pictured
- 0:32:33 Email from Jeff about Bulldozer leaks
- 0:36:00 Revisiting quad-gpus and the Law of Diminishing Returns
- 0:39:58 Leaking Llano and Bulldozer prices
- 0:43:57 Corsair Hydro H80 and H100 Water Coolers On The Horizon
- 0:46:30 Email from Bernie about desktop standby mode
- 0:51:05 Email from Jesse about overclocking temps
- 0:53:40 Hardware / Software Pick of the Week
- Ryan: Lucid Virtu - its working
- Jeremy: Remember Action Quake? How 'bout a little Action Half Life 2, still less buggy than your average commercial release and better supported.
- Josh: DiRT 3: Love it.
- 0:59:30 THIS JUST IN: Asus ROG Matrix GeForce GTX 580 Graphics Card Details Leaked
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 1:02:00 Closing
Introduction and Design
Viewed from a bird’s eye, gaming laptops seem to be a homogenous bunch. Although there are rare exceptions like the Alienware M11x, most are 15.6” or 17” models with quad-core processors and discrete mobile graphics, most frequently the Nvidia GTX 460M. The two gaming laptops we’ve most recently reviewed, the ASUS G53 and MSI GT680R, most certainly fit into this mold.
Upon closer inspection, however, the market for gaming laptops begins to expand and multiply into a wide array of options. While the big players like ASUS, Toshiba and MSI are happy to offer their pre-configured models with roughly similar hardware, customized rigs are as numerous as stars in the sky. Everyone has heard of Alienware, of course, but you may not have heard of companies like Origin, Falcon Northwest, AVADirect, AFactor Gaming, Malibal, Digital Storm and Maingear, just to name a few (or if you have, you may have only heard of their desktops).
Maingear’s eX-L15 is a stereotypical example of a custom gaming laptop. It’s big and it’s bulky, but its appearance is not much different from your average laptop. Inside, however, there is a buffet of high-end hardware.
X Fastest, a Chinese language technology website today posted images of the Asus ROG Matrix version of the NVIDIA GTX 580 graphics card. The three-slot (no, that is not a typo) graphics card is claimed to have a 16 phase VRM design, GPU clock of 816 MHz, shader clock of 1632 MHz, and a memory clock of 4008 MHz. Further, the card contains 1.5GB of GDDR5 memory on a 384 bit bus. All this power is delivered via two 4x2 PCI-E connectors (8 pin). The following table compares the claimed Asus card's speeds to NVIDIA's reference design.
|Asus ROG Matrix GTX 580||NVIDIA Reference Design|
|GPU Clock||816 MHz||772 MHz|
|Shader Clock||1632 MHz||1544 MHz|
|Memory Clock||4008 MHz (effective)||4008 MHz (effective)|
|Memory Amount and Bus||1.5 GB GDDR5, 384 bit bus||1.5 GB GDDR5, 384 bit bus|
|PCI-E Connections||Two 4x2 PCI-E (8 pin) connector||One 6 pin, One 8 pin connector|
The card features several overclocker friendly features, including nodes to directly measure voltage, hardware buttons to increase/decrease voltage to the card, and a “safe mode” button that promises to restore the card to factory settings located next to the uppermost DVI output.
Subject: Graphics Cards | May 24, 2011 - 03:44 PM | Jeremy Hellstrom
Tagged: quad sli, quad crossfire, sli, crossfire, nvidia, amd
With SLI and CrossFire we all hoped to see direct scaling so that a quad GPU setup would be somewhere in the neighbourhood of 4x better than a single GPU. That has proven to be incorrect, not only is the scaling nowhere near that it has been discovered that in some cases going beyond 2 GPUs can actually reduce performance.
As the hardware and drivers evolve, it is worth revisiting the scaling performance of both AMD and NVIDIA which is why [H]ard|OCP grabbed two GeForce GTX 590s and two AMD Radeon HD 6990s, both dual GPU cards. In three of the five games tested they ran into at least one issue, a strike right off the bat. Read on to see how they rate the value of the two manufacturers based on the performance they saw once they'd resolved the problems.
"How does NVIDIA's GeForce GTX 590 SLI Quad-GPU compare to AMD's Radeon HD 6990 CrossFireX Quad-GPU? We will find out if these "if-money-didn't-matter dream video card setups" will deliver the gameplay experience we all expect."
Here are some more Graphics Card articles from around the web:
- Asus ENGTX460 GTX 460 Voltage Tweak Review @ Tweaknews
- MSI GeForce GTX 580 Lightning @ OCAU
- Gigabyte GTX 560 (GV-N56GOC-1GI) @ Pro-Clockers
- ASUS GeForce GTX 560 1GB DirectCU II TOP @ TweakTown
- MSI N560GTX Ti Hawk Video Card @ Benchmark Reviews
- Zotac Geforce GTX 550 TI @ Rbmods
- MSI GeForce GTX 560 Twin Frozr II Review @ Techgage
- NVIDIA Chips Comparison Table @ Hardware Secrets
- Desktop Graphics Card Comparison Guide @ Tech ARP
- AMD FirePro V7900 @ Phoronix
- AMD FirePro V5900 @ Phoronix
- ASUS Radeon HD 6870 Video Card Review @ Legit Reviews
- MSI R6950 Twin Frozr III Power Edition OC @ Benchmark Reviews
- Sapphire Radeon HD 6670 1GB Graphics Card Review @ eTeknix
- Sapphire Radeon HD 6770 1GB Vapor-X @ TweakTown
- VTX3D Radeon HD 6790 1GB @ OCAU
- HIS Radeon HD 6790 IceQ X Turbo 1 GB @ techPowerUp
- PowerColor PCS+ AX6950 Vortex II @ Benchmark Reviews
Subject: Graphics Cards | May 21, 2011 - 03:04 AM | Tim Verry
Tagged: nvidia, kfa2, GTX 560, graphics
Not to be left out of the slew of NVIDIA GeForce GTX 560 releases, KFA2 announced two new NVIDIA graphics cards to their current graphics card lineup. Both are based on the Geforce GTX 560 GPU; however, one card is overclocked and fitted with an aftermarket heatsink and fan combo (the other is a standard single, centered, and shrouded fan design). Labeled the KFA2 GeForce GTX 560 1GB 256bit and the KFA2 GeForce GTX 560 EX OC 1GB 256bit, the DirectX 11 cards offer the following specifications:
|GeForce GTX 560 1GB 256bit||GeForce GTX 560 EX OC 1GB 256bit|
|GPU Clock||810 MHz||905 MHz|
|Shader Clock||1620 MHz||1810 MHz|
|Memory Clock||2004 MHz||2004 MHz|
|Memory||1 GB GDDR5 on 256-bit bus||1 GB GDDR5 on 256-bit bus|
|Memory Bandwidth||128.3 GB/s||128.3 GB/s|
|Texture Fill Rate||45.3 Billion/s||50.6 Billion/s|
The two new cards seem to be positioned (specifications wise) between purely reference cards and the highest clocked GTX 560 cards of their competitors. The street price will ultimately determine if they are worth picking up versus other brands with higher clocks or reference clocks but aftermarket cooling. KFA2 states that the cards will be available online and in retail stores throughout Europe, and are backed by a two year warranty.
Subject: General Tech | May 19, 2011 - 11:35 AM | Jeremy Hellstrom
Tagged: nvidia, quarter, income
We have been discussing the changes to the graphics market on the front page and on the Podcast, and as expected NVIDIA's income has shrunk. Last year NVIDIA was generating $800 million but saw revenue drop bu over $100 million, in perspective SemiAccurate pegs their professional graphics division at about $200 million. If NVIDIA is going to be able to keep their R&D team working on chips several generations ahead of the current products on the market, which they need to in order to be competitive, they had better hope that their foray into the mobile chip market is lucrative enough to pay the bills.
"Nvidia (NASDAQ:NVDA) published their results last Thursday topping analyst estimates and six days later the stock was down 10%. What happened?
The numbers were pretty good. Revenue was up and Tegra™ finally started to get traction, more than 3 times up but there are some red lights. First their revenues are down YoY. Second, their GPU business is down YoY and last, but not least, their professional business revenue is more or less flat for the last quarter."
Here is some more Tech News from around the web:
- Whoops, Intel SNB Is Borked At The Last Minute In Linux 2.6.39 @ Phoronix
- Intel’s 2011 Investor Meeting - Intel’s Architecture Group: 14nm Airmont Atom In 2014 @ AnandTech
- Google rolls out fix for Android security threat @ The Register
- Cisco refuses to deny it will sell off Linksys @ The Register
- Red Hat releases Enterprise Linux 6.1 @ The Inquirer
- Open-Source AMD Fusion Driver Stabilizes @ Phoronix
- Clash of the Sumo Titan bean bag chair @ The Tech Report
- Win a Blackberry Torch 9800 [Red] @ t-break
Subject: Graphics Cards | May 19, 2011 - 04:33 AM | Tim Verry
Tagged: nvidia, GTX 560, graphics
Coinciding with the NDA lift on the NVIDIA GeForce GTX 560, Gigabyte announced its enthusiast class Overclock Edition graphics card based on new the GTX 560 GPU.
The new Overclock Edition replaces the reference design's cooler with Gigabyte's own WindForce 2X variant, which they claim reduces the noise of the card under full load to 31db. Further, the heatsink used direct heat pipe technology, which means that the heat pipes that carry heat away from the GPU and into the fins physically contact the GPU itself. Both fans produce 30.5 CFM of airflow to quickly dissipate the heat of the overclocked GTX 560 GPU, Gigabyte was able to clock the card at a 830 MHz GPU clock and a 4008 Mhz memory clock from the factory. Gigabyte claims to improve overclocking capability by 10% to 30% thanks to it's "Ultra Durable" copper PCB technology and power switching enhancements.
The full specification of the GeForce GTX 560 Overclock Edition are as follows:
|Core Clock||830 MHz|
|Shader Clock||1660 MHz|
|Memory Amount||1 GB|
|Memory Bus||256 bit|
|Card Bus||PCI-E 2.0|
|Process Technology||40 nm|
|Card Dimensions||43mm (h) x 238mm (l) x 130mm (w)|
|Power Requirements||Minimum 500 Watt PSU required|
1x HDMI and Display Port via adapter(s)
1x mini HDMI
1x VGA (via adapter)
Gigabyte is a popular motherboard manufacturer for enthusiasts and it seems that they are striving to gain that same level of consumer brand loyalty with their graphics cards. Do you have a Gigabyte graphics card in your rig?