Subject: Editorial | March 9, 2012 - 08:45 AM | Josh Walrath
Tagged: TSMC, tahiti, process node, nvidia, kepler, amd, 28 nm
Charlie over at Semiaccurate is reporting that TSMC has closed down their entire 28 nm line. Shut down. Not running wafers. This obviously cannot be good.
Apparently TSMC stopped the entire line about three weeks ago and have not restarted it. This type of thing does not happen very often, and when it does, things are really out of whack. Going back we have heard mixed reviews of TSMC’s 28 nm process. NVIDIA was quoted as saying that yields still were not very good, but at least were better than what they experienced with their first 40 nm part (GTX 400 series). Now, part of NVIDIA’s problem was that the design was as much of an issue as the 40 nm process was. AMD at the time was churning out HD 5000 series parts at a pretty good rate, and they said their yields were within expectations.
AMD so far is one of the first customers out of the gate with a large volume of 28 nm parts. The HD 7900 series has been out since the second week of January, the HD 7700 series since mid-February, and the recently released HD 7800 series will reach market in about 2 weeks. Charlie has done some more digging and has found out that AMD has enough product in terms of finished boards and packaged chips that they will be able to handle the shutdown from TSMC. Things will get tight at the end, but apparently the wafers in the middle of being processed have not been thrown out or destroyed. So once production starts again, AMD and the other customers will not have to wait 16 to 20 weeks before getting finished product.
NVIDIA will likely not fare nearly as well. The bulk of the stoppage occurred during the real “meat and potatoes” manufacturing cycle for the company. NVIDIA expects to launch the first round of Kepler based products this month, but if production has been stopped for the past three weeks then we can bet that there are a lot of NVIDIA wafers just sitting in the middle of production. Charlie also claims that the NVIDIA launch will not be a hard one, and NVIDIA expects retail products to be available several weeks after the introduction.
The potential reasons for this could be legion. Was there some kind of toxic spill that resulted in a massive cleanup that required the entire line to be shut down? Was there some kind of contamination that was present while installing the line, but was not discovered until well after production started? Or was something glossed over during installation that ballooned into a bigger problem that just needed to be rectified (a stitch in time saves nine)?
Subject: Graphics Cards | March 8, 2012 - 03:59 PM | Josh Walrath
Tagged: nvidia, kepler, gtx 680, GDC
It seems that there have been a few leaks on NVIDIA's first Kepler based product. Techpowerup and Extreme Tech are both reporting on leaks that apparently came from Cebit and some of NVIDIA's partners. We now have a much better idea what the GTX 680 is all about.
Epic's Mark Rein is showing off his own GTX 680 which successfully ran their Samaritan Demo. It is wrapped for his protection. (Image courtesy of Extreme Tech)
The chip that powers the GTX 680 is the GK104, and it is oddly enough the more "midrange/enthusiast" offering. It has a total of 1536 CUDA cores, runs at 703 MHz core and 1406 MHz hot clock, has a 256 bit memory bus pumping out 196 GB/sec, and has a new and interesting feature that is quite a bit like the Turbo core functionality we see from both AMD and Intel in their CPUs. Apparently when a scene gets very complex, the chip is able to overclock itself up to 900 MHz core/1800 MHz hot clock. It will stay there for either as long as the scene needs it, or the chip approaches its upper TDP limit.
These reports paint the GTX 680 as being about 10% faster than the HD 7970 in certain applications, but in others it is slower. I figure that when reviews are finally released the two cards will have traded blows with each other over who has the fastest graphics card. Let's call it a draw.
The GTX 680 should be unveiled in the next week or so, but initial reviews will not surface until later in the month. Retail availability will be relegated until then, but with the issues that TSMC has had with their 28 nm process (it has been stopped since the middle of February) we have no idea how much product NVIDIA and its partners has. Things could be scarce after the introduction for some time.
Subject: Graphics Cards | March 8, 2012 - 12:41 PM | Tim Verry
Tagged: nvidia, kepler, graphics card, gpu, GK104, gaming, 28nm
GDC 2012 is upon us, and in addition to the Samaritan demo and some gaming goodness, we spotted a leaked image over at Legit Reviews that is allegedly a photo of a production NVIDIA GTX 670 Ti graphics card.
The cooler looks to cover the whole PCB and be of the blower design, funneling hot air of the of the front of the card and out of the case. The connectors include two DVI, one HDMI, and one Display Port. Rumors suggest that the latest NVIDIA cards will be capable of multi-display (>2) from a single card much like AMD cards have been doing for some time.
Not a whole lot is known about the upcoming GK104 "Kepler" GPUs with a good deal of certainty, but we have reported on a few leaks including that the cards will have 2 GB of GDDR5 RAM on a 256 bit memory bus, and that the cards may just be coming out in May. Due to Epic using a working Kepler GPU in their Samaritan demo, that launch date does not sound too far fetched either. On the performance front, there are conflicting rumors; some rumors state that the cards will blow AMD out of the water and other people swear the cards will not be as powerful as the rumors suggest. I suppose we'll find out soon though!
Are you still waiting for NVIDIA's Kepler GPUs or have you jumped on the latest Radeon series?
Subject: General Tech | March 7, 2012 - 09:38 PM | Tim Verry
Tagged: unreal, udk, samaritan, nvidia, fxaa
Last year we saw Unreal unviel their Samaritan demo which showed off next generation gaming graphics using three NVIDIA 580 GTX graphics cards in SLI. Epic games showed off realistic hair and cloth physics along with improved lighting, shadows, anti-aliasing, and more bokeh effects than gamers could shake a controller at with their Samaritan demo, and I have to say it was pretty impressive stuff a year ago, and it still is today. What makes this round special is that hardware has advanced such that the Samaritan level graphics can be achieved in real time with a single graphics card, a big leap from last year's required three SLI'd NVIDIA GTX 580s!
The Samaritan demo was shown at this years' GDC 2012 (Games Developers Conference) to be running on a single NVIDIA "Kepler" graphics card in real time, which is pretty exciting. Epic did not state any further details on the upcoming NVIDIA graphics card; however, the knowledge that the single GPU was able to pull off what it took three Fermi cards to do certainly holds promise.
According to GeForce; however, it was not merely the NVIDIA Kepler GPU that made the Samaritan demo on a single GPU possible. The article states that it was the inclusion of NVIDIA's method for anti-aliasing known as FXAA, or Fast Approximate Anti-Aliasing that enabled it. Unlike the popular MSAA option employed by (many of) today's games, FXAA uses much less memory, enabling single graphics cards to avoid being bogged down by memory thrashing. They further state that the reason MSAA is not ideal for the Samaritan demo is because the demo uses deferred shading to provide the "complex, realistic lighting effects that would be otherwise impossible using forward rendering," a method employed by many game engines. The downside to the arguably better lighting in the Samaritan demo is that it requires four times as much memory. This is because the GPU RAM needs to hold four samples per pixel, and the workload is magnified four times in areas of the game where there are multiple intersecting pieces of geometry.
FXAA vs MSAA
They go on to state that without AA turned on, the lighting in the Samaritan demo uses approximately 120 MB of GPU RAM, and with 4x MSAA turned on it uses about 500 MB. That's 500 MB of memory dedicated just to lighting when it could be used to hold more of the level and physics, for example and would require a GPU to swap more data that it should have to (using FXAA). They state that FXAA on the other hand, is a shader based AA method that does not require additional memory, making it "much more performance friendly for deferred renderers such as Samaritan."
Without anti-aliasing, the game world would look much more jagged and not realistic. AA seeks to smooth out the jagged edges, and FXAA enabled Epic to run their Samaritan demo on a single next generation NVIDIA graphics card. Pretty impressive if you ask me, and I'm excited to see game developers roll some of the Samaritan graphical effects into their games. Knowing that Epic Game's engine can be run on a single graphics card implies that this future is all that much closer. More information is available here, and if you have not already seen it the Samaritan demo is shown in the video below.
Subject: Editorial, General Tech | February 29, 2012 - 03:51 PM | Ken Addison
Tagged: vengeance, tegra, podcast, nvidia, MWC, Intel, corsair, asus, amd
PC Perspective Podcast #191 - 02/29/2012
Join us this week as we talk about our ASUS AMD GPU Roundup, IMFT Flash, and tons of news from MWC!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store
- RSS - Subscribe through your regular
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malvantano
This Podcast is brought to you by
- 0:00:32 Introduction
- 1-888-38-PCPER or email@example.com
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 0:01:38 Contest Reminder - http://pcper.com/contest
- 0:03:35 SilverStone TJ08-E Micro-ATX Tower Enclosure Review
- 0:05:00 Asus DirectCU and DirectCU II for AMD: 6850, 6870, 6950, and 6970 Boards Under the Microscope
- 0:16:15 AMD Updates the FX Line: Some Thoughts on Future Moves for AMD
- 0:22:55 This Podcast is brought to you by
, and their all new Sandy Bridge Motherboards!
- 0:23:45 Intel Ivy Bridge delay is confirmed essentially
- 0:27:30 GPU sales look a little down in the month
- 0:31:13 Could this new research lead to light speed RAM?
- 0:33:45 Just Delivered: Corsair Vengeance K90, K60, M90, M60 Keyboards and Mice
- 0:36:30 Intel / Micron Flash Technology Venture Expands, Micron Assumes Two Plants
- 0:40:30 Qualcomm Shipped Most Smartphone and Tablet GPUs in 2011
- 0:42:20 MWC 12: Samsung to compete with Tegra on quad-core CPU
- 0:45:20 MWC 12: Huawei enters the mobile SoC world with quad-core K3V2
- 0:47:00 Nokia World's Largest Windows Phone OS Smartphone Vendor in Q4 2011
- 0:49:00 MWC 12: Intel branded, Atom-powered smartphone to be sold by Orange
- 0:51:30 MWC 12: ASUS Unveils Infinity Tablets, Dockable Smartphone
- 0:55:00 MWC 12: Lots of industry support for NVIDIA DirectTouch
- 0:56:55 MWC 12: NVIDIA Roadmap Outlines Tegra's Future Including 4G and Die Shrink
- Contest Reminder - http://pcper.com/contest
- 1:00:00 Hardware / Software Pick of the Week
- Ryan: A phone that shows pictures of hotels
- Jeremy: Audio steganography
- 1-888-38-PCPER or firstname.lastname@example.org
- http://twitter.com/ryanshrout and http://twitter.com/pcper
Subject: Mobile | February 28, 2012 - 12:43 PM | Matt Smith
Tagged: tegra 3, tablet, nvidia, MWC, mobile world congress, gaming
Nvidia has always been happy to show off the gaming capabilities of Tegra 2 and Tegra 3. No surprise there – the company’s roots are in PC graphics, which goes hand-in-hand with PC games. When we reviewed the Transformer Prime there were several beautiful pre-installed games, such as Shadowgun, that showed off the hardware. Now Nvidia has come up with not one but five more new titles to brag about at Mobile World Congress – including a new Sonic the Hedgehog game.
Let’s have a look at them.
Dark Kingdom THD
This is a role-playing game built in the vein of games like Diablo and Torchlight. The visuals are impressive and a game like this should control well on a tablet, so this could prove to be an excellent title. I do wish the graphics offered more variety, however – they look a bit dark and dreary to me.
Eden to GREEEEN THD
Hey, are you tired of tower defense games yet? I hope not, because that’s what Eden to GREEEEN is, albeit it one that has somewhat less restrictive mechanics than most titles in the genre. Of the games that Nvidia showed at Mobile World Congress this is easily the least attractive, but that doesn’t mean it won’t be fun to play.
Golden Arrow THD
This is a hack-and-slash title that apparently will be including a lot of lens flare and explosions. I’m not sure that sounds appealing, but if you’re into that sort of thing, go for it. There’s no video available for this title right now, so we don’t have a lot to go on.
Hamilton’s Great Adventure THD
This game, which is already available on other platforms (you can even pick it up on Steam), is a puzzle game that asks players to navigate hazardous terrain on their quest to collect treasure and do over adventure-y things. The versions of this game that have come out for other platforms have received a fair amount of praise, so Nvidia is wise to be bragging about adding this to its stable.
Sonic The Hedgehog 4: Episode II
Nvidia has been bragging about Tegra 3’s console-quality graphics, so why not poach one of console gaming’s most popular franchises? Sonic The Hedgehog 4: Episode II looks to be a straight port of the version that will be available on game consoles, which means platforming at light speed with excellent 3D graphics. I sincerely hope that the controls are solid, because if so, this could be one of the best games on any tablet.
Availability for the titles shown by Nvidia is generally vague, but can be summed up as vaguely “spring.” The Sonic game, for example, will be coming out for Xbox 360 on May 16, so the tablet version will probably release sometime close to that date.
Subject: Mobile | February 27, 2012 - 10:53 AM | Tim Verry
Tagged: wayne, tegra 3, tegra, tablets, nvidia, MWC 12, mobile, grey
This year is a big one for smaller silicon manufacturing processes with Intel's 22nm Ivy Bridge, and NVIDIA and AMD moving to 28nm GPU processes. According to a report on VR-Zone, NVIDIA is already planning their next move for Tegra, including a die shrink to 28nm.
They managed to get their hands on a road map (shown below) for NVIDIA's Tegra SoC (system on a chip) lineup that extends into 2013. With Tegra 3, NVIDIA began by sampling the chip to Asus and the Transformer Prime. After that success, other partners and devices are starting to pick up the mobile chip, and they expect the situation to be the same for future Tegra iterations.
They company allegedly taped out a Tegra 4 (T40) SoC at the end of December 2011, and is starting to sample it to OEM partners to find someone to do a Tegra 3 like launch, with one device/platform to debut first and others to follow in later months.
The Tegra 4 chip is code named "Wayne" and will be comprised of multiple ARM Cortex A15 CPU cores and a new GPU on a 28nm process. The company alos plans to show off a dual core 28nm SoC that uses two Cortex A15 CPU cores, a revamped GPU, and an Icera 4G LTE radio at Mobile World Congress 2013 next year.
Further, the roadmap details a new Tegra 3 chip that is intended to be used with Windows on ARM powered notebooks and tablets. While the new Tegra 3 (T35) SoC will not be a die shrink, it will have higher clock speeds due to less restrictions placed on the maximum TDP (thermal design power) allowed for the devices. VR-Zone estimates that the T35 chips will run somewhere between 1.6 GHz and 1.7 GHz.
Currently there are some incompatibilities with Tegra and 4G LTE radios which has caused some LTE devices to go with Qualcomm SoCs, so it is good to see NVIDIA working on improving compatibility and then integrating the basebands into their future SoCs to rectify the issue.
As far as this year is concerned, we should see the updated Tegra 3 chip and a new version of Tegra 2 that integrates a Icera 4G LTE baseband. The Wayne and Grey chips will likely not be released until 2013 at the earliest. The expanded Tegra portfolio should help them to gain some market share, though exactly how much remains to be seen.
Subject: Mobile | February 26, 2012 - 01:35 PM | Tim Verry
Tagged: tegra 3, smartphone, quad core, nvidia, MWC 12, htc, Android
Earlier rumors suggested that the LG Optimus 4X HD would be the first quad core Tegra 3 powered smartphone; however, HTC and NVIDIA made an announcement today that shows that LG is not the only company showing off a Tegra 3 smartphone at Mobile World Congress 2012!
NVIDIA announced in a press release today that their Tegra 3 mobile SoC would be powering the new HTC One X smartphone which is to be shown off at MWC 2012. According to HTC, the phone features 1 GB of RAM and 16 GB of on-board storage to power the Android 4.0 Ice Cream Sandwich (ICS) mobile OS on a 4.7" display with 1280x720 resolution. It further includes an 8 MP (megapixel) rear camera capable of 1080p video with stereo sound, a VGA resolution front camera for video conferencing and claimed 720p video, 802.11 b/g/n Wi-Fi, support for LTE 4G networks, and a 1,800 mAh battery.
Kouji Kodera, Chief Product Officer at HTC stated that "the HTC One X with Tegra 3 provides an experience that consumers will absolutely love." The HTC One X is the first smartphone that HTC and NVIDIA have worked together on.
The mobile market is advancing rather quickly, and Mobile World Congress 2012 is only just getting started! Stay tuned to PC Perspective for more MWC 2012 announcements!
For the full press release, hit the read more link:
Subject: Mobile | February 25, 2012 - 06:31 AM | Tim Verry
Tagged: ti, qualcomm, nvidia, mobile gpu, jpr, apple
The researchers over at Jon Peddie Research pushed out their results yesterday for shipments of mobile GPUs in SoC (system on a chip) platforms, and they found some interesting results. The article covers the number of shipments by the major players in the mobile device GPU space and uses those numbers to estimate the amount of market share each of the companies has using an average of all the four quarter shipment numbers. Further, they found that from Q1 2011 to Q4 2011, the number of mobile device GPUs shipped by all manufacturers had a CAGR (compound annual growth rate) of 18%. That's a fairly impressive growth rate that shows the smartphone and tablet hardware market is continuing to steadily grow.
In terms of market share, at the end of 2011 Qualcomm was leading the pack with 31.4%, and the only other manufacturer to come close to that number was Apple with 22.7%. The little Adreno GPU by Qualcomm was obviously a popular choice last year!
To make things even more interesting, they note that although Qualcomm has the highest shipment rates, it was Samsung who enjoyed the highest CAGR with a 39% growth rate (bringing them up from 9.2% in Q1 to 14.9% in Q4). Apple then followed behind Samsung's numbers with 26% CAGR. Finally, Qualcomm had the lowest percentage growth rate but maintained the highest number of shipments.
The table below shows off the relative market share for the major SoC mobile device manufacturers, as provided by Jon Peddie Research.
They further state that the mobile GPU war is really heating up, especially between Samsung, Apple, and Qualcomm, and I tend to agree. This area of the technology market is seeing some very impressive growth and is really booming as mobile GPU SoCs are continuously released and they are getting more powerful each iteration. It is an area that has a lot of competition and is growing rapidly, much like desktop computers did 10 to 20 years ago when personal computers really started to be affordable and powerful enough to take over the world (well, market share wise).
Another interesting point about the marketshare results in that of NVIDIA's shipments. With all the marketing behind the Tegra SoC and its popularity in high end smartphones and tablets, I was under the impression that they had a lot more marketshare than they do such that when I first saw the JPR chart, I did a double take and had to be sure I read them correctly! It will be interesting to see how they do this year and whether they will start to see increased growth.
It will be interesting to see if Samsung can catch up to Qualcomm and whether or nor Qualcomm will still be the heavyweight champion by 2012. Nvidia is still just breaking into this market but they have a very powerful GPU, so it will be interesting to see just how much they manage to grow this year. What are your thoughts on these numbers? How do you think things will unfold this year? Let us know in the comments below!
Subject: General Tech, Mobile, Shows and Expos | February 24, 2012 - 03:18 PM | Scott Michaud
Tagged: nvidia, DirectTouch, MWC 12
As a part of their Tegra 3 product, NVIDIA embedded the ability to control some of the touchscreen processing onto the CPU. The offloading allows for increased power efficiency by reducing the number of powered components as well as increased touch responsivity. Atmel, Cypress, and Synaptics are three leading touch-controller companies who join N-Trig, Raydium, and Focaltech in supporting the DirectTouch architecture.
Touchy subject, I know -- but...
Advancements in touch technology are definitely welcome especially when the words power efficiency or responsiveness are involved. Both NVIDIA and Intel have been looking for ways to reduce the number of electronics behind your phone or tablet. The less required to do the most the better we are. It is great to see NVIDIA taking the lead in innovation when it is needed the most.
While I do not mean to rain upon NVIDIA’s bright blue skies -- I must make a note. Despite the precision brought by high sample rate, there does appear to be quite a bit of latency between where his finger is and where the touch is reported. I would be curious to see where that latency occurs.
Of course this issue probably has nothing to do with NVIDIA. Videogames, particularly on consoles, are known to have latencies floating up to 100ms as the input device does not influence the frames being rendered often enough. The latency could come in from the touch device itself, from the software, the operating system, and/or whatever else.
We do not know where the latency occurs, but I expect that whoever crushes it will have a throne awaiting them somewhere in Silicon Valley.
Get notified when we go live!