Subject: Mobile | May 10, 2013 - 03:28 PM | Ryan Shrout
Tagged: tegra 4, tegra, shield, project shield, nvidia
After the initial announcement at CES in January, NVIDIA has been trying hard to keep excitement and interest about Project Shield going. The upcoming Tegra 4-powered mobile Android-based gaming machine will be launched sometime in the summer; both Computex and E3 would make perfect timing.
NVIDIA passed us a photo of the mold for the casing of Project Shield and though you don't really get any awesome new information out of it, I thought I would share.
The photo you see below shows the production mold that's used to craft the ergonomic casing that houses Project SHIELD's high-powered components: Tegra 4, 5-inch 720p HD retinal touchscreen, Stereo Bass Reflex Speakers, WiFi, accelerometer, gyro, a massive battery, and more.
To create the casing, we inject a polycarbonate material into the RHCM (Rapid Heat Cycle Molding) tool at 10,800 PSI and 300 degrees Celsius. We use a polycarbonate mixture comprised of 90% Sabic 500ECR-739 PC and 10% glass. This material and injection molding process ensures a sturdy yet lightweight casing that will deliver hours of gaming with no fatigue.
In case you are behind on what Project Shield is, you should check out the hands-on video we made during our time with the device last January.
What do you think...are you excited about the launch of this device? Do any of its features really make you want to buy it once available?
Subject: General Tech | May 9, 2013 - 07:50 PM | Tim Verry
Tagged: tegra 4, nvidia, grid, financial results
NVIDIA has released the results of its first fiscal quarter of 2014. Overall, NVIDIA had a positive first quarter with total revenue of $954.7 million and a net income of $77.9 million. During Q1 2014 the company announced its Grid VCA for enterprise customers and Tegra 4 and Tegra 4i for the mobile market. NVIDIA’s shareholders saw an Earnings Per Share (EPS) of 13 cents, which is up 30% versus the same quarter last year. Interestingly, NVIDIA has announced that it will be returning $1 billion to shareholders through increased dividends and buying back shares.
Q1 2014 is an interesting quarter, as it is up year over year, but down significantly versus the previous quarter (Q4’13). NVIDIA’s Q1’14 revenue of 954.7 million is up YOY 32% from $924.9 million in Q1’13, but down 13.7% from $1.1 billion in the previous quarter. The dip is likely attributable to the fact that its Q1’14 is the quarter after the holiday rush at the end of Q4. Considering it is still up versus last year, the dip versus last quarter shouldn’t be taken as a bad sign. Net income follows a similar pattern, with net income down 53.2% versus last quarter’s $174 million, but up 29% YOY (Q1’13 net income was $60.9 million).
The financial results seem to indicate that NVIDIA is continuing to grow and remain profitable. According to NVIDIA, the company expects to see operating expenses and revenue increase in Q2’14 to $448 million in and approximately $975 million respectively. Further, NVIDIA expects growth to continue throughout 2014 as it launches new Tegra 4(i) SoCs and expands its server/business offerings with its GRID technologies.
You can find NVIDIA's full financial report on the company's website.
Subject: Editorial, Graphics Cards | May 8, 2013 - 11:37 PM | Ryan Shrout
Tagged: video, nvidia, live, frame rating, fcat
Update: Did you miss the live stream? Watch the on-demand replay below and learn all about the Frame Rating system, FCAT, input latency and more!!
I know, based solely on the amount of traffic and forum discussion, that our readers have really adopted and accepted our Frame Rating graphics testing methodology. Based on direct capture of GPU output via an external system and a high end capture card, our new systems have helped users see GPU performance a in more "real-world" light that previous benchmarks would not allow.
I also know that there are lots of questions about the process, the technology and the results we have shown. In order to try and address these questions and to facilitate new ideas from the community, we are hosting a PC Perspective Live Stream on Thursday afternoon.
Joining me will be NVIDIA's Tom Petersen, a favorite of the community, to talk about NVIDIA's stance on FCAT and Frame Rating, as well as just talk about the science of animation and input.
The primary part of this live stream will be about education - not about bashing one particular product line or talking up another. And part of that education is your ability to interact with us live, ask questions and give feedback. During the stream we'll be monitoring the chat room embedded on http://pcper.com/live and I'll be watching my Twitter feed for questions from the audience. The easiest way to get your question addressed though will be to leave a comment or inquiry here in this post below. It doesn't require registration and this will allow us to think about the questions before hand, giving it a better chance of being answered during the stream.
Frame Rating and FCAT Live Stream
11am PT / 2pm ET - May 9th
So, stop by at 2pm ET on Thursday, May 9th to discuss the future of graphics performance and benchmarking!
Zotac has announced a new GTX TITAN graphics card that will fall under the company’s AMP! Edition branding. This new Titan graphics card will feature factory overclocks on both the GPU and GDDR5 memory. However, due to NVIDIA’s restrictions, the Zotac GeForce GTX TITAN AMP! Edition does not feature a custom cooler or PCB.
The Zotac TITAN AMP! Edition card features a single GK110 GPU with 2,688 CUDA cores clocked at 902MHz base and 954MHz boost. That is a healthy boost over the reference TITAN’s 836MHz base and 876MHz boost clock speeds. Further, while Zotac’s take on the TITAN continues the reference specification’s 6GB of GDDR5 memory, it is impressively overclocked to 6,608Mhz (especially since Zotac has not changed the cooler). The GPU clocks might be able to be replicated by many of the reference cards though. For example, Ryan managed to get his card up to 992MHz boost in his review of the NVIDIA GTX TITAN.
The card has two DL-DVI, one HDMI, and one DisplayPort video output(s). The cooler, PCB, and PCI-E power specifications are still the same as the reference design. You can find more details on the heatsink in the TITAN review. Not allowing vendors to use custom coolers is disappointing and possibly limiting the factory GPU overclocks that they are able/willing to offer and support, but within that restriction the Zotac AMP! Edition looks to be a decent card so long as the (not yet announced) price premium over the $999 NVIDIA reference card is minimal.
Our 4K Testing Methods
You may have recently seen a story and video on PC Perspective about a new TV that made its way into the office. Of particular interest is the fact that the SEIKI SE50UY04 50-in TV is a 4K television; it has a native resolution of 3840x2160. For those that are unfamiliar with the new upcoming TV and display standards, 3840x2160 is exactly four times the resolution of current 1080p TVs and displays. Oh, and this TV only cost us $1300.
In that short preview we validated that both NVIDIA and AMD current generation graphics cards support output to this TV at 3840x2160 using an HDMI cable. You might be surprised to find that HDMI 1.4 can support 4K resolutions, but it can do so only at 30 Hz (60 Hz 4K TVs won't be available until 2014 most likely), half the refresh rate of most TVs and monitors at 60 Hz. That doesn't mean we are limited to 30 FPS of performance though, far from it. As you'll see in our testing on the coming pages we were able to push out much higher frame rates using some very high end graphics solutions.
I should point out that I am not a TV reviewer and I don't claim to be one, so I'll leave the technical merits of the monitor itself to others. Instead I will only report on my experiences with it while using Windows and playing games - it's pretty freaking awesome. The only downside I have found in my time with the TV as a gaming monitor thus far is with the 30 Hz refresh rate and Vsync disabled situations. Because you are seeing fewer screen refreshes over the same amount of time than you would with a 60 Hz panel, all else being equal, you are getting twice as many "frames" of the game being pushed to the monitor each refresh cycle. This means that the horizontal tearing associated with Vsync will likely be more apparent than it would otherwise.
I would likely recommend enabling Vsync for a tear-free experience on this TV once you are happy with performance levels, but obviously for our testing we wanted to keep it off to gauge performance of these graphics cards.
Subject: General Tech | April 25, 2013 - 02:13 PM | Ken Addison
Tagged: video, Xe, seiki, raidr, podcast, nvidia, Never Serttle, hd 7990, GA-Z77N-WiFi, frame rating, crossfire, amd, 4k
PC Perspective Podcast #248 - 04/25/2013
Join us this week as we discuss AMD HD 7990, CrossFire Frame Rating improvements, 4K TVs and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:16:34
0:01:20 Update on Indiegogo: You guys rock!
Week in Review:
0:06:30 AMD Radeon HD 7990 6GB Review
News items of interest:
1-888-38-PCPER or email@example.com
Subject: General Tech | April 24, 2013 - 01:38 PM | Jeremy Hellstrom
Tagged: Steve Scott, nvidia, HPC, tesla, logan, tegra
The Register had a chance to sit down with Steve Scott, once CTO of Cray and now CTO of NVIDIA's Tesla projects to discuss the future of their add-in cards as well as that of x86 in the server room. They discussed Tegra and why it is not receiving the same amount of attention at NVIDIA as Tegra is, as well as some of the fundamental differences in the chips both currently and going forward. NVIDIA plans to unite GPU and CPU onto both families of chips, likely with a custom interface as opposed to placing them on the same die, though both will continue to be designed for very different functions. A lot of the article focuses on Tegra, its memory bandwidth and most importantly its networking capabilities as it seems NVIDIA is focused on the server room and providing hundreds or thousands of interconnected Tegra processors to compete directly with x86 offerings. Read on for the full interview.
"Jen-Hsun Huang, co-founder and CEO of Nvidia has been perfectly honest about the fact that the graphics chip maker didn't intend to get into the supercomputing business. Rather, it was founded by a bunch of gamers who wanted better graphics cards to play 3D games. Fast forward two decades, though, and the Nvidia Tesla GPU coprocessor and the CUDA programming environment have taken the supercomputer world by storm."
Here is some more Tech News from around the web:
- AMD pins future growth to embedded marketplace @ The Register
- AMD announces new embedded G-series SoC @ DigiTimes
- TSMC captures almost 50 percent of foundry market thanks to 28nm demand @ The Inquirer
- $45 BeagleBone Black Keeps Eyes on the Pi's @ Linux.com
- BlackBerry OS 10.1 leaks its secret goo over all the web @ The Register
- Samsung MV900F Wi-Fi 16.3MP Digital Camera Review @ ModSynergy
- i’m Watch: A Smartwatch Review @ TechwareLabs
A very early look at the future of Catalyst
Today is a very interesting day for AMD. It marks both the release of the reference design of the Radeon HD 7990 graphics card, a dual-GPU Tahiti behemoth, and the first sample of a change to the CrossFire technology that will improve animation performance across the board. Both stories are incredibly interesting and as it turns out both feed off of each other in a very important way: the HD 7990 depends on CrossFire and CrossFire depends on this driver.
If you already read our review (or any review that is using the FCAT / frame capture system) of the Radeon HD 7990, you likely came away somewhat unimpressed. The combination of a two AMD Tahiti GPUs on a single PCB with 6GB of frame buffer SHOULD have been an incredibly exciting release for us and would likely have become the single fastest graphics card on the planet. That didn't happen though and our results clearly state why that is the case: AMD CrossFire technology has some serious issues with animation smoothness, runt frames and giving users what they are promised.
Our first results using our Frame Rating performance analysis method were shown during the release of the NVIDIA GeForce GTX Titan card in February. Since then we have been in constant talks with the folks at AMD to figure out what was wrong, how they could fix it, and what it would mean to gamers to implement frame metering technology. We followed that story up with several more that showed the current state of performance on the GPU market using Frame Rating that painted CrossFire in a very negative light. Even though we were accused by some outlets of being biased or that AMD wasn't doing anything incorrectly, we stuck by our results and as it turns out, so does AMD.
Today's preview of a very early prototype driver shows that the company is serious about fixing the problems we discovered.
If you are just catching up on the story, you really need some background information. The best place to start is our article published in late March that goes into detail about how game engines work, how our completely new testing methods work and the problems with AMD CrossFire technology very specifically. From that piece:
It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand. We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern. Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case. Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?” My answer based on the below graph would be no.
An example of a runt frame in a CrossFire configuration
NVIDIA's solution for getting around this potential problem with SLI was to integrate frame metering, a technology that balances frame presentation to the user and to the game engine in a way that enabled smoother, more consistent frame times and thus smoother animations on the screen. For GeForce cards, frame metering began as a software solution but was actually integrated as a hardware function on the Fermi design, taking some load off of the driver.
New GeForce Game-Ready Drivers Just in Time for 'Dead Island: Riptide,' 'Star Trek', 'Neverwinter'; Boost Performance up to 20%
Subject: Graphics Cards | April 23, 2013 - 03:53 PM | Jeremy Hellstrom
Tagged: nvidia, graphics drivers, geforce, 320.00 beta
GeForce 320.00 beta drivers are now available for automatic download and installation using GeForce Experience, the easiest way to keep your drivers up to date.
With a single click in GeForce Experience, gamers can also optimize the image quality of top new games like Dead Island: Riptide and have it instantly tuned to take full advantage of their PC’s hardware.
Here are examples of the performance increases in GeForce 320.00 drivers (measured with GeForce GTX 660):
- Up to 20% in Dirt: Showdown
- Up to 18% in Tomb Raider
- Up to 8% in StarCraft II
- Up to 6% in other top games like Far Cry 3
For more details, refer to the release highlights on the driver download pages and read the GeForce driver article on GeForce.com.
Enjoy the new GeForce Game Ready drivers and let us know what you think.
Windows Vista/Windows 7 Fixed Issues
The Windows 7 Magnifier window flickers. 
Games default to stereoscopic 3D mode after installing the driver. 
[GeForce 330M][Notebook]: The display goes blank when rebooting the notebook after installing th e driver. 
[Crysis 3]: There are black artifacts in the game. 
[Dirt 3]: When ambient occlusion is enabled, there is rendering corruption in the game while in split-screen mode. 
[3DTV Play][Mass Effect]: The NVIDIA Cont rol Panel “override antialiasing” setting does not work when stereoscopic 3D is enabled 
[Microsoft Flight Simulator]: Level D Simulations add-on aircraft gauges are not drawn correctly. 
[GeForce 500 series][Stereoscopic 3D][Two World 2]: The application crashes when switching to windowed mode with stereoscopic 3D enabled. 
[GeForce 660 Ti][All Points Bulletin (APB) Reloaded]: The game crashes occasionally, followed by a black/grey/red screen. 
[Geforce GTX 680][Red Orchestra 2 Heroes of Stalingrad]: Red-screen crash occurs after exiting the game. 
[GeForce 6 series][Final Fantasy XI]: TDR crash occurs in the game when using the Smite of Rage ability. 
[SLI][Surround][GeForce GTX Titan][Tomb Raider]: There is corruption in the game and the system hangs when played at high resolution and Ultra or Ultimate settings. 
[3D Surround, SLI], GeForce 500 Series: With Surround enabled, all displays may not be activated when selecting Activate All Displays from the NVIDIA Control Panel- > Set SLI Configuration page. 
[SLI][Starcraft II][3D Vision]: The game crashes when run with 3D Vision enabled. 
[SLI][GeForce GTX 680][Tomb Raider (2013)]: The game crashes and TDR occurs while running the game at Ultra settings. 
[SLI][Starcraft II][3D Vision]: The game cras hes when played with 3D Vision and SLI enabled. 
SLI][Call of Duty: Black Ops 2]: The player emblems are not drawn correctly.
Subject: Graphics Cards | April 16, 2013 - 10:24 PM | Ryan Shrout
Tagged: nvidia, metro last light, Metro
Late this evening we got word from NVIDIA about an update to its game bundle program for GeForce GTX 600 series cards. Replacing the previously running Free to Play bundle that included $50 in credit for each World of Tanks, Hawken and Planetside 2 title, NVIDIA is moving back to the AAA game with Metro: Last Light.
Metro: Last Light is the sequel to surprise hit from 2010, Metro 2033 and I am personally really looking forward to the game and seeing how it can stress PC hardware like the first did.
This bundle is only good for GTX 660 cards and above with the GTX 650 Ti sticking with the Free to Play $75 credit offer.
NVIDIA today announced that gamers who purchase a NVIDIA GeForce GTX 660 or above would also receive a copy of the highly anticipated Metro: Last Light, published by Deep Silver and is the sequel to the multi award winning Metro 2033. Metro: Last Light will be available May 14, 2013 within the US and May 17, 2013 across Europe.
The deal is already up and running on Newegg.com but with the release date of Metro: Last Light set at May 14th, you'll have just about a month to wait before you can get your hands on it.
How do you think this compares to AMD's currently running bundle with Bioshock Infinite and more? Did NVIDIA step up its game this time around?