Subject: General Tech, Mobile | May 14, 2013 - 09:06 AM | Ryan Shrout
Tagged: tegra 4, tegra, shield, project shield, nvidia
Solid information about the NVIDIA Shield (no longer called Project Shield) is finally becoming available with a blog post written up today on NVIDIA's website. The company will begin accepting pre-orders from users that have previously signed up for the Shield mailing list while the rest of you will have to wait until May 20th to plop down your money.
The cost? $349. Newegg, Gamestop, Micro Center and Canada Computer will carry it.
If you want to sign up for official June release schedule of the Tegra 4 powered mobile Android gaming device, you'll have to head over to shield.nvidia.com.
NVIDIA does point out in the blog that the PC game streaming feature that I truly believe is the one thing that makes Shield a compelling gaming device, will be launching as BETA feature.
And GeForce game streaming, launching as a beta feature, will give SHIELD the power to access your NVIDIA GeForce GTX GPU-powered computer from the comfort of your couch. We’re working on streaming your favorite PC games to SHIELD, including great titles from Steam.
High level features of the device, for those of you that are unaware, include:
- Tegra 4 – The world’s fastest mobile processor delivers rich graphics and unbeatable performance thanks to 72 GPU cores, four CPU cores and 2GB of RAM
- Console-grade controller – Precise control thanks to dual analog joysticks, a full-sized D-Pad, left and right analog triggers, full-sized bumpers and A/B/X/Y buttons
- Multi-touch display – 5-inch, 720p retinal multi-touch display for high-fidelity visuals
- Integrated speakers – Custom, bass reflex, tuned port audio system – we think this is SHIELD’s sleeper feature
- Wi-Fi – 802.11n 2X2 MIMO game-speed Wi-Fi for seamless game streaming
- Pure Android – Latest Android Jelly Bean operating system from Google, for access to Android games and apps
- There’s more – We put into SHIELD everything we would want in a premium mobile gaming device: 16 GB memory, GPS, Bluetooth 3.0, a mini-HDMI output, micro-USB 2.0, a microSD storage slot, a 3.5-mm stereo headphone jack. See the full spec sheet, here.
We took a look at the NVIDIA Shield device at CES this year and posted a video of our experiences, so check it out below.
NVIDIA has also posted a separate blog that talks about some of the upcoming Android games that will highlight the power of the Tegra 4 mobile processor including Broken Age and Costume Quest from Double Fine, Chuck's Challenge from Niffler and more.
I think many people at NVIDIA as well as in the media are very curious to see what the reaction of Shield will actully be upon its release. I am very excited to test it out in real-world, long term usage models but I definitely have doubts about the market's desire for another mobile gaming platform.
Leave me your thoughts in the comments below!!
Subject: Graphics Cards | May 10, 2013 - 07:25 PM | Jeremy Hellstrom
Tagged: titan, radeon hd 7990, nvidia, amd
If you have been wondering how the two flagship GPUs fare in a battle royal of pure frame rate you can satisfy your curiousity at [H]ard|OCP. They have tested both NVIDIA's TITAN and the finally released HD7990 in one of their latest reviews. Both cards were force to push out pixels at 5760x1200 and for the most part tied, which makes sense as they both cost $1000. The real winner was Crossfired HD 7970's which kept up with the big guns but cost $200 less to purcahse.
If that isn't extreme enough for you, they also overclocked the TITAN in a seperate review.
"We follow-up with a look at how the $999 GeForce GTX TITAN compares to the new $999 AMD Radeon HD 7990 video card. What makes this is unique is that the GeForce GTX TITAN is a single-GPU running three displays in NV Surround compared to the same priced dual-GPU CrossFire on a card Radeon HD 7990 in Eyefinity."
Here are some more Graphics Card articles from around the web:
- AMD's Radeon HD 7990 @ The Tech Report
- Diamond BV750 Low Profile 7750 @ Bjorn3D
- XFX Radeon HD 7790 Black Edition 1GB Video Card Review @ Madshrimps
- AMD Radeon HD 7790 2GB review: does another 1GB make a difference @ Hardware.info
- PowerColor HD 7790 Turbo Duo 1GB Video Card Review @ Hi Tech Legion
- AMD Radeon HD 7790 vs. Nvidia GeForce GTX 650 Ti BOOST @ X-bit Labs
- Sapphire HD7790 2GB OC @ Kitguru
- XFX R7790 Black Edition 1GB Review @ Neoseeker
- PowerColor Radeon HD 7790 TurboDuo OC 1GB @ eTeknix
- AMD Radeon HD 7990 6GB and HD 7970 GHz Edition Video Cards in CrossFireX @ Tweaktown
- AMD Radeon HD 7990 6GB Dual GPU Video Card Overclocked @ Tweaktown
- 15-Way Open vs. Closed Source NVIDIA/AMD Linux GPU Comparison @ Phoronix
- ASUS GTX650-E-2GD5 @ Hardware.info
- EVGA GeForce GTX TITAN 6GB SuperClocked Video Cards in SLI Overclocked @ Tweaktown
- Inno3D iChill GTX 650 Ti Boost 2 GB @ techPowerUp
- EVGA GeForce GTX 650 Ti Boost SuperClocked 2GB @ Tweaktown
- ZOTAC GTX Titan AMP! Edition @ Bjorn3D
- ASUS GTX 650 Ti Boost DCII @ Bjorn3D
- Asus GTX 670 DirectCU Mini 2GB @ eTeknix
Subject: General Tech | May 10, 2013 - 05:21 PM | Jeremy Hellstrom
Tagged: eranings, Q1 2013, nvidia, jen-hsun huang
NVIDIA seems to have completely ignored the economic downturn that has affected so many tech companies and posted gains in both revenue and profit for Q1 2013. The entire PC market may have shrunk by 10% but NVIDIA's profits were up 16.7% compared to 12 months ago, though when looking at GPU sales alone they did see about a 5% decline. Now that NVIDIA has branched out into mobility and HPC however, their total sales are up by 3%. The Register postulates that part of the reason their sales did not decline as much as other manufacturers is their focus on high end GPUs which are immune to the erosion being caused by sales of mobile devices such as tablets. Get the whole set of numbers here.
"In the first quarter of fiscal 2014 ended on April 28, Nvidia's overall sales rose by 3.2 per cent to $954.7m. Big Green was able pull $77.9m to the bottom line, up 16.7 per cent compared to the year-ago period – even while investing in a substantial bump-up in research and development costs – thanks to a shift to higher margin products in both the discrete graphics and Tesla GPU coprocessor lines."
Here is some more Tech News from around the web:
- Microsoft plasters IE8 hole abused in nuke lab PC meltdown @ The Register
- Tripping on microdoses of Dyad @ The Tech Report
- Power2U AC/USB wall outlet @ LanOC Reviews
- Plugging into the Puzzle @ Techgage
- Casio G-SHOCK GA-110-1AER Watch @ NikKTech
- 8 Free to Play Games That Are Too Good to Be True @ Techspot
- ModRight Xtreme Super Large Anti-Static Mod-Mat @ Modders-Inc
Subject: Mobile | May 10, 2013 - 03:28 PM | Ryan Shrout
Tagged: tegra 4, tegra, shield, project shield, nvidia
After the initial announcement at CES in January, NVIDIA has been trying hard to keep excitement and interest about Project Shield going. The upcoming Tegra 4-powered mobile Android-based gaming machine will be launched sometime in the summer; both Computex and E3 would make perfect timing.
NVIDIA passed us a photo of the mold for the casing of Project Shield and though you don't really get any awesome new information out of it, I thought I would share.
The photo you see below shows the production mold that's used to craft the ergonomic casing that houses Project SHIELD's high-powered components: Tegra 4, 5-inch 720p HD retinal touchscreen, Stereo Bass Reflex Speakers, WiFi, accelerometer, gyro, a massive battery, and more.
To create the casing, we inject a polycarbonate material into the RHCM (Rapid Heat Cycle Molding) tool at 10,800 PSI and 300 degrees Celsius. We use a polycarbonate mixture comprised of 90% Sabic 500ECR-739 PC and 10% glass. This material and injection molding process ensures a sturdy yet lightweight casing that will deliver hours of gaming with no fatigue.
In case you are behind on what Project Shield is, you should check out the hands-on video we made during our time with the device last January.
What do you think...are you excited about the launch of this device? Do any of its features really make you want to buy it once available?
Subject: General Tech | May 9, 2013 - 07:50 PM | Tim Verry
Tagged: tegra 4, nvidia, grid, financial results
NVIDIA has released the results of its first fiscal quarter of 2014. Overall, NVIDIA had a positive first quarter with total revenue of $954.7 million and a net income of $77.9 million. During Q1 2014 the company announced its Grid VCA for enterprise customers and Tegra 4 and Tegra 4i for the mobile market. NVIDIA’s shareholders saw an Earnings Per Share (EPS) of 13 cents, which is up 30% versus the same quarter last year. Interestingly, NVIDIA has announced that it will be returning $1 billion to shareholders through increased dividends and buying back shares.
Q1 2014 is an interesting quarter, as it is up year over year, but down significantly versus the previous quarter (Q4’13). NVIDIA’s Q1’14 revenue of 954.7 million is up YOY 32% from $924.9 million in Q1’13, but down 13.7% from $1.1 billion in the previous quarter. The dip is likely attributable to the fact that its Q1’14 is the quarter after the holiday rush at the end of Q4. Considering it is still up versus last year, the dip versus last quarter shouldn’t be taken as a bad sign. Net income follows a similar pattern, with net income down 53.2% versus last quarter’s $174 million, but up 29% YOY (Q1’13 net income was $60.9 million).
The financial results seem to indicate that NVIDIA is continuing to grow and remain profitable. According to NVIDIA, the company expects to see operating expenses and revenue increase in Q2’14 to $448 million in and approximately $975 million respectively. Further, NVIDIA expects growth to continue throughout 2014 as it launches new Tegra 4(i) SoCs and expands its server/business offerings with its GRID technologies.
You can find NVIDIA's full financial report on the company's website.
Subject: Editorial, Graphics Cards | May 8, 2013 - 11:37 PM | Ryan Shrout
Tagged: video, nvidia, live, frame rating, fcat
Update: Did you miss the live stream? Watch the on-demand replay below and learn all about the Frame Rating system, FCAT, input latency and more!!
I know, based solely on the amount of traffic and forum discussion, that our readers have really adopted and accepted our Frame Rating graphics testing methodology. Based on direct capture of GPU output via an external system and a high end capture card, our new systems have helped users see GPU performance a in more "real-world" light that previous benchmarks would not allow.
I also know that there are lots of questions about the process, the technology and the results we have shown. In order to try and address these questions and to facilitate new ideas from the community, we are hosting a PC Perspective Live Stream on Thursday afternoon.
Joining me will be NVIDIA's Tom Petersen, a favorite of the community, to talk about NVIDIA's stance on FCAT and Frame Rating, as well as just talk about the science of animation and input.
The primary part of this live stream will be about education - not about bashing one particular product line or talking up another. And part of that education is your ability to interact with us live, ask questions and give feedback. During the stream we'll be monitoring the chat room embedded on http://pcper.com/live and I'll be watching my Twitter feed for questions from the audience. The easiest way to get your question addressed though will be to leave a comment or inquiry here in this post below. It doesn't require registration and this will allow us to think about the questions before hand, giving it a better chance of being answered during the stream.
Frame Rating and FCAT Live Stream
11am PT / 2pm ET - May 9th
So, stop by at 2pm ET on Thursday, May 9th to discuss the future of graphics performance and benchmarking!
Zotac has announced a new GTX TITAN graphics card that will fall under the company’s AMP! Edition branding. This new Titan graphics card will feature factory overclocks on both the GPU and GDDR5 memory. However, due to NVIDIA’s restrictions, the Zotac GeForce GTX TITAN AMP! Edition does not feature a custom cooler or PCB.
The Zotac TITAN AMP! Edition card features a single GK110 GPU with 2,688 CUDA cores clocked at 902MHz base and 954MHz boost. That is a healthy boost over the reference TITAN’s 836MHz base and 876MHz boost clock speeds. Further, while Zotac’s take on the TITAN continues the reference specification’s 6GB of GDDR5 memory, it is impressively overclocked to 6,608Mhz (especially since Zotac has not changed the cooler). The GPU clocks might be able to be replicated by many of the reference cards though. For example, Ryan managed to get his card up to 992MHz boost in his review of the NVIDIA GTX TITAN.
The card has two DL-DVI, one HDMI, and one DisplayPort video output(s). The cooler, PCB, and PCI-E power specifications are still the same as the reference design. You can find more details on the heatsink in the TITAN review. Not allowing vendors to use custom coolers is disappointing and possibly limiting the factory GPU overclocks that they are able/willing to offer and support, but within that restriction the Zotac AMP! Edition looks to be a decent card so long as the (not yet announced) price premium over the $999 NVIDIA reference card is minimal.
Our 4K Testing Methods
You may have recently seen a story and video on PC Perspective about a new TV that made its way into the office. Of particular interest is the fact that the SEIKI SE50UY04 50-in TV is a 4K television; it has a native resolution of 3840x2160. For those that are unfamiliar with the new upcoming TV and display standards, 3840x2160 is exactly four times the resolution of current 1080p TVs and displays. Oh, and this TV only cost us $1300.
In that short preview we validated that both NVIDIA and AMD current generation graphics cards support output to this TV at 3840x2160 using an HDMI cable. You might be surprised to find that HDMI 1.4 can support 4K resolutions, but it can do so only at 30 Hz (60 Hz 4K TVs won't be available until 2014 most likely), half the refresh rate of most TVs and monitors at 60 Hz. That doesn't mean we are limited to 30 FPS of performance though, far from it. As you'll see in our testing on the coming pages we were able to push out much higher frame rates using some very high end graphics solutions.
I should point out that I am not a TV reviewer and I don't claim to be one, so I'll leave the technical merits of the monitor itself to others. Instead I will only report on my experiences with it while using Windows and playing games - it's pretty freaking awesome. The only downside I have found in my time with the TV as a gaming monitor thus far is with the 30 Hz refresh rate and Vsync disabled situations. Because you are seeing fewer screen refreshes over the same amount of time than you would with a 60 Hz panel, all else being equal, you are getting twice as many "frames" of the game being pushed to the monitor each refresh cycle. This means that the horizontal tearing associated with Vsync will likely be more apparent than it would otherwise.
I would likely recommend enabling Vsync for a tear-free experience on this TV once you are happy with performance levels, but obviously for our testing we wanted to keep it off to gauge performance of these graphics cards.
Subject: General Tech | April 25, 2013 - 02:13 PM | Ken Addison
Tagged: video, Xe, seiki, raidr, podcast, nvidia, Never Serttle, hd 7990, GA-Z77N-WiFi, frame rating, crossfire, amd, 4k
PC Perspective Podcast #248 - 04/25/2013
Join us this week as we discuss AMD HD 7990, CrossFire Frame Rating improvements, 4K TVs and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:16:34
0:01:20 Update on Indiegogo: You guys rock!
Week in Review:
0:06:30 AMD Radeon HD 7990 6GB Review
News items of interest:
1-888-38-PCPER or email@example.com
Subject: General Tech | April 24, 2013 - 01:38 PM | Jeremy Hellstrom
Tagged: Steve Scott, nvidia, HPC, tesla, logan, tegra
The Register had a chance to sit down with Steve Scott, once CTO of Cray and now CTO of NVIDIA's Tesla projects to discuss the future of their add-in cards as well as that of x86 in the server room. They discussed Tegra and why it is not receiving the same amount of attention at NVIDIA as Tegra is, as well as some of the fundamental differences in the chips both currently and going forward. NVIDIA plans to unite GPU and CPU onto both families of chips, likely with a custom interface as opposed to placing them on the same die, though both will continue to be designed for very different functions. A lot of the article focuses on Tegra, its memory bandwidth and most importantly its networking capabilities as it seems NVIDIA is focused on the server room and providing hundreds or thousands of interconnected Tegra processors to compete directly with x86 offerings. Read on for the full interview.
"Jen-Hsun Huang, co-founder and CEO of Nvidia has been perfectly honest about the fact that the graphics chip maker didn't intend to get into the supercomputing business. Rather, it was founded by a bunch of gamers who wanted better graphics cards to play 3D games. Fast forward two decades, though, and the Nvidia Tesla GPU coprocessor and the CUDA programming environment have taken the supercomputer world by storm."
Here is some more Tech News from around the web:
- AMD pins future growth to embedded marketplace @ The Register
- AMD announces new embedded G-series SoC @ DigiTimes
- TSMC captures almost 50 percent of foundry market thanks to 28nm demand @ The Inquirer
- $45 BeagleBone Black Keeps Eyes on the Pi's @ Linux.com
- BlackBerry OS 10.1 leaks its secret goo over all the web @ The Register
- Samsung MV900F Wi-Fi 16.3MP Digital Camera Review @ ModSynergy
- i’m Watch: A Smartwatch Review @ TechwareLabs
Get notified when we go live!