Subject: Mobile | September 18, 2013 - 12:04 PM | Ryan Shrout
Tagged: tegra note, tegra 4, tegra, tablet, pny, nvidia, evga
Over the past couple of months there have been several leaks about a potential NVIDIA-branded tablet based on the Tegra 4 SoC. Most speculated that NVIDIA had decided to enter into the hardware market directly with a "Tegra Tab" in a similar vein to the release of NVIDIA SHIELD. As it turns out though NVIDIA has created a platform for which other companies can rebrand and resell an Android tablet.
According to NVIDIA, the Tegra Note platform will enable partners to bring 7-in tablets to market packed with the feature set NVIDIA has been promising since the launch of the Tegra 4 SoC. Those include stylus support, high quality audio, HDR camera capabilities and 100% native Android operating systems.
Maybe more interesting are the partners that NVIDIA is teaming with for this launch. While companies like ASUS have already done the development work to prepare various size tablets based on Tegra chips in the past, NVIDIA is going to introduce a couple of its graphics cards partners to the mobility ecosystem: EVGA and PNY in North America.
While we have questions about the capability for either of these companies to truly support a tabletin today's market but the truth is likely that NVIDIA is handling most if not all of the logistics on this project. What is not in question is the potential for high value: these tablets will start with a suggested retail price of $199.
We already know most of the technical details about the Tegra 4 SoC including the 4+1 Cortex A15 CPU cores and the 72-core GPU. NVIDIA claims they will get 10 hours of video playback with this platform but I would like to get data on the weight and battery size before calling that a win. The display resolution is a bit lower than other competing high-end options in the market today but the sub-$200 price point does mean there had to be some corners cut.
UPDATE: I asked NVIDIA for more information on the size, weight and battery capacity and got a quick answer. The battery capacity is 4100 mAh and the entire device weighs 320g. Compared to the Google Nexus 7, the current strongest 7-in tablet in my opinion, that is a 4% larger battery (vs 3950 mAh) and 10% heavier device (vs 290g). The Tegra Note reference is also a bit thicker at 9.6mm compared to the 8.65mm of the Nexus 7.
There are more details on the official NVIDIA blog post making the announcement this morning including direct OTA Android updates so check that out if you think you might be interested in one of these tablets in the coming months!
Summary of Events
In January of 2013 I revealed a new testing methodology for graphics cards that I dubbed Frame Rating. At the time I was only able to talk about the process, using capture hardware to record the output directly from the DVI connections on graphics cards, but over the course of a few months started to release data and information using this technology. I followed up the story in January with a collection of videos that displayed some of the capture video and what kind of performance issues and anomalies we were able to easily find.
My first full test results were published in February to quite a bit of stir and then finally in late March released Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing which dramatically changed the way graphics cards and gaming performance was discussed and evaluated forever.
Our testing proved that AMD CrossFire was not improving gaming experiences in the same way that NVIDIA SLI was. Also, we showed that other testing tools like FRAPS were inadequate in showcasing this problem. If you are at all unfamiliar with this testing process or the results it showed, please check out the Frame Rating Dissected story above.
At the time, we tested 5760x1080 resolution using AMD Eyefinity and NVIDIA Surround but found there were too many issues and problems with our scripts and the results they were presenting to give reasonably assured performance metrics. Running AMD + Eyefinity was obviously causing some problems but I wasn’t quite able to pinpoint what they were and how severe it might have been. Instead I posted graphs like this:
We were able to show NVIDIA GTX 680 performance and scaling in SLI at 5760x1080 but we only were giving results for the Radeon HD 7970 GHz Edition in a single GPU configuration.
Since those stories were released, AMD has been very active. At first they were hesitant to believe our results and called into question our processes and the ability for gamers to really see the frame rate issues we were describing. However, after months of work and pressure from quite a few press outlets, AMD released a 13.8 beta driver that offered a Frame Pacing option in the 3D controls that enables the ability to evenly space out frames in multi-GPU configurations producing a smoother gaming experience.
The results were great! The new AMD driver produced very consistent frame times and put CrossFire on a similar playing field to NVIDIA’s SLI technology. There were limitation though: the driver only fixed DX10/11 games and only addressed resolutions of 2560x1440 and below.
But the story won’t end there. CrossFire and Eyefinity are still very important in a lot of gamers minds and with the constant price drops in 1920x1080 panels, more and more gamers are taking (or thinking of taking) the plunge to the world of Eyefinity and Surround. As it turns out though, there are some more problems and complications with Eyefinity and high-resolution gaming (multi-head 4K) that are cropping up and deserve discussion.
Subject: General Tech | August 30, 2013 - 04:07 PM | Jeremy Hellstrom
Tagged: nvidia, batman arkham origins, free
If you weren't a fan of NVIDIA's last offer of in game currency for
PTW Free To Play online games then how about Batman: Arkham Origins for free? If you pick up a 600 or 700 series GPU before the end of the year then you will picking up a copy for free. The TITAN and GTX 690 are not named specifically, nor is the rumoured GTX 790 but it is unlikely you would be singled out. NVIDIA will also being showing off a sneak peek of the game at PAX Prime in September.
SANTA CLARA, Calif.—Aug. 30, 2013—NVIDIA today announced it is working with Warner Bros. Interactive Entertainment and WB Games Montréal to make Batman™: Arkham Origins, the next installment in the blockbuster Batman: Arkham videogame franchise, a technically advanced and intensely realistic chapter in the award-winning saga for PC players.
Gamers who purchase a qualifying GPU from a participating partner will receive a free PC edition of Batman: Arkham Origins, which will be released worldwide on Oct. 25, 2013.
Developed by WB Games Montréal, Batman: Arkham Origins features an expanded Gotham City and introduces an original prequel storyline set several years before the events of Batman: Arkham Asylum and Batman: Arkham City. Taking place before the rise of Gotham City’s most dangerous criminals, the game showcases a young Batman as he faces a defining moment of his early career and sets his path to becoming the Dark Knight.
Batman has immense power, strength and speed—the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham’s dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA™ antialiasing, soft shadows and various NVIDIA PhysX® engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.
“The Batman: Arkham games are visually stunning and it’s great that we are able to continue building upon the amazing graphics with Batman: Arkham Origins,” said Samantha Ryan, Senior Vice President, Production and Development, Warner Bros. Interactive Entertainment. “With NVIDIA’s continued support, we are able to deliver an incredibly immersive gameplay experience."
NVIDIA will be unveiling a sneak peek of Batman: Arkham Origins at PAX Prime in Seattle, during the NVIDIA stage presentation at the Paramount Theater on Monday, Sept. 2 at 10 a.m. PT. Entry is free.
Additionally, any PAX attendees that purchase a qualified bundle from the special kiosk at the NVIDIA booth on the show floor will receive for free a limited edition Batman lithograph — one of only 1,000 being produced.
For a full list of participating bundle partners, visit: www.geforce.com/freebatman. This offer is good only until Jan. 31, 2014.
Subject: General Tech | August 30, 2013 - 03:54 PM | Jeremy Hellstrom
Tagged: shield, nvidia, nifty, microsoft, grid vca, byod
Remember NVIDIA's Shield, that game streaming device Ryan was playing with at QuakeCon but which doesn't seem to fit the role of just a gaming device since it can harness the power of other nearby NVIDIA GPUs? The Register is proposing a rather interesting usage scenario for the Shield by using the GRID VCA technology which is the basis of communications with NVIDA's servers and virtualized GPUs, which is also happens to function well with many of the virtualization programs currently in use.
When they saw Windows games being played on a Shield at VM World they realized that there would be nothing impossible about providing Office 365 as a service if you were running Server 2012 with RemoteFX installed. With HDMI out you can have the monitor of your choice and the Bluetooth capability means you can support a keyboard and mouse and suddenly you have the coolest thinclient on the block. In fact you might even be able to sit near a server with several Tesla cards installed and run CAD programs if someone could figure out how to stream a CAD program to the Shield.
Or you could just game at work.
"Some grumble that the Bring Your Own Device (BYOD) concept deserves to be called Spend Your Own Money in recognition of the cost of providing a computer hitting workers' hip pockets instead of employers'.
Such grumbles may be less sustainable now that NVIDIA's $US299 SHIELD portable gaming console can run Windows applications."
Here is some more Tech News from around the web:
- Intel to announce new Haswell processors in September @ DigiTimes
- Samsung has begun bashing out DDR4 20nm memory modules @ The Inquirer
- Intel Plans 'Overclocking' Capability On SSDs @ Slashdot
- Apple set to make 63 million iWatches in 2014 priced at $199 @ The Inquirer
- Do not adjust your eyes: This Kobo ten-incher has a 2560 x 1600 resolution @ The Register
- Enter to win an MSI GeForce GTX 760 graphics card and Corsair HX1050 PSU @ The Tech Report
A New TriFrozr Cooler
Graphics cards are by far the most interesting topic we cover at PC Perspective. Between the battles of NVIDIA and AMD as well as the competition between board partners like EVGA, ASUS, MSI and Galaxy, there is very rarely a moment in time when we don't have a different GPU product of some kind on an active test bed. Both NVIDIA and AMD release reference cards (for the most part) with each and every new product launch and it then takes some time for board partners to really put their own stamp on the designs. Other than the figurative stamp that is the sticker on the fan.
One of the companies that has recently become well known for very custom, non-reference graphics card designs is MSI and the pinnacle of the company's engineering falls into the Lightning brand. As far back as the MSI GTX 260 Lightning and as recently as the MSI HD 7970 Lightning, these cards have combined unique cooling, custom power design and good amount of over engineering to really produce a card that has few rivals.
Today we are looking at the brand new MSI GeForce GTX 780 Lightning, a complete revamp of the GTX 780 that was released in May. Based on the same GK110 GPU as the GTX Titan card, with two fewer SMX units, the GTX 780 easily the second fastest single GPU card on the market. MSI is hoping to make the enthusiasts even more excited about the card with the Lightning design that brings a brand new TriFrozr cooler, impressive power design and overclocking capabilities that basic users and LN2 junkies can take advantage of. Just what DO you get for $750 these days?
Subject: General Tech | August 24, 2013 - 01:05 PM | Tim Verry
Tagged: txaa, PhysX, pc gaming, nvidia, infinity ward, call fo duty, Activision
Activision recently announced a technical partnership with NVIDIA at GamesCom. The two companies are "working hand in hand" on the development of the PC version of Call of Duty: Ghosts to implement the kinds of graphical features and technologies that PC gamers expect of a new triple-A title.
According to a NVIDIA Geforce blog post, NVIDIA developers are working on-site at Infinity Ward. NVIDIA is helping Infinity Ward to enhance the Sub D tessellation, displacement mapping, and HDR lighting. Additionally, the NVIDIA engineers are working to integrate support for the company's TXAA (temporal anti-aliasing) and PhysX technologies. The Infinity Ward game developers are also taking advantage of the APEX Turbulence PhysX tool-kit to enable realistic, physics-based, smoke clouds that will react with the environment and player actions.
Activision and Infinity Ward are also enabling the use of dedicated multiplayer servers for Call of Duty: Ghosts. In addition, Call of Duty Elite will be available for the PC version of the game including a smartphone app that allows stat tracking and profile management from a mobile device.
The Geforce blog claims that the PC version is intended to be the definitive CoD: Ghosts version, which is always nice to see. More graphical effects and features are being worked on, but IW and NVIDIA are keeping them under wraps for now.
The PC is in a really good place right now between console cycles where developers are finally starting to realize the power of the PC and what it is able to offer in terms of graphical performance and control options. PC-first development is something that I have been wanting to see for a long time (develop for the PC and port to consoles rather than the other way around), and now that PC versions are once again getting due credit and development attention (and resources), along with the upcoming consoles being based on x86 hardware... these types of technical partnerships where the PC version is being positioned as the best version are hopefully the start of a trend that will see a new surge in PC gaming!
Subject: General Tech | August 24, 2013 - 11:53 AM | Tim Verry
Tagged: ubisoft, txaa, pc gaming, nvidia, kepler
NVIDIA announced on Wednesday that it had formed an alliance with Ubisoft to collaborate on Ubisoft's upcoming PC game titles (coming this fall). The alliance involves the NVIDIA Developer Technology Team "working closely" with the Ubisoft development studio on several new PC titles. The team NVIDIA-enhanced PC games covered by this new alliance includes Tom Clancy's Splinter Cell: Blacklist, Assassin's Creed IV Black Flag, and Watch Dogs.
NVIDIA Senior VP of Content and Technology Tony Tamasi stated in a press release that "Ubisoft understands that PC gamers demand a truly elite experience -- the best resolutions, the smoothest frame rates and the latest gaming breakthroughs." NVIDIA has reportedly worked with the Ubisoft game developers throughout the entire development process to incorporate the company's graphics technologies.
Tom Clancy's Splinter Cell: Blacklist is the first game to come out of the alliance. It features PC gaming graphics technologies such as DirectX 11 effects, parallax mapping, ambient occlusion, tessellation, HBAO+ (horizon-based ambient occlusion), and NVIDIA's own TXAA and Surround support. The latest Splinter Cell game also comes bundled with NVIDIA graphics cards.
NVIDIA did not go into details on what sort of extra PC-centric graphics features the other Ubisoft games will have, but it should be similar to those in Splinter Cell: Blacklist. Curiously, the press release makes no mention of NVIDIA's The Way It's Meant To Be Played program, though it seems that this alliance may even go a step further than that in terms of development team interaction and shared resources.
Subject: Graphics Cards | August 20, 2013 - 12:24 PM | Jeremy Hellstrom
Tagged: nvidia, graphics drivers, geforce 326.80
The new GeForce 326.80 beta driver is now available to download. An essential update for gamers sneaking into Tom Clancy’s Splinter Cell Blacklist, today’s driver ensures maximum performance and system compatibility in the brand new stealth title, which is jam-packed with PC-exclusive features and technology, including NVIDIA HBAO+ Ambient Occlusion, NVIDIA TXAA Temporal Anti-Aliasing, out-of-the-box NVIDIA SLI support, and much much more. For a full rundown, head on back to GeForce.com tomorrow when we’ll detail all of Blacklist’s impressive tech.
New in GeForce R326 Drivers Performance Boost
- Increases performance by up to 19% for GeForce 400/500/600/700 series GPUs in several PC games vs. GeForce 320.49 WHQL-certified drivers. Results will vary depending on your GPU and system configuration.
Here is an example of measured gains:
GeForce GTX 770:
- Up to 15% in Dirt: Showdown
- Up to 6% in Tomb Raider
GeForce GTX 770 SLI: ·
- Up to 19% in Dirt: Showdown
- Up to 11% in F1 2012
- GeForce GTX 770:
- Added SLI profile for Spinter Cell: Blacklist
- Added SLI profile for Batman: Arkham Origins
- SHIELD · Enables GeForce to SHIELD streaming. Learn more here.
- 4K Displays · Adds support for additional tiled 4K displays Extended support for tiled 4K features
Subject: General Tech, Graphics Cards | August 19, 2013 - 03:04 PM | Scott Michaud
Tagged: jpr, Matrox, s3, amd, nvidia
Well, according to John Peddie Research (JPR), not too good if you are Matrox or S3. The total market for add-in boards decreased 5.4% from last quarter. 14.0 million were shipped across the entire industry. Neither company accounted for a thousandth of that value leaving them with a maximum 7000 units shipped, best case scenario. This industry is, basically, a two horse race.
|This Quarter||Prev. Quarter||Last Year|
Two horses unless you count the Intel Xeon Phi. While technically not a graphics processor despite hardware design, 48,000 of these coprocessors were sold, already, for the Tianhe-2 supercomputer. This is at least seven-fold more than an entire quarter for Matrox. Unfortunately JPR does not report on Intel add-in cards despite its overlap with the GPU add-in market. These numbers could get even more interesting as years progress.
As for the two big players, AMD and NVIDIA, both hold very dominant positions. Almost spiting the 750,000 unit industry decline, AMD experienced a total increase of 0.8% quarter-over-quarter. Their market share gained 2.3% as a result of this growth. NVIDIA experienced a total decrease of 8.9%.
In all, AMD has been doing better than the industry average. They are fighting the slight decline in the graphics industry while simultaneously helping GPUs hold off against larger declines in PC systems.
Subject: General Tech, Graphics Cards, Processors | August 16, 2013 - 04:00 PM | Scott Michaud
Tagged: nvidia, Intel, APU, amd
Despite a slight decline in PC sales compared to last quarter, graphics processors are on the rise. Jon Peddie Research attributes the heightened interest in graphics, with a decline in systems, to a trend towards multiple GPUs in a system. Crossfire and SLI, according to the report, are not driving this drift but they are relevant. More importantly, consumers are adding discrete graphics to systems with integrated solutions.
AMD has experienced an increase in shipments of 47% for laptop APUs. Desktop heterogeneous processors declined but, in all, shipments increased 11%. Intel, likewise, saw an increase albeit just 6%. NVIDIA declined 8%. AMD now enjoys a 5.8% lead in total market share over NVIDIA.
Many PCs have access to multiple graphics processors simultaneously. With an increase of available GPUs, software developers might take the plunge into fully supporting heterogeneous architectures. You could imagine a game which offloads physics or AI pathfinding to secondary graphics. Sure, the increased heat would slightly limit the turbo-performance of the CPU, but the increased parallel performance should overtake that decreased serial performance for a sensible developer.
JPR claims an average of nearly 1.4 GPUs available per system.
The increased laptop heterogeneous processors is a major win for AMD. Still, I wonder how much Never Settle played in to users dropping discrete graphics into machines which would otherwise have integrated (chipset or processor) graphics. The discrete graphics market has declined and yet somehow AMD got a boost from double-attach or replaced graphics.
The report only discusses consumer x86 tablets, desktops, laptops, and some hybrid between the previous three categories. Other processor architectures or x86 servers are not covered.