Battle of the budget boards; MSI's 650 Ti TF BE

Subject: Graphics Cards | September 19, 2013 - 02:55 PM |
Tagged: nvidia, msi, 650ti boost, Twin Frozr

To give you the full name, the MSI N650 Titanium TwinFrozr 2GD5/OC Boost Edition is $170 after MIR, whereas you can pick up the HD 7850 that [H]ard|OCP chose to contrast against for a mere $130 after rebate.  That price difference means that NVIDIA really has to perform quite a bit better than the AMD card to beat it in a performance per price perspective.  From the numbers in the review you can clearly see that the 650Ti is the better performing card, especially with the respectable overclock that [H] managed which does make it the best card under $200; on the other hand if your budget is tight the performance gap is not as big as the price gap which might make that HD 7850 a better choice.

By the way, that NVIDIA card has a Boost clock which means that it might steal some of your megahertz away when it gets too hot, which is apparently a horrible experience and if you somehow disable that feature and cook your GPU ... obviously that is not your fault.

H_frozr.jpg

"Today we evaluate MSI's high-end GeForce GTX 650 Ti BOOST line with the flagship overclocked Gaming Edition MSI N650Ti TF 2GD5/OC BE. With falling prices on AMD Radeon video cards we will compare it to the AMD Radeon HD 7850 to see which will emerge as the victor in the sub-$200 price price range."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Podcast #269 - Frame Rating on Eyefinity, News from IDF, and rumors about new AMD GPUs

Subject: General Tech | September 19, 2013 - 11:26 AM |
Tagged: video, surround, podcast, nvidia, Intel, idf, haswell, frame rating, eyefinity, baytrail, amd, 4250U

PC Perspective Podcast #269 - 09/19/2013

Join us this week as we discuss Frame Rating on Eyefinity, News from IDF, and rumors about new AMD GPUs

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Jeremy Hellstrom, Josh Walrath, Allyn Malventano, and Morry Teitelman

 
Program length: 1:35:35
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week:
    1. Jeremy: Um, sure why not - ADATA DashDrive Durable
    2. Allyn: Connected Data Transporter 2.0 (yes it exists)
  4. 1-888-38-PCPER or podcast@pcper.com
  5. Closing/outro

 

NVIDIA Tegra Note Tablet Platform Launches - $199 from EVGA, PNY

Subject: Mobile | September 18, 2013 - 09:04 AM |
Tagged: tegra note, tegra 4, tegra, tablet, pny, nvidia, evga

Over the past couple of months there have been several leaks about a potential NVIDIA-branded tablet based on the Tegra 4 SoC.  Most speculated that NVIDIA had decided to enter into the hardware market directly with a "Tegra Tab" in a similar vein to the release of NVIDIA SHIELD.  As it turns out though NVIDIA has created a platform for which other companies can rebrand and resell an Android tablet.

TegraNote_front_c.jpg

According to NVIDIA, the Tegra Note platform will enable partners to bring 7-in tablets to market packed with the feature set NVIDIA has been promising since the launch of the Tegra 4 SoC.  Those include stylus support, high quality audio, HDR camera capabilities and 100% native Android operating systems.

Maybe more interesting are the partners that NVIDIA is teaming with for this launch.  While companies like ASUS have already done the development work to prepare various size tablets based on Tegra chips in the past, NVIDIA is going to introduce a couple of its graphics cards partners to the mobility ecosystem: EVGA and PNY in North America.

TegraNote_back.jpg

While we have questions about the capability for either of these companies to truly support a tabletin today's market but the truth is likely that NVIDIA is handling most if not all of the logistics on this project.  What is not in question is the potential for high value: these tablets will start with a suggested retail price of $199.

tegraplatform.png

We already know most of the technical details about the Tegra 4 SoC including the 4+1 Cortex A15 CPU cores and the 72-core GPU.  NVIDIA claims they will get 10 hours of video playback with this platform but I would like to get data on the weight and battery size before calling that a win.  The display resolution is a bit lower than other competing high-end options in the market today but the sub-$200 price point does mean there had to be some corners cut.

UPDATE: I asked NVIDIA for more information on the size, weight and battery capacity and got a quick answer.  The battery capacity is 4100 mAh and the entire device weighs 320g.  Compared to the Google Nexus 7, the current strongest 7-in tablet in my opinion, that is a 4% larger battery (vs 3950 mAh) and 10% heavier device (vs 290g).  The Tegra Note reference is also a bit thicker at 9.6mm compared to the 8.65mm of the Nexus 7.

There are more details on the official NVIDIA blog post making the announcement this morning including direct OTA Android updates so check that out if you think you might be interested in one of these tablets in the coming months!

Source: NVIDIA
Author:
Manufacturer: Various

Summary of Events

In January of 2013 I revealed a new testing methodology for graphics cards that I dubbed Frame Rating.  At the time I was only able to talk about the process, using capture hardware to record the output directly from the DVI connections on graphics cards, but over the course of a few months started to release data and information using this technology.  I followed up the story in January with a collection of videos that displayed some of the capture video and what kind of performance issues and anomalies we were able to easily find. 

My first full test results were published in February to quite a bit of stir and then finally in late March released Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing which dramatically changed the way graphics cards and gaming performance was discussed and evaluated forever. 

Our testing proved that AMD CrossFire was not improving gaming experiences in the same way that NVIDIA SLI was.  Also, we showed that other testing tools like FRAPS were inadequate in showcasing this problem.  If you are at all unfamiliar with this testing process or the results it showed, please check out the Frame Rating Dissected story above.

At the time, we tested 5760x1080 resolution using AMD Eyefinity and NVIDIA Surround but found there were too many issues and problems with our scripts and the results they were presenting to give reasonably assured performance metrics.  Running AMD + Eyefinity was obviously causing some problems but I wasn’t quite able to pinpoint what they were and how severe it might have been.  Instead I posted graphs like this:

01.png

We were able to show NVIDIA GTX 680 performance and scaling in SLI at 5760x1080 but we only were giving results for the Radeon HD 7970 GHz Edition in a single GPU configuration.

 

Since those stories were released, AMD has been very active.  At first they were hesitant to believe our results and called into question our processes and the ability for gamers to really see the frame rate issues we were describing.  However, after months of work and pressure from quite a few press outlets, AMD released a 13.8 beta driver that offered a Frame Pacing option in the 3D controls that enables the ability to evenly space out frames in multi-GPU configurations producing a smoother gaming experience.

02.png

The results were great!  The new AMD driver produced very consistent frame times and put CrossFire on a similar playing field to NVIDIA’s SLI technology.  There were limitation though: the driver only fixed DX10/11 games and only addressed resolutions of 2560x1440 and below.

But the story won’t end there.  CrossFire and Eyefinity are still very important in a lot of gamers minds and with the constant price drops in 1920x1080 panels, more and more gamers are taking (or thinking of taking) the plunge to the world of Eyefinity and Surround.  As it turns out though, there are some more problems and complications with Eyefinity and high-resolution gaming (multi-head 4K) that are cropping up and deserve discussion.

Continue reading our investigation into AMD Eyefinity and NVIDIA Surround with multi-GPU solutions!!

Holy Free Batgame NVIDIA!

Subject: General Tech | August 30, 2013 - 01:07 PM |
Tagged: nvidia, batman arkham origins, free

If you weren't a fan of NVIDIA's last offer of in game currency for PTW Free To Play online games then how about Batman: Arkham Origins for free?  If you pick up a 600 or 700 series GPU before the end of the year then you will picking up a copy for free.  The TITAN and GTX 690 are not named specifically, nor is the rumoured GTX 790 but it is unlikely you would be singled out.  NVIDIA will also being showing off a sneak peek of the game at PAX Prime in September.

batman-arkham-origins-global-landing-page.jpg

SANTA CLARA, Calif.—Aug. 30, 2013—NVIDIA today announced it is working with Warner Bros. Interactive Entertainment and WB Games Montréal to make Batman™: Arkham Origins, the next installment in the blockbuster Batman: Arkham videogame franchise, a technically advanced and intensely realistic chapter in the award-winning saga for PC players.
 
Gamers who purchase a qualifying GPU from a participating partner will receive a free PC edition of Batman: Arkham Origins, which will be released worldwide on Oct. 25, 2013.
 
Developed by WB Games Montréal, Batman: Arkham Origins features an expanded Gotham City and introduces an original prequel storyline set several years before the events of Batman: Arkham Asylum and Batman: Arkham City. Taking place before the rise of Gotham City’s most dangerous criminals, the game showcases a young Batman as he faces a defining moment of his early career and sets his path to becoming the Dark Knight.
 
Batman has immense power, strength and speed—the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham’s dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA™ antialiasing, soft shadows and various NVIDIA PhysX® engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.
 
“The Batman: Arkham games are visually stunning and it’s great that we are able to continue building upon the amazing graphics with Batman: Arkham Origins,” said Samantha Ryan, Senior Vice President, Production and Development, Warner Bros. Interactive Entertainment. “With NVIDIA’s continued support, we are able to deliver an incredibly immersive gameplay experience."
 
NVIDIA will be unveiling a sneak peek of Batman: Arkham Origins at PAX Prime in Seattle, during the NVIDIA stage presentation at the Paramount Theater on Monday, Sept. 2 at 10 a.m. PT. Entry is free.
 
Additionally, any PAX attendees that purchase a qualified bundle from the special kiosk at the NVIDIA booth on the show floor will receive for free a limited edition Batman lithograph — one of only 1,000 being produced.
 
For a full list of participating bundle partners, visit: www.geforce.com/freebatman. This offer is good only until Jan. 31, 2014.

Source: NVIDIA

Could you actually do 'work' on a Shield?

Subject: General Tech | August 30, 2013 - 12:54 PM |
Tagged: shield, nvidia, nifty, microsoft, grid vca, byod

Remember NVIDIA's Shield, that game streaming device Ryan was playing with at QuakeCon but which doesn't seem to fit the role of just a gaming device since it can harness the power of other nearby NVIDIA GPUs?  The Register is proposing a rather interesting usage scenario for the Shield by using the GRID VCA technology which is the basis of communications with NVIDA's servers and virtualized GPUs, which is also happens to function well with many of the virtualization programs currently in use.

When they saw Windows games being played on a Shield at VM World they realized that there would be nothing impossible about providing Office 365 as a service if you were running Server 2012 with RemoteFX installed.  With HDMI out you can have the monitor of your choice and the Bluetooth capability means you can support a keyboard and mouse and suddenly you have the coolest thinclient on the block.  In fact you might even be able to sit near a server with several Tesla cards installed and run CAD programs if someone could figure out how to stream a CAD program to the Shield.  

Or you could just game at work.

IMG_9808.JPG

"Some grumble that the Bring Your Own Device (BYOD) concept deserves to be called Spend Your Own Money in recognition of the cost of providing a computer hitting workers' hip pockets instead of employers'.

Such grumbles may be less sustainable now that NVIDIA's $US299 SHIELD portable gaming console can run Windows applications."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register
Author:
Manufacturer: MSI

A New TriFrozr Cooler

Graphics cards are by far the most interesting topic we cover at PC Perspective.  Between the battles of NVIDIA and AMD as well as the competition between board partners like EVGA, ASUS, MSI and Galaxy, there is very rarely a moment in time when we don't have a different GPU product of some kind on an active test bed.  Both NVIDIA and AMD release reference cards (for the most part) with each and every new product launch and it then takes some time for board partners to really put their own stamp on the designs.  Other than the figurative stamp that is the sticker on the fan.

IMG_9886.JPG

One of the companies that has recently become well known for very custom, non-reference graphics card designs is MSI and the pinnacle of the company's engineering falls into the Lightning brand.  As far back as the MSI GTX 260 Lightning and as recently as the MSI HD 7970 Lightning, these cards have combined unique cooling, custom power design and good amount of over engineering to really produce a card that has few rivals.

Today we are looking at the brand new MSI GeForce GTX 780 Lightning, a complete revamp of the GTX 780 that was released in May.  Based on the same GK110 GPU as the GTX Titan card, with two fewer SMX units, the GTX 780 easily the second fastest single GPU card on the market.  MSI is hoping to make the enthusiasts even more excited about the card with the Lightning design that brings a brand new TriFrozr cooler, impressive power design and overclocking capabilities that basic users and LN2 junkies can take advantage of.  Just what DO you get for $750 these days?

Continue reading our review of the MSI GeForce GTX 780 Lightning graphics card!!

Activision Working With NVIDIA to Enhance the PC Version of Call of Duty: Ghosts

Subject: General Tech | August 24, 2013 - 10:05 AM |
Tagged: txaa, PhysX, pc gaming, nvidia, infinity ward, call fo duty, Activision

Activision recently announced a technical partnership with NVIDIA at GamesCom. The two companies are "working hand in hand" on the development of the PC version of Call of Duty: Ghosts to implement the kinds of graphical features and technologies that PC gamers expect of a new triple-A title.

According to a NVIDIA Geforce blog post, NVIDIA developers are working on-site at Infinity Ward. NVIDIA is helping Infinity Ward to enhance the Sub D tessellation, displacement mapping, and HDR lighting. Additionally, the NVIDIA engineers are working to integrate support for the company's TXAA (temporal anti-aliasing) and PhysX technologies. The Infinity Ward game developers are also taking advantage of the APEX Turbulence PhysX tool-kit to enable realistic, physics-based, smoke clouds that will react with the environment and player actions.

Activision Infinity Ward Call Of Duty Ghosts.jpg

Activision and Infinity Ward are also enabling the use of dedicated multiplayer servers for Call of Duty: Ghosts. In addition, Call of Duty Elite will be available for the PC version of the game including a smartphone app that allows stat tracking and profile management from a mobile device.

The Geforce blog claims that the PC version is intended to be the definitive CoD: Ghosts version, which is always nice to see. More graphical effects and features are being worked on, but IW and NVIDIA are keeping them under wraps for now.

The PC is in a really good place right now between console cycles where developers are finally starting to realize the power of the PC and what it is able to offer in terms of graphical performance and control options. PC-first development is something that I have been wanting to see for a long time (develop for the PC and port to consoles rather than the other way around), and now that PC versions are once again getting due credit and development attention (and resources), along with the upcoming consoles being based on x86 hardware... these types of technical partnerships where the PC version is being positioned as the best version are hopefully the start of a trend that will see a new surge in PC gaming!

Source: NVIDIA

NVIDIA Working Closely With Ubisoft To Enhance Fall PC Games

Subject: General Tech | August 24, 2013 - 08:53 AM |
Tagged: ubisoft, txaa, pc gaming, nvidia, kepler

NVIDIA announced on Wednesday that it had formed an alliance with Ubisoft to collaborate on Ubisoft's upcoming PC game titles (coming this fall). The alliance involves the NVIDIA Developer Technology Team "working closely" with the Ubisoft development studio on several new PC titles. The team NVIDIA-enhanced PC games covered by this new alliance includes Tom Clancy's Splinter Cell: Blacklist, Assassin's Creed IV Black Flag, and Watch Dogs.

NVIDIA Senior VP of Content and Technology Tony Tamasi stated in a press release that "Ubisoft understands that PC gamers demand a truly elite experience -- the best resolutions, the smoothest frame rates and the latest gaming breakthroughs." NVIDIA has reportedly worked with the Ubisoft game developers throughout the entire development process to incorporate the company's graphics technologies.

Tom Clancy's Splinter Cell Blacklist.jpg

Tom Clancy's Splinter Cell: Blacklist is the first game to come out of the alliance. It features PC gaming graphics technologies such as DirectX 11 effects, parallax mapping, ambient occlusion, tessellation, HBAO+ (horizon-based ambient occlusion), and NVIDIA's own TXAA and Surround support. The latest Splinter Cell game also comes bundled with NVIDIA graphics cards.

NVIDIA did not go into details on what sort of extra PC-centric graphics features the other Ubisoft games will have, but it should be similar to those in Splinter Cell: Blacklist. Curiously, the press release makes no mention of NVIDIA's The Way It's Meant To Be Played program, though it seems that this alliance may even go a step further than that in terms of development team interaction and shared resources.

Source: NVIDIA

GeForce 326.80 (Beta) Drivers Now Available

Subject: Graphics Cards | August 20, 2013 - 09:24 AM |
Tagged: nvidia, graphics drivers, geforce 326.80

SCB-Wallpaper-Goggles-1024x768-FINAL.jpg

The new GeForce 326.80 beta driver is now available to download. An essential update for gamers sneaking into Tom Clancy’s Splinter Cell Blacklist, today’s driver ensures maximum performance and system compatibility in the brand new stealth title, which is jam-packed with PC-exclusive features and technology, including NVIDIA HBAO+ Ambient Occlusion, NVIDIA TXAA Temporal Anti-Aliasing, out-of-the-box NVIDIA SLI support, and much much more. For a full rundown, head on back to GeForce.com tomorrow when we’ll detail all of Blacklist’s impressive tech.

New in GeForce R326 Drivers Performance Boost

  • Increases performance by up to 19% for GeForce 400/500/600/700 series GPUs in several PC games vs. GeForce 320.49 WHQL-certified drivers. Results will vary depending on your GPU and system configuration.
  • Here is an example of measured gains:
    • GeForce GTX 770:
      • Up to 15% in Dirt: Showdown
      • Up to 6% in Tomb Raider
    • GeForce GTX 770 SLI: ·
      • Up to 19% in Dirt: Showdown
      • Up to 11% in F1 2012
  • 4K Displays · Adds support for additional tiled 4K displays Extended support for tiled 4K features

Source: NVIDIA