Author:
Subject: Editorial
Manufacturer: NVIDIA

It wouldn’t be February if we didn’t hear the Q4 FY14 earnings from NVIDIA!  NVIDIA does have a slightly odd way of expressing their quarters, but in the end it is all semantics.  They are not in fact living in the future, but I bet their product managers wish they could peer into the actual Q4 2014.  No, the whole FY14 thing relates back to when they made their IPO and how they started reporting.  To us mere mortals, Q4 FY14 actually represents Q4 2013.  Clear as mud?  Lord love the Securities and Exchange Commission and their rules.

633879_NVLogo_3D.jpg

The past quarter was a pretty good one for NVIDIA.  They came away with $1.144 billion in gross revenue and had a GAAP net income of $147 million.  This beat the Street’s estimate by a pretty large margin.  As a response, trading of NVIDIA’s stock has gone up in after hours.  This has certainly been a trying year for NVIDIA and the PC market in general, but they seem to have come out on top.

NVIDIA beat estimates primarily on the strength of the PC graphics division.  Many were focusing on the apparent decline of the PC market and assumed that NVIDIA would be dragged down by lower shipments.  On the contrary, it seems as though the gaming market and add-in sales on the PC helped to solidify NVIDIA’s quarter.  We can look at a number of factors that likely contributed to this uptick for NVIDIA.

Click here to read the rest of NVIDIA's Q4 FY2014 results!

Podcast #286 - AMD Mantle, Battlefield 4 Performance, Chromeboxes and more!

Subject: General Tech | February 6, 2014 - 12:14 PM |
Tagged: podcast, video, amd, Mantle, r9 290, 290x, battlefield 4, Chromebox, Chromebook, t440s, nvidia, Intel

PC Perspective Podcast #286 - 02/06/2014

Join us this week as we discuss the release of AMD Mantle, Battlefield 4 Performance, Chromeboxes and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano

 
Program length: 1:03:08
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week:
    1. Jeremy: The Cyberith Virtualizer would be nice to go with that Oculus Rift you should buy me
  4. Closing/outro

Be sure to subscribe to the PC Perspective YouTube channel!!

 

Video Perspective: Free to Play Games on the A10-7850K vs. Intel Core i3 + GeForce GT 630

Subject: Graphics Cards, Processors | January 31, 2014 - 01:36 PM |
Tagged: 7850k, A10-7850K, amd, APU, gt 630, Intel, nvidia, video

As a follow up to our first video posted earlier in the week that looked at the A10-7850K and the GT 630 from NVIDIA in five standard games, this time we compare the A10-7850K APU against the same combination of the Intel and NVIDIA hardware in five of 2013's top free to play games.

UPDATE: I've had some questions about WHICH of the GT 630 SKUs were used in this testing.  Our GT 630 was this EVGA model that is based on 96 CUDA cores and a 128-bit DDR3 memory interface.  You can see a comparison of the three current GT 630 options on NVIDIA's website here.

If you are looking for more information on AMD's Kaveri APUs you should check out my review of the A8-7600 part as well our testing of Dual Graphics with the A8-7600 and a Radeon R7 250 card.

Video Perspective: 2013 Games on the A10-7850K vs. Intel Core i3 + GeForce GT 630

Subject: Graphics Cards, Processors | January 29, 2014 - 12:44 PM |
Tagged: video, nvidia, Intel, gt 630, APU, amd, A10-7850K, 7850k

The most interesting aspect of the new Kaveri-based APUs from AMD, in particularly the A10-7850K part, is how it improves mainstream gaming performance.  AMD has always stated that these APUs shake up the need for low-cost discrete graphics and when we got the new APU in the office we did a couple of quick tests to see how much validity there to that claim.

In this short video we compare the A10-7850K APU against a combination of the Intel Core i3-4330 and GeForce GT 630 discrete graphics card in five of 2013's top PC releases.  I think you'll find the results pretty interesting.

UPDATE: I've had some questions about WHICH of the GT 630 SKUs were used in this testing.  Our GT 630 was this EVGA model that is based on 96 CUDA cores and a 128-bit DDR3 memory interface.  You can see a comparison of the three current GT 630 options on NVIDIA's website here.

If you are looking for more information on AMD's Kaveri APUs you should check out my review of the A8-7600 part as well our testing of Dual Graphics with the A8-7600 and a Radeon R7 250 card.

NVIDIA still holds the OpenGL crown on Linux; AMD is getting better though

Subject: General Tech | January 23, 2014 - 11:58 AM |
Tagged: opengl, linux, amd, nvidia

If you are a Linux user who prefers to use OpenGL graphics there is still a huge benefit to choosing NVIDIA over AMD.  The tests Phoronix just completed show that the GTX680, 770 and 780 all perform significantly faster than the R9 290 with even the older GTX 550 Ti and 650 GPUs outperforming AMD's best in some benchmarks.  That said AMD is making important improvements to their open source drivers as that is where they are lagging behind NVIDIA.  The new RadeonSI Gallium3D for the HD7000 series shows significant performance improvements when paired with the new 3.13 kernel though still falling a bit behind the Catalyst driver they are now much closer to the performance of the proprietary driver.  For older cards the performance increase is nowhere near as impressive but some certain benchmarks do show this Gallium3D driver to provide at least some improvements.  Pity the Source engine isn't behaving properly during benchmarks which is why no tests were run on Valve's games but that should be solved in the near future.

image.php_.jpg

"In new tests conducted last week with the latest AMD and NVIDIA binary graphics drivers, the high-end AMD GPUs still really aren't proving much competition to NVIDIA's Kepler graphics cards. Here's a new 12 graphics card comparison on Ubuntu."

Here is some more Tech News from around the web:

Tech Talk

Source: Phoronix

Even More NVIDIA ShadowPlay Features with 1.8.2

Subject: General Tech, Graphics Cards | January 23, 2014 - 12:29 AM |
Tagged: ShadowPlay, nvidia, geforce experience

NVIDIA has been upgrading their GeForce Experience just about once per month, on average. Most of their attention has been focused on ShadowPlay which is their video capture and streaming service for games based on DirectX. GeForce Experience 1.8.1 brought streaming to Twitch and the ability to overlay the user's webcam.

This time they add a little bit more control in how ShadowPlay records.

nvidia-shadowplay-jan2014.png

Until this version, users could choose between "Low", "Medium", and "High" quality stages. GeForce Experence 1.8.2 adds "Custom" which allows manual control over resolution, frame rate, and bit rate. NVIDIA wants to makes it clear: frame rate controls the number of images per second and bit rate controls the file size per second. Reducing the frame rate without adjusting the bit rate will result in a file of the same size (just with better quality per frame).

Also with this update, NVIDIA allows users to set a push-to-talk key. I expect this will be mostly useful for Twitch streaming in a crowded dorm or household. Only transmitting your voice when you have something to say prevents someone else from accidentally transmitting theirs globally and instantaneously.

GeForce Experience 1.8.2 is available for download at the GeForce website. Users with a Fermi-based GPU will no longer be pushed GeForce Experience (because it really does not do anything for those graphics cards). The latest version can always be manually downloaded, however.

Rumor: NVIDIA GeForce GTX TITAN Black and GTX 790 Incoming

Subject: Graphics Cards | January 21, 2014 - 12:49 PM |
Tagged: rumor, nvidia, kepler, gtx titan black, gtx titan, gtx 790

How about some fresh graphics card rumors for your Tuesday afternoon?  The folks at VideoCardz.com have collected some information about two potential NVIDIA GeForce cards that are going to hit your pocketbook hard.  If the mid-range GPU market was crowded wait until you see the changes NVIDIA might have for you soon on the high-end.

First up is the NVIDIA GeForce GTX TITAN Black Edition, a card that will actually have the same specifications as the GTX 780 Ti but with full performance double precision floating point and a move from 3GB to 6GB of memory.  The all-black version of the GeForce GTX 700-series cooler is particularly awesome looking.  

gtx790pic.jpg

Image from VideoCardz.com

The new TITAN would sport the same GPU as GTX 780 Ti, only TITAN BLACK would have higher double precision computing performance, thus more FP64 CUDA cores. The GTX TITAN Black Edition is also said to feature 6GB memory buffer.This is twice as much as GTX 780 Ti, and it pretty much confirms we won’t be seeing any 6GB Ti’s.

The rest is pretty much well known, TITAN BLACK has 2880 CUDA cores, 240 TMUs and 48 ROPs.

VideoCardz.com says this will come in at $999.  If true, this is a pure HPC play as the GTX 780 Ti would still offer the same gaming performance for enthusiasts.  

Secondly, there looks to be an upcoming dual-GPU graphics card using a pair of GK110 GPUs that will be called the GeForce GTX 790.  The specifications that VideoCardz.com says they have indicate that each GPU will have 2496 enabled CUDA cores and a smaller 320-bit memory interface with 5GB designated for each GPU.  Cutting back on the memory interface, shader counts and even clocks speeds would allow NVIDIA to manage power consumption at the targeted 300 watt level.  

gtx790table.jpg

Image from VideoCardz.com

Head over to VideoCardz.com for more information about these rumors but if all goes as they expect, you'll hear about these products quite a bit more in February and March.

What do you think?  Are these new $1000 graphics cards something you are looking forward to?  

Source: VideoCardz

GeForce GTX 750 Ti May Be Maxwell... and February?

Subject: General Tech, Graphics Cards | January 20, 2014 - 01:19 AM |
Tagged: maxwell, nvidia

Well this is somewhat unexpected (and possibly wrong). Maxwell, NVIDIA's new architecture to replace Kepler, is said to appear in Feburary with the form of a GeForce GTX 750 Ti. The rumors, which sound iffy to me, claims that this core will be produced at TSMC on a 28nm fabrication technology and later transition to their 20nm lines.

As if the 700-series family tree was not diverse enough.

nvidiamaxwellroadmap.jpg

2013 may have been much closer than expected.

Swedish site, Sweclockers, have been contacted by "sources" which claim that NVIDIA has already alerted partners to prepare a graphics card launch. Very little information is given beyond that. They do not even have access to a suggested GM1## architecture code. They just claim that partners should expect a new videocard on the 18th of February (what type of launch that is is also unclear).

This also raises questions about why the mid-range card will come before the high-end. If the 28nm rumor is true, it could just be that NVIDIA did not want to wait around until TSMC could fabricate their high-end part if they already had an architecture version that could be produced now. It could be as simple as that.

The GeForce GTX 750 Ti is rumored to arrive in February to replace the GTX 650 Ti Boost.

Source: SweClockers

NVIDIA G-Sync DIY Kit For ASUS VG248QE Monitor Now Available for $199

Subject: Displays | January 17, 2014 - 03:35 PM |
Tagged: vg248qe, nvidia, gaming, g-sync, DIY, asus

NVIDIA's new G-Sync variable refresh rate technology is slowly being rolled out to consumers in the form of new monitors and DIY upgrade kits that can be used to add G-Sync functionality to existing displays. The first G-Sync capable monitor to support the DIY upgrade kit path is the ASUS VG248QE which is a 24" 1080p 144Hz TN panel. The monitor itself costs around $270 and the you can now purchase a G-Sync DIY upgrade kit from NVIDIA for $199.

The upgrade kit comes with a replacement controller board, power supply, HDMI cable, plastic spudger, IO shields, and installation instructions. Users will need to take apart the VG248QE monitor, remove the old PCBs and install the G-Sync board in its place. According to NVIDIA the entire process takes about 30 minutes though if this is your first time digging into monitor internals it will likely take closer to an hour to install.

The NVIDIA G-Sync DIY kit below the ASUS VG248QE monitor.

For help with installation, NVIDIA has posted a video of the installation process on YouTube. If you find text and photos easier, you can follow the installation guides written up for PC Perspective by Allyn Malventano and reader Levi Kendall. Both DIY kit reviews stated that the process, while a bit involved, was possible for most gamers to perform with a bit of guidance.

You can order the DIY upgrade kit yourself from this NVIDIA page.

Alternatively, ASUS is also releasing an updated version of the VG248QE monitor with the G-Sync board pre-installed in the first half of this year. This updated G-Sync monitor will have an MSRP of $399.

With the G-Sync kit at $199, will you be going the DIY path or waiting for a new monitor with the technology pre-installed?

Read more about NVIDIA's G-Sync display technology at PC Perspective including first impressions, installation, and more!

Source: NVIDIA

NVIDIA's take on AMD's under documented free sync

Subject: General Tech | January 8, 2014 - 10:20 AM |
Tagged: tom petersen, nvidia, g-sync, free sync, CES 2014, amd

AMD's free sync has been getting a lot of well deserved attention at this years CES, Ryan had a chance to see it in action if you haven't checked out his look at AMD's under reported and under utilized feature.  AMD missed an opportunity with this technology which NVIDIA picked up on with their G-Sync.  NVIDIA has responded to The Tech Report's comments from yesterday, Tom Petersen stated that while free sync may be an alternative on laptops, desktop displays are a different beast.  They utilize different connections and there is generally a scaler chip between the GPU and the display.  Read his full comments here.

gsync.jpg

"AMD demoed its "free sync" alternative to G-Sync on laptops. Desktop displays are different, Nvidia says, and they may not support the variable refresh rate tech behind AMD's solution—at least not yet."

Here is some more Tech News from around the web:

Tech Talk