Was leading with a low end Maxwell smart?

Subject: Graphics Cards | February 19, 2014 - 04:43 PM |
Tagged: geforce, gm107, gpu, graphics, gtx 750 ti, maxwell, nvidia, video

We finally saw Maxwell yesterday, with a new design for the SMs called SMM each of which consist of four blocks of 32 dedicated, non-shared CUDA cores.  In theory that should allow NVIDIA to pack more SMMs onto the card than they could with the previous SMK units.  This new design was released on a $150 card which means we don't really get to see what this new design is capable of yet.  At that price it competes with AMD's R7 260X and R7 265, at least if you can find them at their MSRP and not at inflated cryptocurrency levels.  Legit Reviews contrasted the performance of two overclocked GTX 750 Ti to those two cards as well as to the previous generation GTX 650Ti Boost on a wide selection of games to see how it stacks up performance-wise which you can read here.

That is of course after you read Ryan's full review.

nvidia-geforce-gtx750ti-645x399.jpg

"NVIDIA today announced the new GeForce GTX 750 Ti and GTX 750 video cards, which are very interesting to use as they are the first cards based on NVIDIA's new Maxwell graphics architecture. NVIDIA has been developing Maxwell for a number of years and have decided to launch entry-level discrete graphics cards with the new technology first in the $119 to $149 price range. NVIDIA heavily focused on performance per watt with Maxwell and it clearly shows as the GeForce GTX 750 Ti 2GB video card measures just 5.7-inches in length with a tiny heatsink and doesn't require any internal power connectors!"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Author:
Manufacturer: Various

An Upgrade Project

When NVIDIA started talking to us about the new GeForce GTX 750 Ti graphics card, one of the key points they emphasized was the potential use for this first-generation Maxwell GPU to be used in the upgrade process of smaller form factor or OEM PCs. Without the need for an external power connector, the GTX 750 Ti provided a clear performance delta from integrated graphics with minimal cost and minimal power consumption, so the story went.

Eager to put this theory to the test, we decided to put together a project looking at the upgrade potential of off the shelf OEM computers purchased locally.  A quick trip down the road to Best Buy revealed a PC sales section that was dominated by laptops and all-in-ones, but with quite a few "tower" style desktop computers available as well.  We purchased three different machines, each at a different price point, and with different primary processor configurations.

The lucky winners included a Gateway DX4885, an ASUS M11BB, and a Lenovo H520.

IMG_9724.JPG

Continue reading An Upgrade Story: Can the GTX 750 Ti Convert OEMs PCs to Gaming PCs?

NVIDIA Releases GeForce TITAN Black

Subject: General Tech, Graphics Cards | February 18, 2014 - 09:03 AM |
Tagged: nvidia, gtx titan black, geforce titan, geforce

NVIDIA has just announced the GeForce GTX Titan Black. Based on the full high-performance Kepler (GK110) chip, it is mostly expected to be a lower cost development platform for GPU processing applications. All 2,880 single precision (FP32) CUDA Cores and 960 double precision (FP64) CUDA Cores are unlocked, yielding 5.1 TeraFLOPs of 32-bit decimal and 1.3 TeraFLOPs of 64-bit decimal performance. The chip contains 1536kB of L2 Cache and will be paired with 6GB of video memory on the board.

nvidia-titan-black-2.jpg

The original GeForce GTX Titan launched last year, almost to the day. Also based on the GK110 design, it also featured full double precision performance with only one SMX disabled. Of course, no component at the time contained a fully-enabled GK110 processor. The first product with all 15 SMX units active was not realized until the Quadro K6000, announced in July but only available in the fall. It was followed by the GeForce GTX 780 Ti (with a fraction of its FP64 performance) in November, and the fully powered Tesla K40 less than two weeks after that.

nvidia-titan-black-3.jpg

For gaming applications, this card is expected to have comparable performance to the GTX 780 Ti... unless you can find a use for the extra 3GB of memory. Games do not display much benefit with the extra 64-bit floating point (decimal) performance because the majority of their calculations are at 32-bit precision.

The NVIDIA GeForce GTX Titan Black is available today at a price of $999.

Source: NVIDIA
Author:
Manufacturer: NVIDIA

What we know about Maxwell

I'm going to go out on a limb and guess that many of you reading this review would not have normally been as interested in the launch of the GeForce GTX 750 Ti if a specific word hadn't been mentioned in the title: Maxwell.  It's true, the launch of GTX 750 Ti, a mainstream graphics card that will sit in the $149 price point, marks the first public release of the new NVIDIA GPU architecture code named Maxwell.  It is a unique move for the company to start at this particular point with a new design, but as you'll see in the changes to the architecture as well as the limitations, it all makes a certain bit of sense.

For those of you that don't really care about the underlying magic that makes the GTX 750 Ti possible, you can skip this page and jump right to the details of the new card itself.  There I will detail the product specifications, performance comparison and expectations, etc.

If you are interested in learning what makes Maxwell tick, keep reading below.

The NVIDIA Maxwell Architecture

When NVIDIA first approached us about the GTX 750 Ti they were very light on details about the GPU that was powering it.  Even though the fact it was built on Maxwell was confirmed the company hadn't yet determined if it was going to do a full architecture deep dive with the press.  In the end they went somewhere in between the full detail we are used to getting with a new GPU design and the original, passive stance.  It looks like we'll have to wait for the enthusiast GPU class release to really get the full story but I think the details we have now paint the story quite clearly.  

During the course of design the Kepler architecture, and then implementing it with the Tegra line in the form of the Tegra K1, NVIDIA's engineering team developed a better sense of how to improve the performance and efficiency of the basic compute design.  Kepler was a huge leap forward compared to the likes of Fermi and Maxwell is promising to be equally as revolutionary.  NVIDIA wanted to address both GPU power consumption as well as finding ways to extract more performance from the architecture at the same power levels.  

The logic of the GPU design remains similar to Kepler.  There is a Graphics Processing Cluster (GPC) that houses Simultaneous Multiprocessors (SM) built from a large number of CUDA cores (stream processors).  

block.jpg

GM107 Block Diagram

Readers familiar with the look of Kepler GPUs will instantly see changes in the organization of the various blocks of Maxwell.  There are more divisions, more groupings and fewer CUDA cores "per block" than before.  As it turns out, this reorganization was part of the ability for NVIDIA to improve performance and power efficiency with the new GPU.  

Continue reading our review of the NVIDIA GeForce GTX 750 Ti and Maxwell Architecture!!

Podcast #287 - AMD R7 265, Coin Mining's effect on GPU Prices, NVIDIA Earnings and more!

Subject: General Tech | February 14, 2014 - 02:11 PM |
Tagged: video, r9 270x, r7 265, r7 260x, podcast, nvidia, fusion-io, arm, amd, A17

PC Perspective Podcast #287 - 02/14/2014

Join us this week as we discuss the release of the AMD R7 265, Coin Mining's effect on GPU Prices, NVIDIA Earnings and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano

 
Program length: 1:09:27
 
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week:
  4. Closing/outro

Be sure to subscribe to the PC Perspective YouTube channel!!

 

Author:
Subject: Editorial
Manufacturer: NVIDIA

It wouldn’t be February if we didn’t hear the Q4 FY14 earnings from NVIDIA!  NVIDIA does have a slightly odd way of expressing their quarters, but in the end it is all semantics.  They are not in fact living in the future, but I bet their product managers wish they could peer into the actual Q4 2014.  No, the whole FY14 thing relates back to when they made their IPO and how they started reporting.  To us mere mortals, Q4 FY14 actually represents Q4 2013.  Clear as mud?  Lord love the Securities and Exchange Commission and their rules.

633879_NVLogo_3D.jpg

The past quarter was a pretty good one for NVIDIA.  They came away with $1.144 billion in gross revenue and had a GAAP net income of $147 million.  This beat the Street’s estimate by a pretty large margin.  As a response, trading of NVIDIA’s stock has gone up in after hours.  This has certainly been a trying year for NVIDIA and the PC market in general, but they seem to have come out on top.

NVIDIA beat estimates primarily on the strength of the PC graphics division.  Many were focusing on the apparent decline of the PC market and assumed that NVIDIA would be dragged down by lower shipments.  On the contrary, it seems as though the gaming market and add-in sales on the PC helped to solidify NVIDIA’s quarter.  We can look at a number of factors that likely contributed to this uptick for NVIDIA.

Click here to read the rest of NVIDIA's Q4 FY2014 results!

Podcast #286 - AMD Mantle, Battlefield 4 Performance, Chromeboxes and more!

Subject: General Tech | February 6, 2014 - 03:14 PM |
Tagged: podcast, video, amd, Mantle, r9 290, 290x, battlefield 4, Chromebox, Chromebook, t440s, nvidia, Intel

PC Perspective Podcast #286 - 02/06/2014

Join us this week as we discuss the release of AMD Mantle, Battlefield 4 Performance, Chromeboxes and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano

 
Program length: 1:03:08
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week:
    1. Jeremy: The Cyberith Virtualizer would be nice to go with that Oculus Rift you should buy me
  4. Closing/outro

Be sure to subscribe to the PC Perspective YouTube channel!!

 

Video Perspective: Free to Play Games on the A10-7850K vs. Intel Core i3 + GeForce GT 630

Subject: Graphics Cards, Processors | January 31, 2014 - 04:36 PM |
Tagged: 7850k, A10-7850K, amd, APU, gt 630, Intel, nvidia, video

As a follow up to our first video posted earlier in the week that looked at the A10-7850K and the GT 630 from NVIDIA in five standard games, this time we compare the A10-7850K APU against the same combination of the Intel and NVIDIA hardware in five of 2013's top free to play games.

UPDATE: I've had some questions about WHICH of the GT 630 SKUs were used in this testing.  Our GT 630 was this EVGA model that is based on 96 CUDA cores and a 128-bit DDR3 memory interface.  You can see a comparison of the three current GT 630 options on NVIDIA's website here.

If you are looking for more information on AMD's Kaveri APUs you should check out my review of the A8-7600 part as well our testing of Dual Graphics with the A8-7600 and a Radeon R7 250 card.

Video Perspective: 2013 Games on the A10-7850K vs. Intel Core i3 + GeForce GT 630

Subject: Graphics Cards, Processors | January 29, 2014 - 03:44 PM |
Tagged: video, nvidia, Intel, gt 630, APU, amd, A10-7850K, 7850k

The most interesting aspect of the new Kaveri-based APUs from AMD, in particularly the A10-7850K part, is how it improves mainstream gaming performance.  AMD has always stated that these APUs shake up the need for low-cost discrete graphics and when we got the new APU in the office we did a couple of quick tests to see how much validity there to that claim.

In this short video we compare the A10-7850K APU against a combination of the Intel Core i3-4330 and GeForce GT 630 discrete graphics card in five of 2013's top PC releases.  I think you'll find the results pretty interesting.

UPDATE: I've had some questions about WHICH of the GT 630 SKUs were used in this testing.  Our GT 630 was this EVGA model that is based on 96 CUDA cores and a 128-bit DDR3 memory interface.  You can see a comparison of the three current GT 630 options on NVIDIA's website here.

If you are looking for more information on AMD's Kaveri APUs you should check out my review of the A8-7600 part as well our testing of Dual Graphics with the A8-7600 and a Radeon R7 250 card.

NVIDIA still holds the OpenGL crown on Linux; AMD is getting better though

Subject: General Tech | January 23, 2014 - 02:58 PM |
Tagged: opengl, linux, amd, nvidia

If you are a Linux user who prefers to use OpenGL graphics there is still a huge benefit to choosing NVIDIA over AMD.  The tests Phoronix just completed show that the GTX680, 770 and 780 all perform significantly faster than the R9 290 with even the older GTX 550 Ti and 650 GPUs outperforming AMD's best in some benchmarks.  That said AMD is making important improvements to their open source drivers as that is where they are lagging behind NVIDIA.  The new RadeonSI Gallium3D for the HD7000 series shows significant performance improvements when paired with the new 3.13 kernel though still falling a bit behind the Catalyst driver they are now much closer to the performance of the proprietary driver.  For older cards the performance increase is nowhere near as impressive but some certain benchmarks do show this Gallium3D driver to provide at least some improvements.  Pity the Source engine isn't behaving properly during benchmarks which is why no tests were run on Valve's games but that should be solved in the near future.

image.php_.jpg

"In new tests conducted last week with the latest AMD and NVIDIA binary graphics drivers, the high-end AMD GPUs still really aren't proving much competition to NVIDIA's Kepler graphics cards. Here's a new 12 graphics card comparison on Ubuntu."

Here is some more Tech News from around the web:

Tech Talk

Source: Phoronix