New Intel Graphics Drivers Further Spread Quick Sync Video

Subject: General Tech, Graphics Cards, Processors | February 25, 2014 - 01:33 PM |
Tagged: Ivy Bridge, Intel, iGPU, haswell

Recently, Intel released the 15.33.14.3412 (15.33.14.64.3412 for 64-bit) drivers for their Ivy Bridge and Haswell integrated graphics. The download was apparently published on January 29th while its patch notes are dated February 22nd. It features expanded support for Intel Quick Sync Video Technology, allowing certain Pentium and Celeron-class processors to access the feature, as well as an alleged increase in OpenGL-based games. Probably the most famous OpenGL title of our time is Minecraft, although I do not know if that specific game will see improvements (and if so, how much).

Intel-logo.svg_.png

The new driver enables Quick Sync Video for the following processors:

  • Pentium 3558U
  • Pentium 3561Y
  • Pentium G3220(Unsuffixed/T/TE)
  • Pentium G3420(Unsuffixed/T)
  • Pentium G3430
  • Celeron 2957U
  • Celeron 2961Y
  • Celeron 2981U
  • Celeron G1820(Unsuffixed/T/TE)
  • Celeron G1830

Besides the addition for these processors and the OpenGL performance improvements, the driver obviously fixes several bugs in each of its supported OSes. You can download the appropriate drivers from the Intel Download Center.

Source: Intel
Author:
Manufacturer: ORIGIN PC

Mobile Gaming Powerhouse

Every once in a while, a vendor sends us a preconfigured gaming PC or notebook.  We don't usually focus too much on these systems because so many of readers are quite clearly DIY builders.  Gaming notebooks are another beast, though. Without going through a horrible amount of headaches, building a custom gaming notebook is a pretty tough task.  So, for users who are looking for a ton of gaming performance in a package that is mobile, going with a machine like the ORIGIN PC EON17-SLX is the best option.

IMG_9494.JPG

As the name implies, the EON17-SLX is a 17-in notebook that includes some really impressive specifications including a Haswell processor and SLI GeForce GTX 780M GPUs.

  ORIGIN PC EON17-SLX
Processor Core i7-4930MX (Haswell)
Cores / Threads 4 / 8
Graphics 2 x NVIDIA GeForce GTX 780M 4GB
System Memory 16GB Corsair Vengeance DDR3-1600
Storage 2 x 120GB mSATA SSD (RAID-0)
1 x Western Digital Black 750GB HDD
Wireless Intel 7260 802.11ac
Screen 17-in 1920x1080 LED Matte
Optical 6x Blu-ray reader / DVD writer
Extras Thunderbolt
Operating System Windows 8.1
Price ~$4500

Intel's Core i7-4930MX processor is actually a quad-core Haswell based CPU, not an Ivy Bridge-E part like you might guess based on the part number.  The GeForce GTX 780M GPUs each include 4GB of frame buffer (!!) and have very similar specifications to the desktop GTX 770 parts.  Even though they run at lower clock speeds, a pair of these GPUs will provide a ludicrous amount of gaming performance.

As you would expect for a notebook with this much compute performance, it isn't a thin and light. My scale tips at 9.5 pounds with the laptop alone and over 12 pounds with the power adapter included.  Images of the profile below will indicate not only many of the features included but also the size and form factor.

Continue reading our review of the ORIGIN PC EON17-SLX Gaming Notebook!!

NVIDIA Coin Mining Performance Increases with Maxwell and GTX 750 Ti

Subject: General Tech, Graphics Cards | February 20, 2014 - 05:45 PM |
Tagged: nvidia, mining, maxwell, litecoin, gtx 750 ti, geforce, dogecoin, coin, bitcoin, altcoin

As we have talked about on several different occasions, Altcoin mining (anything that is NOT Bitcoin specifically) is a force on the current GPU market whether we like it or not. Traditionally, Miners have only bought AMD-based GPUs, due to the performance advantage when compared to their NVIDIA competition. However, with continued development of the cudaMiner application over the past few months, NVIDIA cards have been gaining performance in Scrypt mining.

The biggest performance change we've seen yet has come with a new version of cudaMiner released yesterday. This new version (2014-02-18) brings initial support for the Maxwell architecture, which was just released yesterday in the GTX 750 and 750 Ti. With support for Maxwell, mining starts to become a more compelling option with this new NVIDIA GPU.

With the new version of cudaMiner on the reference version of the GTX 750 Ti, we were able to achieve a hashrate of 263 KH/s, impressive when you compare it to the performance of the previous generation, Kepler-based GTX 650 Ti, which tops out at about 150KH/s or so.

IMG_9552.JPG

As you may know from our full GTX 750 Ti Review,  the GM107 overclocks very well. We were able to push our sample to the highest offset configurable of +135 MHz, with an additional 500 MHz added to the memory frequency, and 31 mV bump to the voltage offset. All of this combined to a ~1200 MHz clockspeed while mining, and an additional 40 KH/s or so of performance, bringing us to just under 300KH/s with the 750 Ti.

perf.png

As we compare the performance of the 750 Ti to AMD GPUs and previous generation NVIDIA GPUs, we start to see how impressive the performance of this card stacks up considering the $150 MSRP. For less than half the price of the GTX 770, and roughly the same price as a R7 260X, you can achieve the same performance.

power.png

When we look at power consumption based on the TDP of each card, this comparison only becomes more impressive. At 60W, there is no card that comes close to the performance of the 750 Ti when mining. This means you will spend less to run a 750 Ti than a R7 260X or GTX 770 for roughly the same hash rate.

perfdollar.png

Taking a look at the performance per dollar ratings of these graphics cards, we see the two top performers are the AMD R7 260X and our overclocked GTX 750 Ti.

perfpower.png

However, when looking at the performance per watt differences of the field, the GTX 750 Ti looks more impressive. While most miners may think they don't care about power draw, it can help your bottom line. By being able to buy a smaller, less efficient power supply the payoff date for the hardware is moved up.  This also bodes well for future Maxwell based graphics cards that we will likely see released later in 2014.  

Continue reading our look at Coin Mining performance with the GTX 750 Ti and Maxwell!!

Was leading with a low end Maxwell smart?

Subject: Graphics Cards | February 19, 2014 - 04:43 PM |
Tagged: geforce, gm107, gpu, graphics, gtx 750 ti, maxwell, nvidia, video

We finally saw Maxwell yesterday, with a new design for the SMs called SMM each of which consist of four blocks of 32 dedicated, non-shared CUDA cores.  In theory that should allow NVIDIA to pack more SMMs onto the card than they could with the previous SMK units.  This new design was released on a $150 card which means we don't really get to see what this new design is capable of yet.  At that price it competes with AMD's R7 260X and R7 265, at least if you can find them at their MSRP and not at inflated cryptocurrency levels.  Legit Reviews contrasted the performance of two overclocked GTX 750 Ti to those two cards as well as to the previous generation GTX 650Ti Boost on a wide selection of games to see how it stacks up performance-wise which you can read here.

That is of course after you read Ryan's full review.

nvidia-geforce-gtx750ti-645x399.jpg

"NVIDIA today announced the new GeForce GTX 750 Ti and GTX 750 video cards, which are very interesting to use as they are the first cards based on NVIDIA's new Maxwell graphics architecture. NVIDIA has been developing Maxwell for a number of years and have decided to launch entry-level discrete graphics cards with the new technology first in the $119 to $149 price range. NVIDIA heavily focused on performance per watt with Maxwell and it clearly shows as the GeForce GTX 750 Ti 2GB video card measures just 5.7-inches in length with a tiny heatsink and doesn't require any internal power connectors!"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Author:
Manufacturer: Various

An Upgrade Project

When NVIDIA started talking to us about the new GeForce GTX 750 Ti graphics card, one of the key points they emphasized was the potential use for this first-generation Maxwell GPU to be used in the upgrade process of smaller form factor or OEM PCs. Without the need for an external power connector, the GTX 750 Ti provided a clear performance delta from integrated graphics with minimal cost and minimal power consumption, so the story went.

Eager to put this theory to the test, we decided to put together a project looking at the upgrade potential of off the shelf OEM computers purchased locally.  A quick trip down the road to Best Buy revealed a PC sales section that was dominated by laptops and all-in-ones, but with quite a few "tower" style desktop computers available as well.  We purchased three different machines, each at a different price point, and with different primary processor configurations.

The lucky winners included a Gateway DX4885, an ASUS M11BB, and a Lenovo H520.

IMG_9724.JPG

Continue reading An Upgrade Story: Can the GTX 750 Ti Convert OEMs PCs to Gaming PCs?

AMD Gaming Evolved App with Redeemable Prizes

Subject: General Tech, Graphics Cards | February 19, 2014 - 12:01 AM |
Tagged: raptr, gaming evolved, amd

The AMD Gaming Evolved App updates your drivers, optimizes your game settings, streams your gameplay to Twitch, accesses some social media platforms, and now gives prizes. Points are given for playing games using the app, optimizing game settings, and so forth. These can be exchanged for rewards ranging from free games, to Sapphire R9-series graphics cards.

amd-raptr.jpg

This program has been in beta for a little while now, without the ability to redeem points. The system has been restructured to encourage using the entire app by lowering the accumulation rate for playing games and adding other goals. Beta participants do not lose all of their points, rather it is rescaled more in line with the new system.

The Gaming Evolved prize program has launched today.

Press release after the teaser.

Source: raptr

NVIDIA Releases GeForce TITAN Black

Subject: General Tech, Graphics Cards | February 18, 2014 - 09:03 AM |
Tagged: nvidia, gtx titan black, geforce titan, geforce

NVIDIA has just announced the GeForce GTX Titan Black. Based on the full high-performance Kepler (GK110) chip, it is mostly expected to be a lower cost development platform for GPU processing applications. All 2,880 single precision (FP32) CUDA Cores and 960 double precision (FP64) CUDA Cores are unlocked, yielding 5.1 TeraFLOPs of 32-bit decimal and 1.3 TeraFLOPs of 64-bit decimal performance. The chip contains 1536kB of L2 Cache and will be paired with 6GB of video memory on the board.

nvidia-titan-black-2.jpg

The original GeForce GTX Titan launched last year, almost to the day. Also based on the GK110 design, it also featured full double precision performance with only one SMX disabled. Of course, no component at the time contained a fully-enabled GK110 processor. The first product with all 15 SMX units active was not realized until the Quadro K6000, announced in July but only available in the fall. It was followed by the GeForce GTX 780 Ti (with a fraction of its FP64 performance) in November, and the fully powered Tesla K40 less than two weeks after that.

nvidia-titan-black-3.jpg

For gaming applications, this card is expected to have comparable performance to the GTX 780 Ti... unless you can find a use for the extra 3GB of memory. Games do not display much benefit with the extra 64-bit floating point (decimal) performance because the majority of their calculations are at 32-bit precision.

The NVIDIA GeForce GTX Titan Black is available today at a price of $999.

Source: NVIDIA
Author:
Manufacturer: NVIDIA

What we know about Maxwell

I'm going to go out on a limb and guess that many of you reading this review would not have normally been as interested in the launch of the GeForce GTX 750 Ti if a specific word hadn't been mentioned in the title: Maxwell.  It's true, the launch of GTX 750 Ti, a mainstream graphics card that will sit in the $149 price point, marks the first public release of the new NVIDIA GPU architecture code named Maxwell.  It is a unique move for the company to start at this particular point with a new design, but as you'll see in the changes to the architecture as well as the limitations, it all makes a certain bit of sense.

For those of you that don't really care about the underlying magic that makes the GTX 750 Ti possible, you can skip this page and jump right to the details of the new card itself.  There I will detail the product specifications, performance comparison and expectations, etc.

If you are interested in learning what makes Maxwell tick, keep reading below.

The NVIDIA Maxwell Architecture

When NVIDIA first approached us about the GTX 750 Ti they were very light on details about the GPU that was powering it.  Even though the fact it was built on Maxwell was confirmed the company hadn't yet determined if it was going to do a full architecture deep dive with the press.  In the end they went somewhere in between the full detail we are used to getting with a new GPU design and the original, passive stance.  It looks like we'll have to wait for the enthusiast GPU class release to really get the full story but I think the details we have now paint the story quite clearly.  

During the course of design the Kepler architecture, and then implementing it with the Tegra line in the form of the Tegra K1, NVIDIA's engineering team developed a better sense of how to improve the performance and efficiency of the basic compute design.  Kepler was a huge leap forward compared to the likes of Fermi and Maxwell is promising to be equally as revolutionary.  NVIDIA wanted to address both GPU power consumption as well as finding ways to extract more performance from the architecture at the same power levels.  

The logic of the GPU design remains similar to Kepler.  There is a Graphics Processing Cluster (GPC) that houses Simultaneous Multiprocessors (SM) built from a large number of CUDA cores (stream processors).  

block.jpg

GM107 Block Diagram

Readers familiar with the look of Kepler GPUs will instantly see changes in the organization of the various blocks of Maxwell.  There are more divisions, more groupings and fewer CUDA cores "per block" than before.  As it turns out, this reorganization was part of the ability for NVIDIA to improve performance and power efficiency with the new GPU.  

Continue reading our review of the NVIDIA GeForce GTX 750 Ti and Maxwell Architecture!!

AMD Radeon R9 290X Hits $900 on Newegg. Thanks *coin

Subject: General Tech, Graphics Cards | February 14, 2014 - 06:02 PM |
Tagged: supply shortage, shortage, R9 290X, podcast, litecoin, dogecoin, bitcoin

UPDATE (Feb 14th, 11pm ET): As a commenter has pointed out below, suddenly, as if by magic, Newegg has lowered prices on the currently in stock R9 290X cards by $200.  That means you can currently find them for $699 - only $150 over the expected MSRP.  Does that change anything about what we said above or in the video?  Not really.  It only lowers the severity.

I am curious to know if this was done by Newegg voluntarily due to pressure from news stories such as these, lack of sales at $899 or with some nudging from AMD...

If you have been keeping up with our podcasts and reviews, you will know that AMD cards are great compute devices for their MSRP. This is something that cryptocurrency applies a value to. Run a sufficient amount of encryption tasks and you are rewarded with newly created tokens (or some fee from validated transactions). Some people seem to think that GPUs are more valuable for that purpose than their MSRP, so retailers raise prices and people still buy them.

amd-shortage-900.png

Currently, the cheapest R9 290X is being sold for $900. This is a 64% increase over AMD's intended $549 MSRP. They are not even the ones receiving this money!

This shortage also affects other products such as Corsair's 1200W power supply. Thankfully, only certain components are necessary for mining (mostly GPUs and a lot of power) so at least we are not seeing the shortage spread to RAM, CPUs, APUs, and so forth. We noted a mining kit on Newegg which was powered by a Sempron processor. This line of cheap and low-performance CPUs has not been updated since 2009.

We have kept up with GPU shortages, historically. We did semi-regular availability checks during the GeForce GTX 680 and 690 launch windows. The former was out of stock for over two months after its launch. Those also sometimes strayed from their MSRP, slightly.

Be sure to check out the clip (above) for a nice, 15-minute discussion.

Pitcairn rides again in the R7 265

Subject: Graphics Cards | February 13, 2014 - 02:31 PM |
Tagged: radeon, r7 265, pitcairn, Mantle, gpu, amd

Some time in late February or March you will be able to purchase the R7 265 for around $150, a decent price for an entry level GPU that will benefit those who are currently dependent on the GPU portion of an APU.  This leads to the question of its performance and if this Pitcairn refresh will really benefit a gamer on a tight budget.  Hardware Canucks tested it against the two NVIDIA cards closest in price, the GTX 650 Ti Boost which is almost impossible to find and the GTX 660 2GB which is $40 more than the MSRP of the R7 265.  The GTX 660 is faster overall but when you look at the price to performance ratio the R7 265 is a more attractive offering.  Of course with NVIDIA's Maxwell release just around the corner this could change drastically.

If you already caught Ryan's review, you might have missed the short video he just added on the last page.

slides04.jpg

Crowded house

"AMD's R7 265 is meant to reside in the space between the R7 260X and R9 270, though performance is closer to its R9 sibling. Could this make it a perfect budget friendly graphics card?"

Here are some more Graphics Card articles from around the web:

Graphics Cards