Subject: General Tech, Graphics Cards | February 25, 2014 - 11:46 AM | Scott Michaud
Tagged: amd, Mantle, TrueAudio, Thief 4, thief
AMD released their Catalyst 14.2 Beta V1.3 graphics drivers today, coinciding with the launch of Thief. The game, developed by Eidos Montreal and published by Square Enix, is another entry in "Gaming Evolved" and their "Never Settle" promotion. Soon, it will also support Mantle and TrueAudio.
Being Theif's launch driver, it provides optimizations for both single-GPU and Crossfire customers in that title. It also provides fixes for other titles, especially Battlefield 4 which can now run Mantle with up-to four GPUs. Battlefield 3 and 4 also supports Frame Pacing on very high (greater than 2560x1600) resolution monitors in dual-card Crossfire. It also fixes a couple of bugs in using Crossfire with DirectX 9 games, missing textures Minecraft, and corruption in X-Plane.
Catalyst 14.2 Beta V1.3 driver is available now at AMD's website.
Subject: General Tech, Graphics Cards, Processors | February 25, 2014 - 10:33 AM | Scott Michaud
Tagged: Ivy Bridge, Intel, iGPU, haswell
Recently, Intel released the 22.214.171.12412 (126.96.36.199.3412 for 64-bit) drivers for their Ivy Bridge and Haswell integrated graphics. The download was apparently published on January 29th while its patch notes are dated February 22nd. It features expanded support for Intel Quick Sync Video Technology, allowing certain Pentium and Celeron-class processors to access the feature, as well as an alleged increase in OpenGL-based games. Probably the most famous OpenGL title of our time is Minecraft, although I do not know if that specific game will see improvements (and if so, how much).
The new driver enables Quick Sync Video for the following processors:
- Pentium 3558U
- Pentium 3561Y
- Pentium G3220(Unsuffixed/T/TE)
- Pentium G3420(Unsuffixed/T)
- Pentium G3430
- Celeron 2957U
- Celeron 2961Y
- Celeron 2981U
- Celeron G1820(Unsuffixed/T/TE)
- Celeron G1830
Besides the addition for these processors and the OpenGL performance improvements, the driver obviously fixes several bugs in each of its supported OSes. You can download the appropriate drivers from the Intel Download Center.
Mobile Gaming Powerhouse
Every once in a while, a vendor sends us a preconfigured gaming PC or notebook. We don't usually focus too much on these systems because so many of readers are quite clearly DIY builders. Gaming notebooks are another beast, though. Without going through a horrible amount of headaches, building a custom gaming notebook is a pretty tough task. So, for users who are looking for a ton of gaming performance in a package that is mobile, going with a machine like the ORIGIN PC EON17-SLX is the best option.
As the name implies, the EON17-SLX is a 17-in notebook that includes some really impressive specifications including a Haswell processor and SLI GeForce GTX 780M GPUs.
|ORIGIN PC EON17-SLX|
|Processor||Core i7-4930MX (Haswell)|
|Cores / Threads||4 / 8|
|Graphics||2 x NVIDIA GeForce GTX 780M 4GB|
|System Memory||16GB Corsair Vengeance DDR3-1600|
|Storage||2 x 120GB mSATA SSD (RAID-0)
1 x Western Digital Black 750GB HDD
|Wireless||Intel 7260 802.11ac|
|Screen||17-in 1920x1080 LED Matte|
|Optical||6x Blu-ray reader / DVD writer|
|Operating System||Windows 8.1|
Intel's Core i7-4930MX processor is actually a quad-core Haswell based CPU, not an Ivy Bridge-E part like you might guess based on the part number. The GeForce GTX 780M GPUs each include 4GB of frame buffer (!!) and have very similar specifications to the desktop GTX 770 parts. Even though they run at lower clock speeds, a pair of these GPUs will provide a ludicrous amount of gaming performance.
As you would expect for a notebook with this much compute performance, it isn't a thin and light. My scale tips at 9.5 pounds with the laptop alone and over 12 pounds with the power adapter included. Images of the profile below will indicate not only many of the features included but also the size and form factor.
Subject: General Tech, Graphics Cards | February 20, 2014 - 02:45 PM | Ken Addison
Tagged: nvidia, mining, maxwell, litecoin, gtx 750 ti, geforce, dogecoin, coin, bitcoin, altcoin
As we have talked about on several different occasions, Altcoin mining (anything that is NOT Bitcoin specifically) is a force on the current GPU market whether we like it or not. Traditionally, Miners have only bought AMD-based GPUs, due to the performance advantage when compared to their NVIDIA competition. However, with continued development of the cudaMiner application over the past few months, NVIDIA cards have been gaining performance in Scrypt mining.
The biggest performance change we've seen yet has come with a new version of cudaMiner released yesterday. This new version (2014-02-18) brings initial support for the Maxwell architecture, which was just released yesterday in the GTX 750 and 750 Ti. With support for Maxwell, mining starts to become a more compelling option with this new NVIDIA GPU.
With the new version of cudaMiner on the reference version of the GTX 750 Ti, we were able to achieve a hashrate of 263 KH/s, impressive when you compare it to the performance of the previous generation, Kepler-based GTX 650 Ti, which tops out at about 150KH/s or so.
As you may know from our full GTX 750 Ti Review, the GM107 overclocks very well. We were able to push our sample to the highest offset configurable of +135 MHz, with an additional 500 MHz added to the memory frequency, and 31 mV bump to the voltage offset. All of this combined to a ~1200 MHz clockspeed while mining, and an additional 40 KH/s or so of performance, bringing us to just under 300KH/s with the 750 Ti.
As we compare the performance of the 750 Ti to AMD GPUs and previous generation NVIDIA GPUs, we start to see how impressive the performance of this card stacks up considering the $150 MSRP. For less than half the price of the GTX 770, and roughly the same price as a R7 260X, you can achieve the same performance.
When we look at power consumption based on the TDP of each card, this comparison only becomes more impressive. At 60W, there is no card that comes close to the performance of the 750 Ti when mining. This means you will spend less to run a 750 Ti than a R7 260X or GTX 770 for roughly the same hash rate.
Taking a look at the performance per dollar ratings of these graphics cards, we see the two top performers are the AMD R7 260X and our overclocked GTX 750 Ti.
However, when looking at the performance per watt differences of the field, the GTX 750 Ti looks more impressive. While most miners may think they don't care about power draw, it can help your bottom line. By being able to buy a smaller, less efficient power supply the payoff date for the hardware is moved up. This also bodes well for future Maxwell based graphics cards that we will likely see released later in 2014.
Subject: Graphics Cards | February 19, 2014 - 01:43 PM | Jeremy Hellstrom
Tagged: geforce, gm107, gpu, graphics, gtx 750 ti, maxwell, nvidia, video
We finally saw Maxwell yesterday, with a new design for the SMs called SMM each of which consist of four blocks of 32 dedicated, non-shared CUDA cores. In theory that should allow NVIDIA to pack more SMMs onto the card than they could with the previous SMK units. This new design was released on a $150 card which means we don't really get to see what this new design is capable of yet. At that price it competes with AMD's R7 260X and R7 265, at least if you can find them at their MSRP and not at inflated cryptocurrency levels. Legit Reviews contrasted the performance of two overclocked GTX 750 Ti to those two cards as well as to the previous generation GTX 650Ti Boost on a wide selection of games to see how it stacks up performance-wise which you can read here.
That is of course after you read Ryan's full review.
"NVIDIA today announced the new GeForce GTX 750 Ti and GTX 750 video cards, which are very interesting to use as they are the first cards based on NVIDIA's new Maxwell graphics architecture. NVIDIA has been developing Maxwell for a number of years and have decided to launch entry-level discrete graphics cards with the new technology first in the $119 to $149 price range. NVIDIA heavily focused on performance per watt with Maxwell and it clearly shows as the GeForce GTX 750 Ti 2GB video card measures just 5.7-inches in length with a tiny heatsink and doesn't require any internal power connectors!"
Here are some more Graphics Card articles from around the web:
- MSI GTX 750 Ti Gaming Video Card Review @HiTech Legion
- NVIDIA GeForce GTX 750 Ti @ Benchmark Reviews
- ASUS GTX 750 OC 1 GB @ techPowerUp
- MSI GTX 750 Ti Gaming 2 GB @ techPowerUp
- NVIDIA GeForce GTX 750Ti the Arrival of Maxwell @HiTech Legion
- Palit GTX 750 Ti StormX Dual 2 GB @ techPowerUp
- The GTX 750 Ti Review; Maxwell Arrives @ Hardware Canucks
- Nvidia GeForce GTX 750 Ti vs. AMD Radeon R7 265 @ Legion Hardware
- MSI GTX750Ti OC Twin Frozr @ Kitguru
- NVIDIA GeForce GTX 750 Ti 2 GB @ techPowerUp
- NVIDIA GeForce GTX 750 Ti "Maxwell" On Linux @ Phoronix
- A quick look at Mantle on AMD's Kaveri APU @ The Tech Report
- Sapphire Radeon R9 Tri-X OC video card @ Hardwareoverclock
- AMD Radeon R9 290: Still Not Good For Linux Users @ Phoronix
- AMD Radeon R7 265 2GB Video Card Review @ Legit Reviews
- Sapphire Radeon R7 260X OC 2GB Graphics Card Review @ Techgage
- XFX Double Dissipation R9 280X @ [H]ard|OCP
An Upgrade Project
When NVIDIA started talking to us about the new GeForce GTX 750 Ti graphics card, one of the key points they emphasized was the potential use for this first-generation Maxwell GPU to be used in the upgrade process of smaller form factor or OEM PCs. Without the need for an external power connector, the GTX 750 Ti provided a clear performance delta from integrated graphics with minimal cost and minimal power consumption, so the story went.
Eager to put this theory to the test, we decided to put together a project looking at the upgrade potential of off the shelf OEM computers purchased locally. A quick trip down the road to Best Buy revealed a PC sales section that was dominated by laptops and all-in-ones, but with quite a few "tower" style desktop computers available as well. We purchased three different machines, each at a different price point, and with different primary processor configurations.
The lucky winners included a Gateway DX4885, an ASUS M11BB, and a Lenovo H520.
Subject: General Tech, Graphics Cards | February 18, 2014 - 09:01 PM | Scott Michaud
Tagged: raptr, gaming evolved, amd
The AMD Gaming Evolved App updates your drivers, optimizes your game settings, streams your gameplay to Twitch, accesses some social media platforms, and now gives prizes. Points are given for playing games using the app, optimizing game settings, and so forth. These can be exchanged for rewards ranging from free games, to Sapphire R9-series graphics cards.
This program has been in beta for a little while now, without the ability to redeem points. The system has been restructured to encourage using the entire app by lowering the accumulation rate for playing games and adding other goals. Beta participants do not lose all of their points, rather it is rescaled more in line with the new system.
Subject: General Tech, Graphics Cards | February 18, 2014 - 06:03 AM | Scott Michaud
Tagged: nvidia, gtx titan black, geforce titan, geforce
NVIDIA has just announced the GeForce GTX Titan Black. Based on the full high-performance Kepler (GK110) chip, it is mostly expected to be a lower cost development platform for GPU processing applications. All 2,880 single precision (FP32) CUDA Cores and 960 double precision (FP64) CUDA Cores are unlocked, yielding 5.1 TeraFLOPs of 32-bit decimal and 1.3 TeraFLOPs of 64-bit decimal performance. The chip contains 1536kB of L2 Cache and will be paired with 6GB of video memory on the board.
The original GeForce GTX Titan launched last year, almost to the day. Also based on the GK110 design, it also featured full double precision performance with only one SMX disabled. Of course, no component at the time contained a fully-enabled GK110 processor. The first product with all 15 SMX units active was not realized until the Quadro K6000, announced in July but only available in the fall. It was followed by the GeForce GTX 780 Ti (with a fraction of its FP64 performance) in November, and the fully powered Tesla K40 less than two weeks after that.
For gaming applications, this card is expected to have comparable performance to the GTX 780 Ti... unless you can find a use for the extra 3GB of memory. Games do not display much benefit with the extra 64-bit floating point (decimal) performance because the majority of their calculations are at 32-bit precision.
The NVIDIA GeForce GTX Titan Black is available today at a price of $999.
What we know about Maxwell
I'm going to go out on a limb and guess that many of you reading this review would not have normally been as interested in the launch of the GeForce GTX 750 Ti if a specific word hadn't been mentioned in the title: Maxwell. It's true, the launch of GTX 750 Ti, a mainstream graphics card that will sit in the $149 price point, marks the first public release of the new NVIDIA GPU architecture code named Maxwell. It is a unique move for the company to start at this particular point with a new design, but as you'll see in the changes to the architecture as well as the limitations, it all makes a certain bit of sense.
For those of you that don't really care about the underlying magic that makes the GTX 750 Ti possible, you can skip this page and jump right to the details of the new card itself. There I will detail the product specifications, performance comparison and expectations, etc.
If you are interested in learning what makes Maxwell tick, keep reading below.
The NVIDIA Maxwell Architecture
When NVIDIA first approached us about the GTX 750 Ti they were very light on details about the GPU that was powering it. Even though the fact it was built on Maxwell was confirmed the company hadn't yet determined if it was going to do a full architecture deep dive with the press. In the end they went somewhere in between the full detail we are used to getting with a new GPU design and the original, passive stance. It looks like we'll have to wait for the enthusiast GPU class release to really get the full story but I think the details we have now paint the story quite clearly.
During the course of design the Kepler architecture, and then implementing it with the Tegra line in the form of the Tegra K1, NVIDIA's engineering team developed a better sense of how to improve the performance and efficiency of the basic compute design. Kepler was a huge leap forward compared to the likes of Fermi and Maxwell is promising to be equally as revolutionary. NVIDIA wanted to address both GPU power consumption as well as finding ways to extract more performance from the architecture at the same power levels.
The logic of the GPU design remains similar to Kepler. There is a Graphics Processing Cluster (GPC) that houses Simultaneous Multiprocessors (SM) built from a large number of CUDA cores (stream processors).
GM107 Block Diagram
Readers familiar with the look of Kepler GPUs will instantly see changes in the organization of the various blocks of Maxwell. There are more divisions, more groupings and fewer CUDA cores "per block" than before. As it turns out, this reorganization was part of the ability for NVIDIA to improve performance and power efficiency with the new GPU.
Subject: General Tech, Graphics Cards | February 14, 2014 - 03:02 PM | Scott Michaud
Tagged: supply shortage, shortage, R9 290X, podcast, litecoin, dogecoin, bitcoin
UPDATE (Feb 14th, 11pm ET): As a commenter has pointed out below, suddenly, as if by magic, Newegg has lowered prices on the currently in stock R9 290X cards by $200. That means you can currently find them for $699 - only $150 over the expected MSRP. Does that change anything about what we said above or in the video? Not really. It only lowers the severity.
I am curious to know if this was done by Newegg voluntarily due to pressure from news stories such as these, lack of sales at $899 or with some nudging from AMD...
If you have been keeping up with our podcasts and reviews, you will know that AMD cards are great compute devices for their MSRP. This is something that cryptocurrency applies a value to. Run a sufficient amount of encryption tasks and you are rewarded with newly created tokens (or some fee from validated transactions). Some people seem to think that GPUs are more valuable for that purpose than their MSRP, so retailers raise prices and people still buy them.
Currently, the cheapest R9 290X is being sold for $900. This is a 64% increase over AMD's intended $549 MSRP. They are not even the ones receiving this money!
This shortage also affects other products such as Corsair's 1200W power supply. Thankfully, only certain components are necessary for mining (mostly GPUs and a lot of power) so at least we are not seeing the shortage spread to RAM, CPUs, APUs, and so forth. We noted a mining kit on Newegg which was powered by a Sempron processor. This line of cheap and low-performance CPUs has not been updated since 2009.
We have kept up with GPU shortages, historically. We did semi-regular availability checks during the GeForce GTX 680 and 690 launch windows. The former was out of stock for over two months after its launch. Those also sometimes strayed from their MSRP, slightly.
Be sure to check out the clip (above) for a nice, 15-minute discussion.