Inno3D Introduces a Single Slot GTX 1050 Ti Graphics Card

Subject: Graphics Cards | May 13, 2017 - 11:46 PM |
Tagged: SFF, pascal, nvidia, Inno3D, GP107

Hong Kong based Inno3D recently introduced a single slot graphics card using NVIDIA’s mid-range GTX 1050 Ti GPU. The aptly named Inno3D GeForce GTX 1050 Ti (1-Slot Edition) combines the reference clocked Pascal GPU, 4GB of GDDR5 memory, and a shrouded single fan cooler clad in red and black.

Inno3D GeForce GTX 1050 Ti 1 Slot Edition.png

Around back, the card offers three display outputs including a HDMI 2.0, DisplayPort 1.4, and DVI-D. The single slot cooler is a bit of an odd design with an thin axial fan rather than a centrifugal type that sits over a fake plastic fin array. Note that these fins do not actually cool anything, in fact the PCB of the card does not even extend out to where the fan is; presumably the fins are there primarily for aesthetics and secondarily to channel a bit of the air the fan pulls down. Air is pulled in and pushed over the actual GPU heatsink (under the shroud) and out the vent holes next to the display connectors. Air is circulated through the case and is not actually exhausted like traditional dual slot (and some single slot) designs. I am curious how the choice of fan and vents will affect cooling performance.

Overclocking is going to be limited on this card, and it comes out-of-the-box clocked at NVIDIA reference speeds of 1290 MHz base and 1392 MHz boost for the GPU’s 768 cores and 7 GT/s for the 4GB of GDDR5 memory. The card measures 211 mm (~8.3”) long and should fit in just about any case. Since it pulls all of its power from the slot, it might be a good option for those slim towers OEMs like to use these days to get a bit of gaming out of a retail PC.

Inno3D is not yet talking availability or pricing, but looking at there existing lineup I would expect a MSRP around $150.

Source: Tech Report

NVIDIA Announces Q1 2018 Results

Subject: Editorial | May 10, 2017 - 09:45 PM |
Tagged: nvidia, earnings, revenues, Q1 2018, Q1, v100, data center, automotive, gpu, gtx 1080 ti

NVIDIA had a monster Q1. The quarter before the company had their highest revenue numbers in the history of the company.  Q1 can be a slightly more difficult time and typically the second weakest quarter of the year.  The Holiday rush is over and the market slows down.  For NVIDIA, this was not exactly the case.  While NVIDIA made $2.173 billion in Q4 2017, they came remarkably close to that with revenues of $1.937 billion.  While $250 million is a significant drop, it is not an unexpected one.  In fact, it shows NVIDIA being slightly stronger than expectations.

NVIDIA-Logo.jpg

The past year has shown tremendous growth for NVIDIA.  Their GPUs remain strong and they have the highest performing parts at the upper midrange and high end markets.  AMD simply has not been able to compete with NVIDIA, much less overcome the company with higher performing parts at the top end.  GPUs still make up the largest portion of income that NVIDIA receives.  NVIDIA continues to invest in new areas and those investments are starting to pay off.

Automotive is still in the growth stages for the company, but they have successfully taken the Tegra CPU division and moved away from the cellphone and tablet markets.  NVIDIA continues to support their Shield products, but the main focus looks to be the automotive industry with these high performing, low power parts that sport advanced graphical options.  Professional graphics continues to be a stronghold for NVIDIA.  While it did drop quite a bit from the previous quarter, it is a high margin area that helps bolster revenues.

The biggest mover over this past year seems to be the Data Center.  Last year NVIDIA focused on delivering entire solutions to the market as well as their individual GPUs.  The past two years have seen them have essentially no income in this area to having a $400 million quarter.  This is simply tremendous growth in an area that is still relatively untapped when it comes to GPU compute.

results.png

NVIDIA continues to be very aggressive in their product design and introductions.  They have simply owned the $300+ range of graphics cards with the GTX 1070, GTX 1080, and the recently introduced GTX 1080 Ti.  This is somewhat ignoring the even higher end TitanXp that is priced well above most enthusiasts’ budgets.  Today they announced the V100 chip that is the first glimpse we have of a high end part running on TSMC’s new 12nm FinFET process.  It also features 16 GB of HBM2 memory and a whopping 21 billion transistors in total.

Next quarter looks to be even better than this one, which is a shock because Q2 has traditionally been the slowest quarter of the year.  NVIDIA expects around $1.95 billion in revenues (actually increasing from Q1).  NVIDIA also is rewarding shareholders with not only a quarterly dividend, but also has been actively buying back shares (which tends to keep share prices healthy).  Early last year NVIDIA had a share price of around $30 while today they are trending well above $100.

SXM2-VoltaChipDetails.png

If NVIDIA keeps this up while continuing to expand in automotive and data center, it is a fairly safe bet that they will easily overtop $8 billion in revenues for the year.  Q3 and Q4 will be stronger if they continue to advance in those areas while retaining marketshare in the GPU market.  With rumors hinting that AMD will not have a product that will top the GTX 1080Ti, it is a safe bet that NVIDIA can easily adjust their prices across the board to stay competitive with whatever AMD throws at them.

It is interesting to look back when AMD was shopping around for a graphics firm and wonder what could have happened.  Hector Ruiz was in charge of AMD and tried to leverage a deal with NVIDIA.  Rumors have it that Huang would not agree to it unless he was CEO.  Hector laughed and talked to ATI who was more than happy to sell (and cover up some real weaknesses in the company).  We all know what happened to Hector and how his policies and actions started the spiral that AMD is only now recovering from.  What would that have been like if Jensen had actually become CEO of that merged company?

Source: NVIDIA

NVIDIA Announces Tesla V100 with Volta GPU at GTC 2017

Subject: Graphics Cards | May 10, 2017 - 01:32 PM |
Tagged: v100, tesla, nvidia, gv100, gtc 2017

During the opening keynote to NVIDIA’s GPU Technology Conference, CEO Jen-Hsun Huang formally unveiled the latest GPU architecture and the first product based on it. The Tesla V100 accelerator is based on the Volta GPU architecture and features some amazingly impressive specifications. Let’s take a look.

  Tesla V100 GTX 1080 Ti Titan X (Pascal) GTX 1080 GTX 980 Ti TITAN X GTX 980 R9 Fury X R9 Fury
GPU GV100 GP102 GP102 GP104 GM200 GM200 GM204 Fiji XT Fiji Pro
GPU Cores 5120 3584 3584 2560 2816 3072 2048 4096 3584
Base Clock - 1480 MHz 1417 MHz 1607 MHz 1000 MHz 1000 MHz 1126 MHz 1050 MHz 1000 MHz
Boost Clock 1455 MHz 1582 MHz 1480 MHz 1733 MHz 1076 MHz 1089 MHz 1216 MHz - -
Texture Units 320 224 224 160 176 192 128 256 224
ROP Units 128 (?) 88 96 64 96 96 64 64 64
Memory 16GB 11GB 12GB 8GB 6GB 12GB 4GB 4GB 4GB
Memory Clock 878 MHz (?) 11000 MHz 10000 MHz 10000 MHz 7000 MHz 7000 MHz 7000 MHz 500 MHz 500 MHz
Memory Interface 4096-bit (HBM2) 352-bit 384-bit G5X 256-bit G5X 384-bit 384-bit 256-bit 4096-bit (HBM) 4096-bit (HBM)
Memory Bandwidth 900 GB/s 484 GB/s 480 GB/s 320 GB/s 336 GB/s 336 GB/s 224 GB/s 512 GB/s 512 GB/s
TDP 300 watts 250 watts 250 watts 180 watts 250 watts 250 watts 165 watts 275 watts 275 watts
Peak Compute 15 TFLOPS 10.6 TFLOPS 10.1 TFLOPS 8.2 TFLOPS 5.63 TFLOPS 6.14 TFLOPS 4.61 TFLOPS 8.60 TFLOPS 7.20 TFLOPS
Transistor Count 21.1B 12.0B 12.0B 7.2B 8.0B 8.0B 5.2B 8.9B 8.9B
Process Tech 12nm 16nm 16nm 16nm 28nm 28nm 28nm 28nm 28nm
MSRP (current) lol $699 $1,200 $599 $649 $999 $499 $649 $549

While we are low on details today, it appears that the fundamental compute units of Volta are similar to that of Pascal. The GV100 has 80 SMs with 40 TPCs and 5120 total CUDA cores, a 42% increase over the GP100 GPU used on the Tesla P100 and 42% more than the GP102 GPU used on the GeForce GTX 1080 Ti. The structure of the GPU remains the same GP100 with the CUDA cores organized as 64 single precision (FP32) per SM and 32 double precision (FP64) per SM.

image7.png

Click to Enlarge

Interestingly, NVIDIA has already told us the clock speed of this new product as well, coming in at 1455 MHz Boost, more than 100 MHz lower than the GeForce GTX 1080 Ti and 25 MHz lower than the Tesla P100.

SXM2-VoltaChipDetails.png

Click to Enlarge

Volta adds in support for a brand new compute unit though, known as Tensor Cores. With 640 of these on the GPU die, NVIDIA directly targets the neural network and deep learning fields. If this is your first time hearing about Tensor, you should read up on its influence on the hardware markets, bringing forth an open-source software library for machine learning. Google has invested in a Tensor-specific processor already, and now NVIDIA throws its hat in the ring.

Adding Tensor Cores to Volta allows the GPU to do mass processing for deep learning, on the order of a 12x improvement over Pascal’s capabilities using CUDA cores only.

07.jpg

For users interested in standard usage models, including gaming, the GV100 GPU offers 1.5x improvement in FP32 computing, up to 15 TFLOPS of theoretical performance and 7.5 TFLOPS of FP64. Other relevant specifications include 320 texture units, a 4096-bit HBM2 memory interface and 16GB of memory on-module. NVIDIA claims a memory bandwidth of 900 GB/s which works out to 878 MHz per stack.

Maybe more impressive is the transistor count: 21.1 BILLION! NVIDIA claims that this is the largest chip you can make physically with today’s technology. Considering it is being built on TSMC's 12nm FinFET technology and has an 815 mm2 die size, I see no reason to doubt them.

03.jpg

Shipping is scheduled for Q3 for Tesla V100 – at least that is when NVIDIA is promising the DXG-1 system using the chip is promised to developers.

I know many of you are interested in the gaming implications and timelines – sorry, I don’t have an answer for you yet. I will say that the bump from 10.6 TFLOPS to 15 TFLOPS is an impressive boost! But if the server variant of Volta isn’t due until Q3 of this year, I find it hard to think NVIDIA would bring the consumer version out faster than that. And whether or not NVIDIA offers gamers the chip with non-HBM2 memory is still a question mark for me and could directly impact performance and timing.

More soon!!

Source: NVIDIA

NVIDIA Releases VRWorks Audio 1.0

Subject: Graphics Cards | May 10, 2017 - 07:02 AM |
Tagged: vrworks, nvidia, audio

GPUs are good at large bundles of related tasks, saving die area by tying several chunks of data together. This is commonly used for graphics, where screens have two-to-eight million (1080p to 4K) pixels, 3d models have thousands to millions of vertexes, and so forth. Each instruction is probably done hundreds, thousands, or millions of times, and so parallelism greatly helps with utilizing real-world matter to store and translate this data.

Audio is another area with a lot of parallelism. A second of audio has tens of thousands of sound pressure samples, but another huge advantage is that higher frequency sounds model pretty decently as rays, which can be traced. NVIDIA decided to repurpose their OptiX technology into calculating these rays. Beyond the architecture demo that you often see in global illumination demos, they also integrated it into an Unreal Tournament test map.

And now it’s been released, both as a standalone SDK and as an Unreal Engine 4.15 plug-in. I don’t know what its license specifically entails, because the source code requires logging into NVIDIA’s developer portal, but it looks like the plug-ins will be available to all users of supported engines.

Source: NVIDIA

GeForce Experience 3.6 Has Vulkan and OpenGL

Subject: Graphics Cards | May 10, 2017 - 03:53 AM |
Tagged: vulkan, ShadowPlay, opengl, nvidia, geforce experience

The latest version of GeForce Experience, 3.6, adds video capture (including screenshots and live streaming) support for OpenGL and Vulkan games. The catalog of titles support by ShadowPlay, which I’m pretty sure NVIDIA wants to call Share now, despite referring to it by its old name in the blog post, now includes No Man’s Sky, DOOM, and Microsoft’s beloved OpenGL title: Minecraft.

The rest of the update focuses on tweaking a few interface elements, including its streaming panel, its video and screenshot upload panel, and its gallery. Access to the alternative graphics APIs was the clear headline-maker, however, opening the door to several large gaming groups, and potentially even more going forward.

GeForce Experience 3.6 is available now.

Source: NVIDIA

GTC 17: NVIDIA Demos (Professional) Multi-User VR

Subject: Graphics Cards | May 9, 2017 - 07:01 AM |
Tagged: VR, quadro, nvidia, gp102

Four Quadro P6000s installed in a single server, which looks like a 4U rack-mounted box, are shown running four HTC Vive Business Edition VR systems through virtual machines. It isn’t designed to be a shipping product, just a demo for NVIDIA’s GPU Technology Conference that was developed by their engineers, but that should get the attention of this trade show’s attendees, who are mostly enterprise-focused.

nvidia-2017-fouruservrserver.png

For context, this system has roughly equivalent GPU horsepower to four Titan Xps, albeit with twice the RAM and slightly different clocks; there’s plenty of power per headset to harness. Still, running this level of high-performance application on a virtual machine could be useful in a variety of business applications, from architectural visualization to, as NVIDIA notes, amusement parks.

Given that it’s just a proof-of-concept demo, you’ll need to build it yourself to get one. They didn’t mention using any special software, though.

Source: NVIDIA

Prey The Fourth Be With You??? GeForce 382.05 Drivers

Subject: Graphics Cards | May 4, 2017 - 10:24 PM |
Tagged: nvidia, graphics drivers

If you’re cringing while reading that headline, then rest assured I felt just as dirty writing it.

nvidia-geforce.png

NVIDIA has released another graphics driver, 382.05, to align with a few new game releases: Prey, Battlezone, and Gears of War 4’s latest update. The first title is a first-person action-adventure title by Arkane Studios, which releases tomorrow. Interestingly, the game runs on CryEngine... versus their internally-developed Void engine, as seen in Dishonored 2; Unreal Engine, as they’ve used with the original Dishonored; or id Tech, which Bethesda’s parent company, ZeniMax, owns through id Software and has a bit of a name-association with franchise that this Prey rebooted.

Cross-eyed yet? Good. Let’s move on.

Fans of Gears of War 4, specifically those who have multiple NVIDIA graphics cards, might be interested in SLI support for this DirectX 12-based title. As we’ve mentioned in the past, the process of load-balancing multiple GPUs has changed going from DirectX 11 to DirectX 12. According to The Coalition, SLI support was technically available on 381.89 (and 17.4.4 for AMD CrossFire), but NVIDIA is advertising it with 382.05. I’m not sure whether it’s a timing-based push, or if they optimized the experience since 381.89, but you should probably update regardless.

The driver also adds / updates the SLI profile for Sniper: Ghost Warrior 3 and Warhammer 40,000: Dawn of War III. A bunch of bugs have been fixed, too, such as “In a multi-display configuration, the extended displays are unable to enter sleep mode.” along with a couple of black and blue screen issues.

You can get them from GeForce Experience or NVIDIA’s website.

Source: NVIDIA

NVIDIA Releases GeForce 381.89 Drivers

Subject: Graphics Cards | April 28, 2017 - 09:02 PM |
Tagged: nvidia, graphics drivers

The latest Game Ready drivers from NVIDIA, 381.89, launched a couple of days before yesterday’s release of Warhammer 40,000: Dawn of War III. These drivers were the target of optimizations for that game, as well as Heroes of the Storm 2.0, Batman: Arkham VR, Rick and Morty: Virtual Rick-ality, and Wilson’s Heart.

nvidia-geforce.png

Beyond game-specific optimizations, there isn’t a whole lot of difference between these and the previous drivers, 381.65. According to the release notes, the idle voltage has been reduced in some circumstances, a crash in Sniper Elite 3 has been resolved, and two bluescreens have been fixed. That said, there’s occasionally undocumented changes that crop up.

You can pick up these drivers from NVIDIA’s website, or through GeForce Experience.

Source: NVIDIA

Acer Announces Predator X27 Gaming Monitor: 4K and HDR at 144 Hz

Subject: Displays | April 28, 2017 - 07:25 PM |
Tagged: acer, Predator, Predator X27, monitor, display, hdr, 4k, UHD, 144 Hz, g-sync, nvidia

Acer announced a number of products at their next@acer press event in New York yesterday, but this new monitor might take the cake: a 4K HDR display with a 144 Hz refresh rate. The Predator X27 combined just about every conceivable feature for a gaming monitor and combines it into one product, but don't expect this 27-inch monitor be released at a budget price (pricing has not been announced).

predator_x27.jpg

"Acer’s Predator X27 portrays astonishingly vibrant visuals without motion blur thanks to a high 4K (3840x2160) resolution at a 144 Hz refresh rate, a fast 4 ms response time and a 1,000 nit peak brightness. Featuring Acer HDR Ultra technology, it offers the best possible contrast quality of the high dynamic range with advanced LED local dimming in 384 individually-controlled zones that shine light only when and where it is required. It not only delivers a broader, more deeply saturated color gamut, but a luminance range several times greater than that of traditional dynamic range monitors. By dimming the backlight behind parts of the screen displaying black, blacks appear deeper and darker on those parts of the panel, a significant bonus for people who play games with darker scenes."

Acer has posted a video about the Predator X27, imbedded below:

Acer also announced a new curved gaming monitor with the Predator Z271UV, which offers a 1800R curve from its 27-inch display, but for HDR you'll need to stick to the X27. Quantum dot technology is incorporated into both display for wide color, and both feature NVIDIA G-SYNC variable refresh-rate tech featuring ULMB (ultra-low motion blur) along with with Tobii eye-tracking.

"Acer’s Predator Z271UV provides WQHD (2560x1440) resolution on a curved 1800R panel that puts every corner of the screen at the same distance from the gamer’s eyes – this creates more immersive gameplay with a wider field of view and increased perceived area of peripheral vision. It features a ZeroFrame edge-to-edge design perfect for use in multi-monitor setups, and provides spectacular color breadth covering 125% of the sRGB color space. It’s extremely fast with up to a 1 ms (3 ms native) response time that nearly eliminates motion blur and supports overclocking up to 165 Hz."

We await pricing and availability information for both monitors.

Source: Acer
Author:
Subject: Mobile
Manufacturer: Dell

Overview

The Dell Inspiron 15 7000 Gaming series has been part of the increasingly interesting sub-$1000 gaming notebook market since it’s introduction in 2015. We took a look at last year’s offering and were very impressed with the performance it had to offer, but slightly disappointed in the build quality.

DSC02807.JPG

Dell is back this year with an all-new industrial design for the Inspiron 15 7000 Gaming along with updated graphics in form of the GeForce GTX 1050 Ti.  Can a $850 gaming notebook possibly live up to expectations? Let’s take a closer look.

After three generations of the Dell Inspiron 15 Gaming product, it’s evident that Dell takes this market segment seriously. Alienware seems to have lost a bit of the hearts and minds of gamers in the high-end segment, but Dell has carved out a nice corner of the gaming market.

Dell Inspiron 15 7567 Gaming  (configuration as reviewed)
Processor Intel Core i5-7300HQ (Kaby Lake)
Graphics NVIDIA Geforce GTX 1050 Ti (4GB)
Memory 8GB DDR4-2400 (One DIMM)
Screen 15.6-in 1920x1080 I
Storage

256GB SanDisk X400 SATA M.2 

Available 2.5" drive slot

Camera 720p / Dual Digital Array Microphone
Wireless Intel 3165 802.11ac + BT 4.2 (Dual Band, 1x1)
Connections Ethernet
HDMI 2.0
3x USB 3.0
SD
Audio combo jack
Battery 74 Wh
Dimensions 384.9mm x 274.73mm x 25.44mm (15.15" x 10.82" x 1")
5.76 lbs. (2620 g)
OS Windows 10 Home
Price $849 - Dell.com

Let's just get this out of the way: for the $850 price tag of the model that we were sent by Dell for review, this is an amazing collection of hardware. Traditionally laptops under $1000 have an obvious compromise, but it's difficult to find one here. Dedicated graphics, flash Storage, 1080p screen, and a large battery all are features that I look for in notebooks. Needless to say, my expectations for the Inspiron 15 Gaming are quite high.

Click here to continue reading our review of the Dell Inspiron 15 7000 Gaming.