Intel confirms first graphics chips will land in 2020

Subject: Graphics Cards | June 12, 2018 - 02:21 PM |
Tagged: Intel, graphics, gpu, raja koduri

This article first appeared on MarketWatch.

Intel CEO Brian Krzanich disclosed during an analyst event last week that it will have its first discrete graphics chips available in 2020. This will mark the beginning of the chip giant’s journey towards a portfolio of high-performance graphics products for various markets including gaming, data center, and AI.

Some previous rumors posited that a launch at CES 2019 this coming January might be where Intel makes its graphics reveal, but that timeline was never adopted by Intel. It would have been drastically overaggressive and in no way reasonable with the development process of a new silicon design.

Back in November 2017 Intel brought on board Raja Koduri to lead the graphics and compute initiatives inside the company. Koduri was previously in charge of the graphics division at AMD, helping to develop and grow the Radeon brand, and his departure to Intel was thought to have significant impact on the industry.

Raja-Koduri-Intel.jpg

A typical graphics architecture and chip development cycle is three years for complex design, so even hitting the 2020 window with engineering talent is aggressive.

Intel did not go into detail about what performance level or target market this first discrete GPU solution might address, but Intel EVP of the Data Center Group Navin Shenoy confirmed that the company’s strategy will include solutions for data center segments (think AI, machine learning) along with client (think gaming, professional development).

This is a part of the wider scale AI and machine learning strategy for Intel, that includes these discrete graphics chip products in addition to other options like the Xeon processor family, FPGAs from its acquisition of Altera, and custom AI chips like the Nervana-based NNP.

While the leader in the space, NVIDIA, maintains its position with graphics chips, it is modifying and augmenting these processors with additional features and systems to accelerate AI even more. It will be interesting to see how Intel plans to catch up in design and deployment.

Though few doubt the capability of Intel for chip design, building a new GPU architecture from the ground up is not a small task. Intel needs to provide a performance and efficiency level that is in the same ballpark as NVIDIA and AMD; within 20% or so. Doing that on the first attempt, while also building and fostering the necessary software ecosystem and tools around the new hardware is a tough ask of any company, Silicon Valley juggernaut or no. Until we see the first options available in 2020 to gauge, NVIDIA and AMD have the leadership positions.

Both AMD and NVIDIA will be watching Intel with great interest as GPU development accelerates. AMD’s Forest Norrod, SVP of its data center group, recently stated in an interview that he didn’t expect Koduri at Intel to “have any impact at Intel for at least another three years.” If Intel can deliver on its 2020 target for the first in a series of graphics releases, it might put pressure on these two existing graphics giants sooner than most expected.

Source: MarketWatch

Computex 2018: PowerColor Makes the RX Vega 56 Nano Edition Official

Subject: Graphics Cards | June 8, 2018 - 08:22 AM |
Tagged: Vega Nano, SFF, RX Vega 56, powercolor, mini ITX, computex 2018, computex, amd

PowerColor’s new small form factor RX Vega 56 based graphics card was shown off at Computex 2018 and finally made the card official with more information provided on it following the rumors and official teaser last month. The PowerColor RX Vega 56 Nano Edition is the spiritual successor to AMD’s Fiji XT-based R9 Nano from 2015 and features an AMD RX Vega 56 GPU with 8GB HBM2 memory in a short dual slot graphics card measuring 170mm x 95mm x 38mm. In fact, PowerColor’s RX Vega 56 Nano Edition has a PCB that is only 5mm longer (according to TechPowerUp) than AMD’s previous Nano card and including the cooler is less than 2 cm longer.

PowerColor RX Vega 56 Nano Edition Mini ITX.png

PowerColor’s new SFF graphics card is a dual slot design with a single 80mm fan and dense aluminum heatsink covered by a black plastic shroud providing cooling. The card is powered by  8-pin and 6-pin power connectors and the card offers three DisplayPort 1.4 and one HDMI 2.0b display outputs.

The RX Vega 56 GPU features 56 CUs (compute units) with 3,584 shader processors and 224 texture units. PowerColor has kept the GPU at reference clockspeeds of 1,156 MHz base and up to 1,471 MHz boost. The 8GB of HBM2 memory is stock clocked at 800 MHz and connects to the GPU via a 2048-bit bus.

The PowerColor RX Vega 56 Nano Edition will reportedly be available shortly with a $449 MSRP. The new small form factor Nano Edition card offers an interesting proposition for gamers wanting to build in Mini ITX systems. So long as PowerColor can get the card out at close to MSRP and performance is still there without too much thermal limitations I think there is a definite niche market for it. (Note that the R9 Nano debuted at $650 MSRP!)

Also read:

Source: PowerColor

Computex 2018: AMD Shows off 7nm Vega Graphics

Subject: Graphics Cards | June 5, 2018 - 11:58 PM |
Tagged: Vega, machine learning, instinct, HBM2, gpu, computex 2018, computex, amd, 7nm

AMD showed off its first 7nm GPU in the form of the expected AMD Radeon Instinct RX Vega graphics product and RX Vega GPU with 32GB of HBM2 memory. The new GPU uses the Vega architecture along with the open source ecosystem built by AMD to enable both graphics and GPGPU workloads. AMD demonstrated using the 7nm RX Vega GPU for ray tracing in a cool demo that showed realistic reflections and shadows being rendered on a per pixel basis in a model. Granted, we are still a long way away from seeing that kind of detail in real time gaming, but is still cool to see glimpses of that ray traced future.

Screenshot (892).png

According to AMD, the 32GB of HBM2 memory will greatly benefit creators and enterprise clients that need to work with large datasets and be able to quickly make changes and updates to models before doing a final render. The larger memory buffer will also help in HPC applications with more big data databases being able to be kept close to the GPU for processing using the wide HBM2 memory bus. Further, HBM2 has physical size and energy efficiency benefits which will pique the interest of datacenters focused on maximizing TCO numbers.

Screenshot (898).png

Dr. Lisa Su came on state towards the end of the 7nm Vega demonstration to show off the GPU in person, and you can see that it is rather tiny for the compute power it provides! It is shorter than the two stacks of HBM2 dies on either side, for example.

Screenshot (888).png

Of course AMD did not disclose all the nitty-gritty specifications of the new machine learning graphics card that enthusiasts want to know. We will have to wait a bit longer for that information unfortunately!

As for other 7nm offerings? As Ryan talked about during CES in January, 2018 will primarily be the year for the machine learning-focused Radeon Instinct RX Vega 7nm GPU, with other consumer-focused GPUs using the smaller process node likely coming out in 2019. Whether those 7nm GPUs in 2019 will be a refreshed Vega or the new Navi is still up for debate, however AMD's graphics roadmap certainly doesn't rule out Navi as a possibility. In any case, AMD did state during the livestream that it intends to release a new GPU every year with the GPUs alternating between new architecture and new process node.

What are your thoughts on AMD's graphics roadmap and its first 7nm Vega GPU?

Source: AMD

AMD continues to make inroads on NVIDIA's marketshare

Subject: Graphics Cards | May 29, 2018 - 03:30 PM |
Tagged: amd, nvidia, marketshare, jon peddie

Jon Peddie Research have just released their latest look at the discrete GPU market, which has been doing significantly better than the PC market overall, for reasons we are all quite familiar with.  While sales of full systems have declined 24.5%, GPU sales increased 6.4% over the past quarter and an impressive increase of 66.4% when compared to this time last year.

share.png

With just two suppliers in the market now, any gain by one results in a loss for the other, and it has been AMD's turn to succeed.  The gain of 1.2% over this quarter is not as impressive as AMD's total gains over the past 12 months, which saw 7.4% of the sales once going to NVIDIA shift to AMD.  Vega may not be the most powerful architecture on the planet, but it is selling along with previous generations of GPU. 

The next quarter may level out, not just due to decreases in the purchasing of new mining equipment but also due to historical trends as stock is accumulated to prepare for sales in the fourth quarter.  There is also the fact that it has been a while since either AMD or NVIDIA have released new kit and the majority of those planning an upgrade on this cycle have already done so.  

Once we see new kit arrive and the prices of products from the previous generation receive discounts, there should be another spike in sales.  The mystery is what the next generation will bring from these two competitors.

 

Make your GTX 1080 Ti even cooler with WATERCOOL

Subject: Graphics Cards | May 28, 2018 - 01:56 PM |
Tagged: gtx 1080 ti, watercooler, Heatkiller IV, WATERCOOL

WATERCOOL's Heatkiller IV for the GTX 1080 Ti is up for review at [H]ard|OCP, as you can see below it certainly adds a cool look to your card, but that is only half the story.  WATERCOOL added some features which should improve cooling, such as a flowplate as well as changes to the internals which should improve flow rate.  This results in noticeable improvements over a Founders Edition, with both lower temperatures and higher clocks.  Check out the full review to see if it convinces you to switch your cooling methods.

1526586229qvog5ntrpp_1_5_l.jpg

"Watercool and its Heatkiller series of custom water components are well known for being some of the best in the world when it comes to performance and design. We give its Heatkiller IV water block for the NVIDIA GTX 1080 Ti a good once over, and come away very impressed. Quality and performance all in one package."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

SPECgpc Releases SPECviewperf 13 Workstation Benchmark

Subject: Graphics Cards | May 23, 2018 - 09:01 PM |
Tagged: vega frontier edition, titan xp, specviewperf 13, specgpc

SPECgpc, makers of industry standard benchmarks such as SPECint, released an updated version of SPECviewperf today. The new SPECviewperf 13, is an update to the industry staple benchmark for measuring the graphics performance in workstation and professional applications.

specviewperf-viewsets.png

Ranging from a wide array of applications such as Solidworks, Maya, Creo, 3ds Max, and more, SPECviewperf provides an insight into the performance of mission-critical, but often difficult to benchmark scenarios.

Changes for this new version of SPECviewperf include:

  • Support for 4K resolution displays.
  • New reporting methods, including JSON output that enables more robust and flexible result parsing.
  • A new user interface that will be standardized across all SPEC/GWPG benchmarks.
  • New workloads and scoring that reflect the range of activities found in real-world applications.
  • Various bug fixes and performance improvements.

Given that the changes include new datasets for the energy, medical, Creo, and Maya viewsets, as well as tweaks to the others, we decided to grab some quick results from two high-end prosumer level GPUs, the NVIDIA Titan Xp and the AMD RX Vega Frontier Edition.

The full testbed configuration is listed below:

Test System Setup
CPU

Intel Core i9-7960XE

Motherboard ASUS PRIME X299 Deluxe
Memory

32GB Corsair Vengeance DDR4-3200

Operating at: 2400MHz

Storage Intel Optane SSD DC P4800X 750GB
Sound Card On-board
Graphics Card

NVIDIA GeForce TITAN Xp 12GB

AMD Radeon Vega Frontier Edition (Liquid) 16GB

Graphics Drivers

NVIDIA 397.64

AMD Radeon Pro 18.Q2.1

Power Supply Corsair RM1000x
Operating System Windows 10 Pro x64 RS4

results.png

While we see the Titan Xp handily winning most of the tests in SPECviewperf 13, there are some notable exceptions, including the newly updated energy workload where the Vega Frontier Edition manages to pull off a 13% lead. Additionally, Solidworks—a very widely used application for CAD work—sees a 23% performance advantage for AMD.

SPECviewperf is a benchmark that we rely on to evaluate profession application performance, and we are glad to see it's getting some improvements.

For anyone curious about the performance of their system, SPECviewperf 13 is free to download and use for non-profit entities that do not sell computer hardware, software, or related services.

Source: SPECgpc

NVIDIA Launches GTX 1050 3GB for Budget Gamers

Subject: Graphics Cards | May 23, 2018 - 06:21 PM |
Tagged: pascal, nvidia, GP107, GDDR5, budget

NVIDIA recently quietly launched a new budget graphics card that neatly slots itself between the GTX 1050 and the GTX 1050 Ti. The new GTX 1050 3GB, as the name suggests, features 3GB of GDDR5 memory. The new card is closer to the GTX 1050 Ti than the name would suggest, however as it uses the same 768 CUDA cores instead of the 640 of the GTX 1050 2GB. The GDDR5 memory is where the card differs from the GTX 1050 Ti though as NVIDIA has cut the number of memory controllers by one along with the corresponding ROPs and cache meaning that the new GTX 1050 3GB has a smaller memory bus and less memory bandwidth than both the GTX 1050 2GB and GTX 1050 Ti 4GB.

NVIDIA GTX 1050.png

Specifically, the GTX 1050 with 3GB GDDR5 has a 96-bit memory bus that when paired with 7 Gbps GDDR5 results in maximum memory bandwidth of 84 GB/s versus the other previously released cards' 128-bit memory buses and 112 GB/s of bandwidth.

Clockspeeds on the new GTX 1050 3GB start are a good bit higher than the other cards though with the base clocks starting at 1392 MHz which is the boost clock of the 1050 Ti and running up to 1518 MHz boost clockspeeds. Thanks to the clockspeeds bumps, the theoretical GPU performance of 2.33 TFLOPS is actually higher than the GTX 1050 Ti (2.14 TFLOPS) and existing GTX 1050 2GB (1.86 TFLOPS) though the reduced memory bus (and loss of a small amount of ROPs and cache) will hold the card back from surpassing the Ti variant in most workloads – NVIDIA needs to maintain product segmentation somehow!

  NVIDIA GTX 1050 2GB NVIDIA GTX 1050 3GB NVIDIA GTX 1050 Ti 4GB AMD RX 560 4GB
GPU GP107 GP107 GP107 Polaris 11
GPU Cores 640 768 768 896 or 1024
Texture Units 40 48 48 64
ROPs 32 ? 32 16
GPU Base 1354 1392 1290 1175
GPU Boost 1455 1518 1392 1275
TFLOPS 1.86 2.33 2.14 up to 2.6
Memory 2GB GDDR5 3GB GDDR5 4GB GDDR5 2GB or 4GB GDDR5
Memory Clockspeed 7 Gbps 7 Gbps 7 Gbps 7 Gbps
Memory Bus 128-bit 96-bit 128-bit 128-bit
Memory Bandwidth 112 GB/s 84 GB/s 112 GB/s 112 GB/s
TDP 75W 75W 75W 60W to 80W
Pricing ~$150 ~$160 (Estimate) ~$200 ~$160

The chart above compares the specifications of the GTX 1050 3GB with the GTX 1050 and the GTX 1050 Ti on the NVIDIA side and the AMD RX 560 which appears to be its direct competitor based on pricing. The new 3GB GTX 1050 should compete well with AMD's Polaris 11 based GPU as well as NVIDIA's own cards in the budget gaming space where hopefully the downside of a reduced memory bus will at least dissuade cryptocurrency miners from adopting this card as an entry level miner for Ethereum and other alt coins giving gamers a chance to buy something a bit better than the GTX 1050 and RX 550 level at close to MSRP while the miners fight over the Ti and higher variants with more memory and compute units.

NVIDIA did not release formal pricing or release date information, but the cards are expected to launch in June and prices should be around $160 to $180 depending on retailer and extra things like fancier coolers and factory overclocks.

What are your thoughts on the GTX 1050 3GB? Is it the bastion of hope budget gamers have been waiting for? hehe Looking around online it seems pricing for these budget cards has somewhat returned to sane levels and hopefully alternative options like these aimed at gamers will help further stabilize the market for us DIYers that want to game more than mine. I do wish that NVIDIA could have changed the name a bit to better differentiate the card, maybe the GTX 1050G or something but oh well. I suppose so long as the 640 CUDA core GTX 1050 doesn't ever get 3GB GDDR5 at least gamers will be able to tell them apart by the amount of memory listed on the box or website.

Also read:

Source: NVIDIA

AMD Releases Updated Raven Ridge Desktop APU Graphics Drivers

Subject: Graphics Cards, Processors | May 18, 2018 - 04:33 PM |
Tagged: Vega, ryzen, raven ridge, Radeon Software Adrenalin Edition, r5 2400g, r3 2200g, amd

Today, AMD released the first driver update for the desktop Raven Ridge APUs since their launch in February of this year.

The new Q2 2018 drivers are based on AMD's current Radeon Software Adrenalin Edition release and bring features such as ReLive and the Radeon overlay to the Vega-powered desktop platform.

adrenalin-74.jpg

We haven't had a lot of time to look for potential performance enhancements this driver brings yet, but we did do a quick 3DMark run on our Ryzen 5 2400G with memory running at DDR4-3200.

3dmark-fs.png

Here, we see healthy gains of around 5% in 3DMark Firestrike for the new driver. While I wouldn't expect big gains for older titles, newer titles that have come out since the initial Raven Ridge drive release in February will see the biggest gains.

We are still eager to see the mobile iterations of AMD's Raven Ridge processors get updated drivers, as notebooks such as the HP Envy X360 have not been updated since they launched in November of last year.

It's good to see progress from AMD on this front, but they must work harder to unify the graphics drivers of their APU products into the mainstream graphics driver releases if they want those products to be taken seriously as gaming options.

Source: AMD

NVIDIA GeForce GTX Cards Finally Return to Stock at MSRP

Subject: Graphics Cards | May 9, 2018 - 12:23 PM |
Tagged: video card, pricing, msrp, mining, GTX 1080, gtx 1070, gtx 1060, gtx, graphics, gpu, gaming, crypto

The wait for in-stock NVIDIA graphics cards without inflated price tags seems to be over. Yes, in the wake of months of crypto-fueled disappointment for gamers the much anticipated, long-awaited return of graphics cards at (gasp) MSRP prices is at hand. NVIDIA has now listed most of their GTX lineup as in-stock (with a limit of 2) at normal MSRPs, with the only exception being the GTX 1080 Ti (still out of stock). The lead time from NVIDIA is one week, but worth it for those interested in the lower prices and 'Founders Edition' coolers.

GTX_STORE.PNG

Many other GTX 10 Series options are to be found online at near-MSRP pricing, though as before many of the aftermarket designs command a premium, with factory overclocks and proprietary cooler designs to help justify the added cost. Even Amazon - previously home to some of the most outrageous price-gouging from third-party sellers in months past - has cards at list pricing, which seems to solidify a return to GPU normalcy.

AMZ_1080.PNG

The GTX 1080 inches closer to standard pricing once again on Amazon

Some of the current offers include:

MSI Gaming GeForce GTX 1080 ARMOR 8G - $549.99 @ Amazon.com

EVGA GeForce GTX 1070 SC GAMING ACX 3.0 - $469.99 @ Amazon.com

EVGA GeForce GTX 1060 SC GAMING 6GB - $299.99 @ Newegg.com

GTX 1070 cards continue to have the highest premium outside of NVIDIA's store, with the lowest current pricing on Newegg or Amazon at $469.99. Still, the overall return to near-MSRP pricing around the web is good news for gamers who have been forced to play second (or third) fiddle to cryptomining "entrepreneurs" for several months now; a disturbing era in which pre-built gaming systems from Alienware and others actually presented a better value than DIY builds.

Source: NVIDIA

Who's a pretty cult member?

Subject: Graphics Cards | May 7, 2018 - 02:48 PM |
Tagged: Dunia 2, far cry 5, 4k

Armed with a GTX 1080 Ti Founders Edition and a 4k display, you venture forth into the advanced graphics settings of Ubisoft's latest Far Cry game.  Inside you face a multitude of challenges, from volumetric fog through various species of anti-aliasing until finally facing the beast known as overall quality.  [H]ard|OCP completed this quest and you can benefit from their experience, although no matter how long they searched they could not locate any sign of NVIDIA's Gameworks which appeared in the previous Far Cry.  There were signs of rapid packed math enhancements, much to the rejoicing of those who have no concrete interest in Gameworks existence.

1525152416j5roe5lgd3_1_4_l.png

"We will be comparing Far Cry 5's Overall Quality, Shadows, Volumetric Fog, and Anti-Aliasing image quality. In addition, we will find out if we can improve IQ in the game by adding Anisotropic Filtering and forcing AA from the control panel. We’ve have a video showing you many IQ issues we found in Far Cry 5, and those are plentiful."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: [H]ard|OCP