NVIDIA Acquires Mellanox: Beyond the Numbers

Subject: Editorial | March 12, 2019 - 10:14 PM |
Tagged: nvswitch, nvlink, nvidia, Mellanox, Intel, Infiniband, Ethernet, communications, chiplets, amd

In a bit of a surprise this past weekend NVIDIA announced that it is purchasing the networking company Mellanox for approximately $6.9 billion US. NVIDIA and Intel were engaged in a bidding war for the Israel based company. At first glance we do not see the synergies that could potentially come from such an acquisition, but in digging deeper it makes much more sense. This is still a risky move for NVIDIA as their previous history of acquisitions have not been very favorable for the company (Ageia, Icera, etc.).

633889_NVLogo_3D_H_DarkType.jpg

Mellanox’s portfolio centers around datacenter connectivity solutions such as high speed ethernet and InfiniBand products. They are already a successful company that has products shipping out the door. If there is a super computer somewhere, chances are it is running Mallanox technology for high speed interconnects. This is where things get interesting for NVIDIA.

While NVIDIA focuses on GPUS they are spreading into the datacenter at a pretty tremendous rate. Their NVLink implementation allows high speed connectivity between GPUS and recently they showed off their NVSwitch which features 18 ports. We do not know how long it took to design the NVSwitch and get it running at a high level, but NVIDIA is aiming for implementations that will exceed that technology. NVIDIA had the choice to continue in-house designs or to purchase a company already well versed in such work with access to advanced networking technology.

Intel was also in play for Mellanox, but that particular transaction might not have been approved by anti-trust authorities around the world. If Intel had made an aggressive bid for Mellanox it would have essentially consolidated the market for these high end networking products. In the end NVIDIA offered the $6.9B US for the company and it was accepted. Because NVIDIA has no real networking solutions that are on the market it will likely be approved without issue. Unlike other purchases like Icera, Mellanox is actively shipping product and will add to the bottom line at NVIDIA.

mellanox-logo-square-blue.jpg

The company was able to purchase Mellanox in a cash transaction. They simply dove into their cash reserves instead of offering Mellanox shareholders equal shares in NVIDIA. This $6.9B is above what AMD paid for ATI back in 2006 ($5.4B). There may be some similarities here in that the price for Mellanox could be overvalued compared to what they actually bring to the table and we will see write downs over the next several years, much as AMD did for the ATI purchase.

The purchase will bring them instant expertise with high performance standards like InfiniBand. It will also help to have design teams versed in high speed, large node networking apply their knowledge to the GPU field and create solutions better suited for the technology. They will also continue to sell current Mellanox products.

Another purchase in the past that looks somewhat similar to this is AMD’s acquisition of SeaMicro. That company was selling products based on their Freedom Fabric technology to create ultra-dense servers utilizing dozens of CPUs. This line of products was discontinued by AMD after poor sales, but they expanded upon Freedom Fabric and created the Infinity Fabric that powers their latest Zen CPUs.

I can see a very similar situation occurring at NVIDIA. AMD is using their Infinity Fabric to connect multiple chiplets on a substrate, as well as utilizing that fabric off of the substrate. It also has integrated that fabric into their latest Vega GPUs. This philosophy looks to pay significant dividends for AMD once they introduce their 7nm CPUs in the form of Zen 2 and EPYC 2. AMD is not relying on large, monolithic dies for both their consumer and enterprise parts, thereby improving yields and bins on these parts as compared to what Intel does with current Xeon parts.

mellanox-quantum-connectx-6-chips-652x381.jpg

When looking at the Mellanox purchase from this view, it makes a lot of sense for NVIDIA. With process node advances moving at a much slower pace, the demand for higher performance solutions is only increasing. To meet this demand NVIDIA will be required to make efficient, multi-chip solutions that may require more performance and features than what can be covered by NVLINK. Mellanox could potentially provide the expertise and experience to help NVIDIA achieve such scale.

Source: NVIDIA

NVIDIA Explains How Higher Frame Rates Can Give You an Edge in Battle Royale Games

Subject: Graphics Cards | March 7, 2019 - 11:07 AM |
Tagged: nvidia, gaming, fps, battle royale

NVIDIA has published an article about GPU performance and its impact on gaming, specifically the ultra-popular battle royale variety. The emphasis is on latency, and reducing this when gaming with a combination of high FPS numbers and a 144 Hz (and higher) refresh display. Many of these concepts may seem obvious (competitive gaming on CRTs and/or lower resolutions for max performance comes to mind), but there are plenty of slides to look over - with many more over at NVIDIA's post.

"For many years, esports pros have tuned their hardware for ultra-high frame rates -- 144 or even 240 FPS -- and they pair their hardware with high refresh rate monitors. In fact, ProSettings.net and ProSettings.com report that 99% of Battle Royale Pros (Fortnite,PUBG and Apex Legends) are using 144 Hz monitors or above, and 30% are using 240 Hz monitors. This is because when you run a game, an intricate process occurs in your PC from the time you press the keyboard or move the mouse, to the time you see an updated frame on the screen. They refer to this time period as ‘latency’, and the lower your latency the better your response times will be."

battle-royale-latency-comparison.png

While a GTX 750 Ti to RTX 2080 comparison defies explanation, latency obviously drops with performance in this example

"Working with pros through NVIDIA’s research team and Esports Studio, we have seen the benefits of high FPS and refresh rates play out in directed aiming and perception tests. In blind A/B tests, pros in our labs have been able to consistently discern and see benefits from even a handful of milliseconds of reduced latency.

But what does higher frame rates and lower latency mean for your competitiveness in Battle Royale? A few things:

  • Higher FPS means that you see the next frame more quickly and can respond to it
  • Higher FPS on a high Hz monitor makes the image appear smoother, and moving targets easier to aim at. You are also less likely to see microstutters or “skip pixels” from one frame to the next as you pan your crosshair across the screen
  • Higher FPS combined with G-SYNC technologies like Ultra Low Motion Blur (ULMB) makes objects and text sharper and easier to comprehend in fast moving scenes

This is why for Battle Royale games, which rely heavily on reaction times and your ability to spot an enemy quickly, you want to play at 144 FPS or more."

One of the more interesting aspects of this article relates to K/D ratios, with NVIDIA claiming an edge in this are based on GPU performance and monitor refresh rate:

battle-royale-fortnite-pubg-increase-in-kd-gpu.png

"We were curious to understand how hardware and frame rates affect overall competitiveness in Battle Royale games for everyday gamers - while better hardware can’t replace practice and training, it should assist gamers in getting closer to their maximum potential."

battle-royale-fortnite-pubg-increase-in-kd-monitor.png

"One of the common metrics of player performance in Battle Royales is kill-to-death (K/D) ratio -- how many times you killed another player divided by how many times another player killed you. Using anonymized GeForce Experience Highlights data on K/D events for PUBG and Fortnite, we found some interesting insights on player performance and wanted to share this information with the community."

battle-royale-fortnite-pubg-increase-in-kd-hours.png

For more on this topic, and many more charts, check out the article over at NVIDIA.com.

Source: NVIDIA
Manufacturer: EVGA

The EVGA RTX 2060 XC Ultra

While NVIDIA’s new GTX 1660 Ti has stolen much of the spotlight from the RTX 2060 launched at CES, this more powerful Turing card is still an important part of the current video card landscape, though with its $349 starting price it does not fit into the “midrange” designation we have been used to.

EVGA_RTX2060_XC_Ultra_7.jpg

Beyond the price argument, as we saw with our initial review of the RTX 2060 Founders Edition and subsequent look at 1440p gaming and overclocking results, the RTX 2060 far exceeds midrange performance, which made sense given the price tag but created some confusion based on the "2060" naming as this suggested a 20-series replacement to the GTX 1060.

The subsequent GTX 1660 Ti launch provided those outspoken about the price and performance level of the RTX 2060 in relation to the venerable GTX 1060 with a more suitable replacement, leaving the RTX 2060 as an interesting mid-premium option that could match late-2017’s GTX 1070 Ti for $100 less, but still wasn’t a serious option for RTX features without DLSS to boost performance - image quality concerns in the early days of this tech notwithstanding.

EVGA_RTX2060_XC_Ultra_1.jpg

One area certainly worth exploring further with the RTX 2060 is overclocking, as it seemed possible that a healthy OC had the potential to meet RTX 2070 performance, though our early efforts were conducted using NVIDIA’s Founders Edition version, which just one month in now seems about as common as a pre-cyclone cover version of the original Sim City for IBM compatibles (you know, the pre-Godzilla litigation original?). LGR-inspired references aside, let's look at the card EVGA sent us for review.

Continue reading our review of the EVGA GeForce RTX 2060 XC Ultra graphics card

How about some good news for a change, like a GPU price correction?

Subject: General Tech | March 6, 2019 - 12:34 PM |
Tagged: amd, nvidia, cryptocurrency

The cryptocurrency fad has been driving us insane for the past few years as we saw unprecedented demand for GPUs cause prices to jump far above MSRP and possibly contribute to the launch prices of the current generation of GPUs.  Now that the miners have moved on to other things, or to ASICs designed specifically for mining, NVIDIA and AMD saw a large drop in sales volume.

DigiTimes have heard from card vendors that they have an immense amount of inventory stuck in warehouses now that demand has dried up.  According to their sources, the price cuts we've seen on the GTX 1060 and 1070 as well as the RX580 may start to spread to other cards in an attempt to clear space for new inventory.  You shouldn't expect huge drops over a short time, but you should definitely keep your eye out for bargains over the coming months.

pmEa6Rb.jpg

"With the dissipation of the cryptocurrency mining fad, graphics card players have begun cutting product prices in a bid to clear out excess inventory at the expense of profitability, according to industry sources."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes

NVIDIA Announces GeForce 419.35 WHQL Driver, RTX Triple Threat Bundle

Subject: Graphics Cards | March 5, 2019 - 09:48 AM |
Tagged: Tom Clancy’s The Division II, RTX Triple Threat Bundle, rtx, ray tracing, nvidia, geforce, gaming, devil may cry 5, bundle, Apex Legends, 419.35 WHQL

GeForce Game Ready Driver 419.35 WHQL

NVIDIA has released their latest Game Ready driver today, 419.35 WHQL, for "the optimal gaming experience for Apex Legends, Devil May Cry 5, and Tom Clancy’s The Division II". The update also adds three monitors to the G-SYNC compatible list, with the BenQ XL2540-B/ZOWIE XL LCD, Acer XF250Q, and Acer ED273 A joining the ranks.

apex-screenshot-world-overview.jpg

Apex Legends World Overview (image credit: EA)

"Our newest Game Ready Driver introduces optimizations and updates for Apex Legends, Devil May Cry 5, and Tom Clancy’s The Division II, giving you the best possible experience from the second you start playing.

In addition, we continue to optimize and improve already-released games, such as Metro Exodus, Anthem, and Battlefield V, which are included in our new GeForce RTX Triple Threat Bundle."

RTX Triple Threat Bundle

NVIDIA's latest game bundle offers desktop and laptop RTX 2060 and 2070 buyers a choice of Anthem, Battlefield V, or Metro Exodus. Buyers of the high-end RTX 2080 and 2080 Ti graphics cards (including laptops) get all three of these games.

geforce-rtx-triple-bundle.jpg

"For a limited time, purchase a qualifying GeForce RTX 2080 Ti or 2080 graphics card, gaming desktop, or gaming laptop and get Battlefield V, Anthem, and Metro Exodus (an incredible $180 value!). Pick up a qualifying GeForce RTX 2070 or 2060 graphics card, gaming desktop, or gaming laptop and get your choice of these incredible titles."

The free games offer begins today, with codes redeemable "beginning March 5, 2019 until May 2, 2019 or while supplies last." You can download the latest NVIDIA driver here.

Source: NVIDIA

GeForce Driver Updates Contain Security Fixes

Subject: Graphics Cards | February 28, 2019 - 11:25 PM |
Tagged: nvidia, graphics drivers, security

Normally, when we discuss graphics drivers, there are a subset of users that like to stay on old versions. Some have older hardware and they believe that they will get limited benefits going forward. Others encounter a bug with a certain version and will refuse to update until it is patched.

In this case – you probably want to update regardless.

nvidia-2015-bandaid.png

NVIDIA has found eight security vulnerabilities in their drivers, which have been corrected in their latest versions. One of them also affects Linux... more on that later.

On Windows, there are five supported branches:

  • Users of R418 for GeForce, Quadro, and NVS should install 419.17.
  • Users of R418 for Tesla should install 418.96.
  • Users of R400 for Quadro and NVS should install 412.29.
  • Users of R400 for Tesla should install 412.29.
  • Users of R390 for Quadro and NVS should install 392.37.

Basically, you should install 419.17 unless you are using professional hardware.

One issue is being likened to Meltdown and Spectre although it is not quite the same. In those cases, the exploit took advantage of hardware optimizations leak system memory. In the case of CVE-2018-6260, however, the attack uses NVIDIA’s performance counters to potentially leak graphics memory. The difference is that GPU performance counters are a developer tool, used by applications like NVIDIA Nsight, to provide diagnostics. Further, beyond targeting a developer tool that can be disabled, this attack also requires local access to the device.

Linux users are also vulnerable to this attack (but not the other seven):

  • Users of R418 for GeForce, Quadro, and NVS should install 418.43.
  • Users of R418 for Tesla should install 418.39.
  • Users of R400 for GeForce, Quadro, NVS, and Tesla should install 410.104.
  • Users of R390 for GeForce, Quadro, NVS, and Tesla should install 390.116.
  • Users of R384 for Tesla should install 384.183.

Whether on Windows or Linux, after installing the update, a hidden option will allow you to disable GPU performance counters unless admin credentials are provided. I don’t know why it’s set to the insecure variant by default… but the setting can be toggled in the NVIDIA Control Panel. On Windows it’s Desktop then Enable Developer Settings then Manage GPU Performance Counters under Developer then Restrict access to the GPU counters to admin users only. See the driver release notes (especially the "Driver Security" section) for more info.

nvidia-2019-disableperfcounters.png

The main thing to fix is the other seven, however. That just requires the driver update. You should have received a notification from GeForce Experience if you use it; otherwise, check out NVIDIA’s website.

Source: NVIDIA

GPU prices just too damn high? NVIDIA might have something for you

Subject: General Tech | February 26, 2019 - 02:04 PM |
Tagged: rumour, nvidia, gtx 1660, gtx 1650

You could successfully argue that neither AMD nor NVIDIA have offered a lower end GPU for casual gaming and content consumption in the last generation.  Rumours abound that NVIDIA will offer not one, but two cards priced around the $200 mark which would fill that niche, the GTX 1660 and GTX 1650.  We have little information about them, though you can safely assume that they will perform at a lower level than the GTX 1660 Ti

The launch dates for these cards, assuming they exist, is pegged for March 15th and April 30th, according to DigiTimes.  According to one of our favourite leakers, TUM_APISAK, the GTX 1650 will sport 4GB of RAM and have a core clock of 1,485MHz.  The GTX 1660 remains a mystery.

tdh.PNG

"The sources said that Nvidia is slated to launch GTX 1660 on March 15 and GTX1650 on April 30, which will bear minimum price tags of US$229 and US$179, respectively."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes

Gigabyte's Aero 15, more thoughts on tracing rays while on the go

Subject: Mobile | February 25, 2019 - 03:45 PM |
Tagged: gigabyte, Aero 15 X9, RTX 2070 Max-Q, Core i7-8750H, Intel, nvidia, gaming laptop, 144hz

Gigabyte's Aero 15 X9 has been upgraded to an RTX 2070 Max-Q to power the 144Hz 1080p screen, though if you have the money there is a model with an RTX 2080 Max-Q and a 4k display.  Techspot reviewed the former and tested its performance against previous models as well as determining if the laptop has enough power to provide a decent experience with DLSS or DXR enabled.  They also discovered the laptop gets rather warm while testing and that thanks to the thin bezel the camera has been moved to the bottom, providing a handy way to trim your nose hairs.

gaming.PNG

"Today we are reviewing the Gigabyte Aero 15 X9, the first Nvidia RTX laptop we tested and used for our RTX 2070 Max-Q feature earlier this month. It's a cool gaming laptop, pretty similar to the Aero 15X v8 we looked at last year, but with a few upgrades that we'll walk you through here."

Here are some more Mobile articles from around the web:

More Mobile Articles

 

Source: TechSpot

Every NVIDIA GeForce GTX 1660 Ti on Amazon (So Far)

Subject: Graphics Cards | February 23, 2019 - 03:58 PM |
Tagged: zotac, video card, turing, nvidia, msi, gtx 1660 ti, graphics, gpu, gigabyte, geforce, gaming, evga, asus, amazon

NVIDIA partners launched their new GeForce GTX 1660 Ti graphics cards yesterday, and we checked out a pair of these in our review and found these new TU116-based cards to offer excellent performance (and overclocking headroom) for the price. Looking over Amazon listings today here is everything available so far, separated by board partner. We've added the Boost Clock speeds for your reference to show how these cards are clocked compared to the reference (1770 MHz), and purchases made through any of these Amazon affiliate links help us out with a small commission.

1660_Ti_Cards_Stack.jpg

In any case, this list at least demonstrates the current retail picture of NVIDIA's new mainstream Turing GPU on Amazon, so without further preamble here are all currently available cards in alphabetical order by brand:

ASUS

ASUS Phoenix GeForce GTX 1660 Ti OC

ASUS Dual GeForce GTX 1660 Ti OC

ASUS Strix Gaming GTX 1660 Ti OC

EVGA

EVGA GeForce GTX 1660 Ti XC Black Gaming

EVGA GeForce GTX 1660 Ti XC Gaming

GIGABYTE

GIGABYTE GeForce GTX 1660 Ti OC 6G

GIGABYTE GeForce GTX 1660 Ti Windforce OC 6G

MSI

MSI GTX 1660 Ti VENTUS XS 6G OC

MSI GTX 1660 Ti ARMOR 6G OC

MSI GTX 1660 Ti GAMING X 6G

ZOTAC

ZOTAC Gaming GeForce GTX 1660 Ti

Already we are seeing many cards offering factory overclocks, ranging from a small 30 MHz bump at $279.99 from GIGABYTE (GTX 1660 Ti OC 6G, 1800 MHz Boost Clock) to 100 MHz+ from the MSI GTX 1660 Ti GAMING X 6G (1875 MHz Boost Clock) we reviewed at $309.99.

We will update the list as additional cards become available on Amazon.

Source: Amazon.com

Forget the GTX 1660 Ti, let's start speculating about the other GTX 1660

Subject: Graphics Cards | February 22, 2019 - 01:54 PM |
Tagged: video card, Turin, tu116, rtx, ray tracing, nvidia, msi, gtx 1660 ti, gtx, graphics, gpu, geforce, gaming, asus, DLSS, palit

Today is the day that the GTX 1660 Ti moves from rumour to fact as the NDA is finally over and we can share our results! Sebastian's testing compared the overclocked and slightly above base price MSI GTX 1660 Ti GAMING X against the interestingly shaped EVGA GTX 1660 Ti XC Black.  Performance-wise, the rumours were fairly accurate, the card offers comparable performance to the 1070 Ti, and at at ~$280 price point it is certainly less expensive but still shows evidence of the upwards trend in price for GPUs.

If you are interested in other models, take a peek at The Guru of 3D who reviewed not one or two, but four different 1660 Ti's.  From the tiny little Palit StormX model pictured below through MSI's dual fan VENTUS XS and Gaming X to the full sized ASUS ROG STRIX with three fans you have a fair number of charts to go through!

img_7747.jpg

"We have four new reviews to present today. NVIDIA is launching the 279 USD GeForce GTX 1660 Ti. We've talked about it a lot, it is the more affordable offering, Turing GPU based, yet stripped from RT and tensor functionality."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Guru of 3D