Ripping threads with Coreprio and DLM

Subject: Processors | March 20, 2019 - 04:34 PM |
Tagged: amd, coreprio, threadripper 2, 2990wx, dynamic local mode

Owning a Threadripper is not boring, the new architecture offers a variety of interesting challenges to keep your attention.  One of these features is the lack of direct memory access for two of the dies, which can cause some performance issues and was at least partially addressed by introducing Dynamic Local Mode into Ryzen Master.  On Windows boxes, enabling that feature ensures your hardest working cores have direct memory access, on Linux systems the problem simply doesn't exist.  Another choice is Coreprio, developed by Bitsum, which accomplishes the same task but without the extras included in Ryzen Master. 

Techgage ran a series of benchmarks comparing the differences in performance between the default setting, DLM and Coreprio.

AMD-Ryzen-Threadripper-2990WX-CCX-Diagram.png

"Performance regression issues in Windows on AMD’s top-end Ryzen Threadripper CPUs haven’t gone unnoticed by those who own them, and six months after launch, the issues remain. Fortunately, there’s a new tool making the rounds that can help smooth out those regressions. We’re taking an initial look."

Here are some more Processor articles from around the web:

Processors

Source: Techgage

The Atari VCS is delayed again, but you might not be as mad about it as you think

Subject: General Tech | March 19, 2019 - 01:19 PM |
Tagged: atari, delay, amd, Vega, ryzen

You may remember the announcement of the re-launch of the Atari Video Console System back in the summer of 2017, though by now you may have decided that it is going the way of the ZX Spectrum Vega+.  If you do still hold hope, Atari is once again testing your patience by announcing another delay to the end of 2019.  There is a reason however, which you may or may not find acceptable.  They will be upgrading the AMD Ryzen chip at the heart of the system, with the new generation of Vega graphics offering modern performance.  Atari is also suggesting this will offer much quieter and cooler performance in a quote over at The Inquirer.

atari-vcs-ataribox-linux-pc-video-game-console-gaming.jpg

"The Atari VCS launched on Indiegogo and was originally set to arrive in spring 2018, but the company has announced that it will now arrive at the butt-end of 2019 (and that projection is just for the US and Canada)."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer
Manufacturer: PC Perspective

AMD and NVIDIA GPUs Tested

Tom Clancy’s The Division 2 launched over the weekend and we've been testing it out over the past couple of days with a collection of currently-available graphics cards. Of interest to AMD fans, this game joins the ranks of those well optimized for Radeon graphics, and with a new driver (Radeon Software Adrenalin 2019 Edition 19.3.2) released over the weekend it was a good time to run some benchmarks and see how some AMD and NVIDIA hardware stack up.

d2-key-art-1920x600.jpg

The Division 2 offers DirectX 11 and 12 support, and uses Ubisoft's Snowdrop engine to provide some impressive visuals, particularly at the highest detail settings. We found the "ultra" preset to be quite attainable with very playable frame rates from most midrange-and-above hardware even at 2560x1440, though bear in mind that this game uses quite a bit of video memory. We hit a performance ceiling at 4GB with the "ultra" preset even at 1080p, so we opted for 6GB+ graphics cards for our final testing. And while most of our testing was done at 1440p we did test a selection of cards at 1080p and 4K, just to provide a look at how the GPUs on test scaled when facing different workloads.

Tom Clancy's The Division 2

d2-screen1-1260x709.jpg

Washington D.C. is on the brink of collapse. Lawlessness and instability threaten our society, and rumors of a coup in the capitol are only amplifying the chaos. All active Division agents are desperately needed to save the city before it's too late.

d2-screen4-1260x709.jpg

Developed by Ubisoft Massive and the same teams that brought you Tom Clancy’s The Division, Tom Clancy’s The Division 2 is an online open world, action shooter RPG experience set in a collapsing and fractured Washington, D.C. This rich new setting combines a wide variety of beautiful, iconic, and realistic environments where the player will experience the series’ trademark for authenticity in world building, rich RPG systems, and fast-paced action like never before.

d2-screen3-1260x709.jpg

Play solo or co-op with a team of up to four players to complete a wide range of activities, from the main campaign and adversarial PvP matches to the Dark Zone – where anything can happen.

Continue reading our preview of GPU performance with The Division 2

Remember Conservative Morphological Anti-Aliasing?

Subject: Graphics Cards | March 18, 2019 - 03:13 PM |
Tagged: fxaa, SMAA, Anti-aliasing, MLAA, taa, amd, nvidia

Apart from the new DLSS available on NVIDIA's RTX cards, it has been a very long time since we looked at anti-aliasing implementations and the effects your choice has on performance and visual quality.  You are likely familiar with the four most common implementations, dating back to AMD's MLAA and NVIDIA's FXAA which are not used in new generation games to TAA/TXAA and SMAA but when was the last time you refreshed your memory on what they actually do and how they compare.

Not only did Overclockers Club looking into those, they discuss some of the other attempted implementations as well as sampling types that lie behind these technologies.  Check out their deep dive here.

anti.PNG

"One setting present in many if not all modern PC games that can dramatically impact performance and quality is anti-aliasing and, to be honest, I never really understood how it works. Sure we have the general idea that super-sampling is in effect running at a higher resolution and then downscaling, but then what is multi-sampling? How do post-processing methods work, like the very common FXAA and often favored SMAA?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Crytek's Neon Noir is a Platform Agnostic Real-Time Ray Tracing Demo

Subject: General Tech | March 18, 2019 - 09:03 AM |
Tagged: vulkan, RX Vega 56, rtx, ray tracing, radeon, nvidia, Neon Noir, dx12, demo, crytek, CRYENGINE, amd

Crytek has released video of a new demo called Neon Noir, showcasing real-time ray tracing with a new version of CRYENGINE Total Illumination, slated for release in 2019. The big story here is that this is platform agnostic, meaning both AMD and NVIDIA (including non-RTX) graphics cards can produce the real-time lighting effects. The video was rendered in real time using an AMD Radeon RX Vega 56 (!) at 4K30, with Crytek's choice in GPU seeming to assuage fears of any meaningful performance penalty with this feature enabled (video embedded below):

“Neon Noir follows the journey of a police drone investigating a crime scene. As the drone descends into the streets of a futuristic city, illuminated by neon lights, we see its reflection accurately displayed in the windows it passes by, or scattered across the shards of a broken mirror while it emits a red and blue lighting routine that will bounce off the different surfaces utilizing CRYENGINE's advanced Total Illumination feature. Demonstrating further how ray tracing can deliver a lifelike environment, neon lights are reflected in the puddles below them, street lights flicker on wet surfaces, and windows reflect the scene opposite them accurately.”

Crytek is calling the new ray tracing features “experimental” at this time, but the implications of ray tracing tech beyond proprietary hardware and even graphics API (it works with both DirectX 12 and Vulcan) are obviously a very big deal.

crytek_demo.png

“Neon Noir was developed on a bespoke version of CRYENGINE 5.5., and the experimental ray tracing feature based on CRYENGINE’s Total Illumination used to create the demo is both API and hardware agnostic, enabling ray tracing to run on most mainstream, contemporary AMD and NVIDIA GPUs. However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards and supported APIs like Vulkan and DX12.”

You can read the full announcement from Crytek here.

Source: Crytek

AMD States Its CPUs Are Not Susceptible to SPOILER

Subject: Processors | March 18, 2019 - 08:38 AM |
Tagged: spoiler, speculation, spectre, rowhammer, meltdown, amd

AMD has issued a support article stating that its CPUs are not susceptible to the recently disclosed SPOILER vulnerability. Support Article PA-240 confirms initial beliefs that AMD processors were immune from this specific issue due to the different ways that AMD and Intel processors store and access data:

We are aware of the report of a new security exploit called SPOILER which can gain access to partial address information during load operations. We believe that our products are not susceptible to this issue because of our unique processor architecture. The SPOILER exploit can gain access to partial address information above address bit 11 during load operations. We believe that our products are not susceptible to this issue because AMD processors do not use partial address matches above address bit 11 when resolving load conflicts.

amd-epyc.jpg

SPOILER, one of the latest in the line of speculative execution vulnerabilities that have called into question years of processor architecture design, describes a process that can expose the mappings between virtual and physical memory. That's not a complete issue in and of itself, but it allows other attacks such as Rowhammer to be executed much more quickly and easily.

The research paper that initially disclosed SPOILER earlier this month states that Intel CPUs dating as far back as the first generation Core-series processors are affected. Intel, however, has stated that the vulnerabilities described in the paper can be avoided. The company provided a statement to PC Perspective following our initial SPOILER reporting:

Intel received notice of this research, and we expect that software can be protected against such issues by employing side channel safe software development practices. This includes avoiding control flows that are dependent on the data of interest. We likewise expect that DRAM modules mitigated against Rowhammer style attacks remain protected. Protecting our customers and their data continues to be a critical priority for us and we appreciate the efforts of the security community for their ongoing research.

Source: AMD

AMD Releases Radeon Software Adrenalin 2019 Edition 19.3.2 with Windows 7 DX12 Support

Subject: Graphics Cards | March 14, 2019 - 08:38 PM |
Tagged: Windows 7, The Division 2, radeon, graphics, gpu, gaming, dx12, driver, DirectX 12, amd, Adrenalin, 19.3.2

AMD has released Radeon 19.3.2 drivers, adding support for Tom Clancy’s The Division 2 and offering a performance boost with Civilization VI: Gathering Storm. This update also adds a number of new Vulkan extensions. But wait, there's more: "DirectX 12 on Windows 7 for supported game titles." The DX12-ening is upon us.

adrenalin.PNG

Here are AMD's release notes for 19.3.2:

Radeon Software Adrenalin 2019 Edition 19.3.2 Highlights

Support For

  • Tom Clancy’s The Division® 2
  • Sid Meier’s Civilization® VI: Gathering Storm
    • Up to 4% average performance gains on AMD Radeon VII with Radeon™ Software Adrenalin 2019 Edition 19.3.2 vs 19.2.3. RS-288
  • DirectX® 12 on Windows®7 for supported game titles
    • AMD is thrilled to help expand DirectX® 12 adoption across a broader range of Windows operating systems with Radeon Software Adrenalin 2019 Edition 18.12.2 and onward, which enables consumers to experience exceptional levels of detail and performance in their games.

Fixed Issues

  • Radeon ReLive for VR may sometimes fail to install during Radeon Software installation.
  • Fan curve may fail to switch to manual mode after the manual toggle is switched when fan curve is still set to default behavior.
  • Changes made in Radeon WattMan settings via Radeon Overlay may sometimes not save or take effect once Radeon Overlay is closed.

Known Issues

  • Rainbow Six Siege™ may experience intermittent corruption or flickering on some game textures during gameplay.
  • DOTA™2 VR may experience stutter on some HMD devices when using the Vulkan® API.
  • Mouse cursors may disappear or move out of the boundary of the top of a display on AMD Ryzen Mobile Processors with Radeon Vega Graphics.
  • Performance metrics overlay and Radeon WattMan gauges may experience inaccurate fluctuating readings on AMD Radeon VII..

More release notes after the break.

Source: AMD

Report: AMD's Upcoming Navi GPUs Launching in August

Subject: Graphics Cards | March 13, 2019 - 02:41 PM |
Tagged: report, rumor, wccftech, amd, navi, gpu, graphics, video card, 7nm, radeon

Could Navi be coming a bit sooner than we expected? I'll quote directly from the sourced report by Usman Pirzada over at WCCFtech:

"I have been told that AMD’s Navi GPU is at least one whole month behind AMD’s 7nm Ryzen launch, so if the company launches the 3000 series desktop processors at Computex like they are planning to, you should not expect the Navi GPU to land before early August. The most likely candidates for launch during this window are Gamescom and Siggraph. I would personally lean towards Gamescom simply because it is a gaming product and is the more likely candidate but anything can happen with AMD!

Some rumors previously had suggested an October launch, but as of now, AMD is telling its partners to expect the launch exactly a month after the Ryzen 7nm launch."

AMD GPU Roadmap.png

Paying particular attention to the second paragraph from the quote above, if this report is coming from board partners we will probably start seeing leaked box art and all the fixings from VideoCardz as August nears - if indeed July is the release month for the Ryzen 3000 series CPUs (and come on, how could they pass on a 7/7 launch for the 7nm CPUs?).

Source: Wccftech

AMD's new 66AF:F1 GPU, a model number inspired by the USB-IF?

Subject: General Tech | March 13, 2019 - 12:53 PM |
Tagged: navi, amd, leak, AMD 66AF:F1

It's not the leak that many were hoping for, regardless it is still a peek at what Navi may offer when released.  The Inquirer spotted a mysterious AMD 66AF:F1 GPU attached to some Compubench compute results that could be a Navi GPU.  If that is indeed what it is then the new GPU will offer mid-range performance, as opposed to competing against NVIDIA's current family of high end cards.  It is rather telling that the Bitcoin Mining benchmark was not run, as we have seen in recent financial reports that bubble has pretty much burst.

498eff757b336f811a4b437aeb7bd523_XL.jpg

"In leaked benchmarks on CompuBench, the compute performance of the mysterious GPU suggested that it lags a little behind the Vega 56 and is more on par with the Radeon RX 580."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

NVIDIA Acquires Mellanox: Beyond the Numbers

Subject: Editorial | March 12, 2019 - 10:14 PM |
Tagged: nvswitch, nvlink, nvidia, Mellanox, Intel, Infiniband, Ethernet, communications, chiplets, amd

In a bit of a surprise this past weekend NVIDIA announced that it is purchasing the networking company Mellanox for approximately $6.9 billion US. NVIDIA and Intel were engaged in a bidding war for the Israel based company. At first glance we do not see the synergies that could potentially come from such an acquisition, but in digging deeper it makes much more sense. This is still a risky move for NVIDIA as their previous history of acquisitions have not been very favorable for the company (Ageia, Icera, etc.).

633889_NVLogo_3D_H_DarkType.jpg

Mellanox’s portfolio centers around datacenter connectivity solutions such as high speed ethernet and InfiniBand products. They are already a successful company that has products shipping out the door. If there is a super computer somewhere, chances are it is running Mallanox technology for high speed interconnects. This is where things get interesting for NVIDIA.

While NVIDIA focuses on GPUS they are spreading into the datacenter at a pretty tremendous rate. Their NVLink implementation allows high speed connectivity between GPUS and recently they showed off their NVSwitch which features 18 ports. We do not know how long it took to design the NVSwitch and get it running at a high level, but NVIDIA is aiming for implementations that will exceed that technology. NVIDIA had the choice to continue in-house designs or to purchase a company already well versed in such work with access to advanced networking technology.

Intel was also in play for Mellanox, but that particular transaction might not have been approved by anti-trust authorities around the world. If Intel had made an aggressive bid for Mellanox it would have essentially consolidated the market for these high end networking products. In the end NVIDIA offered the $6.9B US for the company and it was accepted. Because NVIDIA has no real networking solutions that are on the market it will likely be approved without issue. Unlike other purchases like Icera, Mellanox is actively shipping product and will add to the bottom line at NVIDIA.

mellanox-logo-square-blue.jpg

The company was able to purchase Mellanox in a cash transaction. They simply dove into their cash reserves instead of offering Mellanox shareholders equal shares in NVIDIA. This $6.9B is above what AMD paid for ATI back in 2006 ($5.4B). There may be some similarities here in that the price for Mellanox could be overvalued compared to what they actually bring to the table and we will see write downs over the next several years, much as AMD did for the ATI purchase.

The purchase will bring them instant expertise with high performance standards like InfiniBand. It will also help to have design teams versed in high speed, large node networking apply their knowledge to the GPU field and create solutions better suited for the technology. They will also continue to sell current Mellanox products.

Another purchase in the past that looks somewhat similar to this is AMD’s acquisition of SeaMicro. That company was selling products based on their Freedom Fabric technology to create ultra-dense servers utilizing dozens of CPUs. This line of products was discontinued by AMD after poor sales, but they expanded upon Freedom Fabric and created the Infinity Fabric that powers their latest Zen CPUs.

I can see a very similar situation occurring at NVIDIA. AMD is using their Infinity Fabric to connect multiple chiplets on a substrate, as well as utilizing that fabric off of the substrate. It also has integrated that fabric into their latest Vega GPUs. This philosophy looks to pay significant dividends for AMD once they introduce their 7nm CPUs in the form of Zen 2 and EPYC 2. AMD is not relying on large, monolithic dies for both their consumer and enterprise parts, thereby improving yields and bins on these parts as compared to what Intel does with current Xeon parts.

mellanox-quantum-connectx-6-chips-652x381.jpg

When looking at the Mellanox purchase from this view, it makes a lot of sense for NVIDIA. With process node advances moving at a much slower pace, the demand for higher performance solutions is only increasing. To meet this demand NVIDIA will be required to make efficient, multi-chip solutions that may require more performance and features than what can be covered by NVLINK. Mellanox could potentially provide the expertise and experience to help NVIDIA achieve such scale.

Source: NVIDIA