Subject: General Tech | March 19, 2019 - 01:19 PM | Jeremy Hellstrom
Tagged: atari, delay, amd, Vega, ryzen
You may remember the announcement of the re-launch of the Atari Video Console System back in the summer of 2017, though by now you may have decided that it is going the way of the ZX Spectrum Vega+. If you do still hold hope, Atari is once again testing your patience by announcing another delay to the end of 2019. There is a reason however, which you may or may not find acceptable. They will be upgrading the AMD Ryzen chip at the heart of the system, with the new generation of Vega graphics offering modern performance. Atari is also suggesting this will offer much quieter and cooler performance in a quote over at The Inquirer.
"The Atari VCS launched on Indiegogo and was originally set to arrive in spring 2018, but the company has announced that it will now arrive at the butt-end of 2019 (and that projection is just for the US and Canada)."
Here is some more Tech News from around the web:
- NVIDIA GTC 2019: RTX Servers, Omniverse Collaboration, CUDA-X AI, And More @ Techgage
- Corporations, not consumers, drive demand for HP’s new VR headset @ Ars Technica
- MacBook users have taken to giving oral relief to frustrated keyboards @ The Inquirer
- Firefox 66 Arrives With Autoplaying Blocked by Default, Smoother Scrolling, and Better Search @ Slashdot
- NVIDIA Jetson Nano: A Feature-Packed Arm Developer Kit For $99 USD @ Phoronix
- This headline is proudly brought to you by wired keyboards: Wireless Fujitsu model hacked @ The Register
- Apple finally updates the iMac with significantly more powerful CPU and GPU options @ Ars Technica
- TSMC seeing chip orders for Android devices ramp up @ DigiTimes
- QNAP QSW-1208-8C-US 12-Port Unmanaged 10GbE Switch @ Modders-Inc
- ASUS RT-AX88U Dual band AX6000 router @ Guru of 3D
AMD and NVIDIA GPUs Tested
Tom Clancy’s The Division 2 launched over the weekend and we've been testing it out over the past couple of days with a collection of currently-available graphics cards. Of interest to AMD fans, this game joins the ranks of those well optimized for Radeon graphics, and with a new driver (Radeon Software Adrenalin 2019 Edition 19.3.2) released over the weekend it was a good time to run some benchmarks and see how some AMD and NVIDIA hardware stack up.
The Division 2 offers DirectX 11 and 12 support, and uses Ubisoft's Snowdrop engine to provide some impressive visuals, particularly at the highest detail settings. We found the "ultra" preset to be quite attainable with very playable frame rates from most midrange-and-above hardware even at 2560x1440, though bear in mind that this game uses quite a bit of video memory. We hit a performance ceiling at 4GB with the "ultra" preset even at 1080p, so we opted for 6GB+ graphics cards for our final testing. And while most of our testing was done at 1440p we did test a selection of cards at 1080p and 4K, just to provide a look at how the GPUs on test scaled when facing different workloads.
Tom Clancy's The Division 2
Washington D.C. is on the brink of collapse. Lawlessness and instability threaten our society, and rumors of a coup in the capitol are only amplifying the chaos. All active Division agents are desperately needed to save the city before it's too late.
Developed by Ubisoft Massive and the same teams that brought you Tom Clancy’s The Division, Tom Clancy’s The Division 2 is an online open world, action shooter RPG experience set in a collapsing and fractured Washington, D.C. This rich new setting combines a wide variety of beautiful, iconic, and realistic environments where the player will experience the series’ trademark for authenticity in world building, rich RPG systems, and fast-paced action like never before.
Play solo or co-op with a team of up to four players to complete a wide range of activities, from the main campaign and adversarial PvP matches to the Dark Zone – where anything can happen.
Subject: Graphics Cards | March 18, 2019 - 03:13 PM | Jeremy Hellstrom
Tagged: fxaa, SMAA, Anti-aliasing, MLAA, taa, amd, nvidia
Apart from the new DLSS available on NVIDIA's RTX cards, it has been a very long time since we looked at anti-aliasing implementations and the effects your choice has on performance and visual quality. You are likely familiar with the four most common implementations, dating back to AMD's MLAA and NVIDIA's FXAA which are not used in new generation games to TAA/TXAA and SMAA but when was the last time you refreshed your memory on what they actually do and how they compare.
Not only did Overclockers Club looking into those, they discuss some of the other attempted implementations as well as sampling types that lie behind these technologies. Check out their deep dive here.
"One setting present in many if not all modern PC games that can dramatically impact performance and quality is anti-aliasing and, to be honest, I never really understood how it works. Sure we have the general idea that super-sampling is in effect running at a higher resolution and then downscaling, but then what is multi-sampling? How do post-processing methods work, like the very common FXAA and often favored SMAA?"
Here are some more Graphics Card articles from around the web:
- MSI GTX 1660 Ti Gaming X – Turing Without The RTX @ Bjorn3d
- MSI GeForce GTX 1660 Gaming X 6 GB @ TechPowerUp
- The GTX 1660 41 game OC Shootout vs. the RX 590 @ BabelTechReviews
- MSI Nvidia GeForce GTX 1660 Gaming X Review – an Even Lower Cost Turing Option? @ Bjorn3d
Subject: General Tech | March 18, 2019 - 09:03 AM | Sebastian Peak
Tagged: vulkan, RX Vega 56, rtx, ray tracing, radeon, nvidia, Neon Noir, dx12, demo, crytek, CRYENGINE, amd
Crytek has released video of a new demo called Neon Noir, showcasing real-time ray tracing with a new version of CRYENGINE Total Illumination, slated for release in 2019. The big story here is that this is platform agnostic, meaning both AMD and NVIDIA (including non-RTX) graphics cards can produce the real-time lighting effects. The video was rendered in real time using an AMD Radeon RX Vega 56 (!) at 4K30, with Crytek's choice in GPU seeming to assuage fears of any meaningful performance penalty with this feature enabled (video embedded below):
“Neon Noir follows the journey of a police drone investigating a crime scene. As the drone descends into the streets of a futuristic city, illuminated by neon lights, we see its reflection accurately displayed in the windows it passes by, or scattered across the shards of a broken mirror while it emits a red and blue lighting routine that will bounce off the different surfaces utilizing CRYENGINE's advanced Total Illumination feature. Demonstrating further how ray tracing can deliver a lifelike environment, neon lights are reflected in the puddles below them, street lights flicker on wet surfaces, and windows reflect the scene opposite them accurately.”
Crytek is calling the new ray tracing features “experimental” at this time, but the implications of ray tracing tech beyond proprietary hardware and even graphics API (it works with both DirectX 12 and Vulcan) are obviously a very big deal.
“Neon Noir was developed on a bespoke version of CRYENGINE 5.5., and the experimental ray tracing feature based on CRYENGINE’s Total Illumination used to create the demo is both API and hardware agnostic, enabling ray tracing to run on most mainstream, contemporary AMD and NVIDIA GPUs. However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards and supported APIs like Vulkan and DX12.”
You can read the full announcement from Crytek here.
Subject: Processors | March 18, 2019 - 08:38 AM | Jim Tanous
Tagged: spoiler, speculation, spectre, rowhammer, meltdown, amd
AMD has issued a support article stating that its CPUs are not susceptible to the recently disclosed SPOILER vulnerability. Support Article PA-240 confirms initial beliefs that AMD processors were immune from this specific issue due to the different ways that AMD and Intel processors store and access data:
We are aware of the report of a new security exploit called SPOILER which can gain access to partial address information during load operations. We believe that our products are not susceptible to this issue because of our unique processor architecture. The SPOILER exploit can gain access to partial address information above address bit 11 during load operations. We believe that our products are not susceptible to this issue because AMD processors do not use partial address matches above address bit 11 when resolving load conflicts.
SPOILER, one of the latest in the line of speculative execution vulnerabilities that have called into question years of processor architecture design, describes a process that can expose the mappings between virtual and physical memory. That's not a complete issue in and of itself, but it allows other attacks such as Rowhammer to be executed much more quickly and easily.
The research paper that initially disclosed SPOILER earlier this month states that Intel CPUs dating as far back as the first generation Core-series processors are affected. Intel, however, has stated that the vulnerabilities described in the paper can be avoided. The company provided a statement to PC Perspective following our initial SPOILER reporting:
Intel received notice of this research, and we expect that software can be protected against such issues by employing side channel safe software development practices. This includes avoiding control flows that are dependent on the data of interest. We likewise expect that DRAM modules mitigated against Rowhammer style attacks remain protected. Protecting our customers and their data continues to be a critical priority for us and we appreciate the efforts of the security community for their ongoing research.
Subject: Graphics Cards | March 14, 2019 - 08:38 PM | Sebastian Peak
Tagged: Windows 7, The Division 2, radeon, graphics, gpu, gaming, dx12, driver, DirectX 12, amd, Adrenalin, 19.3.2
AMD has released Radeon 19.3.2 drivers, adding support for Tom Clancy’s The Division 2 and offering a performance boost with Civilization VI: Gathering Storm. This update also adds a number of new Vulkan extensions. But wait, there's more: "DirectX 12 on Windows 7 for supported game titles." The DX12-ening is upon us.
Here are AMD's release notes for 19.3.2:
Radeon Software Adrenalin 2019 Edition 19.3.2 Highlights
- Tom Clancy’s The Division® 2
- Sid Meier’s Civilization® VI: Gathering Storm
- Up to 4% average performance gains on AMD Radeon VII with Radeon™ Software Adrenalin 2019 Edition 19.3.2 vs 19.2.3. RS-288
- DirectX® 12 on Windows®7 for supported game titles
- AMD is thrilled to help expand DirectX® 12 adoption across a broader range of Windows operating systems with Radeon Software Adrenalin 2019 Edition 18.12.2 and onward, which enables consumers to experience exceptional levels of detail and performance in their games.
- Radeon ReLive for VR may sometimes fail to install during Radeon Software installation.
- Fan curve may fail to switch to manual mode after the manual toggle is switched when fan curve is still set to default behavior.
- Changes made in Radeon WattMan settings via Radeon Overlay may sometimes not save or take effect once Radeon Overlay is closed.
- Rainbow Six Siege™ may experience intermittent corruption or flickering on some game textures during gameplay.
- DOTA™2 VR may experience stutter on some HMD devices when using the Vulkan® API.
- Mouse cursors may disappear or move out of the boundary of the top of a display on AMD Ryzen Mobile Processors with Radeon Vega Graphics.
- Performance metrics overlay and Radeon WattMan gauges may experience inaccurate fluctuating readings on AMD Radeon VII..
Subject: Graphics Cards | March 13, 2019 - 02:41 PM | Sebastian Peak
Tagged: report, rumor, wccftech, amd, navi, gpu, graphics, video card, 7nm, radeon
Could Navi be coming a bit sooner than we expected? I'll quote directly from the sourced report by Usman Pirzada over at WCCFtech:
"I have been told that AMD’s Navi GPU is at least one whole month behind AMD’s 7nm Ryzen launch, so if the company launches the 3000 series desktop processors at Computex like they are planning to, you should not expect the Navi GPU to land before early August. The most likely candidates for launch during this window are Gamescom and Siggraph. I would personally lean towards Gamescom simply because it is a gaming product and is the more likely candidate but anything can happen with AMD!
Some rumors previously had suggested an October launch, but as of now, AMD is telling its partners to expect the launch exactly a month after the Ryzen 7nm launch."
Paying particular attention to the second paragraph from the quote above, if this report is coming from board partners we will probably start seeing leaked box art and all the fixings from VideoCardz as August nears - if indeed July is the release month for the Ryzen 3000 series CPUs (and come on, how could they pass on a 7/7 launch for the 7nm CPUs?).
Subject: General Tech | March 13, 2019 - 12:53 PM | Jeremy Hellstrom
Tagged: navi, amd, leak, AMD 66AF:F1
It's not the leak that many were hoping for, regardless it is still a peek at what Navi may offer when released. The Inquirer spotted a mysterious AMD 66AF:F1 GPU attached to some Compubench compute results that could be a Navi GPU. If that is indeed what it is then the new GPU will offer mid-range performance, as opposed to competing against NVIDIA's current family of high end cards. It is rather telling that the Bitcoin Mining benchmark was not run, as we have seen in recent financial reports that bubble has pretty much burst.
"In leaked benchmarks on CompuBench, the compute performance of the mysterious GPU suggested that it lags a little behind the Vega 56 and is more on par with the Radeon RX 580."
Here is some more Tech News from around the web:
- Windows 7 users will start getting upgrade nag screens starting next month @ The Inquirer
- NAND flash prices to drop by up to 10% in 2Q19 @ DigiTimes
- Microsoft plugs two Windows zero-day flaws under active attack @ The Inquirer
- Let there be Lightbits: Hardware-software tag team touts block-level storage at SSD latencies @ The Register
- HP Recalls More Laptops For 'Fire and Burn Hazards' @ Slashdot
- Take Note: Schneider's teeny-tiny Galaxy VS li-ion UPS set to explode onto data centre scene @ The Register
- Denuvo Performance Cost & FPS Loss Tested @ TechPowerUp
- You May Have Forgotten Foursquare, But It Didn't Forget You @ Slashdot
- No, Your 3D Printer Doesn’t Have a Fingerprint @ Hackaday
Subject: Editorial | March 12, 2019 - 10:14 PM | Josh Walrath
Tagged: nvswitch, nvlink, nvidia, Mellanox, Intel, Infiniband, Ethernet, communications, chiplets, amd
In a bit of a surprise this past weekend NVIDIA announced that it is purchasing the networking company Mellanox for approximately $6.9 billion US. NVIDIA and Intel were engaged in a bidding war for the Israel based company. At first glance we do not see the synergies that could potentially come from such an acquisition, but in digging deeper it makes much more sense. This is still a risky move for NVIDIA as their previous history of acquisitions have not been very favorable for the company (Ageia, Icera, etc.).
Mellanox’s portfolio centers around datacenter connectivity solutions such as high speed ethernet and InfiniBand products. They are already a successful company that has products shipping out the door. If there is a super computer somewhere, chances are it is running Mallanox technology for high speed interconnects. This is where things get interesting for NVIDIA.
While NVIDIA focuses on GPUS they are spreading into the datacenter at a pretty tremendous rate. Their NVLink implementation allows high speed connectivity between GPUS and recently they showed off their NVSwitch which features 18 ports. We do not know how long it took to design the NVSwitch and get it running at a high level, but NVIDIA is aiming for implementations that will exceed that technology. NVIDIA had the choice to continue in-house designs or to purchase a company already well versed in such work with access to advanced networking technology.
Intel was also in play for Mellanox, but that particular transaction might not have been approved by anti-trust authorities around the world. If Intel had made an aggressive bid for Mellanox it would have essentially consolidated the market for these high end networking products. In the end NVIDIA offered the $6.9B US for the company and it was accepted. Because NVIDIA has no real networking solutions that are on the market it will likely be approved without issue. Unlike other purchases like Icera, Mellanox is actively shipping product and will add to the bottom line at NVIDIA.
The company was able to purchase Mellanox in a cash transaction. They simply dove into their cash reserves instead of offering Mellanox shareholders equal shares in NVIDIA. This $6.9B is above what AMD paid for ATI back in 2006 ($5.4B). There may be some similarities here in that the price for Mellanox could be overvalued compared to what they actually bring to the table and we will see write downs over the next several years, much as AMD did for the ATI purchase.
The purchase will bring them instant expertise with high performance standards like InfiniBand. It will also help to have design teams versed in high speed, large node networking apply their knowledge to the GPU field and create solutions better suited for the technology. They will also continue to sell current Mellanox products.
Another purchase in the past that looks somewhat similar to this is AMD’s acquisition of SeaMicro. That company was selling products based on their Freedom Fabric technology to create ultra-dense servers utilizing dozens of CPUs. This line of products was discontinued by AMD after poor sales, but they expanded upon Freedom Fabric and created the Infinity Fabric that powers their latest Zen CPUs.
I can see a very similar situation occurring at NVIDIA. AMD is using their Infinity Fabric to connect multiple chiplets on a substrate, as well as utilizing that fabric off of the substrate. It also has integrated that fabric into their latest Vega GPUs. This philosophy looks to pay significant dividends for AMD once they introduce their 7nm CPUs in the form of Zen 2 and EPYC 2. AMD is not relying on large, monolithic dies for both their consumer and enterprise parts, thereby improving yields and bins on these parts as compared to what Intel does with current Xeon parts.
When looking at the Mellanox purchase from this view, it makes a lot of sense for NVIDIA. With process node advances moving at a much slower pace, the demand for higher performance solutions is only increasing. To meet this demand NVIDIA will be required to make efficient, multi-chip solutions that may require more performance and features than what can be covered by NVLINK. Mellanox could potentially provide the expertise and experience to help NVIDIA achieve such scale.
Subject: General Tech | March 7, 2019 - 12:58 PM | Jeremy Hellstrom
Tagged: Zen 2, threadripper the third, ryzen pro mobile, Ryzen 3000, epyc 2, amd
Near the beginning of AMD's Investor Relations slidedeck sits a glimpse at what we will see from the company, except for a date with Navi. Some time before summer break we will see the release of the second generation of Ryzen Pro Mobile chips (picture not to scale), which will bring Zen 2 and improved graphics to your office devices.
Just after the mobile chips and in time to give you a reason not to go outside the third generation of Ryzen desktop chips will hit the market, just as AMD promised. We already have a good idea about what those chips will be called as well as their specifications; Tim covered it in length here if you have yet to memorize all the models.
We will also see Thirdripper or as The Tech Report prefers, Threadripper the Third, though we lack any information on the date or models, you should expect to see it before the end of the year with even more cores running at a higher frequency.
AMD will also be releasing a Zen 2 based EPYC family, for chiplet fans everywhere! They suggest it will be twice as fast as the previous generation overall, up to four times as fast in certain floating point operations and as is tradition it will be compatible with the current sockets on EPYC motherboards so you can do a quick and easy drop in upgrade.
"Can you really call something a leak if the company released it on purpose? AMD's just released a slide deck for its investors, and buried in those slides are a few tiny nuggets of interesting information. Let's take a quick peek into the red team's path ahead."
Here is some more Tech News from around the web:
- Blizzard has handed Diablo 1’s keys to GOG, and you can buy it right now @ Ars Technica
- Microsoft open-sources Windows Calculator 'cos maths keeps changing, apparently @ The Inquirer
- 5G is 'ready' once you redefine 'ready'... and then redefine 'reality' @ The Register
- The digital game store wars: Who are the players? @ The Tech Report
- Did you know?! Ghidra, the NSA's open-sourced decompiler toolkit, is ancient Norse for 'No backdoors, we swear!' @ The Register
- Servers key to reversing downward trend in DRAM prices @ DigiTimes