Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Going out on a Prime note, their last Z390 review

Subject: Motherboards | March 21, 2019 - 03:00 PM |
Tagged: PRIME Z390-A, asus

Hear it is, the last motherboard review you will see from [H]ard|OCP as the sun sets on another review site for Kyle has chosen to join the dark side.  The ASUS PRIME Z390-A has a rather unique looking silkscreen pattern but RGB lovers will find the placement of the glowy bits unfortunate.  At $183 currently on Amazon it sits in the middle of the Z390 family as far as pricing, and that does mean it lacks some of the fancier features offered by flagship class motherboards.  If stability, decent overclocking and a bargain price matter more to you than wireless connectivity and a well developed case of RGB-itis then this last review from [H] is well worth a look.

op01.jpg

"Sometimes less is more. We typically work with a lot of motherboards that fall into the more is more category and some that have a high price that can’t easily be justified by most people. The ASUS PRIME Z390-A seems to lean towards less is more. On paper the Z390-A seems to have a lot going for it and is light on fluff. How does it stack up? "

Here are some more Motherboard articles from around the web:

Motherboards

Source: [H]ard|OCP

Nice OS you have there; shame if something happened to it

Subject: General Tech | March 21, 2019 - 01:02 PM |
Tagged: Windows 7, windows 10, microsoft, EoL

It does have to be said that running a 10 year old Microsoft OS might not be the wisest decision; though it is better than running one 24 years old.  However, as we learned in 2017 many businesses are not even close to adopting Windows 10 on the majority of their systems.  There are numerous reasons for that delay, from licensing through security to privacy not to mention the interface is different enough to upset many executive level users. 

That hasn't stopped Microsoft from once again displaying splash screens on Windows 7 machines, as KB4493132 rolls out to those with automatic updates enabled.  Thankfully it does not attempt to fool you into updating by changing the way they close window button works but then again, the update is no longer free.  The Inquirer, as you might expect, is as enthused about this as most users.

buggers.PNG

"HERE WE GO AGAIN. Two years on from Updategate, Microsoft is back to posting nag screens on its outgoing operating system."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer

PC Perspective Podcast #537 - Division 2 GPU Testing, Google Stadia, and the End of [H]ardOCP

Subject: General Tech | March 21, 2019 - 12:05 PM |
Tagged: The Division 2, stadia, razer, ray tracing, Oculus, Intel, hardocp, game streaming, DirectX 12, Basilisk

PC Perspective Podcast #537 - 3/20/2019

Join us this week as we review the new NVIDIA GTX 1660 and a high-end case from Corsair, discuss NVIDIA's Mellanox acquisition, get excited over Halo for PC, and more!

Subscribe to the PC Perspective Podcast

Check out previous podcast episodes: http://pcper.com/podcast

Today's sponsor is KiwiCo. Change the way your kids learn and play. Get your first KiwiCo Crate free: https://www.kiwico.com/pcper

Show Topics
00:00:22 - Intro
00:02:10 - Review: Razer Basilisk Essential Gaming Mouse
00:05:49 - Review: The Division 2 Performance Preview
00:18:34 - News: Real-Time Ray Tracing for Pascal
00:29:19 - News: Google Stadia
00:46:19 - News: GameWorks RTX & Unreal/Unity DXR Support
00:48:54 - News: Crytek Neon Noir Ray Tracing Demo
00:50:48 - News: Variable Rate Shading for DirectX 12
00:54:06 - News: NVIDIA T4 GPUs for the Server Room
01:00:10 - News: NVIDIA Creator Ready Drivers
01:04:29 - News: Oculus Rift S
01:08:39 - News: AMD Immune to SPOILER
01:18:13 - News: Windows 7 DX12 Support in Adrenalin Drivers
01:22:32 - News: Kyle Bennett Joins Intel
01:34:28 - Picks of the Week
01:44:58 - Outro

Picks of the Week
Jim: Cheap Threadripper
Jeremy: RogueTech Mod for BattleTech
Josh: Intel 660p SSD
Sebastian: OCAT 1.4

Today's Podcast Hosts
Sebastian Peak
Josh Walrath
Jeremy Hellstrom
Jim Tanous

Ripping threads with Coreprio and DLM

Subject: Processors | March 20, 2019 - 04:34 PM |
Tagged: amd, coreprio, threadripper 2, 2990wx, dynamic local mode

Owning a Threadripper is not boring, the new architecture offers a variety of interesting challenges to keep your attention.  One of these features is the lack of direct memory access for two of the dies, which can cause some performance issues and was at least partially addressed by introducing Dynamic Local Mode into Ryzen Master.  On Windows boxes, enabling that feature ensures your hardest working cores have direct memory access, on Linux systems the problem simply doesn't exist.  Another choice is Coreprio, developed by Bitsum, which accomplishes the same task but without the extras included in Ryzen Master. 

Techgage ran a series of benchmarks comparing the differences in performance between the default setting, DLM and Coreprio.

AMD-Ryzen-Threadripper-2990WX-CCX-Diagram.png

"Performance regression issues in Windows on AMD’s top-end Ryzen Threadripper CPUs haven’t gone unnoticed by those who own them, and six months after launch, the issues remain. Fortunately, there’s a new tool making the rounds that can help smooth out those regressions. We’re taking an initial look."

Here are some more Processor articles from around the web:

Processors

Source: Techgage

NVIDIA Introduces Creator Ready Driver Program (419.67)

Subject: Graphics Cards | March 20, 2019 - 04:03 PM |
Tagged: nvidia, graphics drivers

At the ongoing Game Developers Conference, GDC 2019, NVIDIA has announced a “Creator Ready Driver” program. This branch targets the users that sit between the GeForce and Quadro line, who use GeForce cards for professional applications – such as game developers.

nvidia-2019-creatorreadylogo.png

Both Game Ready Drivers and the Creator Ready Drivers will support all games and creative applications. The difference is that Creator Ready drivers will be released according to the release schedule of popular creative tools, and they will receive more strict testing with creative applications.

The first release, Creator Reader Drivers 419.67, lists a 13% performance increase with the Blender Cycles renderer, and an 8% to 9% increase for CINEMA 4D, Premiere Pro CC, and Photoshop CC, all relative to the 415-branch GeForce drivers.

You can choose between either branch using the GeForce Experience interface.

nvidia-2019-creatorreadygfe.png

Image Credit: NVIDIA

As for supported products? The Creator Ready Drivers are available for Pascal-, Volta-, and Turing-based GeForce and TITAN GPUs. “All modern Quadro” GPUs are also supported on the Creator Ready branch.

Personally, my brain immediately thinks “a more-steady driver schedule and some creative application performance enhancements at the expense of a little performance near the launch of major video games”. I could be wrong, but that’s what I think when I read this news.

Source: NVIDIA

This is your father's BattleTech!

Subject: General Tech | March 20, 2019 - 02:15 PM |
Tagged: roguetech, mod, gaming, battletech

Have you already collected every 'mech available in the game and bashed your way through the Flashpoint expansion and are looking for something new in your BattleTech gaming time?  How does gunning down a CattleMaster, Uriel or Phoenix Hawk with an LBX Autocannon sound?  RogueTech is a huge mod for Harebrained Schemes' BattleTech which adds over 1000 'mechs, just about every weapon added into the tabletop game over the years, a fair number of new crunchies and even those annoying Power Armour nitwits.  

The changes go deeper than that, with total overhauls to movement, gunnery and how damage is inflicted, by yourself as well as others.  Expect your first missions to go poorly as you stumble your way through the complete revamp of the game, as you can read about over at Rock, Paper, SHOTGUN; installation instructions included.

Roguetech-Bessie.jpg

"It brings the combat a little more in line with the full, expanded tabletop rule-set (which no sane person ever used, thanks to requiring dozens of skill checks and countless dice rolled a turn), but through the magic of computers, we can experience all the thrills of full simulation-level ‘Mech combat without putting you to sleep or frying your brain."

Here is some more Tech News from around the web:

Tech Talk

 

Oculus shows off their new headgear at GDC

Subject: General Tech | March 20, 2019 - 01:20 PM |
Tagged: Oculus Rift S, Oculus, vr headset, gdc 2019

The brand new Oculus Rift S is being shown off at GDC and Ars Technica had a chance to play with it.  The new implementation is rather impressive, a single wire connects you to your PC and there are now no external cameras, instead they have been relocated to the headset itself.  From the description in the review it seems they have done so very successfully, with location tracking improving as opposed to degrading due to that change.  Your eyeballs also get an upgraded experience, with each eye having a 1280x1440 display, though as of yet Oculus have not changed to AMOLED screens.

Check it out here.

oculus_gdc19-7-1440x960.jpg

"This is out-of-the-box room scale," Oculus co-founder Nate Mitchell said as he gestured to the Oculus Rift S, the new PC-focused VR headset launching "this spring" for $399. This headset will effectively replace the standard Oculus Rift headset, which has had production all but halted to make way for the new model."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica

Microsoft Announces Variable Rate Shading for DirectX 12

Subject: Graphics Cards | March 19, 2019 - 08:29 PM |
Tagged: microsoft, DirectX 12, turing, nvidia

To align with the Game Developers Conference, GDC 2019, Microsoft has announced Variable Rate Shading for DirectX 12. This feature increases performance by allowing the GPU to lower its shading resolution for specific parts of the scene (without the developer doing explicit, render texture-based tricks).

An NVIDIA speech from SIGGRAPH 2018 (last August)

The feature is divided into three parts:

  • Lowering the resolution of specific draw calls (tier 1)
  • Lowering the resolution within a draw call by using an image mask (tier 2)
  • Lowering the resolution within a draw call per primitive (ex: triangle) (tier 2)

The last two points are tagged as “tier 2” because they can reduce the workload within a single draw call, which is an item of work that is sent to the GPU. A typical draw call for a 3D engine is a set of triangles (vertices and indices) paired with a material (a shader program, textures, and properties). While it is sometimes useful to lower the resolution for particularly complex draw calls that take up a lot of screen space but whose output is also relatively low detail, such as water, there are real benefits to being more granular.

The second part, an image mask, allows detail to be reduced for certain areas of the screen. This can be useful in several situations:

  • The edges of a VR field of view
  • Anywhere that will be brutalized by a blur or distortion effect
  • Objects behind some translucent overlays
  • Even negating a tier 1-optimized section to re-add quality where needed

The latter example was the one that Microsoft focused on with their blog. Unfortunately, I am struggling to figure out what specifically is going on, because the changes that I see (ex: the coral reef, fish, and dirt) don’t line up with their red/blue visualizer. The claim is that they use an edge detection algorithm to force high-frequency shading where there would be high-frequency detail.

microsoft-2019-dx12-vrs-1.png

Right side reduces shading by 75% for terrain and water

microsoft-2019-dx12-vrs-2.png

Right side reclaims some lost fidelity based on edge detection algorithm

microsoft-2019-dx12-vrs-3.png

Visualization of where shading complexity is spent.

(Red is per-pixel. Blue is 1 shade per 2x2 block.)

Images Source: Firaxis via Microsoft

Microsoft claims that this feature will only be available for DirectX 12. That said, NVIDIA, when Turing launched, claims that Variable Rate Shading will be available for DirectX 11, DirectX 12, Vulkan, and OpenGL. I’m not sure what’s different between Microsoft’s implementation that lets them separate it from NVIDIA’s extension.

Microsoft will have good tools support, however. They claim that their PIX for Windows performance analysis tool will support this feature on Day 1.

Source: Microsoft

NVIDIA is ready to storm the server room

Subject: General Tech, Graphics Cards, Networking, Shows and Expos | March 19, 2019 - 06:16 PM |
Tagged: nvidia, t4, amazon, microsoft, NGC, Mellanox, CUDA-X, GTC, jen-hsun huang, DRIVE Constellation, ai

As part of their long list of announcements yesterday, NVIDIA revealed they are partnering with Cisco, Dell EMC, Fujitsu, Hewlett Packard Enterprise, Inspur, Lenovo and Sugon to provide servers powered by T4 Tensor Core GPUs, optimized to run their CUDA-X AI accelerators. 

t4.PNG

Those T4 GPUs have been on the market for a while but this marks the first major success for NVIDIA in the server room, with models available for purchase from those aforementioned companies.  Those who prefer other people's servers can also benefit from these new products, with Amazon and Microsoft offering Cloud based solutions.  Setting yourself up to run NVIDIA's NGC software may save a lot of money down the road, the cards sip a mere 70W of power which is rather more attractive than the consumption of a gaggle of Tesla V100s.  One might be guilty of suspecting this offers an explanation for their recent acquisition of Mellanox.

purty.PNG

NGC software offers more than just a platform to run optimizations on, it also offers a range of templates to start off with from classification, and object detection, through sentiment analysis and most other basic starting points for training a machine.  Customers will also be able to upload their own models to share internally or, if in the mood, externally with other users and companies.  It supports existing products such as TensorFlow and PyTorch but also offers access to CUDA-X AI, which as the name suggests takes advantage of the base design of the T4 GPU to reduce the amount of time waiting for results and letting users advance designs quickly. 

memorex.PNG

If you are curious exactly what particular implementations of everyone's favourite buzzword might be, NVIDIA's DRIVE Constellation is a example after JoshTekk's own heart.  Literally an way to create open, scalable simulation for large fleets of self-driving cars to train them ... for good one hopes.  Currently the Toyota Research Institute-Advanced Development utilizes these products in the development of their next self-driving fleet, and NVIDIA obviously hopes others will follow suit. 

replaced.PNG

There is not much to see from the perspective of a gamer in the short term, but considering NVIDIA's work at shifting the horsepower from the silicon you own to their own Cloud this will certainly impact the future of gaming from both a hardware and gameplay perspective.  GPUs as a Service may not be the future many of us want but this suggests it could be possible, not to mention the dirty tricks enemy AIs will be able to pull with this processing power behind them.

One might even dream that escort missions could become less of a traumatic experience!

Source: NVIDIA

Cooler Master Launches MasterBox NR400 and NR600 Cases

Subject: Cases and Cooling | March 19, 2019 - 03:35 PM |
Tagged: tempered glass, mesh, MasterBox NR600, MasterBox NR400, enclosure, cooler master, chassis, case, airflow

Cooler Master has launched a pair of new cases today with the MasterBox NR400 and the MasterBox NR600. These budget-friendly cases offer a minimalist approach with clean lines and no flashy lighting effects, and should offer good airflow with their full mesh front panels.

Cooler_Master_NR_0.jpg

The Cooler Master MasterBox NR600 is an ATX mid-tower

The MasterBox NR400 is the smaller of the two cases, supporting mini ITX and micro ATX motherboards, with the larger MasterBox NR600 offering support for standard ATX motherboards.

Cooler_Master_NR_1.jpg

Cooler Master provides this list of features for these two new NR series cases:

  • Minimalistic Mesh Design - Elegant design elements are applied to mesh for optimal thermal performance.
  • Optimal Thermal Performance – The full mesh front panel and ventilated top panel provide a high potential for thermal performance.
  • Flush Tempered Glass Side Panel – The tempered glass side panel, fastened by thumbscrews on the rear panel, keeps the surface completely flush.
  • Headset Jack – The single 4 pole headset jack features both audio and microphone capabilities simultaneously so separate jacks are not needed.
  • Graphics Card Support Up to 346mm (NR400) / 410mm (NR600) – Generous clearance space is provided to support the latest graphics cards.
  • Cable Management – High quality, longer length rubber grommets paired with generous clearance behind the motherboard offers ample room for cable management.

Cooler_Master_NR_2.jpg

"Thermal performance is at the core of the NR Series, with the combination of two pre-installed fans for intake and exhaust and the fine mesh front panel. Generous support for radiator mounting and unrestricted ventilation on the top and front panel ensure that even the most demanding components are efficiently cooled while the two USB 3.1 gen 1 ports at the top of the case provide ample connectivity for everyday use."

Cooler_Master_NR.jpg

The Cooler Master MasterBox NR400 is a smaller mATX design

The MasterBox NR400 has an MSRP of $59.99, with the NR600 at $69.99. Both cases are available for pre-sale today, though no listings have yet found their way to Amazon or Newegg at time of writing.

Google Announces Stadia at GDC 2019

Subject: General Tech | March 19, 2019 - 03:28 PM |
Tagged: google, stadia

Google Stadia, a new gaming service, was announced at the 2019 Game Developers Conference. Much like OnLive and PlayStation Now, users connect to an instance of video games running in datacenters. Because it is a conference for game developers, they also mentioned a few added features, such as directly transcoding for YouTube and allowing the audience to wait in line and jump into a streamer’s game session.

Requirements and price were not mentioned, but it will arrive sometime in 2019.

Google also mentioned that applicable games would run on Linux and the Vulkan API. Given their Android initiative, and their desire to not pay Microsoft a license fee for VMs, this makes a lot of sense. It also forces major engine developers to target and optimize Linux-focused APIs, which will be good in the long run, especially if Google starts adding GPUs from Intel and NVIDIA in the coming years. I'm not sure how much it will push Linux ports, however, because that's not really why publishers avoid the platform.

In terms of hardware, Google claims that an instance will have about 10.7 teraflops of GPU performance on an AMD platform. In terms of compute, this puts it equivalent to a GTX 1080 Ti, although AMD tends to have reduced fillrate, etc., which keeps them behind NVIDIA parts with equivalent compute performance. (Those equivalent AMD parts also tend to be significantly cheaper, and thus comparing flop to flop isn’t fair in most other circumstances.)

google-2019-stadia-controller.png

As long-time readers know, I am also very cautious about streaming services because they are about the epitome of DRM. While the content is available on other, local platforms? Not really a big deal. If a game is developed for Google’s service, and requires Google’s API, then games that leave the service, either by Google’s license ending or Google simply not wanting to host it anymore, will be just about impossible to preserve. Games are not movies or sound files that can be transcoded into another format.

Linux and Vulkan does provide a bit more confidence, but you never know what will happen when a company, with no legal (ex: the GPL) obligation to remain open, gets large enough.

It’s something that I’m still worried about, though.

NZXT Announces the HUE 2 RGB Ambient Lighting Kit V2

Subject: General Tech | March 19, 2019 - 02:34 PM |
Tagged: RGB, nzxt, lighting, HUE 2 Ambient RGB Kit V2, HUE 2, ambient

NZXT has announced the version 2 of the HUE 2 Ambient RGB Lighting Kit, an update which the company says "brings product improvements that make installation easier and help ensure that the included LED strips will be securely attached to your monitor". The lighting kit uses NZXT's CAM software to select between lighting modes - which include an ambient mode that changes the lighting to compliment the images on your monitor.

HUE 2 Ambient RGB Lighting Kit.jpg

"The HUE 2 ecosystem delivers the broadest line of RGB lighting accessories for PC Builders with over 10 different accessories such as the HUE 2 RGB lighting kit, underglow kit, AER RGB 2 fans, and cable combs."

NZXT provides this list of features for the new HUE 2 kit:

HUE 2 Ambient RGB Kit V2 New Features:

  • Stronger Adhesive: Upgraded the 3M LED strip adhesive backing tape to be thicker and stickier. It is more compatible with the different surfaces and textures of monitors on the market.
  • L-Shape Corner Connector: For easier setup, we replaced the 150mm corner connectors to an L-shape corner connector for the top left and bottom right of your monitor.
  • Alcohol Wipes: As your monitor may be dusty or dirty, we added alcohol-based cleaning wipes so you can clean the back of the monitor before adhering the LED strips.

HUE 2 RGB Ambient Lighting Kit Features

  • Available in two configurations, one for 21” to 25”/34”-35” UltraWide monitors and one for 26” to 32” monitors, the HUE 2 RGB Ambient Lighting Kit adds gorgeous lighting to your gaming PC and provides a more immersive in-game experience* by projecting the colors from the edges of your display to your walls.
  • HUE 2 RGB Ambient Lighting controller
  • Faster microprocessor enables incredibly smooth lighting effects and virtually instantaneous response changes to colors on the screen in ambient mode
  • Multiple LED strips in various lengths, along with easy-to-follow configuration and installation instructions for various monitor sizes and aspect ratios
  • AC power adapter
  • All necessary cables
  • Compatible with all HUE 2 accessories and CAM lighting modes

* Ambient lighting modes are available only with the HUE 2 Ambient Lighting controller and require the use of CAM as a registered user, including acceptance of the then-current NZXT Terms of Service.

HUE-2-Ambient-RGB-Lighting-Kit.jpg

The HUE 2 lighting kits are available now in the USA with pricing as follows:

  • HUE 2 Ambient RGB Lighting Kit V2 (21”-25”, 34”-35” UltraWide) - $99.99
  • HUE 2 Ambient RGB Lighting Kit V2 (26”-32”) - $99.99
Source: NZXT

Thar be ARGBs on the Water bearin the flag o' Thermaltake

Subject: Cases and Cooling | March 19, 2019 - 01:57 PM |
Tagged: thermaltake, Water 3.0 360 ARGB Sync, watercooler, 360mm radiator, RGB

Thermaltake's Water 3.0 360 ARGB Sync is a watercooler for those who live and breath RGBs, as it is compatible with all the software suites found on motherboards so you can create your own lightshow.  It is compatible with all current sockets from both AMD and Intel, the only compatibility issue you might have is fitting the 360mm radiator into your case.  In Kitguru's testing it even performed well while cooling, but really it's all about the RGBs.

kitguru_thermaltake_water_3.0_360_ARGB_Sync_colour_4.jpg

"There’s a lot to unpack from the name of the Thermaltake Water 3.0 360 ARGB Sync. The ‘360’ obviously refers to its 360mm radiator, and we know ARGB = Addressable RGB lighting. The ‘Sync’ aspect references software compatibility for the Water 3.0 360 with current motherboards ..."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: Kitguru

The Atari VCS is delayed again, but you might not be as mad about it as you think

Subject: General Tech | March 19, 2019 - 01:19 PM |
Tagged: atari, delay, amd, Vega, ryzen

You may remember the announcement of the re-launch of the Atari Video Console System back in the summer of 2017, though by now you may have decided that it is going the way of the ZX Spectrum Vega+.  If you do still hold hope, Atari is once again testing your patience by announcing another delay to the end of 2019.  There is a reason however, which you may or may not find acceptable.  They will be upgrading the AMD Ryzen chip at the heart of the system, with the new generation of Vega graphics offering modern performance.  Atari is also suggesting this will offer much quieter and cooler performance in a quote over at The Inquirer.

atari-vcs-ataribox-linux-pc-video-game-console-gaming.jpg

"The Atari VCS launched on Indiegogo and was originally set to arrive in spring 2018, but the company has announced that it will now arrive at the butt-end of 2019 (and that projection is just for the US and Canada)."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

The Khronos Group Releases OpenXR 0.90 (Provisional)

Subject: General Tech | March 19, 2019 - 01:20 AM |
Tagged: Khronos, openxr

At the Games Developers Conference in San Francisco, the Khronos Group has published the first, provisional release of the augmented reality and virtual reality API: OpenXR 0.90. The goal is to allow the general public to implement it into their software so they can provide feedback (via the official forums) before it launches into 1.0.

khronos-2019-openxr-logo.png

The last time we’ve mentioned OpenXR was SIGGRAPH back in August. That event had a demo area on the show floor with the Epic Showdown VR demo, which evolved into Robo Recall. On that end, Epic plans to continue support for OpenXR, and they pride themselves as the engine that powers the original demo. Unity has also responded positively to OpenXR, but their quoted statement doesn’t make any specific promises. Microsoft and Collabora are each, separately, providing hardware implementations of the new API. Oculus plans to support the API when it reaches 1.0 status. HTC also wants to support OpenXR with the Vive, although Valve is nowhere to be seen on the list of quotes. Google is also nowhere to be seen on the list.

You can check out the press release on the Khronos Group website.

GTC 19: NVIDIA Announces GameWorks RTX; Unreal Engine and Unity Gain DXR Support

Subject: General Tech | March 18, 2019 - 10:00 PM |
Tagged: gameworks, unreal engine, Unity, rtx, ray tracing, nvidia, GTC 19, GTC, dxr, developers

Today at GTC NVIDIA announced GameWorks RTX and the implementation of real-time ray tracing in the upcoming Unreal Engine 4.22 and the latest version of Unity, currently in 2019.03.

NVIDIA Announces GameWorks RTX

While Pascal and non-RTX Turing support for real-time ray tracing is something of a bombshell from NVIDIA, the creation of GameWorks tools for such effects is not surprising.

“NVIDIA GameWorks RTX is a comprehensive set of tools that help developers implement real time ray-traced effects in games. GameWorks RTX is available to the developer community in open source form under the GameWorks license and includes plugins for Unreal Engine 4.22 and Unity’s 2019.03 preview release.”

NVIDIA lists these components of GameWorks RTX:

GW_RTX.PNG

  • RTX Denoiser SDK – a library that enables fast, real-time ray tracing by providing denoising techniques to lower the required ray count and samples per pixel. It includes algorithms for ray traced area light shadows, glossy reflections, ambient occlusion and diffuse global illumination.
  • Nsight for RT – a standalone developer tool that saves developers time by helping to debug and profile graphics applications built with DXR and other supported APIs.

Unreal Engine and Unity Gaining Real-Time Ray Tracing Support

DXR_GAME_ENGINES.png

And while not specific to NVIDIA hardware, news of more game engines offering integrated DXR support was also announced during the keynote:

“Unreal Engine 4.22 is available in preview now, with final release details expected in Epic’s GDC keynote on Wednesday. Starting on April 4, Unity will offer optimized, production-focused, realtime ray tracing support with a custom experimental build available on GitHub to all users with full preview access in the 2019.03 Unity release. Real-time ray tracing support from other first-party AAA game engines includes DICE/EA’s Frostbite Engine, Remedy Entertainment’s Northlight Engine and engines from Crystal Dynamics, Kingsoft, Netease and others.”

RTX may have been off to a slow start, but this will apparently be the year of real-time ray tracing after all - especially with the upcoming NVIDIA driver update adding support to the GTX 10-series and new GTX 16-series GPUs.

Source: NVIDIA

NVIDIA to Add Real-Time Ray Tracing Support to Pascal GPUs via April Driver Update

Subject: Graphics Cards | March 18, 2019 - 09:41 PM |
Tagged: unreal engine, Unity, turing, rtx, ray tracing, pascal, nvidia, geforce, GTC 19, GTC, gaming, developers

Today at GTC NVIDIA announced a few things of particular interest to gamers, including GameWorks RTX and the implementation of real-time ray tracing in upcoming versions of both Unreal Engine and Unity (we already posted the news that CRYENGINE will be supporting real-time ray tracing as well). But there is something else... NVIDIA is bringing ray tracing support to GeForce GTX graphics cards.

DXR_GPUs.png

This surprising turn means that hardware RT support won’t be limited to RTX cards after all, as the install base of NVIDIA ray-tracing GPUs “grows to tens of millions” with a simple driver update next month, adding the feature to both to previous-gen Pascal and the new Turing GTX GPUs.

How is this possible? It’s all about the programmable shaders:

“NVIDIA GeForce GTX GPUs powered by Pascal and Turing architectures will be able to take advantage of ray tracing-supported games via a driver expected in April. The new driver will enable tens of millions of GPUs for games that support real-time ray tracing, accelerating the growth of the technology and giving game developers a massive installed base.

With this driver, GeForce GTX GPUs will execute ray traced effects on shader cores. Game performance will vary based on the ray-traced effects and on the number of rays cast in the game, along with GPU model and game resolution. Games that support the Microsoft DXR and Vulkan APIs are all supported.

However, GeForce RTX GPUs, which have dedicated ray tracing cores built directly into the GPU, deliver the ultimate ray tracing experience. They provide up to 2-3x faster ray tracing performance with a more visually immersive gaming environment than GPUs without dedicated ray tracing cores.”

A very important caveat is that “2-3x faster ray tracing performance” for GeForce RTX graphics cards mentioned in the last paragraph, so expectations will need to be tempered as RT features will be less efficient running on shader cores (Pascal and Turing) than they are with dedicated cores, as demonstrated by these charts:

BFV_CHART.png

METRO_EXODUS_CHART.png

SOTTR_CHART.png

It's going to be a busy April.

Source: NVIDIA
Manufacturer: PC Perspective

AMD and NVIDIA GPUs Tested

Tom Clancy’s The Division 2 launched over the weekend and we've been testing it out over the past couple of days with a collection of currently-available graphics cards. Of interest to AMD fans, this game joins the ranks of those well optimized for Radeon graphics, and with a new driver (Radeon Software Adrenalin 2019 Edition 19.3.2) released over the weekend it was a good time to run some benchmarks and see how some AMD and NVIDIA hardware stack up.

d2-key-art-1920x600.jpg

The Division 2 offers DirectX 11 and 12 support, and uses Ubisoft's Snowdrop engine to provide some impressive visuals, particularly at the highest detail settings. We found the "ultra" preset to be quite attainable with very playable frame rates from most midrange-and-above hardware even at 2560x1440, though bear in mind that this game uses quite a bit of video memory. We hit a performance ceiling at 4GB with the "ultra" preset even at 1080p, so we opted for 6GB+ graphics cards for our final testing. And while most of our testing was done at 1440p we did test a selection of cards at 1080p and 4K, just to provide a look at how the GPUs on test scaled when facing different workloads.

Tom Clancy's The Division 2

d2-screen1-1260x709.jpg

Washington D.C. is on the brink of collapse. Lawlessness and instability threaten our society, and rumors of a coup in the capitol are only amplifying the chaos. All active Division agents are desperately needed to save the city before it's too late.

d2-screen4-1260x709.jpg

Developed by Ubisoft Massive and the same teams that brought you Tom Clancy’s The Division, Tom Clancy’s The Division 2 is an online open world, action shooter RPG experience set in a collapsing and fractured Washington, D.C. This rich new setting combines a wide variety of beautiful, iconic, and realistic environments where the player will experience the series’ trademark for authenticity in world building, rich RPG systems, and fast-paced action like never before.

d2-screen3-1260x709.jpg

Play solo or co-op with a team of up to four players to complete a wide range of activities, from the main campaign and adversarial PvP matches to the Dark Zone – where anything can happen.

Continue reading our preview of GPU performance with The Division 2

Remember Conservative Morphological Anti-Aliasing?

Subject: Graphics Cards | March 18, 2019 - 03:13 PM |
Tagged: fxaa, SMAA, Anti-aliasing, MLAA, taa, amd, nvidia

Apart from the new DLSS available on NVIDIA's RTX cards, it has been a very long time since we looked at anti-aliasing implementations and the effects your choice has on performance and visual quality.  You are likely familiar with the four most common implementations, dating back to AMD's MLAA and NVIDIA's FXAA which are not used in new generation games to TAA/TXAA and SMAA but when was the last time you refreshed your memory on what they actually do and how they compare.

Not only did Overclockers Club looking into those, they discuss some of the other attempted implementations as well as sampling types that lie behind these technologies.  Check out their deep dive here.

anti.PNG

"One setting present in many if not all modern PC games that can dramatically impact performance and quality is anti-aliasing and, to be honest, I never really understood how it works. Sure we have the general idea that super-sampling is in effect running at a higher resolution and then downscaling, but then what is multi-sampling? How do post-processing methods work, like the very common FXAA and often favored SMAA?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

How else would you get a Comet in your lake, if not from the Sky

Subject: General Tech | March 18, 2019 - 01:27 PM |
Tagged: Comet Lake-S, Skylake, Intel, 14nm, Comet Lake-U, Comet Lake-H, rumour, leak

Intel's new Comet Lake families of chips will be a update to the existing Skylake architecture and will share the same 14nm process node according to what The Inquirer have discovered from leaked documents.  On desktop parts they refer to 10+2 and 8+2 SKUs from which we can infer the presence of GT2 graphics, with a 5GHz part likely topping that line.  Notebook chips are expected to top out at six cores as are the ultra-low power models.   In theory we should see these arrive some time this year, contiguous to the release of Zen 2, though we lack hard dates on either release at this time.

splash.PNG

"According to reports, Comet Lake-S will be based on the Skylake microarchitecture and will be created using Intel's now-ageing 14nm manufacturing process."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer