This wee GTX 1660 from Zotac has at least two fans

Subject: Graphics Cards | April 11, 2019 - 01:21 PM |
Tagged: zotac, nvidia, GTX 1660 Twin Fan 6GB, gtx 1660

Zotac's GeForce GTX 1660 Twin Fan 6G is designed for SFF systems, measuring 173.4x111.15x35.3mm (6.83x4.38x1.39") but still needing two slots which in this day and age of the triple wide is considered svelte.  The GPU Boost clock remains at the reference 1,785 MHz and while the memory runs at 8 GHz, it is worth noting it is GDDR5, not GDDR6 like you find in their GTX 1660 Ti.  With a bit of overclocking, either manually or with OC Scanner The Guru of 3D was able to hit ~2050 MHz Boost, 9.7GHz memory for about a 10% increase in performance.

While not a huge step up from a GTX 1060, the ~$220 price tag makes it worth consideration if you are running an older generation mid-range card.

img_8086.jpg

"In this article we test and review the new Zotac GeForce GTX 1660 Twin Fan, it is priced spot on 219 USD, the MSRP of the new non-Ti model, meet the small and compact GeForce GTX 1660 Twin Fan edition. Based on a dual fan and a small form factor it performs as expected."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Guru of 3D

NVIDIA Releases Ray Tracing Driver for GeForce GTX Cards, New Ray-Tracing Demos

Subject: Graphics Cards | April 11, 2019 - 09:02 AM |
Tagged: turing, rtx, ray tracing, pascal, nvidia, gtx, graphics, gpu, geforce, dxr, demo

NVIDIA has released the Game Ready Driver 425.31 WHQL which enables ray tracing for GeForce GTX graphics cards - a capability previously reserved for the company's RTX series of graphics cards. This change "enables millions more gamers with GeForce GTX GPUs to experience ray tracing for the first time ever", as the list of DXR-capable graphics cards from NVIDIA has grown considerably as of today.

DXR_GPUs.png

The list of NVIDIA GPUs that are DXR-capable now includes (in addition to the RTX series):

  • GeForce GTX 1660 Ti
  • GeForce GTX 1660
  • NVIDIA TITAN Xp (2017)
  • NVIDIA TITAN X (2016)
  • GeForce GTX 1080 Ti
  • GeForce GTX 1080
  • GeForce GTX 1070 Ti
  • GeForce GTX 1070
  • GeForce GTX 1060 6GB
  • Laptops with equivalent Pascal and Turing-architecture GPUs

NVIDIA previously warned of a performance deficit when comparing even high-end Pascal GPUs such as the GTX 1080 Ti to the Turing-based RTX 20-series GPUs when this driver update was discussed during GTC, and their position is that for the best experience dedicated ray tracing cores will be required, and will make a measurable impact - with or without DLSS (a feature that requires the RT cores of the RTX series of GPUs).

"With dedicated RT cores, GeForce RTX GPUs provide up to 2-3x faster performance in ray-traced games, enabling more effects, higher ray counts, and higher resolutions for the best experience. With this new driver however, GeForce GTX 1060 6GB and higher GPUs can execute ray-tracing instructions on traditional shader cores, giving gamers a taste, albeit at lower RT quality settings and resolutions, of how ray tracing will dramatically change the way games are experienced."

In addition to the driver release which enables the visual goodies associated with real-time ray tracing, NVIDIA has also released a trio of tech demos on GeForce.com which you can freely download to check out ray tracing first hand on GTX and RTX graphics cards. Not only will these demos give you a taste of what you might expect from games that incorporate DXR features, but like any good demo they will help users get a sense of how their system might handle these effects.

The demos released include, via NVIDIA:

atomic-heart-rtx-screenshot.jpg

Atomic Heart RTX tech demo - Atomic Heart tech demo is a beautifully detailed tech demo from Mundfish that features ray traced reflections and shadows, as well as NVIDIA DLSS technology.

JusticeRTXDemo.jpg

Justice tech demo - Justice tech demo hails from China, and features ray traced reflections, shadows, and NVIDIA DLSS technology.  It is the first time that real time ray tracing has been used for caustics.

GEFORCE-RTX-Star-Wars.jpg

Reflections tech demo - The Reflections tech demo was created by Epic Games in collaboration with ILMxLAB and NVIDIA. Reflections offers a sneak peek at gaming’s cinematic future with a stunning, witty demo that showcases ray-traced reflections, ray-traced area light shadows, ray-traced ambient occlusion for characters and NVIDIA DLSS technology.

The download page for the tech demos can be found here.

And now to editorialize briefly, I'll point out that one of the aspects of the RTX launch that did not exactly work to NVIDIA's advantage was (obviously) the lack of software to take advantage of their hardware ray tracing capabilities and DLSS, with just a few high-profile titles to date offering support. By adding the previous generation of GPUs to the mix users now have a choice, and the new demos are a big a part of the story, too. Looking back to the early days of dedicated 3D accelerators the tech demo has been an integral part of the GPU experience, showcasing new features and providing enthusiasts with a taste of what a hardware upgrade can provide. The more demos showcasing the effects possible with NVIDIA's ray tracing hardware available, the more Pascal GPU owners will have the ability to check out these features on their own systems without making a purchase of any kind, and if they find the effects compelling it just might drive sales of the RTX 20-series in the endless quest for better performance. It really should have been this way from the start, but at least it has been corrected now - to the benefit of the consumer.

Source: NVIDIA

It'd be a shame if someone spilled the beans about your GTX 1650 ... a real shame

Subject: General Tech | April 9, 2019 - 02:41 PM |
Tagged: leak, rumour, nvidia, zotac, gtx 1650, turing, TU117

Videocardz got their hands on pictures of a Zotac card, ostensibly an unreleased or announced GTX 1650 and were nice enough to share it with the group.  We've already seen a leak from our favourite Twitter user, suggesting the card will have 4GB of memory and a 1,485MHz core clock but this is our first look at what a card might look like.  Most interesting is what you do not see in the pictures, there is no PCIe power connector so this TU117 based card will eat 75W at most, assuming nothing funky happens as it did with another card

In theory, we should know the truth before the end of the month so stay tuned.

ZOTAC-GeForce-GTX-1650-4-1000x750.jpg

"Here is the world’s first look at GeForce GTX 1650 from ZOTAC. The card is a single-fan Mini-ITX form factor design. It is equipped with three display connectors: HDMI, DisplayPort, Dual Link DVI-D."

Here is some more Tech News from around the web:

Tech Talk

Source: Videocardz

Math with Maxwell and Turing

Subject: Graphics Cards | April 4, 2019 - 02:23 PM |
Tagged: nvidia, maxwell, rtx, gtx 960, gtx 1660, RTX 2060, turing

TechSpot decided to test out NVIDIA's claims that the GTX 1660 is 113% faster than the old GTX 960, which is still sitting in the top five for most used GPUs on the Steam charts.  They tested Apex Legends, The Division 2, Shadow of the Tomb Raider and many other games at 1080p as that is what these cards were designed for and then compiled the results.  Wolfenstein II:TNC showed the most improvement with 177% while Middle Earth Shadow of War sat at the bottom with a mere 54% jump with the overall average hitting 117%. 

Not a bad upgrade choice, though as they point out the RX 580 8GB is a strong contender for GTX 960 users as well.

Image_02S.jpg

"When we recently tested the new GeForce GTX 1660 we noted that Nvidia was making a bold claim in the review guide saying that the 1660 was a whopping 113% faster than the GTX 960, making it a perfect upgrade option for owners of the old mid-range Maxwell GPU."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: TechSpot

Anthem Patch 1.0.4 Adds NVIDIA DLSS and Highlights Support

Subject: Graphics Cards | March 26, 2019 - 10:48 AM |
Tagged: update, rtx, ray tracing, patch 1.0.4, patch, nvidia, geforce, ea, DLSS, BioWare, anthem

Patch 1.0.4 for Anthem was released by BioWare/EA today which addresses a number of issues and adds features including two specific to GeForce graphics card users, namely DLSS and NVIDIA Highlights support.

anthem-geforce-rtx-screenshot.jpg

The DLSS (deep learning super-sampling) feature is exclusive to RTX 20-series GPUs, providing performance gains of up to 40% with real-time ray tracing enabled according to NVIDIA, who provides this video of DLSS off vs. on in the game (embedded below):

NVIDIA offers this chart showing performance gains with the range of RTX cards with DLSS enabled:

anthem-nvidia-dlss-4k-performance.png

NVIDIA Highlights support is available to users of GeForce Experience, and this feature allows the automatic capture of gameplay clips and screenshots in certain situations. BioWare/EA lists the supported scenarios for the feature:

  • Visiting and viewing overlooks
  • Defeating certain large creatures
  • Performing multi-kills
  • Defeating legendary creatures
  • Discovering the Tombs of the Legionnaires
  • Performing combos
  • When the player is downed by enemies
  • Defeating bosses

anthem-nvidia-highlights.png

The full Patch 1.0.4 release notes can be viewed here, and more information from NVIDIA about the new GeForce features for Anthem are available here.

Source: NVIDIA

NVIDIA Releases GeForce Game Ready 419.67 Drivers

Subject: Graphics Cards | March 25, 2019 - 02:14 PM |
Tagged: nvidia, graphics drivers, geforce

Five days after they released the Creator Ready 419.67 drivers, NVIDIA has published their Game Ready equivalent, which is also version 419.67. It can be downloaded from GeForce Experience, or you can get it from their website (alongside the release notes).

nvidia-geforce.png

In terms of features: this is another entry in the R418 branch. Often, but not always, major new features are merged in at the start, and subsequent releases add game-specific optimizations, bug fixes, and so forth. This is one of those drivers. It adds two new G-SYNC compatible monitors, the ASUS VG278QR and the ASUS VG258, as well as “Game Ready” tweaks for Battlefield V: Firestorm, Anthem, Shadow of the Tomb Raider, and Sekiro: Shadows Die Twice.

Also, if you have multiple GSYNC compatible displays, NVIDIA now also supports using them in Surround on Turing GPUs.

There are also several bug fixes, although most of them are relatively specific. They do have fixes for DaVinci Resolve and multiple Adobe applications, which I believe were first introduced in the Creator Ready driver. If you’re concerned about “missing out” on fixes, it looks like they’re intending to keep Game Ready up-to-date with Creator Ready, just with less QA on professional applications and timed to game releases (versus professional software releases).

Source: NVIDIA

NVIDIA Introduces Creator Ready Driver Program (419.67)

Subject: Graphics Cards | March 20, 2019 - 04:03 PM |
Tagged: nvidia, graphics drivers

At the ongoing Game Developers Conference, GDC 2019, NVIDIA has announced a “Creator Ready Driver” program. This branch targets the users that sit between the GeForce and Quadro line, who use GeForce cards for professional applications – such as game developers.

nvidia-2019-creatorreadylogo.png

Both Game Ready Drivers and the Creator Ready Drivers will support all games and creative applications. The difference is that Creator Ready drivers will be released according to the release schedule of popular creative tools, and they will receive more strict testing with creative applications.

The first release, Creator Reader Drivers 419.67, lists a 13% performance increase with the Blender Cycles renderer, and an 8% to 9% increase for CINEMA 4D, Premiere Pro CC, and Photoshop CC, all relative to the 415-branch GeForce drivers.

You can choose between either branch using the GeForce Experience interface.

nvidia-2019-creatorreadygfe.png

Image Credit: NVIDIA

As for supported products? The Creator Ready Drivers are available for Pascal-, Volta-, and Turing-based GeForce and TITAN GPUs. “All modern Quadro” GPUs are also supported on the Creator Ready branch.

Personally, my brain immediately thinks “a more-steady driver schedule and some creative application performance enhancements at the expense of a little performance near the launch of major video games”. I could be wrong, but that’s what I think when I read this news.

Source: NVIDIA

Microsoft Announces Variable Rate Shading for DirectX 12

Subject: Graphics Cards | March 19, 2019 - 08:29 PM |
Tagged: microsoft, DirectX 12, turing, nvidia

To align with the Game Developers Conference, GDC 2019, Microsoft has announced Variable Rate Shading for DirectX 12. This feature increases performance by allowing the GPU to lower its shading resolution for specific parts of the scene (without the developer doing explicit, render texture-based tricks).

An NVIDIA speech from SIGGRAPH 2018 (last August)

The feature is divided into three parts:

  • Lowering the resolution of specific draw calls (tier 1)
  • Lowering the resolution within a draw call by using an image mask (tier 2)
  • Lowering the resolution within a draw call per primitive (ex: triangle) (tier 2)

The last two points are tagged as “tier 2” because they can reduce the workload within a single draw call, which is an item of work that is sent to the GPU. A typical draw call for a 3D engine is a set of triangles (vertices and indices) paired with a material (a shader program, textures, and properties). While it is sometimes useful to lower the resolution for particularly complex draw calls that take up a lot of screen space but whose output is also relatively low detail, such as water, there are real benefits to being more granular.

The second part, an image mask, allows detail to be reduced for certain areas of the screen. This can be useful in several situations:

  • The edges of a VR field of view
  • Anywhere that will be brutalized by a blur or distortion effect
  • Objects behind some translucent overlays
  • Even negating a tier 1-optimized section to re-add quality where needed

The latter example was the one that Microsoft focused on with their blog. Unfortunately, I am struggling to figure out what specifically is going on, because the changes that I see (ex: the coral reef, fish, and dirt) don’t line up with their red/blue visualizer. The claim is that they use an edge detection algorithm to force high-frequency shading where there would be high-frequency detail.

microsoft-2019-dx12-vrs-1.png

Right side reduces shading by 75% for terrain and water

microsoft-2019-dx12-vrs-2.png

Right side reclaims some lost fidelity based on edge detection algorithm

microsoft-2019-dx12-vrs-3.png

Visualization of where shading complexity is spent.

(Red is per-pixel. Blue is 1 shade per 2x2 block.)

Images Source: Firaxis via Microsoft

Microsoft claims that this feature will only be available for DirectX 12. That said, NVIDIA, when Turing launched, claims that Variable Rate Shading will be available for DirectX 11, DirectX 12, Vulkan, and OpenGL. I’m not sure what’s different between Microsoft’s implementation that lets them separate it from NVIDIA’s extension.

Microsoft will have good tools support, however. They claim that their PIX for Windows performance analysis tool will support this feature on Day 1.

Source: Microsoft

NVIDIA is ready to storm the server room

Subject: General Tech, Graphics Cards, Networking, Shows and Expos | March 19, 2019 - 06:16 PM |
Tagged: nvidia, t4, amazon, microsoft, NGC, Mellanox, CUDA-X, GTC, jen-hsun huang, DRIVE Constellation, ai

As part of their long list of announcements yesterday, NVIDIA revealed they are partnering with Cisco, Dell EMC, Fujitsu, Hewlett Packard Enterprise, Inspur, Lenovo and Sugon to provide servers powered by T4 Tensor Core GPUs, optimized to run their CUDA-X AI accelerators. 

t4.PNG

Those T4 GPUs have been on the market for a while but this marks the first major success for NVIDIA in the server room, with models available for purchase from those aforementioned companies.  Those who prefer other people's servers can also benefit from these new products, with Amazon and Microsoft offering Cloud based solutions.  Setting yourself up to run NVIDIA's NGC software may save a lot of money down the road, the cards sip a mere 70W of power which is rather more attractive than the consumption of a gaggle of Tesla V100s.  One might be guilty of suspecting this offers an explanation for their recent acquisition of Mellanox.

purty.PNG

NGC software offers more than just a platform to run optimizations on, it also offers a range of templates to start off with from classification, and object detection, through sentiment analysis and most other basic starting points for training a machine.  Customers will also be able to upload their own models to share internally or, if in the mood, externally with other users and companies.  It supports existing products such as TensorFlow and PyTorch but also offers access to CUDA-X AI, which as the name suggests takes advantage of the base design of the T4 GPU to reduce the amount of time waiting for results and letting users advance designs quickly. 

memorex.PNG

If you are curious exactly what particular implementations of everyone's favourite buzzword might be, NVIDIA's DRIVE Constellation is a example after JoshTekk's own heart.  Literally an way to create open, scalable simulation for large fleets of self-driving cars to train them ... for good one hopes.  Currently the Toyota Research Institute-Advanced Development utilizes these products in the development of their next self-driving fleet, and NVIDIA obviously hopes others will follow suit. 

replaced.PNG

There is not much to see from the perspective of a gamer in the short term, but considering NVIDIA's work at shifting the horsepower from the silicon you own to their own Cloud this will certainly impact the future of gaming from both a hardware and gameplay perspective.  GPUs as a Service may not be the future many of us want but this suggests it could be possible, not to mention the dirty tricks enemy AIs will be able to pull with this processing power behind them.

One might even dream that escort missions could become less of a traumatic experience!

Source: NVIDIA

GTC 19: NVIDIA Announces GameWorks RTX; Unreal Engine and Unity Gain DXR Support

Subject: General Tech | March 18, 2019 - 10:00 PM |
Tagged: gameworks, unreal engine, Unity, rtx, ray tracing, nvidia, GTC 19, GTC, dxr, developers

Today at GTC NVIDIA announced GameWorks RTX and the implementation of real-time ray tracing in the upcoming Unreal Engine 4.22 and the latest version of Unity, currently in 2019.03.

NVIDIA Announces GameWorks RTX

While Pascal and non-RTX Turing support for real-time ray tracing is something of a bombshell from NVIDIA, the creation of GameWorks tools for such effects is not surprising.

“NVIDIA GameWorks RTX is a comprehensive set of tools that help developers implement real time ray-traced effects in games. GameWorks RTX is available to the developer community in open source form under the GameWorks license and includes plugins for Unreal Engine 4.22 and Unity’s 2019.03 preview release.”

NVIDIA lists these components of GameWorks RTX:

GW_RTX.PNG

  • RTX Denoiser SDK – a library that enables fast, real-time ray tracing by providing denoising techniques to lower the required ray count and samples per pixel. It includes algorithms for ray traced area light shadows, glossy reflections, ambient occlusion and diffuse global illumination.
  • Nsight for RT – a standalone developer tool that saves developers time by helping to debug and profile graphics applications built with DXR and other supported APIs.

Unreal Engine and Unity Gaining Real-Time Ray Tracing Support

DXR_GAME_ENGINES.png

And while not specific to NVIDIA hardware, news of more game engines offering integrated DXR support was also announced during the keynote:

“Unreal Engine 4.22 is available in preview now, with final release details expected in Epic’s GDC keynote on Wednesday. Starting on April 4, Unity will offer optimized, production-focused, realtime ray tracing support with a custom experimental build available on GitHub to all users with full preview access in the 2019.03 Unity release. Real-time ray tracing support from other first-party AAA game engines includes DICE/EA’s Frostbite Engine, Remedy Entertainment’s Northlight Engine and engines from Crystal Dynamics, Kingsoft, Netease and others.”

RTX may have been off to a slow start, but this will apparently be the year of real-time ray tracing after all - especially with the upcoming NVIDIA driver update adding support to the GTX 10-series and new GTX 16-series GPUs.

Source: NVIDIA