Q2VKPT Makes Quake 2 the First Entirely Raytraced Game

Subject: General Tech | January 18, 2019 - 06:09 PM |
Tagged: vulkan, rtx, raytracing, Quake II, quake, Q2VKPT, Q2PRO, path tracing, open source, nvidia, john carmack, github, fps

Wait - the first fully raytraced game was released in 1997? Not exactly, but Q2VKPT is. That name is not a typo (it stands for Quake 2 Vulkan Path Tracing) it's actually a game - or, more correctly, a proof-of-concept. But not just any game; we're talking about Quake 2. Technically this is a combination of Q2PRO, "an enhanced Quake 2 client and server for Windows and Linux", and VKPT, or Vulkan Path Tracing.

q2vkpt_screenshot_1.jpg

The end result is a fully raytraced experience that, if nothing else, gives the computer hardware media more to run on NVIDIA's GeForce RTX graphics cards right now than the endless BFV demos. Who would have guessed we'd be benchmarking Quake 2 again in 2019?

"Q2VKPT is the first playable game that is entirely raytraced and efficiently simulates fully dynamic lighting in real-time, with the same modern techniques as used in the movie industry (see Disney's practical guide to path tracing). The recent release of GPUs with raytracing capabilities has opened up entirely new possibilities for the future of game graphics, yet making good use of raytracing is non-trivial. While some games have started to explore improvements in shadow and reflection rendering, Q2VKPT is the first project to implement an efficient unified solution for all types of light transport: direct, scattered, and reflected light (see media). This kind of unification has led to a dramatic increase in both flexibility and productivity in the movie industry. The chance to have the same development in games promises a similar increase in visual fidelity and realism for game graphics in the coming years.

This project is meant to serve as a proof-of-concept for computer graphics research and the game industry alike, and to give enthusiasts a glimpse into the potential future of game graphics. Besides the use of hardware-accelerated raytracing, Q2VKPT mainly gains its efficiency from an adaptive image filtering technique that intelligently tracks changes in the scene illumination to re-use as much information as possible from previous computations."

The project can be downloaded from Github, and the developers neatly listed the needed files for download (the .pak files from either the Quake 2 demo or the full version can be used):

  • Github Repository
  • Windows Binary on Github
  • Quake II Starter ("Quake II Starter is a free, standalone Quake II installer for Windows that uses the freely available 3.14 demo, 3.20 point release and the multiplayer-focused Q2PRO client to create a functional setup that's capable of playing online.")

There were also a full Q&A from the developers, and some obvious questions were answered including the observation that Quake 2 is "ancient" at this point, and shouldn't it "run at 6000 FPS by now":

While it is true that Quake II is a relatively old game with rather low geometric complexity, the limiting factor of path tracing is not primarily raytracing or geometric complexity. In fact, the current prototype could trace many more rays without a notable change in frame rate. The computational cost of the techniques used in the Q2VKPT prototype mainly depend on the number of (indirect) light scattering computations and the number of light sources. Quake II was already designed with many light sources when it was first released, in that sense it is still quite a modern game. Also, the number of light scattering events does not depend on scene complexity. It is therefore thinkable that the techniques we use could well scale up to more recent games."

And on the subject of path tracing vs. ray tracing:

"Path tracing is an elegant algorithm that can simulate many of the complex ways that light travels and scatters in virtual scenes. Its physically-based simulation of light allows highly realistic rendering. Path tracing uses Raytracing in order to determine the visibility in-between scattering events. However, Raytracing is merely a primitive operation that can be used for many things. Therefore, Raytracing alone does not automatically produce realistic images. Light transport algorithms like Path tracing can be used for that. However, while elegant and very powerful, naive path tracing is very costly and takes a long time to produce stable images. This project uses a smart adaptive filter that re-uses as much information as possible across many frames and pixels in order to produce robust and stable images."

This project is the result of work by one Christoph Schied, and was "a spare-time project to validate the results of computer graphics research in an actual game". Whatever your opinion of Q2VKPT, as we look back at Quake 2 and its impressive original lighting effects it's pretty clear that John Carmack was far ahead of his time (and it could be said that it's taken this long for hardware to catch up).

Source: Q2VKPT

Custom cooling creates a great card, MSI's Gaming Z RTX 2060

Subject: Graphics Cards | January 18, 2019 - 03:56 PM |
Tagged: RTX 2060 Gaming Z, RTX 2060, nvidia, msi

At first glance, the MSI RTX 2060 Gaming Z is very similar to the Founders Edition, with only the boost clock of 1830MHz justifying that it's MSRP is $35 higher.  Once TechPowerUp got into testing, they found that the custom cooler helps the card maintain peak speed far more effectively than the FE.  This also let them hit an impressive manual overclock of 2055MHz boost clock and 1990MHz on the memory.  Check out the results as well as a complete tear down of the card in the review.

card1.jpg

"MSI's GeForce RTX 2060 Gaming Z is the best RTX 2060 custom-design we've reviewed so far. It comes with idle-fan-stop and a large triple-slot cooler that runs cooler than the Founders Edition. Noise levels are excellent, too, it's the quietest RTX 2060 card to date."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: TechPowerUp

Generations of GeForce GPUs in Ubuntu

Subject: Graphics Cards | January 16, 2019 - 04:33 PM |
Tagged: linux, geforce, nvidia, ubuntu 18.04, gtx 760, gtx 960, RTX 2060, gtx 1060

If you are running an Ubuntu system with an older GPU and are curious about upgrading but unsure if it is worth it, Phoronix has a great review for you.  Whether you are gaming with OpenGL and Vulkan, or curious about the changes in OpenCL/CUDA compute performance they have you covered.  They even delve into the power efficiency numbers so you can spec out the operating costs of a large deployment, if you happen to have the budget to consider buying RTX 2060's in bulk.

cards.PNG

"In this article is a side-by-side performance comparison of the GeForce RTX 2060 up against the GTX 1060 Pascal, GTX 960 Maxwell, and GTX 760 Kepler graphics cards."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Phoronix

GeForce Driver 417.71 Now Available, Enables G-SYNC Compatibility With FreeSync Monitors

Subject: Graphics Cards | January 15, 2019 - 03:25 AM |
Tagged: variable refresh rate, nvidia, graphics driver, gpu, geforce, g-sync compatibility, g-sync, freesync

One of NVIDIA's biggest and most surprising CES announcements was the introduction of support for "G-SYNC Compatible Monitors," allowing the company's G-SYNC-capable Pascal and Turing-based graphics cards to work with FreeSync and other non-G-SYNC variable refresh rate displays. NVIDIA is initially certifying 12 FreeSync monitors but will allow users of any VRR display to manually enable G-SYNC and determine for themselves if the quality of the experience is acceptable.

gsync-compatible-freesync.jpg

Those eager to try the feature can now do so via NVIDIA's latest driver, version 417.71, which is rolling out worldwide right now. As of the date of this article's publication, users in the United States who visit NVIDIA's driver download page are still seeing the previous driver (417.35), but direct download links are already up and running.

The current list of FreeSync monitors that are certified by NVIDIA:

Users with a certified G-SYNC compatible monitor will have G-SYNC automatically enabled via the NVIDIA Control Panel when the driver is updated and the display is connected, the same process as connecting an official G-SYNC display. Those with a variable refresh rate display that is not certified must manually open the NVIDIA Control Panel and enable G-SYNC.

NVIDIA notes, however, that enabling the feature on displays that don't meet the company's performance capabilities may lead to a range of issues, from blurring and stuttering to flickering and blanking. The good news is that the type and severity of the issues will vary by display, so users can determine for themselves if the potential problems are acceptable.

Update: Users over at the NVIDIA subreddit have created a public Google Sheet to track their reports and experiences with various FreeSync monitors. Check it out to see how others are faring with your preferred monitor.

Update 2: Our friends over at Wccftech have published a short video demonstrating how to enable G-SYNC on non-G-SYNC VRR monitors:

Source: NVIDIA

Rounding up CES

Subject: General Tech | January 10, 2019 - 12:37 PM |
Tagged: nvidia, Intel, ces 2019, amd

The Tech Report just posted a nice assortment of updates covering their travels at CES, including their take AMD's new GPU and processors.  They also took a look at Intel's offerings, and not just the fresh splash of seemingly bottomless Coffee, this time without the picture drawn on the top.  What was far more interesting were the lineup of 10nm chips announced, Lakefield for low power applications, Ice Lake mainstream chips and Snow Ridge, an SoC designed for network applications.  Of course, it wouldn't be an Intel briefing without Optane, to which the H10 series was announced, which sports both QLC 3D NAND and 3D XPoint on a M.2 2280 gumstick.  It has a controller for both types of memory which means the bulk of the heavy lifting will be done onboard and not pushed onto your CPU. 

 

image001.png

"That title probably rests on the shoulders of four upcoming Intel products based on the company's beleaguered 10-nm fabrication process: the Lakefield low-power client processors, the Snow Ridge network SoC, and Ice Lake chips for every market segment."

Here is some more Tech News from around the web:

Tech Talk

CES 2019: New Lenovo "Legion" Gaming Laptops Announced. 15-inch Y740, 17-inch Y740, 15-inch Y540

Subject: Systems, Shows and Expos | January 7, 2019 - 08:00 PM |
Tagged: nvidia, Lenovo, Legion, Intel, geforce, gaming laptop, ces 2019, CES

Three new laptops have been added to Lenovo’s portfolio under their “Legion” gaming brand. All three of them will contain “Unannounced NVIDIA GeForce GPUs”.

lenovo-2019-ces-legion740.jpg

The Lenovo Legion Y740 comes in two sizes: 15-inch and 17-inch. Based on the slide deck, both models have the choice between the Intel Core i5-8300H and the Intel Core i7-8750H. The Core i5-8300H is a quad-core CPU with HyperThreading (eight threads) that can turbo up to 4 GHz. The Core i7-8750H is a six-core CPU with HyperThreading (twelve threads) that can turbo up to 4.1 GHz. This can be paired with 8, 16, or 32GB of RAM at 2666MHz, or “8GB + 8GB 3200MHz Corsair Overclocked Memory”.

As for storage, both models can have up to 512GB of PCIe SSD, 512GB of SATA SSD, or 2TB of spinning metal. The 17-inch model can also have an Intel Optane drive added to it, although they don’t list a specific size. Both models also have 1x USB-C connector with support for Thunderbolt, DisplayPort, and USB 3.1. Alongside the USB-C is, also, HDMI, LAN, three standard USB 3.1 Gen 2, and a mini-DisplayPort connector. They also have an RGB keyboard, which, from the picture, appears to be tenkeyless. Both have Dolby sound, but only the 17-inch model also has a subwoofer. They do not list an audio jack, although I see a hole on the left side that could be either audio or a power plug. I think I also see power on the back, so I assume that it is audio on the side. Mobile phones are one thing, but a laptop better have a headphone jack.

The built-in displays are 1080p, which is a good size for a laptop, and support 144 Hz G-Sync @ 300nit. There is also an upsell to a 500nit panel that has been certified for Dolby HDR400. They don’t say whether the upsell also supports 144Hz G-Sync, but I would assume that they do. Check before you buy, though.

Both sizes will be available in February 2019. The 15-inch starts at $1749.99 USD and the 17-inch starts at $1979.99 USD.

lenovo-2019-ces-legion540.jpg.png

The third model is the Lenovo Legion Y540. This one will be available a little bit later – May 2019. Interestingly, the CPU is listed as “Intel Core processors”. As such, I would assume that this laptop will use a new, unannounced processor alongside the unannounced GeForce GPU. Lenovo does mention that the laptop can be paired with up to 32GB of RAM at 2666MHz.

lenovo-2019-ces-legion540closed.jpg.png

The battery is listed as “52.5Wh & 57Wh (Configuration dependent)”. Since an extra 4.5Wh seems like a tough upsell, I am guessing that battery you receive will be tied to the chosen display, but Lenovo doesn’t say so I don’t know. It looks like there will be a choice between three displays: a 60Hz 1080p IPS panel at 250nits with “45%” color, a 60Hz 1080p IPS panel at 300nits with “72%” color, and a 144Hz IPS panel at 300nits with “72%” color. I put each of the color space percentages in quotations because they don’t list which color space. Since one of them is an HDR panel, I’m going to assume that they don’t mean sRGB… because that would be awful. I am hoping that they are referring to the DCI-P3 color space. They could mean NTSC 1976, although that would be a bit low for an HDR panel.

The laptop has a USB-C port but, unlike the Y740, it can only be used for USB 3.1. There are also three standard USB 3.1 ports, one HDMI port, one mini-DisplayPort, an Ethernet jack, and a 3.5mm audio jack, so you can still attach external monitors to it without the USB-C. They keyboard is backlight, but not RGB – just white.

As mentioned, the Lenovo Legion Y540 will be available in May 2019. It will start at $929.99 USD.

Source: Lenovo

CES 2019: New Lenovo "Legion" Displays: Y44w & Y27gq

Subject: Displays, Shows and Expos | January 7, 2019 - 08:00 PM |
Tagged: nvidia, Lenovo, g-sync, freesync 2, display, ces 2019, CES, amd

Lenovo has added two monitors to their Legion line of gaming devices.

lenovo-2019-ces-legion-bigmonitor.jpg

The Lenovo Legion Y44w is a 43.4” gaming display. Most of that size is horizontal, however, because it has a 32:10 aspect ratio. If you have ever used a 1920x1200 monitor, which was the PC equivalent of 1080p while PC manufacturers believed that 16:9 was too wide so they settled on 16:10 for the Windows Vista era, then you should imagine two of them side-by-side in a single monitor. In fact, the Y44w supports two separate video inputs if you wish to split the monitor down the middle into two side-by-side 1920x1200 displays. It can also operate as a single, 3840x1200 display, of course. This resolution is a little over half of a 4K panel, so it should be easier for second-tier GPUs to feed.

Beyond the resolution, the color gamut is listed as “99% sRGB, BT.709, DCI-P3” and it is certified as VESA HDR400. If the slide deck is correct and it can do 99% DCI-P3 at HDR400, then it should have an amazing picture. It can also do 144 Hz with FreeSync 2, so you do not need to compromise refresh rate to get those beautiful colors. The also have an optional speaker from Harman Kardon that can be attached to the display.

The Lenovo Legion Y44w will be available in April 2019 for $1199.99 USD.

lenovo-2019-ces-legion-littlemonitor.jpg

Lenovo also announced the Legion Y27gq gaming monitor. This one is a standard 16:9, 1440p, TN panel that can be driven up to 240 Hz. It supports G-Sync, but not HDR. Despite not supporting HDR, it still covers 90% of DCI-P3, which is quite wide for a TN panel. Lenovo is listing it as an “eSport gaming monitor”… so you can probably guess that high refresh rate and G-Sync are the focus.

If you gotta go fast, then the Lenovo Legion Y27gq is available in April 2019 for $999.99 USD.

Source: Lenovo

Is it midrange or not? Meet the RTX 2060

Subject: Graphics Cards | January 7, 2019 - 04:34 PM |
Tagged: video card, turing, tu106, RTX 2060, rtx, nvidia, graphics card, gpu, gddr6, gaming

After months of rumours and guesses as to what the RTX 2060 will actually offer, we finally know.  It is built on the same TU106 the RTX 2070 uses and sports somewhat similar core clocks though the drop in TC, ROPs and TUs reduces it to producing a mere 5 GigaRays.  The memory is rather different, with the 6GB of GDDR6 connected via 192-bit bus offering 336.1 GB/s of bandwidth.  As you saw in Sebastian's testing the overall performance is better than you would expect from a mid-range card but at the cost of a higher price.

If we missed out on your favourite game, check the Guru of 3D's suite of benchmarks or one of the others below. 

RTX2060_Box.jpg

"NVIDIA today announced the GeForce RTX 2060, the graphics card will be unleashed next week the 15th at a sales price of 349 USD / 359 EUR. Today, however, we can already bring you a full review of what is a pretty feisty little graphics card really."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Guru of 3D
Manufacturer: NVIDIA

Formidable Mid-Range

We have to go all the way back to 2015 for NVIDIA's previous graphics card announcement at CES, with the GeForce GTX 960 revealed during the show four years ago. And coming on the heels of this announcement today we have the latest “mid-range” offering in the tradition of the GeForce x60 (or x060) cards, the RTX 2060. This launch comes as no surprise to those of us following the PC industry, as various rumors and leaks preceded the announcement by weeks and even months, but such is the reality of the modern supply chain process (sadly, few things are ever really a surprise anymore).

RTX2060_Box.jpg

But there is still plenty of new information available with the official launch of this new GPU, not the least of which is the opportunity to look at independent benchmark results to find out what to expect with this new GPU relative to the market. To this end we had the opportunity to get our hands on the card before the official launch, testing the RTX 2060 in several games as well as a couple of synthetic benchmarks. The story is just beginning, and as time permits a "part two" of the RTX 2060 review will be offered to supplement this initial look, addressing omissions and adding further analysis of the data collected thus far.

Before getting into the design and our initial performance impressions of the card, let's look into the specifications of this new RTX 2060, and see how it relates to the rest of the RTX family from NVIDIA. We are  taking a high level look at specs here, so for a deep dive into the RTX series you can check out our previous exploration of the Turing Architecture here.

"Based on a modified version of the Turing TU106 GPU used in the GeForce RTX 2070, the GeForce RTX 2060 brings the GeForce RTX architecture, including DLSS and ray-tracing, to the midrange GPU segment. It delivers excellent gaming performance on all modern games with the graphics settings cranked up. Priced at $349, the GeForce RTX 2060 is designed for 1080p gamers, and delivers an excellent gaming experience at 1440p."

RTX2060_Thumbnail.jpg

  RTX 2080 Ti RTX 2080 RTX 2070 RTX 2060 GTX 1080 GTX 1070
GPU TU102 TU104 TU106 TU106 GP104 GP104
GPU Cores 4352 2944 2304 1920 2560 1920
Base Clock 1350 MHz 1515 MHz 1410  MHz 1365 MHz 1607 MHz 1506 MHz
Boost Clock 1545 MHz/
1635 MHz (FE)
1710 MHz/
1800 MHz (FE)
1620 MHz
1710 MHz (FE)
1680 MHz 1733 MHz 1683 MHz
Texture Units 272 184 144 120 160 120
ROP Units 88 64 64 48 64 64
Tensor Cores 544 368 288 240 -- --
Ray Tracing Speed 10 Giga Rays 8 Giga Rays 6 Giga Rays 5 Giga Rays -- --
Memory 11GB 8GB 8GB 6GB 8GB 8GB
Memory Clock 14000 MHz  14000 MHz  14000 MHz 14000 MHz 10000 MHz 8000 MHz
Memory Interface 352-bit GDDR6 256-bit GDDR6 256-bit GDDR6 192-bit GDDR6 256-bit GDDR5X 256-bit GDDR5
Memory Bandwidth 616 GB/s 448 GB/s 448 GB/s 336.1 GB/s 320 GB/s 256 GB/s
TDP 250 W /
260 W (FE)
215W /
225W (FE)
175 W / 185W (FE) 160 W 180 W 150 W
MSRP (current) $1200 (FE)/
$1000
$800 (FE)/
$700
$599 (FE)/ $499 $349 $549 $379

Continue reading our initial review of the NVIDIA GeForce RTX 2060!

NVIDIA adapts to the market and frees their displays

Subject: General Tech | January 7, 2019 - 01:47 PM |
Tagged: nvidia, g-sync, freesync, benq, asus, AOC, amd, adaptive sync, acer

G-SYNC is showing some signs of defeat as today NVIDIA announced that several Adaptive Sync monitors have been tested and rated as G-SYNC compatible.  Adaptive Sync is the official VESA technology which is present in AMD's FreeSync monitors and it offers a definitive financial advantage over NVIDIA's G-SYNC as the module required for G-SYNC can add hundreds of dollars to the price.

So far only a dozen monitors out of around 400 tests have been rated as G-SYNC compatible, so don't expect to be mixing your monitors quite yet but it does imply in some cases the extra controller is not required for variable refresh rates with either NVIDIA's or AMD's GPUs.   The results of this test give AMD bragging rights for implementing adaptive sync in the most attractive way but this change could hurt GPU sales as users can now opt for an GeForce card paired with a FreeSync display.

Even if your display is not listed in those models, you can try enabling adaptive sync over DisplayPort and see if it works, though your results may vary. Ars Technica lists the models here.

gsync-overview-nobg.png

"Besides being unexpected good news for gamers who already own one of these FreeSync monitors, this is also great news for gamers that want to add VRR to their Nvidia graphics card setup without breaking the bank."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica