Subject: Displays | January 21, 2019 - 05:37 PM | Sebastian Peak
Tagged: vrr, variable refresh rate, rtings, nvidia, monitor, g-sync compatible, g-sync, freesync, display, amd
The staff of Rtings has embarked upon their own in-house testing of G-SYNC compatibility with FreeSync monitors (introduced with GeForce driver 417.71), and have released a video to introduce this new project:
While their choice of NVIDIA's Pendulum demo might be up for debate (since let's face it, any time NVIDIA anything is used to test, well, anything, there will always be a conspiracy theory) they have made some noteworthy observations about their experience vs. an AMD FX 580 with the same monitors. Still, as they point out in the article, "This test is by no means exhaustive, and your results may vary depending on the specific games you are playing, and your specific graphics card."
"We test FreeSync on a custom built PC, with an NVIDIA GTX 1060 6GB. Each monitor is connected via DisplayPort, as NVIDIA's FreeSync implementation does not currently work over HDMI. We use NVIDIA's Pendulum G-SYNC demo to test for tearing, stuttering, screen blanking, and other artifacts. We start at the monitor's standard refresh rate, and gradually decrease the sliders until we could see any issues. From there, we gradually increase the sliders until we start seeing tearing or other issues. The results of both of these tests give us the effective variable refresh rate range. We repeat the test at least twice to confirm our findings.
We use the results of this test to subjectively assign a result, based on how well the monitor supports NVIDIA's FreeSync implementation. The possible results are:
- Yes, NVIDIA Certified: This is reserved for monitors that are certified by NVIDIA as being compatible with NVIDIA FreeSync.
- Yes, Native: This is used to differentiate between monitors that support NVIDIA G-SYNC, instead of NVIDIA FreeSync.
- Yes: These monitors are confirmed by us to support FreeSync with no major issues, but are not certified by NVIDIA.
- Partial: These monitors at least partially support FreeSync, but we experienced some issues during testing. See the review for details of these issues.
- No: These monitors either do not support FreeSync at all, or are unusable with FreeSync enabled."
There are currently 25 test results available to help out with your variable refresh-rate monitor selections for use on NVIDIA hardware.
Subject: General Tech | January 18, 2019 - 06:09 PM | Sebastian Peak
Tagged: vulkan, rtx, raytracing, Quake II, quake, Q2VKPT, Q2PRO, path tracing, open source, nvidia, john carmack, github, fps
Wait - the first fully raytraced game was released in 1997? Not exactly, but Q2VKPT is. That name is not a typo (it stands for Quake 2 Vulkan Path Tracing) it's actually a game - or, more correctly, a proof-of-concept. But not just any game; we're talking about Quake 2. Technically this is a combination of Q2PRO, "an enhanced Quake 2 client and server for Windows and Linux", and VKPT, or Vulkan Path Tracing.
The end result is a fully raytraced experience that, if nothing else, gives the computer hardware media more to run on NVIDIA's GeForce RTX graphics cards right now than the endless BFV demos. Who would have guessed we'd be benchmarking Quake 2 again in 2019?
"Q2VKPT is the first playable game that is entirely raytraced and efficiently simulates fully dynamic lighting in real-time, with the same modern techniques as used in the movie industry (see Disney's practical guide to path tracing). The recent release of GPUs with raytracing capabilities has opened up entirely new possibilities for the future of game graphics, yet making good use of raytracing is non-trivial. While some games have started to explore improvements in shadow and reflection rendering, Q2VKPT is the first project to implement an efficient unified solution for all types of light transport: direct, scattered, and reflected light (see media). This kind of unification has led to a dramatic increase in both flexibility and productivity in the movie industry. The chance to have the same development in games promises a similar increase in visual fidelity and realism for game graphics in the coming years.
This project is meant to serve as a proof-of-concept for computer graphics research and the game industry alike, and to give enthusiasts a glimpse into the potential future of game graphics. Besides the use of hardware-accelerated raytracing, Q2VKPT mainly gains its efficiency from an adaptive image filtering technique that intelligently tracks changes in the scene illumination to re-use as much information as possible from previous computations."
The project can be downloaded from Github, and the developers neatly listed the needed files for download (the .pak files from either the Quake 2 demo or the full version can be used):
- Github Repository
- Windows Binary on Github
- Quake II Starter ("Quake II Starter is a free, standalone Quake II installer for Windows that uses the freely available 3.14 demo, 3.20 point release and the multiplayer-focused Q2PRO client to create a functional setup that's capable of playing online.")
There were also a full Q&A from the developers, and some obvious questions were answered including the observation that Quake 2 is "ancient" at this point, and shouldn't it "run at 6000 FPS by now":
While it is true that Quake II is a relatively old game with rather low geometric complexity, the limiting factor of path tracing is not primarily raytracing or geometric complexity. In fact, the current prototype could trace many more rays without a notable change in frame rate. The computational cost of the techniques used in the Q2VKPT prototype mainly depend on the number of (indirect) light scattering computations and the number of light sources. Quake II was already designed with many light sources when it was first released, in that sense it is still quite a modern game. Also, the number of light scattering events does not depend on scene complexity. It is therefore thinkable that the techniques we use could well scale up to more recent games."
And on the subject of path tracing vs. ray tracing:
"Path tracing is an elegant algorithm that can simulate many of the complex ways that light travels and scatters in virtual scenes. Its physically-based simulation of light allows highly realistic rendering. Path tracing uses Raytracing in order to determine the visibility in-between scattering events. However, Raytracing is merely a primitive operation that can be used for many things. Therefore, Raytracing alone does not automatically produce realistic images. Light transport algorithms like Path tracing can be used for that. However, while elegant and very powerful, naive path tracing is very costly and takes a long time to produce stable images. This project uses a smart adaptive filter that re-uses as much information as possible across many frames and pixels in order to produce robust and stable images."
This project is the result of work by one Christoph Schied, and was "a spare-time project to validate the results of computer graphics research in an actual game". Whatever your opinion of Q2VKPT, as we look back at Quake 2 and its impressive original lighting effects it's pretty clear that John Carmack was far ahead of his time (and it could be said that it's taken this long for hardware to catch up).
Subject: Graphics Cards | January 18, 2019 - 03:56 PM | Jeremy Hellstrom
Tagged: RTX 2060 Gaming Z, RTX 2060, nvidia, msi
At first glance, the MSI RTX 2060 Gaming Z is very similar to the Founders Edition, with only the boost clock of 1830MHz justifying that it's MSRP is $35 higher. Once TechPowerUp got into testing, they found that the custom cooler helps the card maintain peak speed far more effectively than the FE. This also let them hit an impressive manual overclock of 2055MHz boost clock and 1990MHz on the memory. Check out the results as well as a complete tear down of the card in the review.
"MSI's GeForce RTX 2060 Gaming Z is the best RTX 2060 custom-design we've reviewed so far. It comes with idle-fan-stop and a large triple-slot cooler that runs cooler than the Founders Edition. Noise levels are excellent, too, it's the quietest RTX 2060 card to date."
Here are some more Graphics Card articles from around the web:
- MSI RTX 2060 Gaming Z 6G @ Kitguru
- Nvidia GeForce RTX 2060 Founders Edition – Nvidia Unleashes RTX on The $350 Market @ Bjorn3d
- Using FreeSync with GeForce GPUs:How Well Does It Work @ TechSpot
- Adrenalin Software Edition 19.1.1 Driver Performance Analysis @ BabelTechReviews
- Radeon RX 570 vs. RX 580 vs. GeForce GTX 1060 3GB vs. GTX 1060 6GB @ TechSpot
Subject: Graphics Cards | January 16, 2019 - 04:33 PM | Jeremy Hellstrom
Tagged: linux, geforce, nvidia, ubuntu 18.04, gtx 760, gtx 960, RTX 2060, gtx 1060
If you are running an Ubuntu system with an older GPU and are curious about upgrading but unsure if it is worth it, Phoronix has a great review for you. Whether you are gaming with OpenGL and Vulkan, or curious about the changes in OpenCL/CUDA compute performance they have you covered. They even delve into the power efficiency numbers so you can spec out the operating costs of a large deployment, if you happen to have the budget to consider buying RTX 2060's in bulk.
"In this article is a side-by-side performance comparison of the GeForce RTX 2060 up against the GTX 1060 Pascal, GTX 960 Maxwell, and GTX 760 Kepler graphics cards."
Here are some more Graphics Card articles from around the web:
- MSI GeForce RTX 2060 Gaming Z (6G) @ Guru of 3D
- Palit GeForce RTX 2060 Gaming Pro OC 6 GB @ TechPowerUp
- EVGA GeForce RTX 2060 XC Ultra 6 GB @ TechPowerUp
Subject: Graphics Cards | January 15, 2019 - 03:25 AM | Jim Tanous
Tagged: variable refresh rate, nvidia, graphics driver, gpu, geforce, g-sync compatibility, g-sync, freesync
One of NVIDIA's biggest and most surprising CES announcements was the introduction of support for "G-SYNC Compatible Monitors," allowing the company's G-SYNC-capable Pascal and Turing-based graphics cards to work with FreeSync and other non-G-SYNC variable refresh rate displays. NVIDIA is initially certifying 12 FreeSync monitors but will allow users of any VRR display to manually enable G-SYNC and determine for themselves if the quality of the experience is acceptable.
Those eager to try the feature can now do so via NVIDIA's latest driver, version 417.71, which is rolling out worldwide right now. As of the date of this article's publication, users in the United States who visit NVIDIA's driver download page are still seeing the previous driver (417.35), but direct download links are already up and running.
The current list of FreeSync monitors that are certified by NVIDIA:
- Acer XFA240
- Acer XG270HU
- Acer XV273K
- Acer XZ321Q
- AOC Agon AG241QG4
- AOC G2590FX
- ASUS MG278Q
- ASUS XG248
- ASUS VG258Q
- ASUS XG258
- ASUS VG278Q
- BenQ XL2740
Users with a certified G-SYNC compatible monitor will have G-SYNC automatically enabled via the NVIDIA Control Panel when the driver is updated and the display is connected, the same process as connecting an official G-SYNC display. Those with a variable refresh rate display that is not certified must manually open the NVIDIA Control Panel and enable G-SYNC.
NVIDIA notes, however, that enabling the feature on displays that don't meet the company's performance capabilities may lead to a range of issues, from blurring and stuttering to flickering and blanking. The good news is that the type and severity of the issues will vary by display, so users can determine for themselves if the potential problems are acceptable.
Update: Users over at the NVIDIA subreddit have created a public Google Sheet to track their reports and experiences with various FreeSync monitors. Check it out to see how others are faring with your preferred monitor.
Subject: General Tech | January 10, 2019 - 12:37 PM | Jeremy Hellstrom
Tagged: nvidia, Intel, ces 2019, amd
The Tech Report just posted a nice assortment of updates covering their travels at CES, including their take AMD's new GPU and processors. They also took a look at Intel's offerings, and not just the fresh splash of seemingly bottomless Coffee, this time without the picture drawn on the top. What was far more interesting were the lineup of 10nm chips announced, Lakefield for low power applications, Ice Lake mainstream chips and Snow Ridge, an SoC designed for network applications. Of course, it wouldn't be an Intel briefing without Optane, to which the H10 series was announced, which sports both QLC 3D NAND and 3D XPoint on a M.2 2280 gumstick. It has a controller for both types of memory which means the bulk of the heavy lifting will be done onboard and not pushed onto your CPU.
"That title probably rests on the shoulders of four upcoming Intel products based on the company's beleaguered 10-nm fabrication process: the Lakefield low-power client processors, the Snow Ridge network SoC, and Ice Lake chips for every market segment."
Here is some more Tech News from around the web:
- Just updated Windows 7? Can't access network shares? It isn't just you @ The Register
- AMD Keynote for CES 2019 – Radeon VII “Vega on 7nm”, Mobile Graphics and an Interesting Zen 2 Benchmark @ Bjorn3d
- Steamer closets, flying cars, robot boxers, smart-mock-cock ban hypocrisy – yes, it's the worst of CES this year @ The Register
- A sampling of networking gear from CES: TP-Link goes Wi-Fi 6, D-Link goes 5G @ Ars Technica
- Lexar reveals the first 1TB SD card you can actually buy @ The Register
- Steam bug sees acclaimed indie games flagged as 'fake' @ The Inquirer
- Don't Expect A New Nvidia Shield Tablet Anytime Soon @ Slashdot
Subject: Systems, Shows and Expos | January 7, 2019 - 08:00 PM | Scott Michaud
Tagged: nvidia, Lenovo, Legion, Intel, geforce, gaming laptop, ces 2019, CES
Three new laptops have been added to Lenovo’s portfolio under their “Legion” gaming brand. All three of them will contain “Unannounced NVIDIA GeForce GPUs”.
The Lenovo Legion Y740 comes in two sizes: 15-inch and 17-inch. Based on the slide deck, both models have the choice between the Intel Core i5-8300H and the Intel Core i7-8750H. The Core i5-8300H is a quad-core CPU with HyperThreading (eight threads) that can turbo up to 4 GHz. The Core i7-8750H is a six-core CPU with HyperThreading (twelve threads) that can turbo up to 4.1 GHz. This can be paired with 8, 16, or 32GB of RAM at 2666MHz, or “8GB + 8GB 3200MHz Corsair Overclocked Memory”.
As for storage, both models can have up to 512GB of PCIe SSD, 512GB of SATA SSD, or 2TB of spinning metal. The 17-inch model can also have an Intel Optane drive added to it, although they don’t list a specific size. Both models also have 1x USB-C connector with support for Thunderbolt, DisplayPort, and USB 3.1. Alongside the USB-C is, also, HDMI, LAN, three standard USB 3.1 Gen 2, and a mini-DisplayPort connector. They also have an RGB keyboard, which, from the picture, appears to be tenkeyless. Both have Dolby sound, but only the 17-inch model also has a subwoofer. They do not list an audio jack, although I see a hole on the left side that could be either audio or a power plug. I think I also see power on the back, so I assume that it is audio on the side. Mobile phones are one thing, but a laptop better have a headphone jack.
The built-in displays are 1080p, which is a good size for a laptop, and support 144 Hz G-Sync @ 300nit. There is also an upsell to a 500nit panel that has been certified for Dolby HDR400. They don’t say whether the upsell also supports 144Hz G-Sync, but I would assume that they do. Check before you buy, though.
Both sizes will be available in February 2019. The 15-inch starts at $1749.99 USD and the 17-inch starts at $1979.99 USD.
The third model is the Lenovo Legion Y540. This one will be available a little bit later – May 2019. Interestingly, the CPU is listed as “Intel Core processors”. As such, I would assume that this laptop will use a new, unannounced processor alongside the unannounced GeForce GPU. Lenovo does mention that the laptop can be paired with up to 32GB of RAM at 2666MHz.
The battery is listed as “52.5Wh & 57Wh (Configuration dependent)”. Since an extra 4.5Wh seems like a tough upsell, I am guessing that battery you receive will be tied to the chosen display, but Lenovo doesn’t say so I don’t know. It looks like there will be a choice between three displays: a 60Hz 1080p IPS panel at 250nits with “45%” color, a 60Hz 1080p IPS panel at 300nits with “72%” color, and a 144Hz IPS panel at 300nits with “72%” color. I put each of the color space percentages in quotations because they don’t list which color space. Since one of them is an HDR panel, I’m going to assume that they don’t mean sRGB… because that would be awful. I am hoping that they are referring to the DCI-P3 color space. They could mean NTSC 1976, although that would be a bit low for an HDR panel.
The laptop has a USB-C port but, unlike the Y740, it can only be used for USB 3.1. There are also three standard USB 3.1 ports, one HDMI port, one mini-DisplayPort, an Ethernet jack, and a 3.5mm audio jack, so you can still attach external monitors to it without the USB-C. They keyboard is backlight, but not RGB – just white.
As mentioned, the Lenovo Legion Y540 will be available in May 2019. It will start at $929.99 USD.
Subject: Displays, Shows and Expos | January 7, 2019 - 08:00 PM | Scott Michaud
Tagged: nvidia, Lenovo, g-sync, freesync 2, display, ces 2019, CES, amd
Lenovo has added two monitors to their Legion line of gaming devices.
The Lenovo Legion Y44w is a 43.4” gaming display. Most of that size is horizontal, however, because it has a 32:10 aspect ratio. If you have ever used a 1920x1200 monitor, which was the PC equivalent of 1080p while PC manufacturers believed that 16:9 was too wide so they settled on 16:10 for the Windows Vista era, then you should imagine two of them side-by-side in a single monitor. In fact, the Y44w supports two separate video inputs if you wish to split the monitor down the middle into two side-by-side 1920x1200 displays. It can also operate as a single, 3840x1200 display, of course. This resolution is a little over half of a 4K panel, so it should be easier for second-tier GPUs to feed.
Beyond the resolution, the color gamut is listed as “99% sRGB, BT.709, DCI-P3” and it is certified as VESA HDR400. If the slide deck is correct and it can do 99% DCI-P3 at HDR400, then it should have an amazing picture. It can also do 144 Hz with FreeSync 2, so you do not need to compromise refresh rate to get those beautiful colors. The also have an optional speaker from Harman Kardon that can be attached to the display.
The Lenovo Legion Y44w will be available in April 2019 for $1199.99 USD.
Lenovo also announced the Legion Y27gq gaming monitor. This one is a standard 16:9, 1440p, TN panel that can be driven up to 240 Hz. It supports G-Sync, but not HDR. Despite not supporting HDR, it still covers 90% of DCI-P3, which is quite wide for a TN panel. Lenovo is listing it as an “eSport gaming monitor”… so you can probably guess that high refresh rate and G-Sync are the focus.
If you gotta go fast, then the Lenovo Legion Y27gq is available in April 2019 for $999.99 USD.
Subject: Graphics Cards | January 7, 2019 - 04:34 PM | Jeremy Hellstrom
Tagged: video card, turing, tu106, RTX 2060, rtx, nvidia, graphics card, gpu, gddr6, gaming
After months of rumours and guesses as to what the RTX 2060 will actually offer, we finally know. It is built on the same TU106 the RTX 2070 uses and sports somewhat similar core clocks though the drop in TC, ROPs and TUs reduces it to producing a mere 5 GigaRays. The memory is rather different, with the 6GB of GDDR6 connected via 192-bit bus offering 336.1 GB/s of bandwidth. As you saw in Sebastian's testing the overall performance is better than you would expect from a mid-range card but at the cost of a higher price.
If we missed out on your favourite game, check the Guru of 3D's suite of benchmarks or one of the others below.
"NVIDIA today announced the GeForce RTX 2060, the graphics card will be unleashed next week the 15th at a sales price of 349 USD / 359 EUR. Today, however, we can already bring you a full review of what is a pretty feisty little graphics card really."
Here are some more Graphics Card articles from around the web:
- NVIDIA GeForce RTX 2060 FE Review @ Legit Reviews
- RTX 2060 Review with 39 games @ BabelTechReviews
- NVIDIA Geforce RTX 2060 Founders Edition Review @ OCC
- Nvidia RTX 2060 Founders Edition 6GB @ Kitguru
- Battlefield V NVIDIA Ray Tracing RTX 2080 @ [H]ard|OCP
- The GPU Compute Performance From The NVIDIA GeForce GTX 680 To TITAN RTX @ Phoronix
- The HD 7970 vs. the GTX 680 – revisited after 7 years @ BabelTechReviews
We have to go all the way back to 2015 for NVIDIA's previous graphics card announcement at CES, with the GeForce GTX 960 revealed during the show four years ago. And coming on the heels of this announcement today we have the latest “mid-range” offering in the tradition of the GeForce x60 (or x060) cards, the RTX 2060. This launch comes as no surprise to those of us following the PC industry, as various rumors and leaks preceded the announcement by weeks and even months, but such is the reality of the modern supply chain process (sadly, few things are ever really a surprise anymore).
But there is still plenty of new information available with the official launch of this new GPU, not the least of which is the opportunity to look at independent benchmark results to find out what to expect with this new GPU relative to the market. To this end we had the opportunity to get our hands on the card before the official launch, testing the RTX 2060 in several games as well as a couple of synthetic benchmarks. The story is just beginning, and as time permits a "part two" of the RTX 2060 review will be offered to supplement this initial look, addressing omissions and adding further analysis of the data collected thus far.
Before getting into the design and our initial performance impressions of the card, let's look into the specifications of this new RTX 2060, and see how it relates to the rest of the RTX family from NVIDIA. We are taking a high level look at specs here, so for a deep dive into the RTX series you can check out our previous exploration of the Turing Architecture here.
"Based on a modified version of the Turing TU106 GPU used in the GeForce RTX 2070, the GeForce RTX 2060 brings the GeForce RTX architecture, including DLSS and ray-tracing, to the midrange GPU segment. It delivers excellent gaming performance on all modern games with the graphics settings cranked up. Priced at $349, the GeForce RTX 2060 is designed for 1080p gamers, and delivers an excellent gaming experience at 1440p."
|RTX 2080 Ti||RTX 2080||RTX 2070||RTX 2060||GTX 1080||GTX 1070|
|Base Clock||1350 MHz||1515 MHz||1410 MHz||1365 MHz||1607 MHz||1506 MHz|
|Boost Clock||1545 MHz/
1635 MHz (FE)
1800 MHz (FE)
1710 MHz (FE)
|1680 MHz||1733 MHz||1683 MHz|
|Ray Tracing Speed||10 Giga Rays||8 Giga Rays||6 Giga Rays||5 Giga Rays||--||--|
|Memory Clock||14000 MHz||14000 MHz||14000 MHz||14000 MHz||10000 MHz||8000 MHz|
|Memory Interface||352-bit GDDR6||256-bit GDDR6||256-bit GDDR6||192-bit GDDR6||256-bit GDDR5X||256-bit GDDR5|
|Memory Bandwidth||616 GB/s||448 GB/s||448 GB/s||336.1 GB/s||320 GB/s||256 GB/s|
|TDP||250 W /
260 W (FE)
|175 W / 185W (FE)||160 W||180 W||150 W|
|MSRP (current)||$1200 (FE)/
|$599 (FE)/ $499||$349||$549||$379|