Subject: Graphics Cards | February 12, 2019 - 02:53 PM | Scott Michaud
Tagged: pc gaming, battlefield V, ea, dice, nvidia, DLSS, dxr
The Battlefield V Tides of War Chapter 2: Lightning Strikes Update #3 patch, beyond sounding like a Final Fantasy title, has quite a few major improvements. The headlining feature is improved RTX support, which we will discuss shortly, but fans of the game may appreciate the other bullet points, too.
But first, because we are a computer hardware site, the RTX stuff. DLSS, which was recently added to 3DMark and greatly improved the image quality, has been added to Battlefield V. This setting uses machine learning to produce a best guess at antialiasing, versus calculating it with a direct algorithm (such as with TXAA or FXAA). Now that MSAA is somewhat uncommon, because it is incompatible with certain rendering processes, we’re stuck with either antialiasing via post-process or super-sampling. Super-sampling is expensive, so it’s usually either FXAA, which tries to find edges and softens them, or TXAA, which gives neighboring frames different sub-pixel positions and blends them. Both cases have issues. TXAA is considered the “higher end” option, although it gets ugly when objects move, especially quickly and visibly smooth. Because DLSS is basically a shortcut to provide something that looks like super-sampling, it should avoid many of these issues.
DXR raytracing performance was also improved.
Okay, now the tech enthusiasts can stop reading – it’s time for the fans.
Vaultable object detection is said to have a major improvement with this release. DICE acknowledges that Battlefield V movement wasn’t as smooth as it should be. There were a lot of waist-high barriers that players can get stuck behind, which the vaulting system should propel them over. It should be much easier to move around the map after this update, which is good for people like me who like to sneak around and flank.
DICE has also discussed several netcode changes, such as adding more damage updates per packet and fixing some issues where damage should be ignored, or healing should occur but would be ignored, and so forth. Basically, all of the netcode improvements were related to health or damage in some way, which is a good area to focus on.
Also, the Rush game mode, introduced in the Battlefield Bad Company sub-franchise, will return on March 7th "for a limited time"... whatever they mean by that.
The update should be available now.
Subject: Graphics Cards | February 11, 2019 - 03:30 PM | Scott Michaud
Tagged: nvidia, rtx, vulkan
Microsoft got quite a bit of mindshare with the announcement of DirectX Raytracing (DXR) at last year’s GDC 2018. NVIDIA’s RTX technology was somewhat synonymous with DirectX 12 for a while, although NVIDIA was not exactly hiding their equivalent extension for Vulkan. It’s not that you must use DirectX 12 – it’s that you cannot use DirectX 11.
Image Credit: iOrange (via GitHub)
And now there’s a tutorial on GitHub by the user Sergii Kudlai (iOrange), complete with source code licensed under MIT. iOrange is a programmer for Digital Extremes, which is best known for their 2013 hit, Warframe, although they also collaborated with Epic Games on the earlier Unreal Tournament editions (UT2004 and earlier). They also worked on Epic Pinball.
The article is very casually worded and covers up to a single triangle.
If you’re interested in a little more depth, NVIDIA is also releasing Ray Tracing Gems for free on their website, although you need to be registered with their developer portal.
Ray Tracing Gems is available here. Currently only the first two chapters are up, but the rest will arrive every few days until approximately February 25th.
Subject: General Tech | February 11, 2019 - 12:46 PM | Jeremy Hellstrom
Tagged: leak, nvidia, gtx 1660 ti
Today we have seen a lot of action surrounding the soon to be released GTX 1660 Ti, which at one point many considered a fantasy created by strange minds and not an upcoming product at all. Doubt has been removed with the leak of details and pictures of packaging, spotted by WCCFTech and others.
Thanks to the packaging we know the card will have 6 GB GDDR6 VRAM, DirectX 12 support, ANSEL support and Turing Shaders, though no mention of Ray Tracing appears. The back of the card features DVI-D, HDMI, Display Port and the Virtual Link connector which was missing from some custom RTX series cards. Check out the link for more models from third party vendors.
"Featuring the same Turing GPU architecture, the new GeForce GTX graphics cards will exclude Ray Tracing but feature faster shading performance through the enhanced GPU design while utilizing the 12nm process node."
Here is some more Tech News from around the web:
- AMD's 7nm Navi GPUs reportedly delayed until October @ The Inquirer
- Apple supply chain: TSMC to remain sole iPhone chip supplier @ DigiTimes
- Apple sued because two-factor authentication is inconvenient @ The Inquirer
- LibreOffice 6.2 is here: Running up a Tab at the NotebookBar? You can turn it all off if you want @ The Register
- New Part Day: Mapping With RealSense Cameras For $200 @ Hackaday
- Leaky child-tracking smartwatch maker hits back at bad PR @ The Register
- Amazon launches its own cheesy teleshopping channel @ The Inquirer
- 10Gtek X550-T1 10G Ethernet Converged Network Adapter Review @ NikKTech
Subject: Graphics Cards | February 5, 2019 - 11:42 PM | Scott Michaud
Tagged: rtx, nvidia, Futuremark, DLSS, 3dmark
If you have an RTX-based graphics card, then you can now enable Deep Learning Super Sampling (DLSS) on 3DMark’s Port Royal benchmark. NVIDIA has also published a video of the benchmark running at 1440p alongside Temporal Anti-Aliasing (TAA).
Two things stand out about the video: Quality and Performance.
On the quality side: holy crap it looks good. One of the major issues with TAA is that it makes everything that’s moving somewhat blurry and/or otherwise messed up. For DLSS? It’s very clear and sharp, even in motion. It is very impressive. It also seems to behave well when there are big gaps in rendered light intensity, which, in my experience, can be a problem for antialiasing.
On the performance side, DLSS was shown to be significantly faster than TAA – seemingly larger than the gap between TAA and no anti-aliasing at all. The gap is because DLSS renders at a lower resolution automatically, and this behavior is published on NVIDIA’s website. (Ctrl+F for “to reduce the game’s internal rendering resolution”.)
Update on Feb 6th @ 12:36pm EST:
Apparently there's another mode, called DLSS 2X, that renders at native resolution. It won't have the performance boost over TAA, but it should have slightly higher rendering quality. I'm guessing it will be especially noticeable in the following situation.
End of Update.
While NVIDIA claims that it shouldn’t cause a noticeable image degradation, I believe I can see an example (in the video and their official screenshots) where the reduced resolution causes artifacts. If you look at the smoothly curving surfaces on the ring under the ship (as the camera zooms in just after 59s) you might be able to see a little horizontal jagged or almost Moiré effect. While I’m not 100% sure that it’s caused by the forced dip in resolution, it doesn’t seem to appear on the TAA version. If this is an artifact with the lowered resolution, I’m curious whether NVIDIA will allow us to run at the native resolution and still perform DLSS, or if the algorithm simply doesn’t operate that way.
NVIDIA's Side-by-Side Sample with TAA
NVIDIA's Side-by-Side Sample with DLSS
DLSS with artifacts pointed out
Image Credit: NVIDIA and FutureMark. Source.
That said, the image quality of DLSS is significantly above TAA. It’s painful watching an object move smoothly on a deferred rendering setup and seeing TAA freak out just a little to where it’s noticeable… but not enough to justify going back to a forward-rendering system with MSAA.
Subject: Graphics Cards | February 4, 2019 - 02:14 PM | Jeremy Hellstrom
Tagged: 418.81 WHQL, geforce, nvidia, driver
NVIDIA's newest WHQL driver has been updated to better support 3DMark Port Royal as well as getting ready for the release of the RTX laptops from a wide variety of manufacturers for those who love to game on the go.
In addition to improved benchmark runs you will also get the following.
Added or updated the following SLI profiles:
Subject: Graphics Cards | February 1, 2019 - 05:29 PM | Jeremy Hellstrom
Tagged: GTX 2060, msi, RTX 2060 Gaming Z, nvidia
MSI's RTX 2060 GAMING Z 6GB will cost you a bit more than the reference edition, expect to see it eventually settle at $390, however everything from the PCB to the cooler has been customized and the Boost clock is an impressive 1830MHz. [H]ard|OCP fired up the Afterburner and pushed that Boost to 1880MHz, as well as increasing the frequency of the 6GB of VRAM from 14GHz to 15.6GHz. If you are looking for a decent gaming experience at 1440p, this card will suit you better than a GTX 1070 Ti.
"We’ve got a fast factory overclocked MSI GeForce RTX 2060 GAMING Z video card to review today. We’ll take it through its paces in many games, and find out how it performs, including overclocking performance with the competition. Does the RTX 2060 deliver better performance at a lower price compared to the last generation?"
Here are some more Graphics Card articles from around the web:
- Is 6GB VRAM Enough for 1440p Gaming? Testing Usage with Nvidia's RTX 2060 @ Techspot
- ASUS GeForce RTX 2060 STRIX OC @ Guru of 3D
- Overclocking Showdown – the RX Vega 64 vs. the RTX 2070 @ BabelTechReviews
- MSI GeForce RTX 2080 Ti Lightning Z 11 GB @ TechPowerUp
Subject: General Tech | January 29, 2019 - 02:28 PM | Jeremy Hellstrom
Tagged: amd, nvidia, TSMC
In case you have yet to hear, TSMC's production line is suffering after ingesting some sub-par chemicals, which has "caused wafers to have lower yield". It was originally reported that it was the 16n and 14nm process nodes which were effected, used by NVIDIA and MediaTek GPUs as well as AMD's Xbox One X and PS4 APUs.
The Inquirer followed up with TSMC who stated the initial reports were incorrect and that it is roughly 10,000 wafers on the 12nm and 16nm nodes at Fab 14B in southern Taiwan which received the bad batch, nodes used by Huawei, MediaTek, and NVIDIA but not AMD.
TSMC still expects to meet market demands; they have dropped enough from last year that they announced expected Q1 2019 revenue will decline by 22%. Hopefully this is not the start of another problematic year for TSMC, who had to deal with a WannaCry infection last summer.
AMD, with their focus on the 7nm node, might have a bit of an opportunity if this does cause any temporary shortages of NVIDIA GPUs on the market.
Exploring 2560x1440 Results
In part one of our review of the NVIDIA GeForce RTX 2060 graphics card we looked at gaming performance using only 1920x1080 and 3840x2160 results, and while UHD is the current standard for consumer televisions (and an easy way to ensure GPU-bound performance) more than twice as many gamers play on a 2560x1440 display (3.89% vs. 1.42% for 3840x2160) according to Steam hardware survey results.
Adding these 1440p results was planned from the beginning, but time constraints made testing at three resolutions before getting on a plane for CES impossible (though in retrospect UHD should have been the one excluded from part one, and in future I'll approach it that way). Regardless, we now have those 1440p results to share, having concluded testing using the same list of games and synthetic benchmarks we saw in the previous installment.
On to the benchmarks!
|PC Perspective GPU Test Platform|
|Processor||Intel Core i7-8700K|
|Motherboard||ASUS ROG STRIX Z370-H Gaming|
|Memory||Corsair Vengeance LED 16GB (8GBx2) DDR4-3000|
|Storage||Samsung 850 EVO 1TB|
|Power Supply||CORSAIR RM1000x 1000W|
|Operating System||Windows 10 64-bit (Version 1803)|
NVIDIA: 417.54, 417.71 (OC Results)
We will begin with Unigine Superposition, which was run with the high preset settings.
Here we see the RTX 2060 with slightly higher performance than the GTX 1070 Ti, right in the middle of GTX 1070 and GTX 1080 performance levels. As expected so far.
Subject: General Tech | January 22, 2019 - 01:30 PM | Jeremy Hellstrom
Tagged: and, nvidia, leak, linux, 1660 ti, radeon vii
Once again we have an interesting leak from TUM_APISAK, this time about an upcoming NVIDIA product. The performance of the GTX 1660 Ti may or may not match the benchmark below but if it does we may finally be seeing a new mid-range Turing GPU from NVIDIA. The GTX naming scheme is worth noting, as it implies this will not feature the Ray Tracing or other enhancements brought by the RTX family and the strange new numbering system implies we might see more. That lack may help drive the price down, which would give people a chance to pick up something noticeably faster than a GTX 1060.
If you are more interested in verifiable news, The Inquirer also offers that this morning with confirmation of Linux support for AMD's new GPUs right from the very start. This has been something which we haven't really seen from AMD in the past, with enthusiasts working in the dark to tweak existing open source drivers to power AMD cards. Over the past few years AMD has been more forthcoming with information that helped in the development of drivers and has been more successful at releasing their own. This is great news that the new Radeon VII family will be conversant in Linux as of day one; we will keep an eye out for comparative performance once the cards launch.
"The leaked benchmarks come courtesy serial leaker APISAK, which posted a screenshot of the Ashes of Singularity benchmark showing a GPU called the GeForce GTX 1660 Ti."
Here is some more Tech News from around the web:
- Microsoft wants your ideas for better gaming in Windows 10 @ The Inquirer
- New Phobos Ransomware Exploits Weak Security To Hit Targets Around the World @ Slashdot
- 5G moving out of lab for official kickoff in 2019, says MediaTek chair @ DigiTimes
- New Part Day: Small, Cheap, and Good LIDAR Modules @ Hackaday
Subject: Displays | January 21, 2019 - 05:37 PM | Sebastian Peak
Tagged: vrr, variable refresh rate, rtings, nvidia, monitor, g-sync compatible, g-sync, freesync, display, amd
The staff of Rtings has embarked upon their own in-house testing of G-SYNC compatibility with FreeSync monitors (introduced with GeForce driver 417.71), and have released a video to introduce this new project:
While their choice of NVIDIA's Pendulum demo might be up for debate (since let's face it, any time NVIDIA anything is used to test, well, anything, there will always be a conspiracy theory) they have made some noteworthy observations about their experience vs. an AMD FX 580 with the same monitors. Still, as they point out in the article, "This test is by no means exhaustive, and your results may vary depending on the specific games you are playing, and your specific graphics card."
"We test FreeSync on a custom built PC, with an NVIDIA GTX 1060 6GB. Each monitor is connected via DisplayPort, as NVIDIA's FreeSync implementation does not currently work over HDMI. We use NVIDIA's Pendulum G-SYNC demo to test for tearing, stuttering, screen blanking, and other artifacts. We start at the monitor's standard refresh rate, and gradually decrease the sliders until we could see any issues. From there, we gradually increase the sliders until we start seeing tearing or other issues. The results of both of these tests give us the effective variable refresh rate range. We repeat the test at least twice to confirm our findings.
We use the results of this test to subjectively assign a result, based on how well the monitor supports NVIDIA's FreeSync implementation. The possible results are:
- Yes, NVIDIA Certified: This is reserved for monitors that are certified by NVIDIA as being compatible with NVIDIA FreeSync.
- Yes, Native: This is used to differentiate between monitors that support NVIDIA G-SYNC, instead of NVIDIA FreeSync.
- Yes: These monitors are confirmed by us to support FreeSync with no major issues, but are not certified by NVIDIA.
- Partial: These monitors at least partially support FreeSync, but we experienced some issues during testing. See the review for details of these issues.
- No: These monitors either do not support FreeSync at all, or are unusable with FreeSync enabled."
There are currently 25 test results available to help out with your variable refresh-rate monitor selections for use on NVIDIA hardware.