Subject: Graphics Cards | January 18, 2019 - 03:56 PM | Jeremy Hellstrom
Tagged: RTX 2060 Gaming Z, RTX 2060, nvidia, msi
At first glance, the MSI RTX 2060 Gaming Z is very similar to the Founders Edition, with only the boost clock of 1830MHz justifying that it's MSRP is $35 higher. Once TechPowerUp got into testing, they found that the custom cooler helps the card maintain peak speed far more effectively than the FE. This also let them hit an impressive manual overclock of 2055MHz boost clock and 1990MHz on the memory. Check out the results as well as a complete tear down of the card in the review.
"MSI's GeForce RTX 2060 Gaming Z is the best RTX 2060 custom-design we've reviewed so far. It comes with idle-fan-stop and a large triple-slot cooler that runs cooler than the Founders Edition. Noise levels are excellent, too, it's the quietest RTX 2060 card to date."
Here are some more Graphics Card articles from around the web:
- MSI RTX 2060 Gaming Z 6G @ Kitguru
- Nvidia GeForce RTX 2060 Founders Edition – Nvidia Unleashes RTX on The $350 Market @ Bjorn3d
- Using FreeSync with GeForce GPUs:How Well Does It Work @ TechSpot
- Adrenalin Software Edition 19.1.1 Driver Performance Analysis @ BabelTechReviews
- Radeon RX 570 vs. RX 580 vs. GeForce GTX 1060 3GB vs. GTX 1060 6GB @ TechSpot
Subject: Graphics Cards | January 16, 2019 - 04:33 PM | Jeremy Hellstrom
Tagged: linux, geforce, nvidia, ubuntu 18.04, gtx 760, gtx 960, RTX 2060, gtx 1060
If you are running an Ubuntu system with an older GPU and are curious about upgrading but unsure if it is worth it, Phoronix has a great review for you. Whether you are gaming with OpenGL and Vulkan, or curious about the changes in OpenCL/CUDA compute performance they have you covered. They even delve into the power efficiency numbers so you can spec out the operating costs of a large deployment, if you happen to have the budget to consider buying RTX 2060's in bulk.
"In this article is a side-by-side performance comparison of the GeForce RTX 2060 up against the GTX 1060 Pascal, GTX 960 Maxwell, and GTX 760 Kepler graphics cards."
Here are some more Graphics Card articles from around the web:
- MSI GeForce RTX 2060 Gaming Z (6G) @ Guru of 3D
- Palit GeForce RTX 2060 Gaming Pro OC 6 GB @ TechPowerUp
- EVGA GeForce RTX 2060 XC Ultra 6 GB @ TechPowerUp
Subject: Graphics Cards | January 15, 2019 - 03:25 AM | Jim Tanous
Tagged: variable refresh rate, nvidia, graphics driver, gpu, geforce, g-sync compatibility, g-sync, freesync
One of NVIDIA's biggest and most surprising CES announcements was the introduction of support for "G-SYNC Compatible Monitors," allowing the company's G-SYNC-capable Pascal and Turing-based graphics cards to work with FreeSync and other non-G-SYNC variable refresh rate displays. NVIDIA is initially certifying 12 FreeSync monitors but will allow users of any VRR display to manually enable G-SYNC and determine for themselves if the quality of the experience is acceptable.
Those eager to try the feature can now do so via NVIDIA's latest driver, version 417.71, which is rolling out worldwide right now. As of the date of this article's publication, users in the United States who visit NVIDIA's driver download page are still seeing the previous driver (417.35), but direct download links are already up and running.
The current list of FreeSync monitors that are certified by NVIDIA:
- Acer XFA240
- Acer XG270HU
- Acer XV273K
- Acer XZ321Q
- AOC Agon AG241QG4
- AOC G2590FX
- ASUS MG278Q
- ASUS XG248
- ASUS VG258Q
- ASUS XG258
- ASUS VG278Q
- BenQ XL2740
Users with a certified G-SYNC compatible monitor will have G-SYNC automatically enabled via the NVIDIA Control Panel when the driver is updated and the display is connected, the same process as connecting an official G-SYNC display. Those with a variable refresh rate display that is not certified must manually open the NVIDIA Control Panel and enable G-SYNC.
NVIDIA notes, however, that enabling the feature on displays that don't meet the company's performance capabilities may lead to a range of issues, from blurring and stuttering to flickering and blanking. The good news is that the type and severity of the issues will vary by display, so users can determine for themselves if the potential problems are acceptable.
Update: Users over at the NVIDIA subreddit have created a public Google Sheet to track their reports and experiences with various FreeSync monitors. Check it out to see how others are faring with your preferred monitor.
Subject: General Tech | January 10, 2019 - 12:37 PM | Jeremy Hellstrom
Tagged: nvidia, Intel, ces 2019, amd
The Tech Report just posted a nice assortment of updates covering their travels at CES, including their take AMD's new GPU and processors. They also took a look at Intel's offerings, and not just the fresh splash of seemingly bottomless Coffee, this time without the picture drawn on the top. What was far more interesting were the lineup of 10nm chips announced, Lakefield for low power applications, Ice Lake mainstream chips and Snow Ridge, an SoC designed for network applications. Of course, it wouldn't be an Intel briefing without Optane, to which the H10 series was announced, which sports both QLC 3D NAND and 3D XPoint on a M.2 2280 gumstick. It has a controller for both types of memory which means the bulk of the heavy lifting will be done onboard and not pushed onto your CPU.
"That title probably rests on the shoulders of four upcoming Intel products based on the company's beleaguered 10-nm fabrication process: the Lakefield low-power client processors, the Snow Ridge network SoC, and Ice Lake chips for every market segment."
Here is some more Tech News from around the web:
- Just updated Windows 7? Can't access network shares? It isn't just you @ The Register
- AMD Keynote for CES 2019 – Radeon VII “Vega on 7nm”, Mobile Graphics and an Interesting Zen 2 Benchmark @ Bjorn3d
- Steamer closets, flying cars, robot boxers, smart-mock-cock ban hypocrisy – yes, it's the worst of CES this year @ The Register
- A sampling of networking gear from CES: TP-Link goes Wi-Fi 6, D-Link goes 5G @ Ars Technica
- Lexar reveals the first 1TB SD card you can actually buy @ The Register
- Steam bug sees acclaimed indie games flagged as 'fake' @ The Inquirer
- Don't Expect A New Nvidia Shield Tablet Anytime Soon @ Slashdot
Subject: Systems, Shows and Expos | January 7, 2019 - 08:00 PM | Scott Michaud
Tagged: nvidia, Lenovo, Legion, Intel, geforce, gaming laptop, ces 2019, CES
Three new laptops have been added to Lenovo’s portfolio under their “Legion” gaming brand. All three of them will contain “Unannounced NVIDIA GeForce GPUs”.
The Lenovo Legion Y740 comes in two sizes: 15-inch and 17-inch. Based on the slide deck, both models have the choice between the Intel Core i5-8300H and the Intel Core i7-8750H. The Core i5-8300H is a quad-core CPU with HyperThreading (eight threads) that can turbo up to 4 GHz. The Core i7-8750H is a six-core CPU with HyperThreading (twelve threads) that can turbo up to 4.1 GHz. This can be paired with 8, 16, or 32GB of RAM at 2666MHz, or “8GB + 8GB 3200MHz Corsair Overclocked Memory”.
As for storage, both models can have up to 512GB of PCIe SSD, 512GB of SATA SSD, or 2TB of spinning metal. The 17-inch model can also have an Intel Optane drive added to it, although they don’t list a specific size. Both models also have 1x USB-C connector with support for Thunderbolt, DisplayPort, and USB 3.1. Alongside the USB-C is, also, HDMI, LAN, three standard USB 3.1 Gen 2, and a mini-DisplayPort connector. They also have an RGB keyboard, which, from the picture, appears to be tenkeyless. Both have Dolby sound, but only the 17-inch model also has a subwoofer. They do not list an audio jack, although I see a hole on the left side that could be either audio or a power plug. I think I also see power on the back, so I assume that it is audio on the side. Mobile phones are one thing, but a laptop better have a headphone jack.
The built-in displays are 1080p, which is a good size for a laptop, and support 144 Hz G-Sync @ 300nit. There is also an upsell to a 500nit panel that has been certified for Dolby HDR400. They don’t say whether the upsell also supports 144Hz G-Sync, but I would assume that they do. Check before you buy, though.
Both sizes will be available in February 2019. The 15-inch starts at $1749.99 USD and the 17-inch starts at $1979.99 USD.
The third model is the Lenovo Legion Y540. This one will be available a little bit later – May 2019. Interestingly, the CPU is listed as “Intel Core processors”. As such, I would assume that this laptop will use a new, unannounced processor alongside the unannounced GeForce GPU. Lenovo does mention that the laptop can be paired with up to 32GB of RAM at 2666MHz.
The battery is listed as “52.5Wh & 57Wh (Configuration dependent)”. Since an extra 4.5Wh seems like a tough upsell, I am guessing that battery you receive will be tied to the chosen display, but Lenovo doesn’t say so I don’t know. It looks like there will be a choice between three displays: a 60Hz 1080p IPS panel at 250nits with “45%” color, a 60Hz 1080p IPS panel at 300nits with “72%” color, and a 144Hz IPS panel at 300nits with “72%” color. I put each of the color space percentages in quotations because they don’t list which color space. Since one of them is an HDR panel, I’m going to assume that they don’t mean sRGB… because that would be awful. I am hoping that they are referring to the DCI-P3 color space. They could mean NTSC 1976, although that would be a bit low for an HDR panel.
The laptop has a USB-C port but, unlike the Y740, it can only be used for USB 3.1. There are also three standard USB 3.1 ports, one HDMI port, one mini-DisplayPort, an Ethernet jack, and a 3.5mm audio jack, so you can still attach external monitors to it without the USB-C. They keyboard is backlight, but not RGB – just white.
As mentioned, the Lenovo Legion Y540 will be available in May 2019. It will start at $929.99 USD.
Subject: Displays, Shows and Expos | January 7, 2019 - 08:00 PM | Scott Michaud
Tagged: nvidia, Lenovo, g-sync, freesync 2, display, ces 2019, CES, amd
Lenovo has added two monitors to their Legion line of gaming devices.
The Lenovo Legion Y44w is a 43.4” gaming display. Most of that size is horizontal, however, because it has a 32:10 aspect ratio. If you have ever used a 1920x1200 monitor, which was the PC equivalent of 1080p while PC manufacturers believed that 16:9 was too wide so they settled on 16:10 for the Windows Vista era, then you should imagine two of them side-by-side in a single monitor. In fact, the Y44w supports two separate video inputs if you wish to split the monitor down the middle into two side-by-side 1920x1200 displays. It can also operate as a single, 3840x1200 display, of course. This resolution is a little over half of a 4K panel, so it should be easier for second-tier GPUs to feed.
Beyond the resolution, the color gamut is listed as “99% sRGB, BT.709, DCI-P3” and it is certified as VESA HDR400. If the slide deck is correct and it can do 99% DCI-P3 at HDR400, then it should have an amazing picture. It can also do 144 Hz with FreeSync 2, so you do not need to compromise refresh rate to get those beautiful colors. The also have an optional speaker from Harman Kardon that can be attached to the display.
The Lenovo Legion Y44w will be available in April 2019 for $1199.99 USD.
Lenovo also announced the Legion Y27gq gaming monitor. This one is a standard 16:9, 1440p, TN panel that can be driven up to 240 Hz. It supports G-Sync, but not HDR. Despite not supporting HDR, it still covers 90% of DCI-P3, which is quite wide for a TN panel. Lenovo is listing it as an “eSport gaming monitor”… so you can probably guess that high refresh rate and G-Sync are the focus.
If you gotta go fast, then the Lenovo Legion Y27gq is available in April 2019 for $999.99 USD.
Subject: Graphics Cards | January 7, 2019 - 04:34 PM | Jeremy Hellstrom
Tagged: video card, turing, tu106, RTX 2060, rtx, nvidia, graphics card, gpu, gddr6, gaming
After months of rumours and guesses as to what the RTX 2060 will actually offer, we finally know. It is built on the same TU106 the RTX 2070 uses and sports somewhat similar core clocks though the drop in TC, ROPs and TUs reduces it to producing a mere 5 GigaRays. The memory is rather different, with the 6GB of GDDR6 connected via 192-bit bus offering 336.1 GB/s of bandwidth. As you saw in Sebastian's testing the overall performance is better than you would expect from a mid-range card but at the cost of a higher price.
If we missed out on your favourite game, check the Guru of 3D's suite of benchmarks or one of the others below.
"NVIDIA today announced the GeForce RTX 2060, the graphics card will be unleashed next week the 15th at a sales price of 349 USD / 359 EUR. Today, however, we can already bring you a full review of what is a pretty feisty little graphics card really."
Here are some more Graphics Card articles from around the web:
- NVIDIA GeForce RTX 2060 FE Review @ Legit Reviews
- RTX 2060 Review with 39 games @ BabelTechReviews
- NVIDIA Geforce RTX 2060 Founders Edition Review @ OCC
- Nvidia RTX 2060 Founders Edition 6GB @ Kitguru
- Battlefield V NVIDIA Ray Tracing RTX 2080 @ [H]ard|OCP
- The GPU Compute Performance From The NVIDIA GeForce GTX 680 To TITAN RTX @ Phoronix
- The HD 7970 vs. the GTX 680 – revisited after 7 years @ BabelTechReviews
We have to go all the way back to 2015 for NVIDIA's previous graphics card announcement at CES, with the GeForce GTX 960 revealed during the show four years ago. And coming on the heels of this announcement today we have the latest “mid-range” offering in the tradition of the GeForce x60 (or x060) cards, the RTX 2060. This launch comes as no surprise to those of us following the PC industry, as various rumors and leaks preceded the announcement by weeks and even months, but such is the reality of the modern supply chain process (sadly, few things are ever really a surprise anymore).
But there is still plenty of new information available with the official launch of this new GPU, not the least of which is the opportunity to look at independent benchmark results to find out what to expect with this new GPU relative to the market. To this end we had the opportunity to get our hands on the card before the official launch, testing the RTX 2060 in several games as well as a couple of synthetic benchmarks. The story is just beginning, and as time permits a "part two" of the RTX 2060 review will be offered to supplement this initial look, addressing omissions and adding further analysis of the data collected thus far.
Before getting into the design and our initial performance impressions of the card, let's look into the specifications of this new RTX 2060, and see how it relates to the rest of the RTX family from NVIDIA. We are taking a high level look at specs here, so for a deep dive into the RTX series you can check out our previous exploration of the Turing Architecture here.
"Based on a modified version of the Turing TU106 GPU used in the GeForce RTX 2070, the GeForce RTX 2060 brings the GeForce RTX architecture, including DLSS and ray-tracing, to the midrange GPU segment. It delivers excellent gaming performance on all modern games with the graphics settings cranked up. Priced at $349, the GeForce RTX 2060 is designed for 1080p gamers, and delivers an excellent gaming experience at 1440p."
|RTX 2080 Ti||RTX 2080||RTX 2070||RTX 2060||GTX 1080||GTX 1070|
|Base Clock||1350 MHz||1515 MHz||1410 MHz||1365 MHz||1607 MHz||1506 MHz|
|Boost Clock||1545 MHz/
1635 MHz (FE)
1800 MHz (FE)
1710 MHz (FE)
|1680 MHz||1733 MHz||1683 MHz|
|Ray Tracing Speed||10 Giga Rays||8 Giga Rays||6 Giga Rays||5 Giga Rays||--||--|
|Memory Clock||14000 MHz||14000 MHz||14000 MHz||14000 MHz||10000 MHz||8000 MHz|
|Memory Interface||352-bit GDDR6||256-bit GDDR6||256-bit GDDR6||192-bit GDDR6||256-bit GDDR5X||256-bit GDDR5|
|Memory Bandwidth||616 GB/s||448 GB/s||448 GB/s||336.1 GB/s||320 GB/s||256 GB/s|
|TDP||250 W /
260 W (FE)
|175 W / 185W (FE)||160 W||180 W||150 W|
|MSRP (current)||$1200 (FE)/
|$599 (FE)/ $499||$349||$549||$379|
Subject: General Tech | January 7, 2019 - 01:47 PM | Jeremy Hellstrom
Tagged: nvidia, g-sync, freesync, benq, asus, AOC, amd, adaptive sync, acer
G-SYNC is showing some signs of defeat as today NVIDIA announced that several Adaptive Sync monitors have been tested and rated as G-SYNC compatible. Adaptive Sync is the official VESA technology which is present in AMD's FreeSync monitors and it offers a definitive financial advantage over NVIDIA's G-SYNC as the module required for G-SYNC can add hundreds of dollars to the price.
So far only a dozen monitors out of around 400 tests have been rated as G-SYNC compatible, so don't expect to be mixing your monitors quite yet but it does imply in some cases the extra controller is not required for variable refresh rates with either NVIDIA's or AMD's GPUs. The results of this test give AMD bragging rights for implementing adaptive sync in the most attractive way but this change could hurt GPU sales as users can now opt for an GeForce card paired with a FreeSync display.
Even if your display is not listed in those models, you can try enabling adaptive sync over DisplayPort and see if it works, though your results may vary. Ars Technica lists the models here.
"Besides being unexpected good news for gamers who already own one of these FreeSync monitors, this is also great news for gamers that want to add VRR to their Nvidia graphics card setup without breaking the bank."
Here is some more Tech News from around the web:
- Marriott: Good news. Hackers only took 383 million booking records ... and 5.3m unencrypted passport numbers @ The Register
- Asus ZenBook S13 brings the display notch to laptops @ The Inquirer
- New side-channel leak: Boffins bash operating system page caches until they spill secrets @ The Register
- Vinyl and Cassette Sales Continued To Grow Last Year @ Slashdot
- 2018 review and 2019 outlook: Sharp price falls to boost NAND flash penetration @ DigiTimes
- Controlling Non-Googley Devices With Google Assistant @ Hackaday
- Huawei's 7nm Kunpeng 920 is 'industry's fastest' ARM-based processor @ The Inquirer
- The Ultimate Guide to Buying a Used Graphics Card @ Techspot
- ThunderX3 UC5 HEX RGB Gaming Chair Review @ NikKTech
Subject: Graphics Cards | January 7, 2019 - 02:46 AM | Jim Tanous
Tagged: rtx mobile, RTX 2080, RTX 2070, RTX 2060, rtx, nvidia, max-q, gaming laptop, ces2019
NVIDIA just wrapped up its CES keynote, and in addition to the expected unveiling of the RTX 2060, the company announced new mobile GeForce RTX options. More than 40 upcoming laptops, including 17 sporting NVIDIA’s Max-Q design, will offer RTX 2080, RTX 2070, and RTX 2060 graphics options.
NVIDIA CEO Jensen Huang likened GeForce RTX-powered laptops to a gaming console platform, pointing out multiple times performance comparisons to traditional game consoles like the PlayStation 4.
Laptops are the fastest growing gaming platform — and just getting started. The world’s top OEMs are using Turing to bring next-generation console performance to thin, sleek laptops that gamers can take anywhere. Hundreds of millions of people worldwide — an entire generation — are growing up gaming. I can’t wait for them to experience this new wave of laptops.
New GeForce RTX laptops will continue to support features like WhisperMode, which paces frame rates for AC-connected laptops to reduce heat and therefore fan noise, NVIDIA Battery Boost, which uses GeForce Experience to optimize performance for longer battery life, and of course G-SYNC.
Beyond gaming, NVIDIA is touting the benefits of the RTX platform for content creators, such as real-time video encoding for live streamers, faster rendering for video editors, and accurate interactive lighting, reflections, and shadows for animators.
Laptops sporting GeForce RTX cards will be available starting January 29th from NVIDIA partners including Acer, Alienware, ASUS, Dell, Gigabyte, HP, Lenovo, MSI, Razer, and Samsung. Pricing, detailed configuration options, and exact availability will vary and is not yet available for all manufacturers.