Subject: Graphics Cards | January 15, 2019 - 03:25 AM | Jim Tanous
Tagged: variable refresh rate, nvidia, graphics driver, gpu, geforce, g-sync compatibility, g-sync, freesync
One of NVIDIA's biggest and most surprising CES announcements was the introduction of support for "G-SYNC Compatible Monitors," allowing the company's G-SYNC-capable Pascal and Turing-based graphics cards to work with FreeSync and other non-G-SYNC variable refresh rate displays. NVIDIA is initially certifying 12 FreeSync monitors but will allow users of any VRR display to manually enable G-SYNC and determine for themselves if the quality of the experience is acceptable.
Those eager to try the feature can now do so via NVIDIA's latest driver, version 417.71, which is rolling out worldwide right now. As of the date of this article's publication, users in the United States who visit NVIDIA's driver download page are still seeing the previous driver (417.35), but direct download links are already up and running.
The current list of FreeSync monitors that are certified by NVIDIA:
- Acer XFA240
- Acer XG270HU
- Acer XV273K
- Acer XZ321Q
- AOC Agon AG241QG4
- AOC G2590FX
- ASUS MG278Q
- ASUS XG248
- ASUS VG258Q
- ASUS XG258
- ASUS VG278Q
- BenQ XL2740
Users with a certified G-SYNC compatible monitor will have G-SYNC automatically enabled via the NVIDIA Control Panel when the driver is updated and the display is connected, the same process as connecting an official G-SYNC display. Those with a variable refresh rate display that is not certified must manually open the NVIDIA Control Panel and enable G-SYNC.
NVIDIA notes, however, that enabling the feature on displays that don't meet the company's performance capabilities may lead to a range of issues, from blurring and stuttering to flickering and blanking. The good news is that the type and severity of the issues will vary by display, so users can determine for themselves if the potential problems are acceptable.
Update: Users over at the NVIDIA subreddit have created a public Google Sheet to track their reports and experiences with various FreeSync monitors. Check it out to see how others are faring with your preferred monitor.
Subject: Graphics Cards | January 12, 2019 - 08:17 AM | Jim Tanous
Tagged: vega 64, Vega, RX VEGA 64, radeon vii, gpu, benchmarks, amd, 7nm
After announcing the Radeon VII this week at CES, AMD has quietly released its own internal benchmarks showing how the upcoming card potentially compares to the Radeon RX Vega 64, AMD's current flagship desktop GPU released in August 2017.
The internal benchmarks, compiled by AMD Performance Labs earlier this month, were released as a footnote in AMD's official Radeon VII press release and first noticed by HardOCP. AMD tested 25 games and 4 media creation applications, with the Radeon VII averaging around a 29 percent improvement in games and 36 percent improvement in professional apps.
AMD's test platform for its gaming Radeon VII benchmarks was an Intel Core i7-7700K with 16GB of DDR4 memory clocked at 3000MHz running Windows 10 with AMD Driver version 18.50. CPU frequencies and exact Windows 10 version were not disclosed. AMD states that all games were run at "4K max settings" with reported frame rate results based on the average of three separate runs each.
For games, the Radeon VII benchmarks show a wide performance delta compared to RX Vega 64, from as little as 7.5 percent in Hitman 2 to as much as 68.4 percent for Fallout 76. Below is a chart created by PC Perspective from AMD's data of the frame rate results from all 25 games.
In terms of media creation applications, AMD changed its testing platform to the Ryzen 7 2700X, also paired with 16GB of DDR4 at 3000MHz. Again, exact processor frequencies and other details were not disclosed. The results reveal between a 27% and 62% improvement:
It is important to reiterate that the data presented in the above charts is from AMD's own internal testing, and should therefore be viewed skeptically until third party Radeon VII benchmarks are available. However, these benchmarks do provide an interesting first look at potential Radeon VII performance compared to its predecessor.
Radeon VII is scheduled to launch February 7, 2019 with an MSRP of $699. In addition to the reference design showcased at CES, AMD has confirmed that third party Radeon VII boards will be available from the company's GPU partners.
Subject: Graphics Cards | January 7, 2019 - 04:34 PM | Jeremy Hellstrom
Tagged: video card, turing, tu106, RTX 2060, rtx, nvidia, graphics card, gpu, gddr6, gaming
After months of rumours and guesses as to what the RTX 2060 will actually offer, we finally know. It is built on the same TU106 the RTX 2070 uses and sports somewhat similar core clocks though the drop in TC, ROPs and TUs reduces it to producing a mere 5 GigaRays. The memory is rather different, with the 6GB of GDDR6 connected via 192-bit bus offering 336.1 GB/s of bandwidth. As you saw in Sebastian's testing the overall performance is better than you would expect from a mid-range card but at the cost of a higher price.
If we missed out on your favourite game, check the Guru of 3D's suite of benchmarks or one of the others below.
"NVIDIA today announced the GeForce RTX 2060, the graphics card will be unleashed next week the 15th at a sales price of 349 USD / 359 EUR. Today, however, we can already bring you a full review of what is a pretty feisty little graphics card really."
Here are some more Graphics Card articles from around the web:
- NVIDIA GeForce RTX 2060 FE Review @ Legit Reviews
- RTX 2060 Review with 39 games @ BabelTechReviews
- NVIDIA Geforce RTX 2060 Founders Edition Review @ OCC
- Nvidia RTX 2060 Founders Edition 6GB @ Kitguru
- Battlefield V NVIDIA Ray Tracing RTX 2080 @ [H]ard|OCP
- The GPU Compute Performance From The NVIDIA GeForce GTX 680 To TITAN RTX @ Phoronix
- The HD 7970 vs. the GTX 680 – revisited after 7 years @ BabelTechReviews
We have to go all the way back to 2015 for NVIDIA's previous graphics card announcement at CES, with the GeForce GTX 960 revealed during the show four years ago. And coming on the heels of this announcement today we have the latest “mid-range” offering in the tradition of the GeForce x60 (or x060) cards, the RTX 2060. This launch comes as no surprise to those of us following the PC industry, as various rumors and leaks preceded the announcement by weeks and even months, but such is the reality of the modern supply chain process (sadly, few things are ever really a surprise anymore).
But there is still plenty of new information available with the official launch of this new GPU, not the least of which is the opportunity to look at independent benchmark results to find out what to expect with this new GPU relative to the market. To this end we had the opportunity to get our hands on the card before the official launch, testing the RTX 2060 in several games as well as a couple of synthetic benchmarks. The story is just beginning, and as time permits a "part two" of the RTX 2060 review will be offered to supplement this initial look, addressing omissions and adding further analysis of the data collected thus far.
Before getting into the design and our initial performance impressions of the card, let's look into the specifications of this new RTX 2060, and see how it relates to the rest of the RTX family from NVIDIA. We are taking a high level look at specs here, so for a deep dive into the RTX series you can check out our previous exploration of the Turing Architecture here.
"Based on a modified version of the Turing TU106 GPU used in the GeForce RTX 2070, the GeForce RTX 2060 brings the GeForce RTX architecture, including DLSS and ray-tracing, to the midrange GPU segment. It delivers excellent gaming performance on all modern games with the graphics settings cranked up. Priced at $349, the GeForce RTX 2060 is designed for 1080p gamers, and delivers an excellent gaming experience at 1440p."
|RTX 2080 Ti||RTX 2080||RTX 2070||RTX 2060||GTX 1080||GTX 1070|
|Base Clock||1350 MHz||1515 MHz||1410 MHz||1365 MHz||1607 MHz||1506 MHz|
|Boost Clock||1545 MHz/
1635 MHz (FE)
1800 MHz (FE)
1710 MHz (FE)
|1680 MHz||1733 MHz||1683 MHz|
|Ray Tracing Speed||10 Giga Rays||8 Giga Rays||6 Giga Rays||5 Giga Rays||--||--|
|Memory Clock||14000 MHz||14000 MHz||14000 MHz||14000 MHz||10000 MHz||8000 MHz|
|Memory Interface||352-bit GDDR6||256-bit GDDR6||256-bit GDDR6||192-bit GDDR6||256-bit GDDR5X||256-bit GDDR5|
|Memory Bandwidth||616 GB/s||448 GB/s||448 GB/s||336.1 GB/s||320 GB/s||256 GB/s|
|TDP||250 W /
260 W (FE)
|175 W / 185W (FE)||160 W||180 W||150 W|
|MSRP (current)||$1200 (FE)/
|$599 (FE)/ $499||$349||$549||$379|
Subject: Graphics Cards | January 7, 2019 - 01:59 AM | Sebastian Peak
Tagged: video card, RTX 2060, rtx, ray tracing, nvidia, graphics, gpu, geforce, ces 2019, CES
On stage at an event tonight at CES 2019, NVIDIA CEO Jensen Huang made it offical: the RTX 2060 exists and will be available this month. The card is priced at $349, and is based on the same Turing architecture as the rest of the RTX family.
The RTX 2060 was announced with 6GB of GDDR6 memory, and like its bigger siblings the RTX 2060 offers ray tracing support (with 240 Tensor Cores onboard), and NVIDIA targets 60 FPS performance with ray tracing enabled in Battlefield V:
"The RTX 2060 is 60 percent faster on current titles than the prior-generation GTX 1060, NVIDIA’s most popular GPU, and beats the gameplay of the GeForce GTX 1070 Ti. With Turing’s RT Cores and Tensor Cores, it can run Battlefield V with ray tracing at 60 frames per second."
That 60% increase comes from benchmarks the company ran using 2560x1440 resolution, and the RTX 2060 is targeting resolutions from the mainstream 1920x1080 up to 2560x1440, though with performance between a GTX 1070 and 1080 the RTX 2060 could very well support 3840x2160 gaming at medium-to-high settings as well.
The official launch of the RTX 2060 is January 15 from add-in partners, as well as a Founders Edition card from NVIDIA beginning on that date. NVIDIA is also launching a new bundle deal. Qualifying RTX 2060 purchasers, either as a standalone card or as part of a desktop including the RTX 2060, can choose to receive either Battlefield V or the upcoming Anthem for free.
Stay tuned for more details on the GeForce RTX 2060 soon.
Subject: Graphics Cards | December 17, 2018 - 03:24 PM | Sebastian Peak
Tagged: rumor, report, nvidia, leak, GTX 2060, graphics, gpu, geforce, gaming
We've been hearing rumors about a GeForce RTX 2060 since at least August, with screen captures of a reported mid-range Pascal card (then assumed to be "GTX" 2060) - seemingly with GTX 1080 levels of performance - surfacing at that time.
Then in November there was the reported Final Fantasy XV benchmark leak, showing performance a little below a GTX 1070 with the game running at 3840x2160 (high quality preset) - but this was possibly the mobile skew according to leaker APISAK on Twitter.
A week or so ago we saw an image of a Gigabyte card from VideoCardz.com which the site said was the RTX 2060:
Image via VideoCardz.com
"Our sources at Gigabyte have confirmed GeForce RTX 2060 graphics card launching soon. The card features TU106 GPU with 1920 CUDA cores and 6GB of GDDR6 memory. The model pictured below is factory-overclocked, but the exact clock remains unconfirmed." (Source: VideoCardz)
It seems fair to assume that a launch is imminent, with reports of a potential announcement the second week of January which may or may not coincide with CES 2019. As to final specs and pricing? Let the speculation commence!
Subject: Graphics Cards | December 13, 2018 - 09:01 AM | Jim Tanous
Tagged: Radeon Software Adrenalin Edition, radeon software, radeon, gpu, drivers, amd, Adrenalin Edition
AMD today released the latest major update to its Radeon software and driver suite. Building on the groundwork laid last year, AMD Radeon Software Adrenalin 2019 Edition brings a number of new features and performance improvements.
With this year’s software update, AMD continues to make significant gains in game performance compared to last year’s driver release, with an average gain of up to 15 percent in across a range of popular titles. Examples include Assassin’s Creed Odyssey (11%), Battlefield V (39%), and Shadow of the Tomb Raider (15%).
Beyond performance, Adrenalin 2019 Edition introduces a number of new and improved features. Highlights include:
Game Streaming: Radeon gamers can now stream any game or application from their PCs to their mobile devices via the AMD Link app at up to 4K 60fps. The feature supports both on-screen controls as well as Bluetooth controllers. ReLive streaming is also expanding to VR, with users able to stream games and videos from their PCs to standalone VR headsets via new AMD VR store apps. This includes Steam VR titles, allowing users to play high-quality PC-based VR games on select standalone headsets. AMD claims that its streaming technology offers “up to 44% faster responsiveness” than other game streaming solutions.
ReLive Streaming and Sharing: Gamers more interested in streaming their games to other people will find several new features in AMD’s ReLive feature, including adjustable picture-in-picture instant replays from 5 to 30 seconds, automatic GIF creation, and a new scene editor with more stream overlay options and hotkey-based scene transition control.
Radeon Game Advisor: A new overlay available in-game that helps users designate their target experience (performance vs. quality) and then recommends game-specific settings to achieve that target. Since the tool is running live alongside the game, it can respond to changes as they occur and dynamically recommend updated settings and options.
Radeon Settings Advisor: A new tool in the Radeon Software interface that scans system configuration and settings and recommends changes (e.g., enabling or disabling Radeon Chill, changing the display refresh rate, enabling HDR) to achieve an optimal gaming experience.
WattMan One-Click Tuning Improvements: Radeon WattMan now supports automatic tuning of memory overclocking, GPU undervolting, expanded fan control options, and unlocked DPM states for RX Vega series cards.
Display Improvements: FreeSync 2 can now tone-map HDR content to look better on displays that don’t support the full color and contrast of the HDR spec, and AMD’s Virtual Super Resolution feature is now supported on ultra-wide displays.
Radeon Overlay: AMD’s Overlay feature which allows gamers to access certain Radeon features without leaving their game has been updated to display system performance metrics, WattMan configuration options, Radeon Enhanced Sync controls, and the aforementioned Game Advisor.
AMD Link: AMD’s mobile companion app now offers easier setup via QR code scanning, voice control of various Radeon and ReLive settings (e.g., start/stop streaming, save replay, take screenshot), WattMan controls, enhanced performance metrics, and the ability to initiate a Radeon Software update.
Radeon Software Adrenalin 2019 Edition is available now from AMD’s support website for all supported AMD GPUs.
Subject: Graphics Cards | December 10, 2018 - 03:28 PM | Sebastian Peak
Tagged: Vega, trademark, rumor, report, radeon, graphics, gpu, amd, 7nm
The logo, with the familiar "V" joined by a couple of new stripes on the right side, could mean a couple of things; with a possible reference to Vega II (2), or perhaps the VII suggests the Roman numeral 7 for 7nm, instead? VideoCardz.com thinks the latter may be the case:
"AMD has registered a new trademark just 2 weeks ago. Despite many rumors floating around about Navi architecture and its possible early reveal or announcement in January, it seems that AMD is not yet done with Vega. The Radeon Vega logo, which features the distinctive V lettering, has now received 2 stripes, to indicate the 7nm die shrink."
Whatever the case may be it's interesting to consider the possibility of a 7nm Vega GPU before we see Navi. We really don't know, though it does seem a bit presumptuous to consider a new product as early as CES, as Tech Radar speculates:
"We know full well that the next generation of AMD graphics will be built upon a 7nm architecture going by the roadmaps the company released at CES 2018. At the same time, it seems to all sync up with AMD's plans to announce new 7nm GPUs at CES 2019, so it almost seems certain that we’ll see Vega II graphics cards soon."
The prospect of new graphics cards is always tantalizing, but we'll need more than a logo before things really get interesting.
Our First Look
Over the years, the general trend for new GPU launches, especially GPUs from new graphics architecture is to launch only with the "reference" graphics card designs, developed by AMD or NVIDIA. While the idea of a "reference" design has changed over the years, with the introduction of NVIDIA's Founders Edition cards, and different special edition designs at launch from AMD like we saw with Vega 56 and Vega 64, generally there aren't any custom designs from partners available at launch.
However with the launch of NVIDIA's Turing architecture, in the form of the RTX 2080 and RTX 2080 Ti, we've been presented with an embarrassment of riches in the form of plenty of custom cooler and custom PCB designs found from Add-in Board (AIB) Manufacturers.
Today, we're taking a look at our first custom RTX 2080 design, the MSI RTX 2080 Gaming X Trio.
|MSI GeForce RTX 2080 Gaming X Trio|
|Base Clock Speed||1515 MHz|
|Boost Clock Speed||1835 MHz|
|Memory Clock Speed||7000 MHz GDDR6|
|Outputs||DisplayPort x 3 (v1.4) / HDMI 2.0b x 1 / USB Type-C x1 (VirtualLink)|
12.9-in x 5.5-in x 2.1-in (327 x 140 x 55.6 mm)
|Weight||3.42 lbs (1553 g)|
Introduced with the GTX 1080 Ti, the Gaming X Trio is as you might expect, a triple fan design, that makes up MSI's highest performance graphics card offering.
Subject: Processors, Mobile | September 2, 2018 - 11:45 AM | Sebastian Peak
Tagged: SoC, octa-core, mobile, Mali-G76, Kirin, Huawei, HiSilicon, gpu, cpu, Cortex-A76, arm, 8-core
Huawei has introduced their subsidiary HiSilicon’s newest mobile processor in the Kirin 980, which, along with Huawei's claim of the world's first commercial 7nm SoC, is the first SoC to use Arm Cortex A76 CPU cores and Arm’s Mali G76 GPU.
Huawei is aiming squarely at Qualcomm with this announcement, claiming better performance than a Snapdragon 845 during the presentation. One of its primary differences to the current Snapdragon is the composition of the Kirin 980’s eight CPU cores, notable as the usual 'big.LITTLE' Arm CPU core configuration for an octa-core design gives way to a revised organization with three groups, as illustrated by AnandTech here:
Of the four Cortex A76 cores just two are clocked up to maximize performance with certain applications such as gaming (and, likely, benchmarks) at 2.60 GHz, and the other two are used more generally as more efficient performance cores at 1.92 GHz. The remaining four A55 cores operate at 1.80 GHz, and are used for lower-performance tasks. A full breakdown of the CPU core configuration as well as slides from the event are available at AnandTech.
Huawei claims that the improved CPU in the Kirin 980 results in "75 percent more powerful and 58 percent more efficient compared to their previous generation" (the Kirin 970). This claim translates into what Huawei claims to be 37% better performance and 32% greater efficiency than Qualcomm’s Snapdragon 845.
The GPU also gets a much-needed lift this year from Arm's latest GPU, the Mali-G76, which features "new, wider execution engines with double the number of lanes" and "provides dramatic uplifts in both performance and efficiency for complex graphics and Machine Learning (ML) workloads", according to Arm.
Real-world testing with shipping handsets is needed to verify Huawei's performance claims, of course. In fact, the results shown by Huawei at the presentation carry a this disclaimer, sourced from today’s press release:
"The specifications of Kirin 980 does not represent the specifications of the phone using this chip. All data and benchmark results are based on internal testing. Results may vary in different environments."
The upcoming Mate 20 from Huawei will be powered by this new Kirin 980 - and could very well provide results consistent with the full potential of the new chip - and that is set for an official launch on October 16.
The full press release is available after the break.