Subject: Graphics Cards | January 7, 2019 - 02:46 AM | Jim Tanous
Tagged: rtx mobile, RTX 2080, RTX 2070, RTX 2060, rtx, nvidia, max-q, gaming laptop, ces2019
NVIDIA just wrapped up its CES keynote, and in addition to the expected unveiling of the RTX 2060, the company announced new mobile GeForce RTX options. More than 40 upcoming laptops, including 17 sporting NVIDIA’s Max-Q design, will offer RTX 2080, RTX 2070, and RTX 2060 graphics options.
NVIDIA CEO Jensen Huang likened GeForce RTX-powered laptops to a gaming console platform, pointing out multiple times performance comparisons to traditional game consoles like the PlayStation 4.
Laptops are the fastest growing gaming platform — and just getting started. The world’s top OEMs are using Turing to bring next-generation console performance to thin, sleek laptops that gamers can take anywhere. Hundreds of millions of people worldwide — an entire generation — are growing up gaming. I can’t wait for them to experience this new wave of laptops.
New GeForce RTX laptops will continue to support features like WhisperMode, which paces frame rates for AC-connected laptops to reduce heat and therefore fan noise, NVIDIA Battery Boost, which uses GeForce Experience to optimize performance for longer battery life, and of course G-SYNC.
Beyond gaming, NVIDIA is touting the benefits of the RTX platform for content creators, such as real-time video encoding for live streamers, faster rendering for video editors, and accurate interactive lighting, reflections, and shadows for animators.
Laptops sporting GeForce RTX cards will be available starting January 29th from NVIDIA partners including Acer, Alienware, ASUS, Dell, Gigabyte, HP, Lenovo, MSI, Razer, and Samsung. Pricing, detailed configuration options, and exact availability will vary and is not yet available for all manufacturers.
Subject: Graphics Cards | January 7, 2019 - 01:59 AM | Sebastian Peak
Tagged: video card, RTX 2060, rtx, ray tracing, nvidia, graphics, gpu, geforce, ces 2019, CES
On stage at an event tonight at CES 2019, NVIDIA CEO Jensen Huang made it offical: the RTX 2060 exists and will be available this month. The card is priced at $349, and is based on the same Turing architecture as the rest of the RTX family.
The RTX 2060 was announced with 6GB of GDDR6 memory, and like its bigger siblings the RTX 2060 offers ray tracing support (with 240 Tensor Cores onboard), and NVIDIA targets 60 FPS performance with ray tracing enabled in Battlefield V:
"The RTX 2060 is 60 percent faster on current titles than the prior-generation GTX 1060, NVIDIA’s most popular GPU, and beats the gameplay of the GeForce GTX 1070 Ti. With Turing’s RT Cores and Tensor Cores, it can run Battlefield V with ray tracing at 60 frames per second."
That 60% increase comes from benchmarks the company ran using 2560x1440 resolution, and the RTX 2060 is targeting resolutions from the mainstream 1920x1080 up to 2560x1440, though with performance between a GTX 1070 and 1080 the RTX 2060 could very well support 3840x2160 gaming at medium-to-high settings as well.
The official launch of the RTX 2060 is January 15 from add-in partners, as well as a Founders Edition card from NVIDIA beginning on that date. NVIDIA is also launching a new bundle deal. Qualifying RTX 2060 purchasers, either as a standalone card or as part of a desktop including the RTX 2060, can choose to receive either Battlefield V or the upcoming Anthem for free.
Stay tuned for more details on the GeForce RTX 2060 soon.
Subject: Displays | January 6, 2019 - 01:10 PM | Jim Tanous
Tagged: Omen, nvidia, hp, g-sync hdr, g-sync, ces2019, bfgd, 144hz
After first unveiling them at last year’s CES, NVIDIA’s Big Format Gaming Displays (BFGD) finally have an official price point. Engadget met up with NVIDIA partner HP at CES 2019 to preview the company’s Omen X Emperium BFGD.
The 65-inch 4K display sports G-SYNC HDR, 144Hz refresh rate, an integrated sound bar, and built-in NVIDIA SHIELD interface. The starting price? $4,999.
That price isn’t too surprising; rumors and leaks from NVIDIA’s BFGD partners had suggested the $5,000 range. And when you consider that the first true G-SYNC HDR displays hit the market at $2,000 for a paltry 27-inches, the BFGD’s price seems reasonable in that context.
But with HP showing its hand early on here at CES, it’s likely that we can expect NVIDIA’s other BFGD partners to be priced in the same ballpark. We have yet to receive further details on any smaller BFGDs, but if you’re crazy enough to pay any price for giant, G-SYNC HDR gaming, you’ll be able to pick up the HP Omen X Emperium starting in February.
Subject: Graphics Cards | January 2, 2019 - 12:34 PM | Sebastian Peak
Tagged: pascal, overclocking, OC Scanner, nvidia, GTX 1080, gtx 1070, gtx 1060, geforce
GPU overclocking utility MSI Afterburner now supports automatic Pascal overclocking, bringing this feature to the GTX 10-series for the first time. NVIDIA had previously offered the OC Scanner only for the Turing-based RTX graphics cards (we compared OC Scanner vs. manual results using a previous version in our MSI GeForce RTX 2080 Gaming X Trio review), but a new version of the API is incorporated in Afterburner v4.6.0 beta 10.
"If you purchased a GeForce GTX 1050, 1060, 1070, 1080, Titan X, Tian Xp, Titan V (Volta) or AMD Radeon RX 5x0 and Vega graphics card we can recommend you to at least try out this latest release. We have written a GeForce GTX 1070 and 1080 overclocking guide right here. This is the new public final release of MSI AfterBurner. Over the past few weeks we have made a tremendous effort to get a lot of features enabled for this build."
The release notes are massive for this latest version, and you can view them in full after the break.
Subject: Graphics Cards | January 1, 2019 - 12:41 AM | Tim Verry
Tagged: turing, tu106, RTX 2060, nvidia, gaming
Videocardz recently released information on the NVIDIA RTX 2060 that sheds more light on the rumored card. Reportedly sourced from a copy of the official reviewer's guide, Videocardz claims that they are now able to confirm the specifications of the RTX 2060 including 1920 CUDA cores, 240 tensor cores, 30 ray tracing cores, and 6GB GDDR6 memory.
Graphics cards using the TU106-300 GPU will be available in stock and factory overclocked designs with the NVIDIA reference or AIB custom coolers. Display outputs include DVI, HDMI, and DisplayPort
|RTX 2060||RTX 2070||GTX 1070 Ti||RX Vega 64||RX Vega 56|
|GPU||TU106-300||TU106-400||GP104||Vega 10||Vega 10|
|CUDA cores||1920||2304||2432||4096 SPs||3584 SPs|
|Memory||6GB GDDR6||8GB GDDR6||8GB GDDR5||8GB HBM2||8GB HBM2|
|SP Compute||6.5 TF||7.5 TF||7.8 TF||12.5 TF (13.7 AIO)||10.5 TF|
|Base clock||1365||1410||1607||1200 (1406 AIO)||1156|
|Boost clock||1680||1710 (FE)||1683||1546 (1677 AIO)||1471|
|Memory clock||14000 MHz||14000 MHz||8000 MHz||1890 MHz||1600 MHz|
|Launch MSRP||$349||$499 (599 FE)||$449||$499||$399|
|Pricing 1-1-19||?||$500+||$405+||$400+ ($500+ AIO)||$470+(?)|
Allegedly, the RTX 2060 will offer up performance that is comparable to last generation's GTX 1070 Ti in 1080p and 1440p gaming scenarios. In a couple games the card even gets close to the GTX 1080 but in most of the titles listed by Videocardz (from the alleged reviewer's guide) the new GPU comes in slightly faster ot slightly slower than the 1070 Ti depending on the specific game. The RTX 2060 and its 30 RT cores can reportedly pull off playable 65 FPS Battlefield V even with RTX enabled with performance looking better with DLSS turned on at 88 FPS compared to RTX off performance of 90 FPS. Granted, that is Battlefield V at 1080p rather than the 1440p or 4k that the beefier RTX cards can push out.
When it comes to pricing, the RTX 2060 will have a MSRP of $349 with AIB and Founder's Edition being at the same level. RTX 2060 graphics cards are slated to launch om January 7th and will be available as soon as January 15th. If true we will not have long to wait until it is official and reviews are unveiled.
If you are curious about the rumored performance, check out the charts Videocardz uncovered.
Subject: Graphics Cards | December 21, 2018 - 02:00 PM | Jim Tanous
Tagged: physx 4.0, PhysX, open source, nvidia
As promised in the company's initial announcement earlier this month, NVIDIA has released the newly open-sourced PhysX 4.0 SDK via GitHub. Now, thanks to its 3-Clause BSD license, any game developer, hardware company, or coding enthusiast can grab the latest version of NVIDIA's realtime physics engine and tinker, improve, or implement it in hopefully creative new ways.
The one limitation, of course, is that in its current form PhysX 4.0 (and version 3.4, which is now open source, too) still references lots of NVIDIA's closed source APIs, notably CUDA. But with the PhysX framework now available to fork, there's nothing to stop an eager company or programmer from creating and implementing their own alternatives to NVIDIA's proprietary tech.
In addition to going open source, PhysX 4.0 introduces a number of new features as outlined on NVIDIA's developer site:
- Temporal Gauss-Seidel Solver (TGS), which makes machinery, characters/ragdolls, and anything else that is jointed or articulated much more robust. TGS dynamically re-computes constraints with each iteration, based on bodies’ relative motion.
- The new reduced coordinate articulations feature makes the simulation of joints possible with no relative position error and realistic actuation.
- New automatic multi-broad phase.
- Increased scalability with new filtering rules for kinematics and statics.
- Actor-centric scene queries significantly improve performance for actors with many shapes.
- Build system now based on CMake.
BSD 3 licensed platforms:
- Apple iOS
- Apple MacOS
- Google Android ARM
- Microsoft Windows
Unchanged NVIDIA EULA platforms:
- Microsoft XBox One
- Sony Playstation 4
- Nintendo Switch
Subject: Graphics Cards | December 17, 2018 - 03:24 PM | Sebastian Peak
Tagged: rumor, report, nvidia, leak, GTX 2060, graphics, gpu, geforce, gaming
We've been hearing rumors about a GeForce RTX 2060 since at least August, with screen captures of a reported mid-range Pascal card (then assumed to be "GTX" 2060) - seemingly with GTX 1080 levels of performance - surfacing at that time.
Then in November there was the reported Final Fantasy XV benchmark leak, showing performance a little below a GTX 1070 with the game running at 3840x2160 (high quality preset) - but this was possibly the mobile skew according to leaker APISAK on Twitter.
A week or so ago we saw an image of a Gigabyte card from VideoCardz.com which the site said was the RTX 2060:
Image via VideoCardz.com
"Our sources at Gigabyte have confirmed GeForce RTX 2060 graphics card launching soon. The card features TU106 GPU with 1920 CUDA cores and 6GB of GDDR6 memory. The model pictured below is factory-overclocked, but the exact clock remains unconfirmed." (Source: VideoCardz)
It seems fair to assume that a launch is imminent, with reports of a potential announcement the second week of January which may or may not coincide with CES 2019. As to final specs and pricing? Let the speculation commence!
Subject: General Tech | December 13, 2018 - 01:15 PM | Jeremy Hellstrom
Tagged: nvidia, machine learning, jetson, AGX Xavier
NVIDIA claims their newly announced Jetson AGX Xavier SoC can provide up to 32 trillion operations per second for specific tasks, requiring a mere 10W to do so. The chips are design for image processing and recognition along with all those other 'puter learnin' things you would expect and chances are a device will have several of these chips working in tandem, which offers a lot of processing power. It is already being used for real time monitoring of DNA sequencing and will be installed in car manufacturing lines in Japan.
The Inquirer points out that this performance comes at a cost, currently $1100 per unit as long as you are buying 1000 of them or more.
"Essentially a data wrangling server plonked onto a silicon package, Jetson AGX Xavier is designed to handle all the tech and processing that autonomous things need to go about their robot lives, such as image processing and computer vision and the inference of deep learning algorithms."
Here is some more Tech News from around the web:
- RISC-V Will Stop Hackers Dead From Getting Into Your Computer @ Hackaday
- Small American town rejects Comcast – while ISP reps take issue with your El Reg vultures @ The Register
- In a Test, 3D Model of a Head Was Able To Fool Facial Recognition System of Several Popular Android Smartphones @ Slashdot
- Russia's cutting edge robot turns out to be bloke in costume @ The Inquirer
- Asustek president Jerry Shen to leave in major business revamp @ DigiTimes
- Julius Lilienfeld and the First Transistor @ Hackaday
- Ticketmaster tells customer it's not at fault for site's Magecart malware pwnage @ The Register
- The Worst CPU & GPU Purchases of 2018 @ Techspot
- Hands-on: Switch’s NES controllers offer unmatched old-school authenticity @ Ars Technica
- Standing Desk Starter Guide: Some Dos and Don'ts @ Techspot
Subject: Graphics Cards, Mobile | December 12, 2018 - 10:04 PM | Tim Verry
Tagged: turing, rumor, RTX 2070, RTX 2060, nvidia
Rumors have appeared online that suggest NVIDIA may be launching mobile versions of its RTX 2070 and RTX 2060 GPUs based on its new Turing architecture. The new RTX 2070 and RTX 2060 with Max-Q designs were leaked by Twitter user TUM_APISAK who posted cropped screenshots of Geekbench 4.3.1 and 3DMark 11 Performance results.
Allegedly handling the graphics duties in a Lenovo 81HE, the GeForce RTX 2070 with Max-Q Design (8GB VRAM) combined with a Core i7-8750H Coffee Lake six core CPU and 32 GB system memory managed a Geekbench 4.3.1 score of 223,753. The GPU supposedly has 36 Compute Units (CUs) and a core clockspeed of 1,300 MHz. The desktop RTX 2070 GPU which is already available also has 36 CUs with 2,304 CUDA cores, 144 texture units, 64 ROPS, 288 Tensor cores, and 36 RT (ray tracing) cores. The desktop GPU has a 175W reference (non FE) TDP and clocks of 1410 MHz base and 1680 MHz boost (1710 MHz for Founder's Edition). Assuming that 36 CU number is accurate, the mobile (RTX 2070M) may well have the same core counts, just running at lower clocks which would be nice to see but would require a beefy mobile cooling solution.
As far as the RTX 2060 Max-Q Design graphics processor, not as much information was leaked as far as specifications as the leak was limited to two screenshots allegedly from Final Fantasy XV's benchmark results page comparing a desktop RTX 2060 with a Max-Q RTX 2060. The number of CUs (and other numbers like CUDA/Tensor/RT cores, TMUs, and ROPs) was not revealed in those screenshots, for example. The comparison does lend further credence to the rumors of the RTX 2060 utilizing 6 GB of GDDR6 memory though. Tom's Hardware does have a screenshot that shows the RTX 2060 with 30 CUs which suggest 1,920 CUDA cores, 240 Tensor cores, and 30 RT cores though with clocks up to 1.2 GHz (which does mesh well with previous rumors of the desktop part).
|Graphics Card||Generic VGA||Generic VGA|
|Memory||6144 MB||6144 MB|
|Core clock||960 MHz||975 MHz|
|Memory Clock||1750 MHz||1500 MHz|
|Driver name||NVIDIA GeForce RTX 2060||NVIDIA GeForce RTX 2060 with Maz-Q Design|
Also, the TU106 RTX 2060 with Max-Q Design reportedly has a 975 MHz core clock and a 1500 MHz (6 GHz) memory clock. Note that the 960 MHz core clock and 1750 MHz (7 GHz) memory clocks don't match previous RTX 2060 rumors which suggested higher GPU clocks in particular (up to 1.2 GHz). To be fair, it could just be the software reporting incorrect numbers due to the GPUs not being official yet. One final bit of leaked information included a note about 3DMark 11 performance with the RTX 2060 Max Q Design GPU hitting at least 19,000 in the benchmark's Performance preset which allegedly puts it in between the scores of the mobile GTX 1070 and the mobile GTX 1070 Max-Q. (A graphics score between nineteen and twenty thousand would put it a bit above a desktop GTX 1060 but far below the desktop 1070).
As usual, take these rumors and leaked screenshots with a healthy heaping of salt, but they are interesting nonetheless. Combined with the news about NVIDIA possibly announcing new mid-range GPUs at CES 2019, we may well see new laptops and other mobile graphics solutions shown off at CES and available within the first half of 2019 which would be quite the coup.
What are your thoughts on the rumored RTX 2060 for desktops and its mobile RTX 2060 and RTX 2070 Max-Q siblings?
Subject: General Tech | December 12, 2018 - 12:37 PM | Jeremy Hellstrom
Tagged: RTX 2060, nvidia, navi, amd
The majority of today's news will cover Intel's wide range of announcements from their architecture day, with new Optane DIMMs seeking to reduce latency to come close to matching that of DRAM to Foveros chiplets and hints of coming in off the Lake to spend some time in a Sunny Cove. Indeed there are more links below the fold offering more coverage as yesterdays announcements were very dense.
That might overshadow a rumour which dedicated discrete GPUs lovers would be interested in, the fact that NVIDIA might be able to get the RTX 2060 to market before AMD can launch a Navi based card. The Inquirer has seen rumours that NVIDIA might be able to release the card in the first half of 2019, while the 7nm Navi isn't expected until the second half of year. The early supply of mid-range NVIDIA GPUs might attract buyers who no longer want to wait; though depending on how Navi performs they could come to regret that lack of patience.
"GRAPHICS CARDS IN 2019 are set to get a good bit more interesting, as a leak suggests that Nvidia's GeForce RTX 2060 could reach the market before AMD's next-gen Navi Radeon cards."
Here is some more Tech News from around the web:
- Intel 2018 Architecture Day @ [H]ard|OCP
- Intel talks about its architectural vision for the future @ The Tech Report
- Intel introduces Foveros: 3D die stacking for more than just memory @ Ars Technica
- Intel Architecture Day – Foveros, Sunny Cove and Gen11 Graphics Coming Soon @ Legit Reviews
- TSMC to expand 8-inch fab capacity for robust demand for automotive, IoT @ DigiTimes
- The internet is going to hell and its creators want your help fixing it @ The Register
- Synology MR2200ac Mesh Router Review: First WPA3-Certified Wi-Fi Router @ Modders-Inc
- LG's beer-making bot singlehandedly sucks all fun, boffinry from home brewing @ The Register
- Ever Wondered How Those Computer-Controlled Christmas Light Displays Work? @ Techspot