Subject: Graphics Cards | January 8, 2019 - 05:46 PM | Jeremy Hellstrom
Tagged: visontek, thunderbolt 3, external gpu, ces 2019
VisionTek announced a new external GPU enclosure today at CES, costing $350 for the enclosure and 240W power (cinder) block. The dual Thunderbolt 3 controllers provide enough bandwidth for your GPU as well as a pair of USB 3.0 ports, Gigabit ethernet and a SATA III port.
At 8.5x6x2.75" it is small enough to be easily accomodated on a desk whle still able to hold a variety of cards, up to about the size of an AMD Vega 56 8GB. It is compatible with Windows 10 and OS X and will let you string together up to six 4K displays @ 60fps. You can see the full PR below.
The VisionTek Thunderbolt 3 Mini eGFX Enclosure is ideal for creative professionals, IT/enterprise power users, professional users in healthcare, finance, scientific research labs, etc., and gaming enthusiasts seeking the ultimate GPU performance improvement to Thunderbolt 3 equipped laptops. Delivering up to 40Gbps of bandwidth, Thunderbolt 3 is the industry’s fastest interface that is rapidly becoming popular on new generation of laptops and mini PCs. VisionTek’s Mini eGFX combined with a graphics add-in card significantly boosts the GPU performance of a Thunderbolt 3 enabled laptop, via a plug and play connection to the enclosure.
Sleek, Portable, and Future Proof for the Most Demanding Applications
The VisionTek Thunderbolt 3 Mini eGFX Enclosure combines a sleek and portable design that easily fits discretely on a desk, or hidden away, to handle all your graphic intensive applications. VisionTek’s Mini eGFX Enclosure can be plugged into any Thunderbolt 3 enabled laptop or mini-PC to accelerate the most demanding 3D intensive software programs. Best of all, the Mini eGFX enclosure can be upgraded to perfectly match the application’s performance requirements. Consumers have the option of selecting from many mini ITX cards or standard compatible graphic card models for the Mini eGFX to optimize GPU processing requirements for each user’s specific needs.
"With the launch of the Mini eGFX external enclosure, users can turbocharge their Thunderbolt 3 enabled laptops with cutting edge discrete GPU add-in cards on the fly," said Michael Innes, President, VisionTek Products, LLC. “VisionTek embraces technology innovations from Intel that enhance the way we utilize our GPU technology to increase the efficiency, performance, and resolution of 3D visual PC applications.” “The VisionTek Thunderbolt 3 Mini eGFX enclosure is one of the most compact, yet flexible eGFX enclosures available in the market today,” said Jason Ziller, General Manager, Client Connectivity Division at Intel. “With this solution, VisionTek can broadly address the many professional graphics verticals, enterprise and consumer gaming markets as it can be easily configured to fit the needs of the customer with many inter-changeable graphic cards available.”
Expansive Selection of Laptop Compatibility
Thunderbolt 3 enabled laptops and mini-PCs connected to the new VisionTek Mini eGFX dock enclosure drives the most demanding 3D graphic intensive applications. Visiontek is proud to announce compatibility and availability with many new Thunderbolt 3 equipped laptops and mini-PCs in 2019. The speed, reliability, efficiency and compact size of VisionTek’s Mini eGFX Enclosure eliminates limitations of a laptop or mini-PC environment and opens possibilities for the perfect combination of portability and performance when required.
Select from a wide performance range of add-in graphics cards certified by VisionTek to improve 3D imaging rendering, 4K HD applications, video editing, run multi-monitor displays, improve PC gaming, and more. The VisionTek Mini eGFX is design to fit most mini ITX discrete graphics cards, as well as select reference card designs. Visit VisionTek’s product page for the most current list of graphics cards recommended. Benefits of the VisionTek Thunderbolt 3 Mini eGFX Enclosure for external GPUs include:
- Compact Design – Form & function collide to create one of the most compact and flexible enclosure designs in the industry to accommodate a variety of graphics cards (enclosure dimensions: 8.5” x 6” x 2.75”)
- 3D Graphics Performance – Whether you’re rendering complex 3D images or playing intense first-person shooters, the eGFX enclosure supports a wide range of mini ITX size graphics cards.
- Power – 240W of dedicated power is provided with the VisionTek Mini eGFX Enclosure.
- Multiple Displays – Supports up to six 4K displays @ 60fps from laptops & mini PC’s. Scalable to the needs of the user with the addition of a graphics card to fit the application’s needs.
- Maximum 3D Resolution Control –Set limits using the GPU’s proprietary firmware controls to customize resolution settings, enhance 3D performance, and assign multi-monitor layouts.
- Additional High-Speed USB 3.0, Ethernet Connection, and SATA III Port – The design uses a second Thunderbolt controller with PCIe-to-USB and PCIe-to-LAN controllers to provide Two (2) additional USB 3.0 ports that are conveniently accessible on the front panel of the eGFX enclosure and one (1) RJ45 ethernet Gigabit LAN connection located on the back side of the enclosure.
Subject: Graphics Cards | January 7, 2019 - 04:34 PM | Jeremy Hellstrom
Tagged: video card, turing, tu106, RTX 2060, rtx, nvidia, graphics card, gpu, gddr6, gaming
After months of rumours and guesses as to what the RTX 2060 will actually offer, we finally know. It is built on the same TU106 the RTX 2070 uses and sports somewhat similar core clocks though the drop in TC, ROPs and TUs reduces it to producing a mere 5 GigaRays. The memory is rather different, with the 6GB of GDDR6 connected via 192-bit bus offering 336.1 GB/s of bandwidth. As you saw in Sebastian's testing the overall performance is better than you would expect from a mid-range card but at the cost of a higher price.
If we missed out on your favourite game, check the Guru of 3D's suite of benchmarks or one of the others below.
"NVIDIA today announced the GeForce RTX 2060, the graphics card will be unleashed next week the 15th at a sales price of 349 USD / 359 EUR. Today, however, we can already bring you a full review of what is a pretty feisty little graphics card really."
Here are some more Graphics Card articles from around the web:
- NVIDIA GeForce RTX 2060 FE Review @ Legit Reviews
- RTX 2060 Review with 39 games @ BabelTechReviews
- NVIDIA Geforce RTX 2060 Founders Edition Review @ OCC
- Nvidia RTX 2060 Founders Edition 6GB @ Kitguru
- Battlefield V NVIDIA Ray Tracing RTX 2080 @ [H]ard|OCP
- The GPU Compute Performance From The NVIDIA GeForce GTX 680 To TITAN RTX @ Phoronix
- The HD 7970 vs. the GTX 680 – revisited after 7 years @ BabelTechReviews
We have to go all the way back to 2015 for NVIDIA's previous graphics card announcement at CES, with the GeForce GTX 960 revealed during the show four years ago. And coming on the heels of this announcement today we have the latest “mid-range” offering in the tradition of the GeForce x60 (or x060) cards, the RTX 2060. This launch comes as no surprise to those of us following the PC industry, as various rumors and leaks preceded the announcement by weeks and even months, but such is the reality of the modern supply chain process (sadly, few things are ever really a surprise anymore).
But there is still plenty of new information available with the official launch of this new GPU, not the least of which is the opportunity to look at independent benchmark results to find out what to expect with this new GPU relative to the market. To this end we had the opportunity to get our hands on the card before the official launch, testing the RTX 2060 in several games as well as a couple of synthetic benchmarks. The story is just beginning, and as time permits a "part two" of the RTX 2060 review will be offered to supplement this initial look, addressing omissions and adding further analysis of the data collected thus far.
Before getting into the design and our initial performance impressions of the card, let's look into the specifications of this new RTX 2060, and see how it relates to the rest of the RTX family from NVIDIA. We are taking a high level look at specs here, so for a deep dive into the RTX series you can check out our previous exploration of the Turing Architecture here.
"Based on a modified version of the Turing TU106 GPU used in the GeForce RTX 2070, the GeForce RTX 2060 brings the GeForce RTX architecture, including DLSS and ray-tracing, to the midrange GPU segment. It delivers excellent gaming performance on all modern games with the graphics settings cranked up. Priced at $349, the GeForce RTX 2060 is designed for 1080p gamers, and delivers an excellent gaming experience at 1440p."
|RTX 2080 Ti||RTX 2080||RTX 2070||RTX 2060||GTX 1080||GTX 1070|
|Base Clock||1350 MHz||1515 MHz||1410 MHz||1365 MHz||1607 MHz||1506 MHz|
|Boost Clock||1545 MHz/
1635 MHz (FE)
1800 MHz (FE)
1710 MHz (FE)
|1680 MHz||1733 MHz||1683 MHz|
|Ray Tracing Speed||10 Giga Rays||8 Giga Rays||6 Giga Rays||5 Giga Rays||--||--|
|Memory Clock||14000 MHz||14000 MHz||14000 MHz||14000 MHz||10000 MHz||8000 MHz|
|Memory Interface||352-bit GDDR6||256-bit GDDR6||256-bit GDDR6||192-bit GDDR6||256-bit GDDR5X||256-bit GDDR5|
|Memory Bandwidth||616 GB/s||448 GB/s||448 GB/s||336.1 GB/s||320 GB/s||256 GB/s|
|TDP||250 W /
260 W (FE)
|175 W / 185W (FE)||160 W||180 W||150 W|
|MSRP (current)||$1200 (FE)/
|$599 (FE)/ $499||$349||$549||$379|
Subject: Graphics Cards | January 7, 2019 - 02:46 AM | Jim Tanous
Tagged: rtx mobile, RTX 2080, RTX 2070, RTX 2060, rtx, nvidia, max-q, gaming laptop, ces2019
NVIDIA just wrapped up its CES keynote, and in addition to the expected unveiling of the RTX 2060, the company announced new mobile GeForce RTX options. More than 40 upcoming laptops, including 17 sporting NVIDIA’s Max-Q design, will offer RTX 2080, RTX 2070, and RTX 2060 graphics options.
NVIDIA CEO Jensen Huang likened GeForce RTX-powered laptops to a gaming console platform, pointing out multiple times performance comparisons to traditional game consoles like the PlayStation 4.
Laptops are the fastest growing gaming platform — and just getting started. The world’s top OEMs are using Turing to bring next-generation console performance to thin, sleek laptops that gamers can take anywhere. Hundreds of millions of people worldwide — an entire generation — are growing up gaming. I can’t wait for them to experience this new wave of laptops.
New GeForce RTX laptops will continue to support features like WhisperMode, which paces frame rates for AC-connected laptops to reduce heat and therefore fan noise, NVIDIA Battery Boost, which uses GeForce Experience to optimize performance for longer battery life, and of course G-SYNC.
Beyond gaming, NVIDIA is touting the benefits of the RTX platform for content creators, such as real-time video encoding for live streamers, faster rendering for video editors, and accurate interactive lighting, reflections, and shadows for animators.
Laptops sporting GeForce RTX cards will be available starting January 29th from NVIDIA partners including Acer, Alienware, ASUS, Dell, Gigabyte, HP, Lenovo, MSI, Razer, and Samsung. Pricing, detailed configuration options, and exact availability will vary and is not yet available for all manufacturers.
Subject: Graphics Cards | January 7, 2019 - 01:59 AM | Sebastian Peak
Tagged: video card, RTX 2060, rtx, ray tracing, nvidia, graphics, gpu, geforce, ces 2019, CES
On stage at an event tonight at CES 2019, NVIDIA CEO Jensen Huang made it offical: the RTX 2060 exists and will be available this month. The card is priced at $349, and is based on the same Turing architecture as the rest of the RTX family.
The RTX 2060 was announced with 6GB of GDDR6 memory, and like its bigger siblings the RTX 2060 offers ray tracing support (with 240 Tensor Cores onboard), and NVIDIA targets 60 FPS performance with ray tracing enabled in Battlefield V:
"The RTX 2060 is 60 percent faster on current titles than the prior-generation GTX 1060, NVIDIA’s most popular GPU, and beats the gameplay of the GeForce GTX 1070 Ti. With Turing’s RT Cores and Tensor Cores, it can run Battlefield V with ray tracing at 60 frames per second."
That 60% increase comes from benchmarks the company ran using 2560x1440 resolution, and the RTX 2060 is targeting resolutions from the mainstream 1920x1080 up to 2560x1440, though with performance between a GTX 1070 and 1080 the RTX 2060 could very well support 3840x2160 gaming at medium-to-high settings as well.
The official launch of the RTX 2060 is January 15 from add-in partners, as well as a Founders Edition card from NVIDIA beginning on that date. NVIDIA is also launching a new bundle deal. Qualifying RTX 2060 purchasers, either as a standalone card or as part of a desktop including the RTX 2060, can choose to receive either Battlefield V or the upcoming Anthem for free.
Stay tuned for more details on the GeForce RTX 2060 soon.
Subject: Graphics Cards | January 2, 2019 - 12:34 PM | Sebastian Peak
Tagged: pascal, overclocking, OC Scanner, nvidia, GTX 1080, gtx 1070, gtx 1060, geforce
GPU overclocking utility MSI Afterburner now supports automatic Pascal overclocking, bringing this feature to the GTX 10-series for the first time. NVIDIA had previously offered the OC Scanner only for the Turing-based RTX graphics cards (we compared OC Scanner vs. manual results using a previous version in our MSI GeForce RTX 2080 Gaming X Trio review), but a new version of the API is incorporated in Afterburner v4.6.0 beta 10.
"If you purchased a GeForce GTX 1050, 1060, 1070, 1080, Titan X, Tian Xp, Titan V (Volta) or AMD Radeon RX 5x0 and Vega graphics card we can recommend you to at least try out this latest release. We have written a GeForce GTX 1070 and 1080 overclocking guide right here. This is the new public final release of MSI AfterBurner. Over the past few weeks we have made a tremendous effort to get a lot of features enabled for this build."
The release notes are massive for this latest version, and you can view them in full after the break.
Subject: Graphics Cards | January 1, 2019 - 12:41 AM | Tim Verry
Tagged: turing, tu106, RTX 2060, nvidia, gaming
Videocardz recently released information on the NVIDIA RTX 2060 that sheds more light on the rumored card. Reportedly sourced from a copy of the official reviewer's guide, Videocardz claims that they are now able to confirm the specifications of the RTX 2060 including 1920 CUDA cores, 240 tensor cores, 30 ray tracing cores, and 6GB GDDR6 memory.
Graphics cards using the TU106-300 GPU will be available in stock and factory overclocked designs with the NVIDIA reference or AIB custom coolers. Display outputs include DVI, HDMI, and DisplayPort
|RTX 2060||RTX 2070||GTX 1070 Ti||RX Vega 64||RX Vega 56|
|GPU||TU106-300||TU106-400||GP104||Vega 10||Vega 10|
|CUDA cores||1920||2304||2432||4096 SPs||3584 SPs|
|Memory||6GB GDDR6||8GB GDDR6||8GB GDDR5||8GB HBM2||8GB HBM2|
|SP Compute||6.5 TF||7.5 TF||7.8 TF||12.5 TF (13.7 AIO)||10.5 TF|
|Base clock||1365||1410||1607||1200 (1406 AIO)||1156|
|Boost clock||1680||1710 (FE)||1683||1546 (1677 AIO)||1471|
|Memory clock||14000 MHz||14000 MHz||8000 MHz||1890 MHz||1600 MHz|
|Launch MSRP||$349||$499 (599 FE)||$449||$499||$399|
|Pricing 1-1-19||?||$500+||$405+||$400+ ($500+ AIO)||$470+(?)|
Allegedly, the RTX 2060 will offer up performance that is comparable to last generation's GTX 1070 Ti in 1080p and 1440p gaming scenarios. In a couple games the card even gets close to the GTX 1080 but in most of the titles listed by Videocardz (from the alleged reviewer's guide) the new GPU comes in slightly faster ot slightly slower than the 1070 Ti depending on the specific game. The RTX 2060 and its 30 RT cores can reportedly pull off playable 65 FPS Battlefield V even with RTX enabled with performance looking better with DLSS turned on at 88 FPS compared to RTX off performance of 90 FPS. Granted, that is Battlefield V at 1080p rather than the 1440p or 4k that the beefier RTX cards can push out.
When it comes to pricing, the RTX 2060 will have a MSRP of $349 with AIB and Founder's Edition being at the same level. RTX 2060 graphics cards are slated to launch om January 7th and will be available as soon as January 15th. If true we will not have long to wait until it is official and reviews are unveiled.
If you are curious about the rumored performance, check out the charts Videocardz uncovered.
Subject: Graphics Cards | December 24, 2018 - 03:19 PM | Jeremy Hellstrom
Tagged: sea hawk, RTX 2080, overclocking, msi
[H]ard|OCP takes a look at MSI's Sea Hawk RTX 2080, which sports a GPU covered by an AiO watercooler as well as a blower fan to ensure the memory and VRM are actively cooled as well. The design of the cooler also slims the card so you don't need to worry about the spacing between your PCIe slots as with some other coolers. Without any work whatsoever, you can expect an average 1954MHz GPU clock, 2040MHz with a bit of a power boost or 2060MHz if you don't mind the noise produced by fans spinning at 100%. The VRMs did prove a little finicky as you can see in the full review.
"MSI sent over its new Sea Hawk RTX 2080 card for use in a build video. This is a fair simple RTX card build that is purchased with a pre-installed All-In-One cooler. We wanted to see how well it overclocked and spent a night of gaming in order to do that and we have to say we were pleased with our results."
Here are some more Graphics Card articles from around the web:
- MSI GeForce RTX 2080 Duke 8G OC @ Modders-Inc
- ZOTAC GAMING GeForce RTX 2080 Ti AMP Extreme @ Guru of 3D
- ZOTAC RTX 2080 AMP Extreme Video Card Review @ Hardware Asylum
- ASUS GeForce RTX 2070 STRIX OC 8 GB @ TechPowerUp
- Zotac Gaming RTX 2070 OC Mini 8GB @ Kitguru
- KFA2 GeForce GTX 1060 6 GB GDDR5X @ TechPowerUp
- Initial Linux Benchmarks Of The NVIDIA TITAN RTX Graphics Card For Compute & Gaming @ Phoronix
- OCC NVIDIA RTX 2080 Overclocking Guide
- Battlefield V NVIDIA Ray Tracing RTX 2070 Performance @ [H]ard|OCP
- NVIDIA DLSS Test in Final Fantasy XV @ TechPowerUp
Subject: Graphics Cards | December 21, 2018 - 02:00 PM | Jim Tanous
Tagged: physx 4.0, PhysX, open source, nvidia
As promised in the company's initial announcement earlier this month, NVIDIA has released the newly open-sourced PhysX 4.0 SDK via GitHub. Now, thanks to its 3-Clause BSD license, any game developer, hardware company, or coding enthusiast can grab the latest version of NVIDIA's realtime physics engine and tinker, improve, or implement it in hopefully creative new ways.
The one limitation, of course, is that in its current form PhysX 4.0 (and version 3.4, which is now open source, too) still references lots of NVIDIA's closed source APIs, notably CUDA. But with the PhysX framework now available to fork, there's nothing to stop an eager company or programmer from creating and implementing their own alternatives to NVIDIA's proprietary tech.
In addition to going open source, PhysX 4.0 introduces a number of new features as outlined on NVIDIA's developer site:
- Temporal Gauss-Seidel Solver (TGS), which makes machinery, characters/ragdolls, and anything else that is jointed or articulated much more robust. TGS dynamically re-computes constraints with each iteration, based on bodies’ relative motion.
- The new reduced coordinate articulations feature makes the simulation of joints possible with no relative position error and realistic actuation.
- New automatic multi-broad phase.
- Increased scalability with new filtering rules for kinematics and statics.
- Actor-centric scene queries significantly improve performance for actors with many shapes.
- Build system now based on CMake.
BSD 3 licensed platforms:
- Apple iOS
- Apple MacOS
- Google Android ARM
- Microsoft Windows
Unchanged NVIDIA EULA platforms:
- Microsoft XBox One
- Sony Playstation 4
- Nintendo Switch
Subject: Graphics Cards, Memory | December 17, 2018 - 04:33 PM | Sebastian Peak
Tagged: Vega, radeon, JESD235, jedec, high bandwidth memory, hbm, DRAM, amd
In a press release today JEDEC has announced an update to the HBM standard, with potential implications for graphics cards utilizing the technology (such as an AMD Radeon Vega 64 successor, perhaps?).
"This update extends the per pin bandwidth to 2.4 Gbps, adds a new footprint option to accommodate the 16 Gb-layer and 12-high configurations for higher density components, and updates the MISR polynomial options for these new configurations."
Original HBM graphic via AMD
The revised spec brings the JEDEC standard up to the level we saw with Samsung's "Aquabolt" HBM2 and its 307.2 GB/s per-stack bandwidth, but with 12-high TSV stacks (up from 8) which raises memory capacity from 8GB to a whopping 24GB per stack.
The full press release from JEDEC follows:
ARLINGTON, Va., USA – DECEMBER 17, 2018 – JEDEC Solid State Technology Association, the global leader in the development of standards for the microelectronics industry, today announced the publication of an update to JESD235 High Bandwidth Memory (HBM) DRAM standard. HBM DRAM is used in Graphics, High Performance Computing, Server, Networking and Client applications where peak bandwidth, bandwidth per watt, and capacity per area are valued metrics to a solution’s success in the market. The standard was developed and updated with support from leading GPU and CPU developers to extend the system bandwidth growth curve beyond levels supported by traditional discrete packaged memory. JESD235B is available for download from the JEDEC website.
JEDEC standard JESD235B for HBM leverages Wide I/O and TSV technologies to support densities up to 24 GB per device at speeds up to 307 GB/s. This bandwidth is delivered across a 1024-bit wide device interface that is divided into 8 independent channels on each DRAM stack. The standard can support 2-high, 4-high, 8-high, and 12-high TSV stacks of DRAM at full bandwidth to allow systems flexibility on capacity requirements from 1 GB – 24 GB per stack.
This update extends the per pin bandwidth to 2.4 Gbps, adds a new footprint option to accommodate the 16 Gb-layer and 12-high configurations for higher density components, and updates the MISR polynomial options for these new configurations. Additional clarifications are provided throughout the document to address test features and compatibility across generations of HBM components.