All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards | March 25, 2019 - 03:03 PM | Jeremy Hellstrom
Tagged: RTX 2080 Max-Q, RTX 2080, NVIDA
TechSpot posted an investigation into the differences between the 215W desktop RTX 2080 and the 80W mobile Max-Q version. As you might expect, quite a bit had to be done to drop the power so precipitously including dropping the boost clock to just 1095MHz from 1515 MHz. To make things even more confusingly there is a 90W variant of the Max-Q which has a base clock of 990 MHz and a boost of 1230 MH and for the most part you will not know which one is installed before you buy the laptop and open it up.
"Following our coverage into Nvidia's laptop RTX GPUs, today we're reviewing the top-end RTX 2080 Max-Q. As an "RTX 2080" Turing part, this GPU comes with 2944 CUDA cores, 368 Tensor cores and 46 ray tracing cores. But that's where the similarities between the RTX 2080 Max-Q and the desktop RTX 2080 end."
Here are some more Graphics Card articles from around the web:
- Gigabyte RTX 2060 Gaming OC 6G @ Modders-Inc
- GeForce GTX 1660 Tested: 33 Game Benchmarks @ TechSpot
- MSI Nvidia GeForce GTX 1660 Gaming X Review – an Even Lower Cost Turing Option? @ Bjorn3d
- Adrenalin Software Edition 19.3.3 Driver Performance Analysis @ BabelTechReview
Subject: Graphics Cards | March 25, 2019 - 02:14 PM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce
Five days after they released the Creator Ready 419.67 drivers, NVIDIA has published their Game Ready equivalent, which is also version 419.67. It can be downloaded from GeForce Experience, or you can get it from their website (alongside the release notes).
In terms of features: this is another entry in the R418 branch. Often, but not always, major new features are merged in at the start, and subsequent releases add game-specific optimizations, bug fixes, and so forth. This is one of those drivers. It adds two new G-SYNC compatible monitors, the ASUS VG278QR and the ASUS VG258, as well as “Game Ready” tweaks for Battlefield V: Firestorm, Anthem, Shadow of the Tomb Raider, and Sekiro: Shadows Die Twice.
Also, if you have multiple GSYNC compatible displays, NVIDIA now also supports using them in Surround on Turing GPUs.
There are also several bug fixes, although most of them are relatively specific. They do have fixes for DaVinci Resolve and multiple Adobe applications, which I believe were first introduced in the Creator Ready driver. If you’re concerned about “missing out” on fixes, it looks like they’re intending to keep Game Ready up-to-date with Creator Ready, just with less QA on professional applications and timed to game releases (versus professional software releases).
Subject: General Tech | March 25, 2019 - 01:47 PM | Jeremy Hellstrom
Tagged: ShadowHammer, security, Kaspersky Labs, asus
Update, 3/26/19: As reported by TechRadar this morning ASUS has responded to the issue and implemented a fix to the latest version of Live Update (version 3.6.8) which provides "an enhanced end-to-end encryption mechanism" for the software. ASUS states that they "have also updated and strengthened our server-to-end-user software architecture to prevent similar attacks from happening in the future”. The company has also released a software tool to see if your system is affected, available directly from ASUS here (ZIP file).
Further, Bloomberg reports today that ASUS has disputed the numbers from the Kaspersky report, stating the attacks impacted only several hundred devices - and not "over a million" as had been estimated by Kaspersky. An ASUS spokesperson also said that "the company had since helped customers fix the problem, patched the vulnerability and updated their servers," in a statement quoted in the Bloomberg report.
The original news post follows.
Today, unfortunately, we have a perfect example of a supply chain attack posted at Slashdot and a very good reason for anyone using ASUS products to do a full scan on their systems as soon as they can. It seems that attackers compromised the ASUS update server, forged two different ASUS digital certificates and pushed out malware to about a half million customers when their machines ran an auto-update. Kaspersky Labs published details on their findings this afternoon as well, cautioning that "the investigation is still in progress and full results and technical paper will be published during SAS 2019 conference in Singapore".
What makes this even more interesting is that the infection was looking for 600 specific MAC addresses, when it found one it would immediately reach out to another server to install additional payload. This does not mean those without one of the listed MAC addresses is safe, the infection could still be there and modified to install additional nastiness on all infected machines. According to the information from Motherboard, Kaspersky first detected this in January and have reached out to ASUS several times, as did Motherboard who "has not heard back from the company".
"The researchers estimate half a million Windows machines received the malicious backdoor through the ASUS update server, although the attackers appear to have been targeting only about 600 of those systems. The malware searched for targeted systems through their unique MAC addresses."
Here is some more Tech News from around the web:
- Nintendo planning two new Switch models @ Ars Technica
- Netflix wants to choose its own adventure where Bandersnatch trademark case magically vanishes @ The Register
- Apple is reportedly working on App Store games by subscription @ The Inquirer
- We fought through the crowds to try Oculus's new VR goggles so you don't have to bother (and frankly, you shouldn't) @ The Register
- Microsoft's Chromium version of the Edge browser has leaked all over the internet @ The Inquirer
- Improved Estimates of the Distance To the Large Magellanic Cloud @ Slashdot
- 2019 Samsung Forum - QLED 8K + 4K TVs, iTunes & More @ TechARP
Subject: Processors | March 25, 2019 - 09:46 AM | Sebastian Peak
Tagged: shopping, sale, ryzen 7, ryzen 5, ryzen, processor, price drop, cpu, APU, amd, amazon, 2700x, 2700, 2600x, 2400G
If you haven't looked at AMD Ryzen processor listings over the weekend you might be surprised to see prices reduces across the board on Amazon specifically, with some pretty significant discounts including a Ryzen 7 2700 for only $219.99 (list price is $299). While we could debate whether these price changes signal the coming of 3000-series Ryzen CPUs sooner rather than later, the price drops are great for consumers regardless.
Here's a current list of the best deals on Ryzen 2000-series processors from Amazon, which seems to have the best prices (with Newegg's discounts far less dramatic).
AMD Ryzen 7 2700X Processor with Wraith Prism LED Cooler
- List price $329, currently $289.99 on Amazon.com
AMD Ryzen 7 2700 Processor with Wraith Spire LED Cooler
- List price $299, currently $219.99 on Amazon.com
AMD Ryzen 5 2600X Processor with Wraith Spire Cooler
- List price $249, currently $189.99 on Amazon.com
AMD Ryzen 5 2600 Processor with Wraith Stealth Cooler
- List price $199, currently $164.99 on Amazon.com
AMD Ryzen 5 2400G Processor with Radeon RX Vega 11 Graphics
- List price $169, currently $134.99 on Amazon.com
It's always tough to consider a build or upgrade when a new CPU launch is imminent, but the flipside is that previous-gen parts get cheaper (well, at least with these Ryzen parts they do). Here's to more price drops throughout the year.
Subject: Graphics Cards | March 22, 2019 - 04:32 PM | Sebastian Peak
Tagged: processor, Intel, integrated graphics, iGPU, hardware, graphics, gpu, Gen11
Intel has published a whitepaper on their new Gen11 processor graphics, providing details about the underlying architecture. The upcoming Sunny Cove processors and Gen11 graphics were unveiled back in December at Intel's Architecture Day, where Intel had stated that Gen11 was "expected to double the computing performance-per-clock compared to Intel Gen9 graphics", which would obviously be a massive improvement over their current offerings. Intel promises up 1 TFLOP performance from Gen11, with its 64 EUs (execution units) and other improvements providing up to a 2.67x increase over Gen9 - though Intel does clarity that "there may be different configurations" so we will very likely see the usual segmentation.
"The architecture implements multiple unique clock domains, which have been partitioned as a per-CPU core clock domain, a processor graphics clock domain, and a ring interconnect clock domain. The SoC architecture is designed to be extensible for a range of products and enable efficient wire routing between components within the SoC.
Gen11 graphics will be based on Intel’s 10nm process, with architectural refinements that promise significant performance-per-watt improvements, according to Intel. Intel also states that memory bandwidth has been addressed to meet the demands of the increased potency of the GPU, with improvements to compression, larger L3 cache size, and increased peak memory bandwidth. All major graphics APIs are supported including DirectX, OpenGL, Vulkan, OpenCL, and Metal - the last of which makes sense as these will very likely be powering the next generation of Apple's MacBook line.
Intel states that beyond the increases in compute and memory bandwidth, Gen11 will introduce "key new features that enable higher performance by reducing the amount of redundant work", and list Coarse pixel shading (CPS) and Position Only Shading Tile Based Rendering (PTBR) among them. Many more details are provided in the document, available at the source link (warning, PDF).
Subject: General Tech | March 22, 2019 - 02:18 PM | Jeremy Hellstrom
Tagged: razer, nari ultimate THX wireless chroma, wireless headset, Hypersense, audio
Many moons ago several companies released headsets with what was essentially force feedback and as it turned out, not many people enjoyed it. Razer is bring that back with their Razer Nari Ultimate THX Wireless Chroma headset, that comes with a price as large as its name. They called this feature HyperSense and as you might expect turning it on in tandem with the blinken-lighten significantly reduces the amount of time the battery lasts. On the plus side, the headset uses 50mm neodymium magnets and a frequency response of 20 Hz to 20 kHz.
You can see them in use over at KitGuru, or at least take a look at the freeze frame as it is worth the click.
"Ever wished your headset vibrated and you could feel what was happening in game? The Razer Nari headset was released back in September and in this review we take a look at the recently released Razer Nari Ultimate that has all the features of the original plus Hypersense technology."
Here is some more Tech News from around the web:
- FiiO M6 Portable High Resolution Music Player Review @ NikKTech
- Razer Kraken Gaming Headset @ KitGuru
- AVerMedia GC573 4K Capture Card & Live Streamer Mic 133 @ Kitguru
Subject: General Tech | March 22, 2019 - 01:43 PM | Jeremy Hellstrom
Tagged: flashbolt, Samsung, HBM2, HBM2E, aquabolt, flarebolt
Last year Samsung released Aquabolt, their HB2 modules which offered 2.4 Gbps per pin data transfer rates at 1.2V, rather impressive since the specification stated 2.0Gbps per pin was the top transfer rate. Today they announced the next generation, Flashbolt which uses HBM2E and unlike certain other standards which recently bolted an E onto the end of their name this offers an actual improvement.
To make sure it you are confused this Friday, the new chips are rated for 410GB/s of bandwidth and 16GB of capacity; go yell at The Inquirer but it isn't a mistake. To put it more helpfully, HBM2E can accommodate 16Gb per die, with transfer rates of 3.2Gbps per pin. That is an immense amount of bandwidth for GPUs and HPC cards, though we won't be seeing any products in the near future nor are you likely to be able to insert it into your system any time soon. AMD adopted a winning technology first, but as often seems to be the case, perhaps they moved a little too early.
"The name follows the same conventions as Samsung's previous memory products, Aquabolt and Flarebolt, but relies on a radically different architecture: HBM2E can accommodate 16Gb per die, twice as much as HBM2 used in Aquabolt, and offer transfer rates of 3.2Gbps per pin."
Here is some more Tech News from around the web:
- AT&T's 5G E Falls Short of T-Mobile and Verizon 4G Speeds: OpenSignal @ Slashdot
- Make Your Own Quantum Dots @ Hackaday
- Most Bitcoin Trading Faked by Unregulated Exchanges, Study Finds @ Slashdot
- Azure thing at last: Windows Virtual Desktop takes to the cloudy stage @ The Register
- 750,000 Medtronic Defibrillators Vulnerable To Hacking @ Slashdot
- Reprogrammable self-assembly makes molecular computer @ Physicsworld
- Clippy briefly resurrected as Teams add-on, brutally taken down by brand police @ Ars Technica
- Microsoft successfully creates an end-to-end storage device using DNA @ The Inquirer
- Grandson of Legendary John Deere Inventor Calls Out Company On Right To Repair @ Slashdot
- NETGEAR Orbi Voice (RBK50V) AC2200 Mesh Wi-Fi System @ Kitguru
- Tacoma Is FREE For A Limited Time
Subject: Motherboards | March 21, 2019 - 03:00 PM | Jeremy Hellstrom
Tagged: PRIME Z390-A, asus
Here it is, the last motherboard review you will see from [H]ard|OCP as the sun sets on another review site for Kyle has chosen to join the dark side. The ASUS PRIME Z390-A has a rather unique looking silkscreen pattern but RGB lovers will find the placement of the glowy bits unfortunate. At $183 currently on Amazon it sits in the middle of the Z390 family as far as pricing, and that does mean it lacks some of the fancier features offered by flagship class motherboards. If stability, decent overclocking and a bargain price matter more to you than wireless connectivity and a well developed case of RGB-itis then this last review from [H] is well worth a look.
"Sometimes less is more. We typically work with a lot of motherboards that fall into the more is more category and some that have a high price that can’t easily be justified by most people. The ASUS PRIME Z390-A seems to lean towards less is more. On paper the Z390-A seems to have a lot going for it and is light on fluff. How does it stack up? "
Here are some more Motherboard articles from around the web:
- ASUS ROG Maximus XI Hero (Wi-Fi) Z390 @ Kitguru
- ASUS ROG Rampage VI Extreme Omega @ The Guru of 3D
- Gigabyte Z390 Aorus Master @ Kitguru
Subject: General Tech | March 21, 2019 - 01:02 PM | Jeremy Hellstrom
Tagged: Windows 7, windows 10, microsoft, EoL
It does have to be said that running a 10 year old Microsoft OS might not be the wisest decision; though it is better than running one 18 years old. However, as we learned in 2017 many businesses are not even close to adopting Windows 10 on the majority of their systems. There are numerous reasons for that delay, from licensing through security to privacy not to mention the interface is different enough to upset many executive level users.
That hasn't stopped Microsoft from once again displaying splash screens on Windows 7 machines, as KB4493132 rolls out to those with automatic updates enabled. Thankfully it does not attempt to fool you into updating by changing the way they close window button works but then again, the update is no longer free. The Inquirer, as you might expect, is as enthused about this as most users.
"HERE WE GO AGAIN. Two years on from Updategate, Microsoft is back to posting nag screens on its outgoing operating system."
Here is some more Tech News from around the web:
- It’s been a long wait, but Bill and Ted 3: Face the Music is happening @ Ars Technica
- Stop us if you're getting deja-vu: Uber used spyware to nobble dial-a-ride rival, this time Down Under, allegedly @ The Register
- Microsoft brings Defender AV software to Macs for the first time @ The Inquirer
- Falling NAND flash prices to drive SSD adoption in enterprise, datacenter apps @ DigiTimes
- Our Skyborg (actual US govt program) will be just like IBM Watson, beams Air Force bod @ The Register
- For Years, Hundreds of Millions of Facebook Users Had Their Account Passwords Stored in Plain Text and Searchable By Thousands of Facebook Employees @ Slashdot
Subject: General Tech | March 21, 2019 - 12:05 PM | Jim Tanous
Tagged: podcast, The Division 2, stadia, razer, ray tracing, Oculus, Intel, hardocp, game streaming, DirectX 12, Basilisk
PC Perspective Podcast #537 - 3/20/2019
Join us this week as we review the new NVIDIA GTX 1660 and a high-end case from Corsair, discuss NVIDIA's Mellanox acquisition, get excited over Halo for PC, and more!
Subscribe to the PC Perspective Podcast
Check out previous podcast episodes: http://pcper.com/podcast
Today's sponsor is KiwiCo. Change the way your kids learn and play. Get your first KiwiCo Crate free: https://www.kiwico.com/pcper
00:00:22 - Intro
00:02:10 - Review: Razer Basilisk Essential Gaming Mouse
00:05:49 - Review: The Division 2 Performance Preview
00:18:34 - News: Real-Time Ray Tracing for Pascal
00:29:19 - News: Google Stadia
00:46:19 - News: GameWorks RTX & Unreal/Unity DXR Support
00:48:54 - News: Crytek Neon Noir Ray Tracing Demo
00:50:48 - News: Variable Rate Shading for DirectX 12
00:54:06 - News: NVIDIA T4 GPUs for the Server Room
01:00:10 - News: NVIDIA Creator Ready Drivers
01:04:29 - News: Oculus Rift S
01:08:39 - News: AMD Immune to SPOILER
01:18:13 - News: Windows 7 DX12 Support in Adrenalin Drivers
01:22:32 - News: Kyle Bennett Joins Intel
01:34:28 - Picks of the Week
01:44:58 - Outro
Subject: Processors | March 20, 2019 - 04:34 PM | Jeremy Hellstrom
Tagged: amd, coreprio, threadripper 2, 2990wx, dynamic local mode
Owning a Threadripper is not boring, the new architecture offers a variety of interesting challenges to keep your attention. One of these features is the lack of direct memory access for two of the dies, which can cause some performance issues and was at least partially addressed by introducing Dynamic Local Mode into Ryzen Master. On Windows boxes, enabling that feature ensures your hardest working cores have direct memory access, on Linux systems the problem simply doesn't exist. Another choice is Coreprio, developed by Bitsum, which accomplishes the same task but without the extras included in Ryzen Master.
"Performance regression issues in Windows on AMD’s top-end Ryzen Threadripper CPUs haven’t gone unnoticed by those who own them, and six months after launch, the issues remain. Fortunately, there’s a new tool making the rounds that can help smooth out those regressions. We’re taking an initial look."
Here are some more Processor articles from around the web:
- AMD Ryzen Threadripper 2970WX & 2920X Workstation Performance @ Techgage
- AMD Ryzen Memory Tweaking & Overclocking Guide @ TechPowerUp
- Testing Intel Whiskey Lake CPUs: Core i7-8565U @ Techspot
Subject: Graphics Cards | March 20, 2019 - 04:03 PM | Scott Michaud
Tagged: nvidia, graphics drivers
At the ongoing Game Developers Conference, GDC 2019, NVIDIA has announced a “Creator Ready Driver” program. This branch targets the users that sit between the GeForce and Quadro line, who use GeForce cards for professional applications – such as game developers.
Both Game Ready Drivers and the Creator Ready Drivers will support all games and creative applications. The difference is that Creator Ready drivers will be released according to the release schedule of popular creative tools, and they will receive more strict testing with creative applications.
The first release, Creator Reader Drivers 419.67, lists a 13% performance increase with the Blender Cycles renderer, and an 8% to 9% increase for CINEMA 4D, Premiere Pro CC, and Photoshop CC, all relative to the 415-branch GeForce drivers.
You can choose between either branch using the GeForce Experience interface.
Image Credit: NVIDIA
As for supported products? The Creator Ready Drivers are available for Pascal-, Volta-, and Turing-based GeForce and TITAN GPUs. “All modern Quadro” GPUs are also supported on the Creator Ready branch.
Personally, my brain immediately thinks “a more-steady driver schedule and some creative application performance enhancements at the expense of a little performance near the launch of major video games”. I could be wrong, but that’s what I think when I read this news.
Subject: General Tech | March 20, 2019 - 02:15 PM | Jeremy Hellstrom
Tagged: roguetech, mod, gaming, battletech
Have you already collected every 'mech available in the game and bashed your way through the Flashpoint expansion and are looking for something new in your BattleTech gaming time? How does gunning down a CattleMaster, Uriel or Phoenix Hawk with an LBX Autocannon sound? RogueTech is a huge mod for Harebrained Schemes' BattleTech which adds over 1000 'mechs, just about every weapon added into the tabletop game over the years, a fair number of new crunchies and even those annoying Power Armour nitwits.
The changes go deeper than that, with total overhauls to movement, gunnery and how damage is inflicted, by yourself as well as others. Expect your first missions to go poorly as you stumble your way through the complete revamp of the game, as you can read about over at Rock, Paper, SHOTGUN; installation instructions included.
"It brings the combat a little more in line with the full, expanded tabletop rule-set (which no sane person ever used, thanks to requiring dozens of skill checks and countless dice rolled a turn), but through the magic of computers, we can experience all the thrills of full simulation-level ‘Mech combat without putting you to sleep or frying your brain."
Here is some more Tech News from around the web:
- The Division 2: PC graphics performance benchmark analysis @ The Guru of 3D
- Tom Clancy's The Division 2 @ BabelTechReviews
- Shadow of the Tomb Raider perf review update: RTX & DLSS @ Guru of 3D
- Shadow of the Tomb Raider Ultra RTX Performance and IQ Analysis @ BabelTechReviews
- RTX and DLSS in Shadow of the Tomb Raider @ TechPowerUp
- Humble Strategy Bundle
- The System Shock remake drips with neon nostalgia in new footage @ Rock, Paper, SHOTGUN
- Vehicular battle royale 'Notmycar' hits Steam on April 5th @ Engadget
- System Shock 3 first trailer: she's back (and so's he) @ Rock, Paper, SHOTGUN
Subject: General Tech | March 20, 2019 - 01:20 PM | Jeremy Hellstrom
Tagged: Oculus Rift S, Oculus, vr headset, gdc 2019
The brand new Oculus Rift S is being shown off at GDC and Ars Technica had a chance to play with it. The new implementation is rather impressive, a single wire connects you to your PC and there are now no external cameras, instead they have been relocated to the headset itself. From the description in the review it seems they have done so very successfully, with location tracking improving as opposed to degrading due to that change. Your eyeballs also get an upgraded experience, with each eye having a 1280x1440 display, though as of yet Oculus have not changed to AMOLED screens.
"This is out-of-the-box room scale," Oculus co-founder Nate Mitchell said as he gestured to the Oculus Rift S, the new PC-focused VR headset launching "this spring" for $399. This headset will effectively replace the standard Oculus Rift headset, which has had production all but halted to make way for the new model."
Here is some more Tech News from around the web:
- Qualcomm's latest chip will stop your smart speaker from fluffing commands @ The Inquirer
- Opera Adds Free and Unlimited VPN Service To Its Android Browser @ Slashdot
- EC whacks Google with €1.49bn antitrust fine over AdSense @ The Inquirer
- Hands-On: New Nvidia Jetson Nano is More Power In A Smaller Form Factor @ Hackaday
- LLVM 8.0 Released With Cascade Lake Support, Better Diagnostics, More OpenMP/OpenCL @ Slashdot
Subject: Graphics Cards | March 19, 2019 - 08:29 PM | Scott Michaud
Tagged: microsoft, DirectX 12, turing, nvidia
To align with the Game Developers Conference, GDC 2019, Microsoft has announced Variable Rate Shading for DirectX 12. This feature increases performance by allowing the GPU to lower its shading resolution for specific parts of the scene (without the developer doing explicit, render texture-based tricks).
An NVIDIA speech from SIGGRAPH 2018 (last August)
The feature is divided into three parts:
- Lowering the resolution of specific draw calls (tier 1)
- Lowering the resolution within a draw call by using an image mask (tier 2)
- Lowering the resolution within a draw call per primitive (ex: triangle) (tier 2)
The last two points are tagged as “tier 2” because they can reduce the workload within a single draw call, which is an item of work that is sent to the GPU. A typical draw call for a 3D engine is a set of triangles (vertices and indices) paired with a material (a shader program, textures, and properties). While it is sometimes useful to lower the resolution for particularly complex draw calls that take up a lot of screen space but whose output is also relatively low detail, such as water, there are real benefits to being more granular.
The second part, an image mask, allows detail to be reduced for certain areas of the screen. This can be useful in several situations:
- The edges of a VR field of view
- Anywhere that will be brutalized by a blur or distortion effect
- Objects behind some translucent overlays
- Even negating a tier 1-optimized section to re-add quality where needed
The latter example was the one that Microsoft focused on with their blog. Unfortunately, I am struggling to figure out what specifically is going on, because the changes that I see (ex: the coral reef, fish, and dirt) don’t line up with their red/blue visualizer. The claim is that they use an edge detection algorithm to force high-frequency shading where there would be high-frequency detail.
Right side reduces shading by 75% for terrain and water
Right side reclaims some lost fidelity based on edge detection algorithm
Visualization of where shading complexity is spent.
(Red is per-pixel. Blue is 1 shade per 2x2 block.)
Images Source: Firaxis via Microsoft
Microsoft claims that this feature will only be available for DirectX 12. That said, NVIDIA, when Turing launched, claims that Variable Rate Shading will be available for DirectX 11, DirectX 12, Vulkan, and OpenGL. I’m not sure what’s different between Microsoft’s implementation that lets them separate it from NVIDIA’s extension.
Microsoft will have good tools support, however. They claim that their PIX for Windows performance analysis tool will support this feature on Day 1.
Subject: General Tech, Graphics Cards, Networking, Shows and Expos | March 19, 2019 - 06:16 PM | Jeremy Hellstrom
Tagged: nvidia, t4, amazon, microsoft, NGC, Mellanox, CUDA-X, GTC, jen-hsun huang, DRIVE Constellation, ai
As part of their long list of announcements yesterday, NVIDIA revealed they are partnering with Cisco, Dell EMC, Fujitsu, Hewlett Packard Enterprise, Inspur, Lenovo and Sugon to provide servers powered by T4 Tensor Core GPUs, optimized to run their CUDA-X AI accelerators.
Those T4 GPUs have been on the market for a while but this marks the first major success for NVIDIA in the server room, with models available for purchase from those aforementioned companies. Those who prefer other people's servers can also benefit from these new products, with Amazon and Microsoft offering Cloud based solutions. Setting yourself up to run NVIDIA's NGC software may save a lot of money down the road, the cards sip a mere 70W of power which is rather more attractive than the consumption of a gaggle of Tesla V100s. One might be guilty of suspecting this offers an explanation for their recent acquisition of Mellanox.
NGC software offers more than just a platform to run optimizations on, it also offers a range of templates to start off with from classification, and object detection, through sentiment analysis and most other basic starting points for training a machine. Customers will also be able to upload their own models to share internally or, if in the mood, externally with other users and companies. It supports existing products such as TensorFlow and PyTorch but also offers access to CUDA-X AI, which as the name suggests takes advantage of the base design of the T4 GPU to reduce the amount of time waiting for results and letting users advance designs quickly.
If you are curious exactly what particular implementations of everyone's favourite buzzword might be, NVIDIA's DRIVE Constellation is a example after JoshTekk's own heart. Literally an way to create open, scalable simulation for large fleets of self-driving cars to train them ... for good one hopes. Currently the Toyota Research Institute-Advanced Development utilizes these products in the development of their next self-driving fleet, and NVIDIA obviously hopes others will follow suit.
There is not much to see from the perspective of a gamer in the short term, but considering NVIDIA's work at shifting the horsepower from the silicon you own to their own Cloud this will certainly impact the future of gaming from both a hardware and gameplay perspective. GPUs as a Service may not be the future many of us want but this suggests it could be possible, not to mention the dirty tricks enemy AIs will be able to pull with this processing power behind them.
One might even dream that escort missions could become less of a traumatic experience!
Subject: Cases and Cooling | March 19, 2019 - 03:35 PM | Sebastian Peak
Tagged: tempered glass, mesh, MasterBox NR600, MasterBox NR400, enclosure, cooler master, chassis, case, airflow
Cooler Master has launched a pair of new cases today with the MasterBox NR400 and the MasterBox NR600. These budget-friendly cases offer a minimalist approach with clean lines and no flashy lighting effects, and should offer good airflow with their full mesh front panels.
The Cooler Master MasterBox NR600 is an ATX mid-tower
The MasterBox NR400 is the smaller of the two cases, supporting mini ITX and micro ATX motherboards, with the larger MasterBox NR600 offering support for standard ATX motherboards.
Cooler Master provides this list of features for these two new NR series cases:
- Minimalistic Mesh Design - Elegant design elements are applied to mesh for optimal thermal performance.
- Optimal Thermal Performance – The full mesh front panel and ventilated top panel provide a high potential for thermal performance.
- Flush Tempered Glass Side Panel – The tempered glass side panel, fastened by thumbscrews on the rear panel, keeps the surface completely flush.
- Headset Jack – The single 4 pole headset jack features both audio and microphone capabilities simultaneously so separate jacks are not needed.
- Graphics Card Support Up to 346mm (NR400) / 410mm (NR600) – Generous clearance space is provided to support the latest graphics cards.
- Cable Management – High quality, longer length rubber grommets paired with generous clearance behind the motherboard offers ample room for cable management.
"Thermal performance is at the core of the NR Series, with the combination of two pre-installed fans for intake and exhaust and the fine mesh front panel. Generous support for radiator mounting and unrestricted ventilation on the top and front panel ensure that even the most demanding components are efficiently cooled while the two USB 3.1 gen 1 ports at the top of the case provide ample connectivity for everyday use."
The Cooler Master MasterBox NR400 is a smaller mATX design
The MasterBox NR400 has an MSRP of $59.99, with the NR600 at $69.99. Both cases are available for pre-sale today, though no listings have yet found their way to Amazon or Newegg at time of writing.
Subject: General Tech | March 19, 2019 - 03:28 PM | Scott Michaud
Tagged: google, stadia
Google Stadia, a new gaming service, was announced at the 2019 Game Developers Conference. Much like OnLive and PlayStation Now, users connect to an instance of video games running in datacenters. Because it is a conference for game developers, they also mentioned a few added features, such as directly transcoding for YouTube and allowing the audience to wait in line and jump into a streamer’s game session.
Requirements and price were not mentioned, but it will arrive sometime in 2019.
Google also mentioned that applicable games would run on Linux and the Vulkan API. Given their Android initiative, and their desire to not pay Microsoft a license fee for VMs, this makes a lot of sense. It also forces major engine developers to target and optimize Linux-focused APIs, which will be good in the long run, especially if Google starts adding GPUs from Intel and NVIDIA in the coming years. I'm not sure how much it will push Linux ports, however, because that's not really why publishers avoid the platform.
In terms of hardware, Google claims that an instance will have about 10.7 teraflops of GPU performance on an AMD platform. In terms of compute, this puts it equivalent to a GTX 1080 Ti, although AMD tends to have reduced fillrate, etc., which keeps them behind NVIDIA parts with equivalent compute performance. (Those equivalent AMD parts also tend to be significantly cheaper, and thus comparing flop to flop isn’t fair in most other circumstances.)
As long-time readers know, I am also very cautious about streaming services because they are about the epitome of DRM. While the content is available on other, local platforms? Not really a big deal. If a game is developed for Google’s service, and requires Google’s API, then games that leave the service, either by Google’s license ending or Google simply not wanting to host it anymore, will be just about impossible to preserve. Games are not movies or sound files that can be transcoded into another format.
Linux and Vulkan does provide a bit more confidence, but you never know what will happen when a company, with no legal (ex: the GPL) obligation to remain open, gets large enough.
It’s something that I’m still worried about, though.
Subject: General Tech | March 19, 2019 - 02:34 PM | Sebastian Peak
Tagged: RGB, nzxt, lighting, HUE 2 Ambient RGB Kit V2, HUE 2, ambient
NZXT has announced the version 2 of the HUE 2 Ambient RGB Lighting Kit, an update which the company says "brings product improvements that make installation easier and help ensure that the included LED strips will be securely attached to your monitor". The lighting kit uses NZXT's CAM software to select between lighting modes - which include an ambient mode that changes the lighting to compliment the images on your monitor.
"The HUE 2 ecosystem delivers the broadest line of RGB lighting accessories for PC Builders with over 10 different accessories such as the HUE 2 RGB lighting kit, underglow kit, AER RGB 2 fans, and cable combs."
NZXT provides this list of features for the new HUE 2 kit:
HUE 2 Ambient RGB Kit V2 New Features:
- Stronger Adhesive: Upgraded the 3M LED strip adhesive backing tape to be thicker and stickier. It is more compatible with the different surfaces and textures of monitors on the market.
- L-Shape Corner Connector: For easier setup, we replaced the 150mm corner connectors to an L-shape corner connector for the top left and bottom right of your monitor.
- Alcohol Wipes: As your monitor may be dusty or dirty, we added alcohol-based cleaning wipes so you can clean the back of the monitor before adhering the LED strips.
HUE 2 RGB Ambient Lighting Kit Features
- Available in two configurations, one for 21” to 25”/34”-35” UltraWide monitors and one for 26” to 32” monitors, the HUE 2 RGB Ambient Lighting Kit adds gorgeous lighting to your gaming PC and provides a more immersive in-game experience* by projecting the colors from the edges of your display to your walls.
- HUE 2 RGB Ambient Lighting controller
- Faster microprocessor enables incredibly smooth lighting effects and virtually instantaneous response changes to colors on the screen in ambient mode
- Multiple LED strips in various lengths, along with easy-to-follow configuration and installation instructions for various monitor sizes and aspect ratios
- AC power adapter
- All necessary cables
- Compatible with all HUE 2 accessories and CAM lighting modes
* Ambient lighting modes are available only with the HUE 2 Ambient Lighting controller and require the use of CAM as a registered user, including acceptance of the then-current NZXT Terms of Service.
The HUE 2 lighting kits are available now in the USA with pricing as follows:
- HUE 2 Ambient RGB Lighting Kit V2 (21”-25”, 34”-35” UltraWide) - $99.99
- HUE 2 Ambient RGB Lighting Kit V2 (26”-32”) - $99.99
Subject: Cases and Cooling | March 19, 2019 - 01:57 PM | Jeremy Hellstrom
Tagged: thermaltake, Water 3.0 360 ARGB Sync, watercooler, 360mm radiator, RGB
Thermaltake's Water 3.0 360 ARGB Sync is a watercooler for those who live and breath RGBs, as it is compatible with all the software suites found on motherboards so you can create your own lightshow. It is compatible with all current sockets from both AMD and Intel, the only compatibility issue you might have is fitting the 360mm radiator into your case. In Kitguru's testing it even performed well while cooling, but really it's all about the RGBs.
"There’s a lot to unpack from the name of the Thermaltake Water 3.0 360 ARGB Sync. The ‘360’ obviously refers to its 360mm radiator, and we know ARGB = Addressable RGB lighting. The ‘Sync’ aspect references software compatibility for the Water 3.0 360 with current motherboards ..."
Here are some more Cases & Cooling reviews from around the web:
- ID-Cooling Auraflow X 240 @ TechPowerUp
- Noctua NF-F12 PWM chromax Fan @ TechPowerUp
- Corsair Carbide 678C @ The Guru of 3D
- CORSAIR CRYSTAL 680X RGB ATX Smart Case Review @ NikKTech
- Cooler Master MasterBox NR400 @ Kitguru