Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

NVIDIA to Add Real-Time Ray Tracing Support to Pascal GPUs via April Driver Update

Subject: Graphics Cards | March 18, 2019 - 09:41 PM |
Tagged: unreal engine, Unity, turing, rtx, ray tracing, pascal, nvidia, geforce, GTC 19, GTC, gaming, developers

Today at GTC NVIDIA announced a few things of particular interest to gamers, including GameWorks RTX and the implementation of real-time ray tracing in upcoming versions of both Unreal Engine and Unity (we already posted the news that CRYENGINE will be supporting real-time ray tracing as well). But there is something else... NVIDIA is bringing ray tracing support to GeForce GTX graphics cards.

DXR_GPUs.png

This surprising turn means that hardware RT support won’t be limited to RTX cards after all, as the install base of NVIDIA ray-tracing GPUs “grows to tens of millions” with a simple driver update next month, adding the feature to both to previous-gen Pascal and the new Turing GTX GPUs.

How is this possible? It’s all about the programmable shaders:

“NVIDIA GeForce GTX GPUs powered by Pascal and Turing architectures will be able to take advantage of ray tracing-supported games via a driver expected in April. The new driver will enable tens of millions of GPUs for games that support real-time ray tracing, accelerating the growth of the technology and giving game developers a massive installed base.

With this driver, GeForce GTX GPUs will execute ray traced effects on shader cores. Game performance will vary based on the ray-traced effects and on the number of rays cast in the game, along with GPU model and game resolution. Games that support the Microsoft DXR and Vulkan APIs are all supported.

However, GeForce RTX GPUs, which have dedicated ray tracing cores built directly into the GPU, deliver the ultimate ray tracing experience. They provide up to 2-3x faster ray tracing performance with a more visually immersive gaming environment than GPUs without dedicated ray tracing cores.”

A very important caveat is that “2-3x faster ray tracing performance” for GeForce RTX graphics cards mentioned in the last paragraph, so expectations will need to be tempered as RT features will be less efficient running on shader cores (Pascal and Turing) than they are with dedicated cores, as demonstrated by these charts:

BFV_CHART.png

METRO_EXODUS_CHART.png

SOTTR_CHART.png

It's going to be a busy April.

Source: NVIDIA
Manufacturer: PC Perspective

AMD and NVIDIA GPUs Tested

Tom Clancy’s The Division 2 launched over the weekend and we've been testing it out over the past couple of days with a collection of currently-available graphics cards. Of interest to AMD fans, this game joins the ranks of those well optimized for Radeon graphics, and with a new driver (Radeon Software Adrenalin 2019 Edition 19.3.2) released over the weekend it was a good time to run some benchmarks and see how some AMD and NVIDIA hardware stack up.

d2-key-art-1920x600.jpg

The Division 2 offers DirectX 11 and 12 support, and uses Ubisoft's Snowdrop engine to provide some impressive visuals, particularly at the highest detail settings. We found the "ultra" preset to be quite attainable with very playable frame rates from most midrange-and-above hardware even at 2560x1440, though bear in mind that this game uses quite a bit of video memory. We hit a performance ceiling at 4GB with the "ultra" preset even at 1080p, so we opted for 6GB+ graphics cards for our final testing. And while most of our testing was done at 1440p we did test a selection of cards at 1080p and 4K, just to provide a look at how the GPUs on test scaled when facing different workloads.

Tom Clancy's The Division 2

d2-screen1-1260x709.jpg

Washington D.C. is on the brink of collapse. Lawlessness and instability threaten our society, and rumors of a coup in the capitol are only amplifying the chaos. All active Division agents are desperately needed to save the city before it's too late.

d2-screen4-1260x709.jpg

Developed by Ubisoft Massive and the same teams that brought you Tom Clancy’s The Division, Tom Clancy’s The Division 2 is an online open world, action shooter RPG experience set in a collapsing and fractured Washington, D.C. This rich new setting combines a wide variety of beautiful, iconic, and realistic environments where the player will experience the series’ trademark for authenticity in world building, rich RPG systems, and fast-paced action like never before.

d2-screen3-1260x709.jpg

Play solo or co-op with a team of up to four players to complete a wide range of activities, from the main campaign and adversarial PvP matches to the Dark Zone – where anything can happen.

Continue reading our preview of GPU performance with The Division 2

Nice OS you have there; shame if something happened to it

Subject: General Tech | March 21, 2019 - 01:02 PM |
Tagged: Windows 7, windows 10, microsoft, EoL

It does have to be said that running a 10 year old Microsoft OS might not be the wisest decision; though it is better than running one 24 years old.  However, as we learned in 2017 many businesses are not even close to adopting Windows 10 on the majority of their systems.  There are numerous reasons for that delay, from licensing through security to privacy not to mention the interface is different enough to upset many executive level users. 

That hasn't stopped Microsoft from once again displaying splash screens on Windows 7 machines, as KB4493132 rolls out to those with automatic updates enabled.  Thankfully it does not attempt to fool you into updating by changing the way they close window button works but then again, the update is no longer free.  The Inquirer, as you might expect, is as enthused about this as most users.

buggers.PNG

"HERE WE GO AGAIN. Two years on from Updategate, Microsoft is back to posting nag screens on its outgoing operating system."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer

NZXT Announces the HUE 2 RGB Ambient Lighting Kit V2

Subject: General Tech | March 19, 2019 - 02:34 PM |
Tagged: RGB, nzxt, lighting, HUE 2 Ambient RGB Kit V2, HUE 2, ambient

NZXT has announced the version 2 of the HUE 2 Ambient RGB Lighting Kit, an update which the company says "brings product improvements that make installation easier and help ensure that the included LED strips will be securely attached to your monitor". The lighting kit uses NZXT's CAM software to select between lighting modes - which include an ambient mode that changes the lighting to compliment the images on your monitor.

HUE 2 Ambient RGB Lighting Kit.jpg

"The HUE 2 ecosystem delivers the broadest line of RGB lighting accessories for PC Builders with over 10 different accessories such as the HUE 2 RGB lighting kit, underglow kit, AER RGB 2 fans, and cable combs."

NZXT provides this list of features for the new HUE 2 kit:

HUE 2 Ambient RGB Kit V2 New Features:

  • Stronger Adhesive: Upgraded the 3M LED strip adhesive backing tape to be thicker and stickier. It is more compatible with the different surfaces and textures of monitors on the market.
  • L-Shape Corner Connector: For easier setup, we replaced the 150mm corner connectors to an L-shape corner connector for the top left and bottom right of your monitor.
  • Alcohol Wipes: As your monitor may be dusty or dirty, we added alcohol-based cleaning wipes so you can clean the back of the monitor before adhering the LED strips.

HUE 2 RGB Ambient Lighting Kit Features

  • Available in two configurations, one for 21” to 25”/34”-35” UltraWide monitors and one for 26” to 32” monitors, the HUE 2 RGB Ambient Lighting Kit adds gorgeous lighting to your gaming PC and provides a more immersive in-game experience* by projecting the colors from the edges of your display to your walls.
  • HUE 2 RGB Ambient Lighting controller
  • Faster microprocessor enables incredibly smooth lighting effects and virtually instantaneous response changes to colors on the screen in ambient mode
  • Multiple LED strips in various lengths, along with easy-to-follow configuration and installation instructions for various monitor sizes and aspect ratios
  • AC power adapter
  • All necessary cables
  • Compatible with all HUE 2 accessories and CAM lighting modes

* Ambient lighting modes are available only with the HUE 2 Ambient Lighting controller and require the use of CAM as a registered user, including acceptance of the then-current NZXT Terms of Service.

HUE-2-Ambient-RGB-Lighting-Kit.jpg

The HUE 2 lighting kits are available now in the USA with pricing as follows:

  • HUE 2 Ambient RGB Lighting Kit V2 (21”-25”, 34”-35” UltraWide) - $99.99
  • HUE 2 Ambient RGB Lighting Kit V2 (26”-32”) - $99.99
Source: NZXT

Oculus shows off their new headgear at GDC

Subject: General Tech | March 20, 2019 - 01:20 PM |
Tagged: Oculus Rift S, Oculus, vr headset, gdc 2019

The brand new Oculus Rift S is being shown off at GDC and Ars Technica had a chance to play with it.  The new implementation is rather impressive, a single wire connects you to your PC and there are now no external cameras, instead they have been relocated to the headset itself.  From the description in the review it seems they have done so very successfully, with location tracking improving as opposed to degrading due to that change.  Your eyeballs also get an upgraded experience, with each eye having a 1280x1440 display, though as of yet Oculus have not changed to AMOLED screens.

Check it out here.

oculus_gdc19-7-1440x960.jpg

"This is out-of-the-box room scale," Oculus co-founder Nate Mitchell said as he gestured to the Oculus Rift S, the new PC-focused VR headset launching "this spring" for $399. This headset will effectively replace the standard Oculus Rift headset, which has had production all but halted to make way for the new model."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica

The Atari VCS is delayed again, but you might not be as mad about it as you think

Subject: General Tech | March 19, 2019 - 01:19 PM |
Tagged: atari, delay, amd, Vega, ryzen

You may remember the announcement of the re-launch of the Atari Video Console System back in the summer of 2017, though by now you may have decided that it is going the way of the ZX Spectrum Vega+.  If you do still hold hope, Atari is once again testing your patience by announcing another delay to the end of 2019.  There is a reason however, which you may or may not find acceptable.  They will be upgrading the AMD Ryzen chip at the heart of the system, with the new generation of Vega graphics offering modern performance.  Atari is also suggesting this will offer much quieter and cooler performance in a quote over at The Inquirer.

atari-vcs-ataribox-linux-pc-video-game-console-gaming.jpg

"The Atari VCS launched on Indiegogo and was originally set to arrive in spring 2018, but the company has announced that it will now arrive at the butt-end of 2019 (and that projection is just for the US and Canada)."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

Google Announces Stadia at GDC 2019

Subject: General Tech | March 19, 2019 - 03:28 PM |
Tagged: google, stadia

Google Stadia, a new gaming service, was announced at the 2019 Game Developers Conference. Much like OnLive and PlayStation Now, users connect to an instance of video games running in datacenters. Because it is a conference for game developers, they also mentioned a few added features, such as directly transcoding for YouTube and allowing the audience to wait in line and jump into a streamer’s game session.

Requirements and price were not mentioned, but it will arrive sometime in 2019.

Google also mentioned that applicable games would run on Linux and the Vulkan API. Given their Android initiative, and their desire to not pay Microsoft a license fee for VMs, this makes a lot of sense. It also forces major engine developers to target and optimize Linux-focused APIs, which will be good in the long run, especially if Google starts adding GPUs from Intel and NVIDIA in the coming years. I'm not sure how much it will push Linux ports, however, because that's not really why publishers avoid the platform.

In terms of hardware, Google claims that an instance will have about 10.7 teraflops of GPU performance on an AMD platform. In terms of compute, this puts it equivalent to a GTX 1080 Ti, although AMD tends to have reduced fillrate, etc., which keeps them behind NVIDIA parts with equivalent compute performance. (Those equivalent AMD parts also tend to be significantly cheaper, and thus comparing flop to flop isn’t fair in most other circumstances.)

google-2019-stadia-controller.png

As long-time readers know, I am also very cautious about streaming services because they are about the epitome of DRM. While the content is available on other, local platforms? Not really a big deal. If a game is developed for Google’s service, and requires Google’s API, then games that leave the service, either by Google’s license ending or Google simply not wanting to host it anymore, will be just about impossible to preserve. Games are not movies or sound files that can be transcoded into another format.

Linux and Vulkan does provide a bit more confidence, but you never know what will happen when a company, with no legal (ex: the GPL) obligation to remain open, gets large enough.

It’s something that I’m still worried about, though.

Crytek's Neon Noir is a Platform Agnostic Real-Time Ray Tracing Demo

Subject: General Tech | March 18, 2019 - 09:03 AM |
Tagged: vulkan, RX Vega 56, rtx, ray tracing, radeon, nvidia, Neon Noir, dx12, demo, crytek, CRYENGINE, amd

Crytek has released video of a new demo called Neon Noir, showcasing real-time ray tracing with a new version of CRYENGINE Total Illumination, slated for release in 2019. The big story here is that this is platform agnostic, meaning both AMD and NVIDIA (including non-RTX) graphics cards can produce the real-time lighting effects. The video was rendered in real time using an AMD Radeon RX Vega 56 (!) at 4K30, with Crytek's choice in GPU seeming to assuage fears of any meaningful performance penalty with this feature enabled (video embedded below):

“Neon Noir follows the journey of a police drone investigating a crime scene. As the drone descends into the streets of a futuristic city, illuminated by neon lights, we see its reflection accurately displayed in the windows it passes by, or scattered across the shards of a broken mirror while it emits a red and blue lighting routine that will bounce off the different surfaces utilizing CRYENGINE's advanced Total Illumination feature. Demonstrating further how ray tracing can deliver a lifelike environment, neon lights are reflected in the puddles below them, street lights flicker on wet surfaces, and windows reflect the scene opposite them accurately.”

Crytek is calling the new ray tracing features “experimental” at this time, but the implications of ray tracing tech beyond proprietary hardware and even graphics API (it works with both DirectX 12 and Vulcan) are obviously a very big deal.

crytek_demo.png

“Neon Noir was developed on a bespoke version of CRYENGINE 5.5., and the experimental ray tracing feature based on CRYENGINE’s Total Illumination used to create the demo is both API and hardware agnostic, enabling ray tracing to run on most mainstream, contemporary AMD and NVIDIA GPUs. However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards and supported APIs like Vulkan and DX12.”

You can read the full announcement from Crytek here.

Source: Crytek

Microsoft Announces Variable Rate Shading for DirectX 12

Subject: Graphics Cards | March 19, 2019 - 08:29 PM |
Tagged: microsoft, DirectX 12, turing, nvidia

To align with the Game Developers Conference, GDC 2019, Microsoft has announced Variable Rate Shading for DirectX 12. This feature increases performance by allowing the GPU to lower its shading resolution for specific parts of the scene (without the developer doing explicit, render texture-based tricks).

An NVIDIA speech from SIGGRAPH 2018 (last August)

The feature is divided into three parts:

  • Lowering the resolution of specific draw calls (tier 1)
  • Lowering the resolution within a draw call by using an image mask (tier 2)
  • Lowering the resolution within a draw call per primitive (ex: triangle) (tier 2)

The last two points are tagged as “tier 2” because they can reduce the workload within a single draw call, which is an item of work that is sent to the GPU. A typical draw call for a 3D engine is a set of triangles (vertices and indices) paired with a material (a shader program, textures, and properties). While it is sometimes useful to lower the resolution for particularly complex draw calls that take up a lot of screen space but whose output is also relatively low detail, such as water, there are real benefits to being more granular.

The second part, an image mask, allows detail to be reduced for certain areas of the screen. This can be useful in several situations:

  • The edges of a VR field of view
  • Anywhere that will be brutalized by a blur or distortion effect
  • Objects behind some translucent overlays
  • Even negating a tier 1-optimized section to re-add quality where needed

The latter example was the one that Microsoft focused on with their blog. Unfortunately, I am struggling to figure out what specifically is going on, because the changes that I see (ex: the coral reef, fish, and dirt) don’t line up with their red/blue visualizer. The claim is that they use an edge detection algorithm to force high-frequency shading where there would be high-frequency detail.

microsoft-2019-dx12-vrs-1.png

Right side reduces shading by 75% for terrain and water

microsoft-2019-dx12-vrs-2.png

Right side reclaims some lost fidelity based on edge detection algorithm

microsoft-2019-dx12-vrs-3.png

Visualization of where shading complexity is spent.

(Red is per-pixel. Blue is 1 shade per 2x2 block.)

Images Source: Firaxis via Microsoft

Microsoft claims that this feature will only be available for DirectX 12. That said, NVIDIA, when Turing launched, claims that Variable Rate Shading will be available for DirectX 11, DirectX 12, Vulkan, and OpenGL. I’m not sure what’s different between Microsoft’s implementation that lets them separate it from NVIDIA’s extension.

Microsoft will have good tools support, however. They claim that their PIX for Windows performance analysis tool will support this feature on Day 1.

Source: Microsoft

Going out on a Prime note, their last Z390 review

Subject: Motherboards | March 21, 2019 - 03:00 PM |
Tagged: PRIME Z390-A, asus

Hear it is, the last motherboard review you will see from [H]ard|OCP as the sun sets on another review site for Kyle has chosen to join the dark side.  The ASUS PRIME Z390-A has a rather unique looking silkscreen pattern but RGB lovers will find the placement of the glowy bits unfortunate.  At $183 currently on Amazon it sits in the middle of the Z390 family as far as pricing, and that does mean it lacks some of the fancier features offered by flagship class motherboards.  If stability, decent overclocking and a bargain price matter more to you than wireless connectivity and a well developed case of RGB-itis then this last review from [H] is well worth a look.

op01.jpg

"Sometimes less is more. We typically work with a lot of motherboards that fall into the more is more category and some that have a high price that can’t easily be justified by most people. The ASUS PRIME Z390-A seems to lean towards less is more. On paper the Z390-A seems to have a lot going for it and is light on fluff. How does it stack up? "

Here are some more Motherboard articles from around the web:

Motherboards

Source: [H]ard|OCP

NVIDIA is ready to storm the server room

Subject: General Tech, Graphics Cards, Networking, Shows and Expos | March 19, 2019 - 06:16 PM |
Tagged: nvidia, t4, amazon, microsoft, NGC, Mellanox, CUDA-X, GTC, jen-hsun huang, DRIVE Constellation, ai

As part of their long list of announcements yesterday, NVIDIA revealed they are partnering with Cisco, Dell EMC, Fujitsu, Hewlett Packard Enterprise, Inspur, Lenovo and Sugon to provide servers powered by T4 Tensor Core GPUs, optimized to run their CUDA-X AI accelerators. 

t4.PNG

Those T4 GPUs have been on the market for a while but this marks the first major success for NVIDIA in the server room, with models available for purchase from those aforementioned companies.  Those who prefer other people's servers can also benefit from these new products, with Amazon and Microsoft offering Cloud based solutions.  Setting yourself up to run NVIDIA's NGC software may save a lot of money down the road, the cards sip a mere 70W of power which is rather more attractive than the consumption of a gaggle of Tesla V100s.  One might be guilty of suspecting this offers an explanation for their recent acquisition of Mellanox.

purty.PNG

NGC software offers more than just a platform to run optimizations on, it also offers a range of templates to start off with from classification, and object detection, through sentiment analysis and most other basic starting points for training a machine.  Customers will also be able to upload their own models to share internally or, if in the mood, externally with other users and companies.  It supports existing products such as TensorFlow and PyTorch but also offers access to CUDA-X AI, which as the name suggests takes advantage of the base design of the T4 GPU to reduce the amount of time waiting for results and letting users advance designs quickly. 

memorex.PNG

If you are curious exactly what particular implementations of everyone's favourite buzzword might be, NVIDIA's DRIVE Constellation is a example after JoshTekk's own heart.  Literally an way to create open, scalable simulation for large fleets of self-driving cars to train them ... for good one hopes.  Currently the Toyota Research Institute-Advanced Development utilizes these products in the development of their next self-driving fleet, and NVIDIA obviously hopes others will follow suit. 

replaced.PNG

There is not much to see from the perspective of a gamer in the short term, but considering NVIDIA's work at shifting the horsepower from the silicon you own to their own Cloud this will certainly impact the future of gaming from both a hardware and gameplay perspective.  GPUs as a Service may not be the future many of us want but this suggests it could be possible, not to mention the dirty tricks enemy AIs will be able to pull with this processing power behind them.

One might even dream that escort missions could become less of a traumatic experience!

Source: NVIDIA

How else would you get a Comet in your lake, if not from the Sky

Subject: General Tech | March 18, 2019 - 01:27 PM |
Tagged: Comet Lake-S, Skylake, Intel, 14nm, Comet Lake-U, Comet Lake-H, rumour, leak

Intel's new Comet Lake families of chips will be a update to the existing Skylake architecture and will share the same 14nm process node according to what The Inquirer have discovered from leaked documents.  On desktop parts they refer to 10+2 and 8+2 SKUs from which we can infer the presence of GT2 graphics, with a 5GHz part likely topping that line.  Notebook chips are expected to top out at six cores as are the ultra-low power models.   In theory we should see these arrive some time this year, contiguous to the release of Zen 2, though we lack hard dates on either release at this time.

splash.PNG

"According to reports, Comet Lake-S will be based on the Skylake microarchitecture and will be created using Intel's now-ageing 14nm manufacturing process."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

NVIDIA Introduces Creator Ready Driver Program (419.67)

Subject: Graphics Cards | March 20, 2019 - 04:03 PM |
Tagged: nvidia, graphics drivers

At the ongoing Game Developers Conference, GDC 2019, NVIDIA has announced a “Creator Ready Driver” program. This branch targets the users that sit between the GeForce and Quadro line, who use GeForce cards for professional applications – such as game developers.

nvidia-2019-creatorreadylogo.png

Both Game Ready Drivers and the Creator Ready Drivers will support all games and creative applications. The difference is that Creator Ready drivers will be released according to the release schedule of popular creative tools, and they will receive more strict testing with creative applications.

The first release, Creator Reader Drivers 419.67, lists a 13% performance increase with the Blender Cycles renderer, and an 8% to 9% increase for CINEMA 4D, Premiere Pro CC, and Photoshop CC, all relative to the 415-branch GeForce drivers.

You can choose between either branch using the GeForce Experience interface.

nvidia-2019-creatorreadygfe.png

Image Credit: NVIDIA

As for supported products? The Creator Ready Drivers are available for Pascal-, Volta-, and Turing-based GeForce and TITAN GPUs. “All modern Quadro” GPUs are also supported on the Creator Ready branch.

Personally, my brain immediately thinks “a more-steady driver schedule and some creative application performance enhancements at the expense of a little performance near the launch of major video games”. I could be wrong, but that’s what I think when I read this news.

Source: NVIDIA

Ripping threads with Coreprio and DLM

Subject: Processors | March 20, 2019 - 04:34 PM |
Tagged: amd, coreprio, threadripper 2, 2990wx, dynamic local mode

Owning a Threadripper is not boring, the new architecture offers a variety of interesting challenges to keep your attention.  One of these features is the lack of direct memory access for two of the dies, which can cause some performance issues and was at least partially addressed by introducing Dynamic Local Mode into Ryzen Master.  On Windows boxes, enabling that feature ensures your hardest working cores have direct memory access, on Linux systems the problem simply doesn't exist.  Another choice is Coreprio, developed by Bitsum, which accomplishes the same task but without the extras included in Ryzen Master. 

Techgage ran a series of benchmarks comparing the differences in performance between the default setting, DLM and Coreprio.

AMD-Ryzen-Threadripper-2990WX-CCX-Diagram.png

"Performance regression issues in Windows on AMD’s top-end Ryzen Threadripper CPUs haven’t gone unnoticed by those who own them, and six months after launch, the issues remain. Fortunately, there’s a new tool making the rounds that can help smooth out those regressions. We’re taking an initial look."

Here are some more Processor articles from around the web:

Processors

Source: Techgage

PC Perspective Podcast #537 - Division 2 GPU Testing, Google Stadia, and the End of [H]ardOCP

Subject: General Tech | March 21, 2019 - 12:05 PM |
Tagged: The Division 2, stadia, razer, ray tracing, Oculus, Intel, hardocp, game streaming, DirectX 12, Basilisk

PC Perspective Podcast #537 - 3/20/2019

Join us this week as we review the new NVIDIA GTX 1660 and a high-end case from Corsair, discuss NVIDIA's Mellanox acquisition, get excited over Halo for PC, and more!

Subscribe to the PC Perspective Podcast

Check out previous podcast episodes: http://pcper.com/podcast

Today's sponsor is KiwiCo. Change the way your kids learn and play. Get your first KiwiCo Crate free: https://www.kiwico.com/pcper

Show Topics
00:00:22 - Intro
00:02:10 - Review: Razer Basilisk Essential Gaming Mouse
00:05:49 - Review: The Division 2 Performance Preview
00:18:34 - News: Real-Time Ray Tracing for Pascal
00:29:19 - News: Google Stadia
00:46:19 - News: GameWorks RTX & Unreal/Unity DXR Support
00:48:54 - News: Crytek Neon Noir Ray Tracing Demo
00:50:48 - News: Variable Rate Shading for DirectX 12
00:54:06 - News: NVIDIA T4 GPUs for the Server Room
01:00:10 - News: NVIDIA Creator Ready Drivers
01:04:29 - News: Oculus Rift S
01:08:39 - News: AMD Immune to SPOILER
01:18:13 - News: Windows 7 DX12 Support in Adrenalin Drivers
01:22:32 - News: Kyle Bennett Joins Intel
01:34:28 - Picks of the Week
01:44:58 - Outro

Picks of the Week
Jim: Cheap Threadripper
Jeremy: RogueTech Mod for BattleTech
Josh: Intel 660p SSD
Sebastian: OCAT 1.4

Today's Podcast Hosts
Sebastian Peak
Josh Walrath
Jeremy Hellstrom
Jim Tanous

Subject: General Tech
Manufacturer: Razer

Razer is never one to shy away from reinventing and refreshing its products. Every year or two, we find ourselves receiving a new press release on a familiar item that’s been updated or made new again with a fresh feature or new design. Today’s review is exactly one such item with the Razer Basilisk Essential. The design of the original Basilisk proved to be quite popular amongst gamers. Today’s update takes that same shape and intriguing multi-function paddle and trims it down to the titular essentials, landing at just under the fifty-dollar price point.

Is it worth adding to your Amazon wishlist? Join us as we find out!

Specifications

  • Current Pricing: $49.99
  • Sensor: 6400 DPI Optical Sensor
  • Gaming Grade Tactile Scroll Wheel
  • Multi-function paddle (single length)
  • Razer Mechanical Switches
  • 20 Million Click Lifespan
  • 7 programmable buttons
  • Customizable backlit logo
  • Weight: 95g

Razer_Basilisk_Essential_1.jpg

Starting, as always, with packaging, Razer is keeping to the mold here. We have the usual high-end product shot on the front and the specific feature highlights on the back. Inside, we see the first hints of the budget-oriented nature of the mouse in that it ships in a brown cardboard tray and styrofoam sleeve. This kind of packaging is perfectly fine, and transports the mouse safely, but it doesn’t offer the same kind of presentation found on some of Razer’s more expensive products.

Razer_Basilisk_Essential_2.jpg

Continue reading our review of the Razer Basilisk Essential gaming mouse

Cooler Master Launches MasterBox NR400 and NR600 Cases

Subject: Cases and Cooling | March 19, 2019 - 03:35 PM |
Tagged: tempered glass, mesh, MasterBox NR600, MasterBox NR400, enclosure, cooler master, chassis, case, airflow

Cooler Master has launched a pair of new cases today with the MasterBox NR400 and the MasterBox NR600. These budget-friendly cases offer a minimalist approach with clean lines and no flashy lighting effects, and should offer good airflow with their full mesh front panels.

Cooler_Master_NR_0.jpg

The Cooler Master MasterBox NR600 is an ATX mid-tower

The MasterBox NR400 is the smaller of the two cases, supporting mini ITX and micro ATX motherboards, with the larger MasterBox NR600 offering support for standard ATX motherboards.

Cooler_Master_NR_1.jpg

Cooler Master provides this list of features for these two new NR series cases:

  • Minimalistic Mesh Design - Elegant design elements are applied to mesh for optimal thermal performance.
  • Optimal Thermal Performance – The full mesh front panel and ventilated top panel provide a high potential for thermal performance.
  • Flush Tempered Glass Side Panel – The tempered glass side panel, fastened by thumbscrews on the rear panel, keeps the surface completely flush.
  • Headset Jack – The single 4 pole headset jack features both audio and microphone capabilities simultaneously so separate jacks are not needed.
  • Graphics Card Support Up to 346mm (NR400) / 410mm (NR600) – Generous clearance space is provided to support the latest graphics cards.
  • Cable Management – High quality, longer length rubber grommets paired with generous clearance behind the motherboard offers ample room for cable management.

Cooler_Master_NR_2.jpg

"Thermal performance is at the core of the NR Series, with the combination of two pre-installed fans for intake and exhaust and the fine mesh front panel. Generous support for radiator mounting and unrestricted ventilation on the top and front panel ensure that even the most demanding components are efficiently cooled while the two USB 3.1 gen 1 ports at the top of the case provide ample connectivity for everyday use."

Cooler_Master_NR.jpg

The Cooler Master MasterBox NR400 is a smaller mATX design

The MasterBox NR400 has an MSRP of $59.99, with the NR600 at $69.99. Both cases are available for pre-sale today, though no listings have yet found their way to Amazon or Newegg at time of writing.

A peek at MSI's new RTX powered laptops

Subject: General Tech | March 15, 2019 - 03:46 PM |
Tagged: msi, gaming laptop, rtx, E75 Raider RGB 8SG, GL73 8SE, GS75 Stealth 8SG, P75 Creator

Kitguru wasn't able to run benchmarks on MSI's four new laptops yet but they did get a chance to check them out and posted a video overview.  The E75 Raider RGB 8SG, GL73 8SE, GS75 Stealth 8SG, and P75 Creator all contain either a RTX 2060, 2070 or 2080 as well as an Intel Intel Core i7-8750H and a variety of NVMe storage options. 

Head on over for a tour.

GL73-pic-6.jpg

"Earlier in the week, I took a trip down to Leo’s studio where I was joined by Natalie McMorrow from MSI, who took us through an overview of some of the upcoming gaming laptops that MSI will have to offer. We also managed to get a sneak preview of a completely different type of laptop made specifically for content creators."

Here is some more Tech News from around the web:

Tech Talk

Source: Kitguru

Analogue supplies the truest input on Cooler Master's MK850

Subject: General Tech | March 15, 2019 - 04:41 PM |
Tagged: MK850, analogue, cooler master, gaming keyboard, aimpad, input

We've seem Aimpad's analogue input in action before, on other Cooler Master products as well as a Wooting one.  Cooler Master has released the MK850 which utilizes this technology, along with providing RGBs for you to gaze longingly at.  You might have noticed that there are more than a few keys on this board, the extras allow you to toggle analogue input off and on as well as modifying the sensitivity of the keys which do have analogue capability, the Q, W, E, R, A, S, D, and F specifically.  M1 through M5 allow you different profiles for different games.

Take a gander at the full review The Tech Report posted if you are interested.

cooler master mk850 1_result.jpg

"Forget, for a moment, what the Cooler Master MK850 looks like. Forget about its RGB lighting, its brushed metal top, its many keys and buttons, and the software that comes with it, because none of those things make it a keyboard of consequence. The only thing we really care about is whether or not the MK850 can deliver on the tantalizing promise of analog keyboard input for gaming."

Here is some more Tech News from around the web:

Tech Talk

LG's V50 ThinQ '5G' phone

Subject: Mobile | March 15, 2019 - 05:07 PM |
Tagged: V50 ThinQ, LG, 5G

We know for sure which came first, the 5G modem before 5G service, no matter what certain companies branding might say.   LG's new V50 ThinQ contains a Qualcomm Snapdragon 855 chipset and X50 modem inside it's shell.  It has already been announced that LG will not be releasing a folding phone this year, but this phone comes with an interesting compromise.  You can purchase a second device, a screen with POGO pin connectors that you can then attach to the phone, the hinge technically makes it a folding phone but not in the way that others are developing.

Take a quick peek at it over at The Inquirer.

IMG20190225160745-540x334.jpg

"That's not a bad thing, but it serves as a reminder that the days when a new technology meant a retrograde in form factor are gone. There's nothing about the V50 ThinQ 5G that sets it apart from any 4G phone - at least on the outside."

Here are some more Mobile articles from around the web:

More Mobile Articles

Source: The Inquirer