All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | March 20, 2019 - 01:20 PM | Jeremy Hellstrom
Tagged: Oculus Rift S, Oculus, vr headset, gdc 2019
The brand new Oculus Rift S is being shown off at GDC and Ars Technica had a chance to play with it. The new implementation is rather impressive, a single wire connects you to your PC and there are now no external cameras, instead they have been relocated to the headset itself. From the description in the review it seems they have done so very successfully, with location tracking improving as opposed to degrading due to that change. Your eyeballs also get an upgraded experience, with each eye having a 1280x1440 display, though as of yet Oculus have not changed to AMOLED screens.
"This is out-of-the-box room scale," Oculus co-founder Nate Mitchell said as he gestured to the Oculus Rift S, the new PC-focused VR headset launching "this spring" for $399. This headset will effectively replace the standard Oculus Rift headset, which has had production all but halted to make way for the new model."
Here is some more Tech News from around the web:
- Qualcomm's latest chip will stop your smart speaker from fluffing commands @ The Inquirer
- Opera Adds Free and Unlimited VPN Service To Its Android Browser @ Slashdot
- EC whacks Google with €1.49bn antitrust fine over AdSense @ The Inquirer
- Hands-On: New Nvidia Jetson Nano is More Power In A Smaller Form Factor @ Hackaday
- LLVM 8.0 Released With Cascade Lake Support, Better Diagnostics, More OpenMP/OpenCL @ Slashdot
Subject: Graphics Cards | March 19, 2019 - 08:29 PM | Scott Michaud
Tagged: microsoft, DirectX 12, turing, nvidia
To align with the Game Developers Conference, GDC 2019, Microsoft has announced Variable Rate Shading for DirectX 12. This feature increases performance by allowing the GPU to lower its shading resolution for specific parts of the scene (without the developer doing explicit, render texture-based tricks).
An NVIDIA speech from SIGGRAPH 2018 (last August)
The feature is divided into three parts:
- Lowering the resolution of specific draw calls (tier 1)
- Lowering the resolution within a draw call by using an image mask (tier 2)
- Lowering the resolution within a draw call per primitive (ex: triangle) (tier 2)
The last two points are tagged as “tier 2” because they can reduce the workload within a single draw call, which is an item of work that is sent to the GPU. A typical draw call for a 3D engine is a set of triangles (vertices and indices) paired with a material (a shader program, textures, and properties). While it is sometimes useful to lower the resolution for particularly complex draw calls that take up a lot of screen space but whose output is also relatively low detail, such as water, there are real benefits to being more granular.
The second part, an image mask, allows detail to be reduced for certain areas of the screen. This can be useful in several situations:
- The edges of a VR field of view
- Anywhere that will be brutalized by a blur or distortion effect
- Objects behind some translucent overlays
- Even negating a tier 1-optimized section to re-add quality where needed
The latter example was the one that Microsoft focused on with their blog. Unfortunately, I am struggling to figure out what specifically is going on, because the changes that I see (ex: the coral reef, fish, and dirt) don’t line up with their red/blue visualizer. The claim is that they use an edge detection algorithm to force high-frequency shading where there would be high-frequency detail.
Right side reduces shading by 75% for terrain and water
Right side reclaims some lost fidelity based on edge detection algorithm
Visualization of where shading complexity is spent.
(Red is per-pixel. Blue is 1 shade per 2x2 block.)
Images Source: Firaxis via Microsoft
Microsoft claims that this feature will only be available for DirectX 12. That said, NVIDIA, when Turing launched, claims that Variable Rate Shading will be available for DirectX 11, DirectX 12, Vulkan, and OpenGL. I’m not sure what’s different between Microsoft’s implementation that lets them separate it from NVIDIA’s extension.
Microsoft will have good tools support, however. They claim that their PIX for Windows performance analysis tool will support this feature on Day 1.
Subject: General Tech, Graphics Cards, Networking, Shows and Expos | March 19, 2019 - 06:16 PM | Jeremy Hellstrom
Tagged: nvidia, t4, amazon, microsoft, NGC, Mellanox, CUDA-X, GTC, jen-hsun huang, DRIVE Constellation, ai
As part of their long list of announcements yesterday, NVIDIA revealed they are partnering with Cisco, Dell EMC, Fujitsu, Hewlett Packard Enterprise, Inspur, Lenovo and Sugon to provide servers powered by T4 Tensor Core GPUs, optimized to run their CUDA-X AI accelerators.
Those T4 GPUs have been on the market for a while but this marks the first major success for NVIDIA in the server room, with models available for purchase from those aforementioned companies. Those who prefer other people's servers can also benefit from these new products, with Amazon and Microsoft offering Cloud based solutions. Setting yourself up to run NVIDIA's NGC software may save a lot of money down the road, the cards sip a mere 70W of power which is rather more attractive than the consumption of a gaggle of Tesla V100s. One might be guilty of suspecting this offers an explanation for their recent acquisition of Mellanox.
NGC software offers more than just a platform to run optimizations on, it also offers a range of templates to start off with from classification, and object detection, through sentiment analysis and most other basic starting points for training a machine. Customers will also be able to upload their own models to share internally or, if in the mood, externally with other users and companies. It supports existing products such as TensorFlow and PyTorch but also offers access to CUDA-X AI, which as the name suggests takes advantage of the base design of the T4 GPU to reduce the amount of time waiting for results and letting users advance designs quickly.
If you are curious exactly what particular implementations of everyone's favourite buzzword might be, NVIDIA's DRIVE Constellation is a example after JoshTekk's own heart. Literally an way to create open, scalable simulation for large fleets of self-driving cars to train them ... for good one hopes. Currently the Toyota Research Institute-Advanced Development utilizes these products in the development of their next self-driving fleet, and NVIDIA obviously hopes others will follow suit.
There is not much to see from the perspective of a gamer in the short term, but considering NVIDIA's work at shifting the horsepower from the silicon you own to their own Cloud this will certainly impact the future of gaming from both a hardware and gameplay perspective. GPUs as a Service may not be the future many of us want but this suggests it could be possible, not to mention the dirty tricks enemy AIs will be able to pull with this processing power behind them.
One might even dream that escort missions could become less of a traumatic experience!
Subject: Cases and Cooling | March 19, 2019 - 03:35 PM | Sebastian Peak
Tagged: tempered glass, mesh, MasterBox NR600, MasterBox NR400, enclosure, cooler master, chassis, case, airflow
Cooler Master has launched a pair of new cases today with the MasterBox NR400 and the MasterBox NR600. These budget-friendly cases offer a minimalist approach with clean lines and no flashy lighting effects, and should offer good airflow with their full mesh front panels.
The Cooler Master MasterBox NR600 is an ATX mid-tower
The MasterBox NR400 is the smaller of the two cases, supporting mini ITX and micro ATX motherboards, with the larger MasterBox NR600 offering support for standard ATX motherboards.
Cooler Master provides this list of features for these two new NR series cases:
- Minimalistic Mesh Design - Elegant design elements are applied to mesh for optimal thermal performance.
- Optimal Thermal Performance – The full mesh front panel and ventilated top panel provide a high potential for thermal performance.
- Flush Tempered Glass Side Panel – The tempered glass side panel, fastened by thumbscrews on the rear panel, keeps the surface completely flush.
- Headset Jack – The single 4 pole headset jack features both audio and microphone capabilities simultaneously so separate jacks are not needed.
- Graphics Card Support Up to 346mm (NR400) / 410mm (NR600) – Generous clearance space is provided to support the latest graphics cards.
- Cable Management – High quality, longer length rubber grommets paired with generous clearance behind the motherboard offers ample room for cable management.
"Thermal performance is at the core of the NR Series, with the combination of two pre-installed fans for intake and exhaust and the fine mesh front panel. Generous support for radiator mounting and unrestricted ventilation on the top and front panel ensure that even the most demanding components are efficiently cooled while the two USB 3.1 gen 1 ports at the top of the case provide ample connectivity for everyday use."
The Cooler Master MasterBox NR400 is a smaller mATX design
The MasterBox NR400 has an MSRP of $59.99, with the NR600 at $69.99. Both cases are available for pre-sale today, though no listings have yet found their way to Amazon or Newegg at time of writing.
Subject: General Tech | March 19, 2019 - 03:28 PM | Scott Michaud
Tagged: google, stadia
Google Stadia, a new gaming service, was announced at the 2019 Game Developers Conference. Much like OnLive and PlayStation Now, users connect to an instance of video games running in datacenters. Because it is a conference for game developers, they also mentioned a few added features, such as directly transcoding for YouTube and allowing the audience to wait in line and jump into a streamer’s game session.
Requirements and price were not mentioned, but it will arrive sometime in 2019.
Google also mentioned that applicable games would run on Linux and the Vulkan API. Given their Android initiative, and their desire to not pay Microsoft a license fee for VMs, this makes a lot of sense. It also forces major engine developers to target and optimize Linux-focused APIs, which will be good in the long run, especially if Google starts adding GPUs from Intel and NVIDIA in the coming years. I'm not sure how much it will push Linux ports, however, because that's not really why publishers avoid the platform.
In terms of hardware, Google claims that an instance will have about 10.7 teraflops of GPU performance on an AMD platform. In terms of compute, this puts it equivalent to a GTX 1080 Ti, although AMD tends to have reduced fillrate, etc., which keeps them behind NVIDIA parts with equivalent compute performance. (Those equivalent AMD parts also tend to be significantly cheaper, and thus comparing flop to flop isn’t fair in most other circumstances.)
As long-time readers know, I am also very cautious about streaming services because they are about the epitome of DRM. While the content is available on other, local platforms? Not really a big deal. If a game is developed for Google’s service, and requires Google’s API, then games that leave the service, either by Google’s license ending or Google simply not wanting to host it anymore, will be just about impossible to preserve. Games are not movies or sound files that can be transcoded into another format.
Linux and Vulkan does provide a bit more confidence, but you never know what will happen when a company, with no legal (ex: the GPL) obligation to remain open, gets large enough.
It’s something that I’m still worried about, though.
Subject: General Tech | March 19, 2019 - 02:34 PM | Sebastian Peak
Tagged: RGB, nzxt, lighting, HUE 2 Ambient RGB Kit V2, HUE 2, ambient
NZXT has announced the version 2 of the HUE 2 Ambient RGB Lighting Kit, an update which the company says "brings product improvements that make installation easier and help ensure that the included LED strips will be securely attached to your monitor". The lighting kit uses NZXT's CAM software to select between lighting modes - which include an ambient mode that changes the lighting to compliment the images on your monitor.
"The HUE 2 ecosystem delivers the broadest line of RGB lighting accessories for PC Builders with over 10 different accessories such as the HUE 2 RGB lighting kit, underglow kit, AER RGB 2 fans, and cable combs."
NZXT provides this list of features for the new HUE 2 kit:
HUE 2 Ambient RGB Kit V2 New Features:
- Stronger Adhesive: Upgraded the 3M LED strip adhesive backing tape to be thicker and stickier. It is more compatible with the different surfaces and textures of monitors on the market.
- L-Shape Corner Connector: For easier setup, we replaced the 150mm corner connectors to an L-shape corner connector for the top left and bottom right of your monitor.
- Alcohol Wipes: As your monitor may be dusty or dirty, we added alcohol-based cleaning wipes so you can clean the back of the monitor before adhering the LED strips.
HUE 2 RGB Ambient Lighting Kit Features
- Available in two configurations, one for 21” to 25”/34”-35” UltraWide monitors and one for 26” to 32” monitors, the HUE 2 RGB Ambient Lighting Kit adds gorgeous lighting to your gaming PC and provides a more immersive in-game experience* by projecting the colors from the edges of your display to your walls.
- HUE 2 RGB Ambient Lighting controller
- Faster microprocessor enables incredibly smooth lighting effects and virtually instantaneous response changes to colors on the screen in ambient mode
- Multiple LED strips in various lengths, along with easy-to-follow configuration and installation instructions for various monitor sizes and aspect ratios
- AC power adapter
- All necessary cables
- Compatible with all HUE 2 accessories and CAM lighting modes
* Ambient lighting modes are available only with the HUE 2 Ambient Lighting controller and require the use of CAM as a registered user, including acceptance of the then-current NZXT Terms of Service.
The HUE 2 lighting kits are available now in the USA with pricing as follows:
- HUE 2 Ambient RGB Lighting Kit V2 (21”-25”, 34”-35” UltraWide) - $99.99
- HUE 2 Ambient RGB Lighting Kit V2 (26”-32”) - $99.99
Subject: Cases and Cooling | March 19, 2019 - 01:57 PM | Jeremy Hellstrom
Tagged: thermaltake, Water 3.0 360 ARGB Sync, watercooler, 360mm radiator, RGB
Thermaltake's Water 3.0 360 ARGB Sync is a watercooler for those who live and breath RGBs, as it is compatible with all the software suites found on motherboards so you can create your own lightshow. It is compatible with all current sockets from both AMD and Intel, the only compatibility issue you might have is fitting the 360mm radiator into your case. In Kitguru's testing it even performed well while cooling, but really it's all about the RGBs.
"There’s a lot to unpack from the name of the Thermaltake Water 3.0 360 ARGB Sync. The ‘360’ obviously refers to its 360mm radiator, and we know ARGB = Addressable RGB lighting. The ‘Sync’ aspect references software compatibility for the Water 3.0 360 with current motherboards ..."
Here are some more Cases & Cooling reviews from around the web:
- ID-Cooling Auraflow X 240 @ TechPowerUp
- Noctua NF-F12 PWM chromax Fan @ TechPowerUp
- Corsair Carbide 678C @ The Guru of 3D
- CORSAIR CRYSTAL 680X RGB ATX Smart Case Review @ NikKTech
- Cooler Master MasterBox NR400 @ Kitguru
Subject: General Tech | March 19, 2019 - 01:19 PM | Jeremy Hellstrom
Tagged: atari, delay, amd, Vega, ryzen
You may remember the announcement of the re-launch of the Atari Video Console System back in the summer of 2017, though by now you may have decided that it is going the way of the ZX Spectrum Vega+. If you do still hold hope, Atari is once again testing your patience by announcing another delay to the end of 2019. There is a reason however, which you may or may not find acceptable. They will be upgrading the AMD Ryzen chip at the heart of the system, with the new generation of Vega graphics offering modern performance. Atari is also suggesting this will offer much quieter and cooler performance in a quote over at The Inquirer.
"The Atari VCS launched on Indiegogo and was originally set to arrive in spring 2018, but the company has announced that it will now arrive at the butt-end of 2019 (and that projection is just for the US and Canada)."
Here is some more Tech News from around the web:
- NVIDIA GTC 2019: RTX Servers, Omniverse Collaboration, CUDA-X AI, And More @ Techgage
- Corporations, not consumers, drive demand for HP’s new VR headset @ Ars Technica
- MacBook users have taken to giving oral relief to frustrated keyboards @ The Inquirer
- Firefox 66 Arrives With Autoplaying Blocked by Default, Smoother Scrolling, and Better Search @ Slashdot
- NVIDIA Jetson Nano: A Feature-Packed Arm Developer Kit For $99 USD @ Phoronix
- This headline is proudly brought to you by wired keyboards: Wireless Fujitsu model hacked @ The Register
- Apple finally updates the iMac with significantly more powerful CPU and GPU options @ Ars Technica
- TSMC seeing chip orders for Android devices ramp up @ DigiTimes
- QNAP QSW-1208-8C-US 12-Port Unmanaged 10GbE Switch @ Modders-Inc
- ASUS RT-AX88U Dual band AX6000 router @ Guru of 3D
Subject: General Tech | March 19, 2019 - 01:20 AM | Scott Michaud
Tagged: Khronos, openxr
At the Games Developers Conference in San Francisco, the Khronos Group has published the first, provisional release of the augmented reality and virtual reality API: OpenXR 0.90. The goal is to allow the general public to implement it into their software so they can provide feedback (via the official forums) before it launches into 1.0.
The last time we’ve mentioned OpenXR was SIGGRAPH back in August. That event had a demo area on the show floor with the Epic Showdown VR demo, which evolved into Robo Recall. On that end, Epic plans to continue support for OpenXR, and they pride themselves as the engine that powers the original demo. Unity has also responded positively to OpenXR, but their quoted statement doesn’t make any specific promises. Microsoft and Collabora are each, separately, providing hardware implementations of the new API. Oculus plans to support the API when it reaches 1.0 status. HTC also wants to support OpenXR with the Vive, although Valve is nowhere to be seen on the list of quotes. Google is also nowhere to be seen on the list.
Subject: General Tech | March 18, 2019 - 10:00 PM | Sebastian Peak
Tagged: gameworks, unreal engine, Unity, rtx, ray tracing, nvidia, GTC 19, GTC, dxr, developers
Today at GTC NVIDIA announced GameWorks RTX and the implementation of real-time ray tracing in the upcoming Unreal Engine 4.22 and the latest version of Unity, currently in 2019.03.
NVIDIA Announces GameWorks RTX
While Pascal and non-RTX Turing support for real-time ray tracing is something of a bombshell from NVIDIA, the creation of GameWorks tools for such effects is not surprising.
“NVIDIA GameWorks RTX is a comprehensive set of tools that help developers implement real time ray-traced effects in games. GameWorks RTX is available to the developer community in open source form under the GameWorks license and includes plugins for Unreal Engine 4.22 and Unity’s 2019.03 preview release.”
NVIDIA lists these components of GameWorks RTX:
- RTX Denoiser SDK – a library that enables fast, real-time ray tracing by providing denoising techniques to lower the required ray count and samples per pixel. It includes algorithms for ray traced area light shadows, glossy reflections, ambient occlusion and diffuse global illumination.
- Nsight for RT – a standalone developer tool that saves developers time by helping to debug and profile graphics applications built with DXR and other supported APIs.
Unreal Engine and Unity Gaining Real-Time Ray Tracing Support
And while not specific to NVIDIA hardware, news of more game engines offering integrated DXR support was also announced during the keynote:
“Unreal Engine 4.22 is available in preview now, with final release details expected in Epic’s GDC keynote on Wednesday. Starting on April 4, Unity will offer optimized, production-focused, realtime ray tracing support with a custom experimental build available on GitHub to all users with full preview access in the 2019.03 Unity release. Real-time ray tracing support from other first-party AAA game engines includes DICE/EA’s Frostbite Engine, Remedy Entertainment’s Northlight Engine and engines from Crystal Dynamics, Kingsoft, Netease and others.”
RTX may have been off to a slow start, but this will apparently be the year of real-time ray tracing after all - especially with the upcoming NVIDIA driver update adding support to the GTX 10-series and new GTX 16-series GPUs.
Subject: Graphics Cards | March 18, 2019 - 09:41 PM | Sebastian Peak
Tagged: unreal engine, Unity, turing, rtx, ray tracing, pascal, nvidia, geforce, GTC 19, GTC, gaming, developers
Today at GTC NVIDIA announced a few things of particular interest to gamers, including GameWorks RTX and the implementation of real-time ray tracing in upcoming versions of both Unreal Engine and Unity (we already posted the news that CRYENGINE will be supporting real-time ray tracing as well). But there is something else... NVIDIA is bringing ray tracing support to GeForce GTX graphics cards.
This surprising turn means that hardware RT support won’t be limited to RTX cards after all, as the install base of NVIDIA ray-tracing GPUs “grows to tens of millions” with a simple driver update next month, adding the feature to both to previous-gen Pascal and the new Turing GTX GPUs.
How is this possible? It’s all about the programmable shaders:
“NVIDIA GeForce GTX GPUs powered by Pascal and Turing architectures will be able to take advantage of ray tracing-supported games via a driver expected in April. The new driver will enable tens of millions of GPUs for games that support real-time ray tracing, accelerating the growth of the technology and giving game developers a massive installed base.
With this driver, GeForce GTX GPUs will execute ray traced effects on shader cores. Game performance will vary based on the ray-traced effects and on the number of rays cast in the game, along with GPU model and game resolution. Games that support the Microsoft DXR and Vulkan APIs are all supported.
However, GeForce RTX GPUs, which have dedicated ray tracing cores built directly into the GPU, deliver the ultimate ray tracing experience. They provide up to 2-3x faster ray tracing performance with a more visually immersive gaming environment than GPUs without dedicated ray tracing cores.”
A very important caveat is that “2-3x faster ray tracing performance” for GeForce RTX graphics cards mentioned in the last paragraph, so expectations will need to be tempered as RT features will be less efficient running on shader cores (Pascal and Turing) than they are with dedicated cores, as demonstrated by these charts:
It's going to be a busy April.
Subject: Graphics Cards | March 18, 2019 - 03:13 PM | Jeremy Hellstrom
Tagged: fxaa, SMAA, Anti-aliasing, MLAA, taa, amd, nvidia
Apart from the new DLSS available on NVIDIA's RTX cards, it has been a very long time since we looked at anti-aliasing implementations and the effects your choice has on performance and visual quality. You are likely familiar with the four most common implementations, dating back to AMD's MLAA and NVIDIA's FXAA which are not used in new generation games to TAA/TXAA and SMAA but when was the last time you refreshed your memory on what they actually do and how they compare.
Not only did Overclockers Club looking into those, they discuss some of the other attempted implementations as well as sampling types that lie behind these technologies. Check out their deep dive here.
"One setting present in many if not all modern PC games that can dramatically impact performance and quality is anti-aliasing and, to be honest, I never really understood how it works. Sure we have the general idea that super-sampling is in effect running at a higher resolution and then downscaling, but then what is multi-sampling? How do post-processing methods work, like the very common FXAA and often favored SMAA?"
Here are some more Graphics Card articles from around the web:
- MSI GTX 1660 Ti Gaming X – Turing Without The RTX @ Bjorn3d
- MSI GeForce GTX 1660 Gaming X 6 GB @ TechPowerUp
- The GTX 1660 41 game OC Shootout vs. the RX 590 @ BabelTechReviews
- MSI Nvidia GeForce GTX 1660 Gaming X Review – an Even Lower Cost Turing Option? @ Bjorn3d
Subject: General Tech | March 18, 2019 - 01:27 PM | Jeremy Hellstrom
Tagged: Comet Lake-S, Skylake, Intel, 14nm, Comet Lake-U, Comet Lake-H, rumour, leak
Intel's new Comet Lake families of chips will be a update to the existing Skylake architecture and will share the same 14nm process node according to what The Inquirer have discovered from leaked documents. On desktop parts they refer to 10+2 and 8+2 SKUs from which we can infer the presence of GT2 graphics, with a 5GHz part likely topping that line. Notebook chips are expected to top out at six cores as are the ultra-low power models. In theory we should see these arrive some time this year, contiguous to the release of Zen 2, though we lack hard dates on either release at this time.
"According to reports, Comet Lake-S will be based on the Skylake microarchitecture and will be created using Intel's now-ageing 14nm manufacturing process."
Here is some more Tech News from around the web:
- D-Wave 2000Q hands-on: Steep learning curve for quantum computing @ Ars Technica
- Microsoft Edges its bets on new Chrome and Firefox Defender extension @ The Inquirer
- Apple quietly launches 10.5in iPad Air and new iPad Mini @ The Inquirer
- It’s time to start caring about “VR cinema,” and SXSW’s stunners are proof @ Ars Technica
- Chip flinger Broadcom says its software unit's doing great. Wait, what? @ The Register
- Forget that rare-earth element crunch – we can now just extract them from industrial waste @ The Register
- Q&A: Crypto-guru Bruce Schneier on teaching tech to lawmakers, plus privacy failures – and a call to techies to act @ The Register
- The World Wide Web Turns 30: A Timeline @ Techspot
- Razer 3 gaming handset coming, as Nintendo mulls joining the race @ DigiTimes
- Before Google+ Shuts Down, The Internet Archive Will Preserve Its Posts @ Slashdot
- MySpace Has Reportedly Lost All Photos, Videos and Songs Uploaded Over 12 Years Due To Data Corruption During a Server Migration Project @ Slashdot
- How To Interface Sega Controllers, And Make Them Wireless @ Hackaday
Subject: General Tech | March 18, 2019 - 09:03 AM | Sebastian Peak
Tagged: vulkan, RX Vega 56, rtx, ray tracing, radeon, nvidia, Neon Noir, dx12, demo, crytek, CRYENGINE, amd
Crytek has released video of a new demo called Neon Noir, showcasing real-time ray tracing with a new version of CRYENGINE Total Illumination, slated for release in 2019. The big story here is that this is platform agnostic, meaning both AMD and NVIDIA (including non-RTX) graphics cards can produce the real-time lighting effects. The video was rendered in real time using an AMD Radeon RX Vega 56 (!) at 4K30, with Crytek's choice in GPU seeming to assuage fears of any meaningful performance penalty with this feature enabled (video embedded below):
“Neon Noir follows the journey of a police drone investigating a crime scene. As the drone descends into the streets of a futuristic city, illuminated by neon lights, we see its reflection accurately displayed in the windows it passes by, or scattered across the shards of a broken mirror while it emits a red and blue lighting routine that will bounce off the different surfaces utilizing CRYENGINE's advanced Total Illumination feature. Demonstrating further how ray tracing can deliver a lifelike environment, neon lights are reflected in the puddles below them, street lights flicker on wet surfaces, and windows reflect the scene opposite them accurately.”
Crytek is calling the new ray tracing features “experimental” at this time, but the implications of ray tracing tech beyond proprietary hardware and even graphics API (it works with both DirectX 12 and Vulcan) are obviously a very big deal.
“Neon Noir was developed on a bespoke version of CRYENGINE 5.5., and the experimental ray tracing feature based on CRYENGINE’s Total Illumination used to create the demo is both API and hardware agnostic, enabling ray tracing to run on most mainstream, contemporary AMD and NVIDIA GPUs. However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards and supported APIs like Vulkan and DX12.”
You can read the full announcement from Crytek here.
Subject: Processors | March 18, 2019 - 08:38 AM | Jim Tanous
Tagged: spoiler, speculation, spectre, rowhammer, meltdown, amd
AMD has issued a support article stating that its CPUs are not susceptible to the recently disclosed SPOILER vulnerability. Support Article PA-240 confirms initial beliefs that AMD processors were immune from this specific issue due to the different ways that AMD and Intel processors store and access data:
We are aware of the report of a new security exploit called SPOILER which can gain access to partial address information during load operations. We believe that our products are not susceptible to this issue because of our unique processor architecture. The SPOILER exploit can gain access to partial address information above address bit 11 during load operations. We believe that our products are not susceptible to this issue because AMD processors do not use partial address matches above address bit 11 when resolving load conflicts.
SPOILER, one of the latest in the line of speculative execution vulnerabilities that have called into question years of processor architecture design, describes a process that can expose the mappings between virtual and physical memory. That's not a complete issue in and of itself, but it allows other attacks such as Rowhammer to be executed much more quickly and easily.
The research paper that initially disclosed SPOILER earlier this month states that Intel CPUs dating as far back as the first generation Core-series processors are affected. Intel, however, has stated that the vulnerabilities described in the paper can be avoided. The company provided a statement to PC Perspective following our initial SPOILER reporting:
Intel received notice of this research, and we expect that software can be protected against such issues by employing side channel safe software development practices. This includes avoiding control flows that are dependent on the data of interest. We likewise expect that DRAM modules mitigated against Rowhammer style attacks remain protected. Protecting our customers and their data continues to be a critical priority for us and we appreciate the efforts of the security community for their ongoing research.
Subject: Mobile | March 15, 2019 - 05:07 PM | Jeremy Hellstrom
Tagged: V50 ThinQ, LG, 5G
We know for sure which came first, the 5G modem before 5G service, no matter what certain companies branding might say. LG's new V50 ThinQ contains a Qualcomm Snapdragon 855 chipset and X50 modem inside it's shell. It has already been announced that LG will not be releasing a folding phone this year, but this phone comes with an interesting compromise. You can purchase a second device, a screen with POGO pin connectors that you can then attach to the phone, the hinge technically makes it a folding phone but not in the way that others are developing.
Take a quick peek at it over at The Inquirer.
"That's not a bad thing, but it serves as a reminder that the days when a new technology meant a retrograde in form factor are gone. There's nothing about the V50 ThinQ 5G that sets it apart from any 4G phone - at least on the outside."
Here are some more Mobile articles from around the web:
More Mobile Articles
Subject: General Tech | March 15, 2019 - 04:41 PM | Jeremy Hellstrom
Tagged: MK850, analogue, cooler master, gaming keyboard, aimpad, input
We've seem Aimpad's analogue input in action before, on other Cooler Master products as well as a Wooting one. Cooler Master has released the MK850 which utilizes this technology, along with providing RGBs for you to gaze longingly at. You might have noticed that there are more than a few keys on this board, the extras allow you to toggle analogue input off and on as well as modifying the sensitivity of the keys which do have analogue capability, the Q, W, E, R, A, S, D, and F specifically. M1 through M5 allow you different profiles for different games.
"Forget, for a moment, what the Cooler Master MK850 looks like. Forget about its RGB lighting, its brushed metal top, its many keys and buttons, and the software that comes with it, because none of those things make it a keyboard of consequence. The only thing we really care about is whether or not the MK850 can deliver on the tantalizing promise of analog keyboard input for gaming."
Here is some more Tech News from around the web:
- Razer BlackWidow @ Kitguru
- Patriot Viper V765 mechanical keyboard @ The Tech Report
- Cooler Master MK730 @ Kitguru
- HyperX Alloy FPS RGB Mechanical Gaming Keyboard Review @ NikKTech
- ASUS ROG Balteus Qi Mouse Pad @ TechPowerUp
- Razer Basilisk Essential Mouse @ Kitguru
- ASUS ROG Gladius II Wireless @ TechPowerUp
Subject: General Tech | March 15, 2019 - 03:46 PM | Jeremy Hellstrom
Tagged: msi, gaming laptop, rtx, E75 Raider RGB 8SG, GL73 8SE, GS75 Stealth 8SG, P75 Creator
Kitguru wasn't able to run benchmarks on MSI's four new laptops yet but they did get a chance to check them out and posted a video overview. The E75 Raider RGB 8SG, GL73 8SE, GS75 Stealth 8SG, and P75 Creator all contain either a RTX 2060, 2070 or 2080 as well as an Intel Intel Core i7-8750H and a variety of NVMe storage options.
"Earlier in the week, I took a trip down to Leo’s studio where I was joined by Natalie McMorrow from MSI, who took us through an overview of some of the upcoming gaming laptops that MSI will have to offer. We also managed to get a sneak preview of a completely different type of laptop made specifically for content creators."
Here is some more Tech News from around the web:
- Overhyped 5G is being 'rushed', Britain's top comms boffin reckons @ The Register
- Steam now lets you stream anywhere in the world @ The Inquirer
- Beto O'Rourke's Secret Membership in America's Oldest Hacking Group @ Slashdot
- Intel to start engineering projects for 5G modem chips in 2Q19, say sources @ DigiTimes
- Kids are taking to Google Docs to message each other undetected @ The Inquirer
- Facebook blames 'server config change' for 14-hour outage. Someone run that through the universal liar translator @ The Register
- Humble Bundle is giving away Grid 2
Subject: Graphics Cards | March 14, 2019 - 08:38 PM | Sebastian Peak
Tagged: Windows 7, The Division 2, radeon, graphics, gpu, gaming, dx12, driver, DirectX 12, amd, Adrenalin, 19.3.2
AMD has released Radeon 19.3.2 drivers, adding support for Tom Clancy’s The Division 2 and offering a performance boost with Civilization VI: Gathering Storm. This update also adds a number of new Vulkan extensions. But wait, there's more: "DirectX 12 on Windows 7 for supported game titles." The DX12-ening is upon us.
Here are AMD's release notes for 19.3.2:
Radeon Software Adrenalin 2019 Edition 19.3.2 Highlights
- Tom Clancy’s The Division® 2
- Sid Meier’s Civilization® VI: Gathering Storm
- Up to 4% average performance gains on AMD Radeon VII with Radeon™ Software Adrenalin 2019 Edition 19.3.2 vs 19.2.3. RS-288
- DirectX® 12 on Windows®7 for supported game titles
- AMD is thrilled to help expand DirectX® 12 adoption across a broader range of Windows operating systems with Radeon Software Adrenalin 2019 Edition 18.12.2 and onward, which enables consumers to experience exceptional levels of detail and performance in their games.
- Radeon ReLive for VR may sometimes fail to install during Radeon Software installation.
- Fan curve may fail to switch to manual mode after the manual toggle is switched when fan curve is still set to default behavior.
- Changes made in Radeon WattMan settings via Radeon Overlay may sometimes not save or take effect once Radeon Overlay is closed.
- Rainbow Six Siege™ may experience intermittent corruption or flickering on some game textures during gameplay.
- DOTA™2 VR may experience stutter on some HMD devices when using the Vulkan® API.
- Mouse cursors may disappear or move out of the boundary of the top of a display on AMD Ryzen Mobile Processors with Radeon Vega Graphics.
- Performance metrics overlay and Radeon WattMan gauges may experience inaccurate fluctuating readings on AMD Radeon VII..
Subject: Graphics Cards | March 14, 2019 - 01:33 PM | Jeremy Hellstrom
Tagged: video card, turing, rtx, nvidia, gtx 1660 ti, gtx 1660, gtx 1060, graphics card, geforce, GDDR5, gaming, 6Gb
Sebastian has given you a look at the triple slot EVGA GTX 1660 XC Black as well as the dual fan and dual slot MSI GTX 1660 GAMING X, both doing well in benchmarks especially when overclocked. The new GTX 1660 does come in other shapes and sizes, like the dual slot, single fan GTX 1660 StormX OC 6G from Palit which The Guru of 3D reviewed. Do not underestimate it because of its diminutive size, the Boost Clock is 1830MHz out of the box and with some tweaking will sit around 2070MHz and the GDDR5 pushed up to 9800MHz.
Check out even more models below.
"We review a GeForce GTX 1660 that is priced spot on that 219 USD marker, the MSRP of the new non-Ti model, meet the petite Palit GeForce GTX 1660 StormX OC edition. Based on a big single fan and a small form factor you should not be fooled by its looks. It performs well on all fronts, including cooling acoustic levels."
Here are some more Graphics Card articles from around the web:
- MSI GeForce GTX 1660 VENTUS XS 6G OC @ Guru of 3D
- Palit GeForce GTX 1660 StormX 6 GB @ TechPowerUp
- MSI GeForce GTX 1660 X
TiGAMING X 6G @ Guru of 3D
- Zotac GeForce GTX 1660 Twin Fan 6 GB @ TechPowerUp
- EVGA GTX 1660 XC takes on the Red Devil RX 590 in 41 @ BabelTechReviews
- EVGA GTX 1660 XC Ultra 6 GB @ TechPowerUp
- MSI GTX 1660 Gaming X 6G @ Kitguru
- ZOTAC GAMING GeForce RTX 2070 MINI @ Modders-Inc