All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
When PC monitors made the mainstream transition to widescreen aspect ratios in the mid-2000s, many manufacturers opted for resolutions at a 16:10 ratio. My first widescreen displays were a pair of Dell monitors with a 1920x1200 resolution and, as time and technology marched forward, I moved to larger 2560x1600 monitors.
I grew to rely on and appreciate the extra vertical resolution that 16:10 displays offer, but as the production and development of "widescreen" PC monitors matured, it naturally began to merge with the television industry, which had long since settled on a 16:9 aspect ratio. This led to the introduction of PC displays with native resolutions of 1920x1080 and 2560x1440, keeping things simple for activities such as media playback but robbing consumers of pixels in terms of vertical resolution.
I was well-accustomed to my 16:10 monitors when the 16:9 aspect ratio took over the market, and while I initially thought that the 120 or 160 missing rows of pixels wouldn't be missed, I was unfortunately mistaken. Those seemingly insignificant pixels turned out to make a noticeable difference in terms of on-screen productivity real estate, and my 1080p and 1440p displays have always felt cramped as a result.
I was therefore sad to see that the relatively new ultrawide monitor market continued the trend of limited vertical resolutions. Most ultrawides feature a 21:9 aspect ratio with resolutions of 2560x1080 or 3440x1440. While this gives users extra resolution on the sides, it maintains the same limited height options of those ubiquitous 1080p and 1440p displays. The ultrawide form factor is fantastic for movies and games, but while some find them perfectly acceptable for productivity, I still felt cramped.
Thankfully, a new breed of ultrawide monitors is here to save the day. In the second half of 2017, display manufactures such as Dell, Acer, and LG launched 38-inch ultrawide monitors with a 3840x1600 resolution. Just like the how the early ultrawides "stretched" a 1080p or 1440p monitor, the 38-inch versions do the same for my beloved 2560x1600 displays.
The Acer XR382CQK
I've had the opportunity to test one of these new "taller" displays thanks to a review loan from Acer of the XR382CQK, a curved 37.5-inch behemoth. It shares the same glorious 3840x1600 resolution as others in its class, but it also offers some unique features, including a 75Hz refresh rate, USB-C input, and AMD FreeSync support.
Based on my time with the XR382CQK, my hopes for those extra 160 of resolution were fulfilled. The height of the display area felt great for tasks like video editing in Premiere and referencing multiple side-by-side documents and websites, and the gaming experience was just as satisfying. And with its 38-inch size, the display is quite usable at 100 percent scaling.
There's also an unexpected benefit for video content that I hadn't originally considered. I was so focused on regaining that missing vertical resolution that I initially failed to appreciate the jump in horizontal resolution from 3440px to 3840px. This is the same horizontal resolution as the consumer UHD standard, which means that 4K movies in a 21:9 or similar aspect ratio will be viewable in their full size with a 1:1 pixel ratio.
One of the promises of moving to interfaces like USB 3.1 Gen 2 and Thunderbolt 3 on notebooks is the idea of the "one cable future." For the most part, I think we are starting to see some of those benefits. It's nice that with USB Power Delivery, users aren't tied into buying chargers directly from their notebook manufacturer or turning to trying to find oddball third-party chargers with their exact barrel connector. Additionally, I also find it to be a great feature when laptops have USB-c charging ports on opposing sides of the notebooks, allowing me greater flexibility to plug in a charger without putting additional strain on the cable.
For years, the end-game for mobile versatility has been a powerful thin-and-light notebook which you can connect to a dock at home, and use a desktop PC. With more powerful notebook processor's like Intel's quad-core 8th generation parts coming out, we are beginning to reach a point where we have the processing power; the next step is having a quality dock with which to plug these notebooks.
While USB-C can support DisplayPort, Power Delivery, and 10 Gbit/s transfer speeds in its highest-end configuration, this would still be a bit lacking for power users. Thunderbolt 3 offering the same display and power delivery capabilities, but with its 40 Gbit/s data transfer capabilities is a more suitable option.
Today, we are taking a look at the CalDigit Thunderbolt Station 3 Plus, a Thunderbolt 3-enabled device that provides a plethora of connectivity options for your notebook.
Introduction, Specifications and Packaging
Intel has wanted a 3D XPoint to go 'mainstream' for some time now. Their last big mainstream part, the X25-M, launched 10 years ago. It was available in relatively small capacities of 80GB and 160GB, but it brought about incredible performance at a time where most other early SSDs were mediocre at best. The X25-M brought NAND flash memory to the masses, and now 10 years later we have another vehicle which hopes to bring 3D XPoint to the masses - the Intel Optane SSD 800P:
Originally dubbed 'Brighton Beach', the 800P comes in at capacities smaller than its decade-old counterpart - only 58GB and 118GB. The 'odd' capacities are due to Intel playing it extra safe with additional ECC and some space to hold metadata related to wear leveling. Even though 3D XPoint media has great endurance that runs circles around NAND flash, it can still wear out, and therefore the media must still be managed similarly to NAND. 3D XPoint can be written in place, meaning far less juggling of data while writing, allowing for far greater performance consistency across the board. Consistency and low latency are the strongest traits of Optane, to the point where Intel was bold enough to launch an NVMe part with half of the typical PCIe 3.0 x4 link available in most modern SSDs. For Intel, the 800P is more about being nimble than having straight line speed. Those after higher throughputs will have to opt for the SSD 900P, a device that draws more power and requires a desktop form factor.
- Capacities: 58GB, 118GB
- PCIe 3.0 x2, M.2 2280
- Sequential: Up to 1200/600 MB/s (R/W)
- Random: 250K+ / 140K+ IOPS (R/W) (QD4)
- Latency (average sequential): 6.75us / 18us (R/W) (TYP)
- Power: 3.75W Active, 8mW L1.2 Sleep
Specs are essentially what we would expect from an Optane Memory type device. Capacities of 58GB and 118GB are welcome additions over the prior 16GB and 32GB Optane Memory parts, but the 120GB capacity point is still extremely cramped for those who would typically desire such a high performing / low latency device. We had 120GB SSDs back in 2009, after all, and nowadays we have 20GB Windows installs and 50GB game downloads.
Before moving on, I need to call out Intel on their latency specification here. To put it bluntly, sequential transfer latency is a crap spec. Nobody cares about the latency of a sequential transfer, especially for a product which touts its responsiveness - something based on the *random* access latency, and the 6.75us figure above would translate to 150,000 QD1 IOPS (the 800P is fast, but it's not *that* fast). Most storage devices/media will internally 'read ahead' so that sequential latencies at the interface are as low as possible, increasing sequential throughput. Sequential latency is simply the inverse of throughput, meaning any SSD with a higher sequential throughput than the 800P should beat it on this particular spec. To drive the point home further, consider that a HDD's average sequential latency can beat the random read latency of a top-tier NVMe SSD like the 960 PRO. It's just a bad way to spec a storage device, and it won't do Intel any favors here if competing products start sharing this same method of rating latency in the future.
Our samples came in white/brown box packaging, but I did snag a couple of photos of what should be the retail box this past CES:
Don't Call It SPIR of the Moment
Vulkan 1.0 released a little over two years ago. The announcement, with conformant drivers, conformance tests, tools, and patch for The Talos Principle, made a successful launch for the Khronos Group. Of course, games weren’t magically three times faster or anything like that, but it got the API out there; it also redrew the line between game and graphics driver.
The Khronos Group repeats this “hard launch” with Vulkan 1.1.
First, the specifications for both Vulkan 1.1 and SPIR-V 1.3 have been published. We will get into the details of those two standards later. Second, a suite of conformance tests has also been included with this release, which helps prevent an implementation bug from being an implied API that software relies upon ad-infinitum. Third, several developer tools have been released, mostly by LunarG, into the open-source ecosystem.
Fourth – conformant drivers. The following companies have Vulkan 1.1-certified drivers:
There are two new additions to the API:
The first is Protected Content. This allows developers to restrict access to rendering resources (DRM). Moving on!
The second is Subgroup Operations. We mentioned that they were added to SPIR-V back in 2016 when Microsoft announced HLSL Shader Model 6.0, and some of the instructions were available as OpenGL extensions. They are now a part of the core Vulkan 1.1 specification. This allows the individual threads of a GPU in a warp or wavefront to work together on specific instructions.
Shader compilers can use these intrinsics to speed up operations such as:
- Finding the min/max of a series of numbers
- Shuffle and/or copy values between lanes of a group
- Adding several numbers together
- Multiply several numbers together
- Evaluate whether any, all, or which lanes of a group evaluate true
In other words, shader compilers can do more optimizations, which boosts the speed of several algorithms and should translate to higher performance when shader-limited. It also means that DirectX titles using Shader Model 6.0 should be able to compile into their Vulkan equivalents when using the latter API.
This leads us to SPIR-V 1.3. (We’ll circle back to Vulkan later.) SPIR-V is the shading language that Vulkan relies upon, which is based on a subset of LLVM. SPIR-V is the code that is actually run on the GPU hardware – Vulkan just deals with how to get this code onto the silicon as efficiently as possible. In a video game, this would be whatever code the developer chose to represent lighting, animation, particle physics, and almost anything else done on the GPU.
The Khronos Group is promoting that the SPIR-V ecosystem can be written in either GLSL, OpenCL C, or even HLSL. In other words, the developer will not need to rewrite their DirectX shaders to operate on Vulkan. This isn’t particularly new – Unity did this sort-of HLSL to SPIR-V conversion ever since they added Vulkan – but it’s good to mention that it’s a promoted workflow. OpenCL C will also be useful for developers who want to move existing OpenCL code into Vulkan on platforms where the latter is available but the former rarely is, such as Android.
Speaking of which, that’s exactly what Google, Codeplay, and Adobe are doing. Adobe wrote a lot of OpenCL C code for their Creative Cloud applications, and they want to move it elsewhere. This ended up being a case study for an OpenCL to Vulkan run-time API translation layer and the Clspv OpenCL C to SPIR-V compiler. The latter is open source, and the former might become open source in the future.
Now back to Vulkan.
The other major change with this new version is the absorption of several extensions into the core, 1.1 specification.
The first is Multiview, which allows multiple projections to be rendered at the same time, as seen in the GTX 1080 launch. This can be used for rendering VR, stereoscopic 3D, cube maps, and curved displays without extra draw calls.
The second is device groups, which allows multiple GPUs to work together.
The third allows data to be shared between APIs and even whole applications. The Khronos Group specifically mentions that Steam VR SDK uses this.
The fourth is 16-bit data types. While most GPUs operate on 32-bit values, it might be beneficial to pack data into 16-bit values in memory for algorithms that are limited by bandwidth. It also helps Vulkan be used in non-graphics workloads.
We already discussed HLSL support, but that’s an extension that’s now core.
The sixth extension is YCbCr support, which is required by several video codecs.
The last thing that I would like to mention is the Public Vulkan Ecosystem Forum. The Khronos Group has regularly mentioned that they want to get the open-source community more involved in reporting issues and collaborating on solutions. In this case, they are working on a forum where both members and non-members will collaborate, as well as the usual GitHub issues tab and so forth.
You can check out the details here.
Introduction and First Impressions
Launching today, Corsair’s new Carbide Series 275R case is a budget-friendly option that still offers plenty of understated style with clean lines and the option of a tempered glass side panel. Corsair sent us a unit to check out, so we have a day-one review to share. How does it compete against recent cases we’ve looked at? Find out here!
The Carbide 275R is a compact mid-tower design that still accommodates standard ATX motherboards, large CPU coolers (up to 170 mm tall), and long graphics cards, and it includes a pair of Corsair’s SP120 fans for intake/exhaust. The price tag? $69.99 for the version with an acrylic side, and $79.99 for the version with a tempered glass side panel (as reviewed). Let’s dive in, beginning with a rundown of the basic specs.
Introduction and First Impressions
HyperX announced the Cloud Flight at CES, marking the first wireless headset offering from the gaming division of Kingston. HyperX already enjoyed a reputation for quality sound and build quality, so we'll see how that translates into a wireless product which boasts some pretty incredible battery life (up to 30 hours without LED lighting).
The HyperX Cloud Flight with a closed-cup design that looks like a pair of studio headphones, and in addition to the 2.4 GHz wireless connection it offers the option of a 3.5 mm connection, making it compatibile with anything that supports traditional wired audio. The lighting effects are understated and adjustable, and the detachable noise-cancelling mic is certified by TeamSpeak and Discord.
The big questions to answer in this review: how does it sound, how comfortable is it, and how well does the wireless mode work? Let's get started!
Introduction and Features
Seasonic’s updated power supply lineup now includes the new PRIME Titanium Fanless model that can deliver up to 600W. The SSR-600TL is one of the latest members in Seasonic’s popular PRIME series. This new flagship PRIME fanless model boasts 80 Plus Titanium certification for the highest level efficiency.
Sea Sonic Electronics Co., Ltd has been designing and building PC power supplies since 1981 and they are one of the most highly respected manufacturers in the world. Not only do they market power supplies under their own Seasonic name but they are the OEM for numerous big name brands.
The Seasonic PRIME 600W Titanium Fanless power supply features Seasonic’s Micro Tolerance Load Regulation (MTLR) for ultra-stable voltage regulation and comes with fully modular cables, all Japanese made capacitors, and is backed by a 12-year warranty.
Seasonic PRIME 600W Titanium Fanless PSU Key Features:
• Fanless operation (0dbA)
• 600W continuous DC output
• Ultra-high efficiency, 80 PLUS Titanium certified
• Micro-Tolerance Load Regulation (MTLR)
• Excellent AC ripple and noise suppression
• Fully modular cabling design
• Multi-GPU technology support
• Conductive polymer aluminum solid capacitors
• High reliability 105°C Japanese made electrolytic capacitors
• Ultra-Ventilation honeycomb structure
• Dual sided PCB layout
• High capacity +12V output
• Active PFC (0.99 PF typical) with Universal AC input
• Protections: OPP,OVP,UVP,SCP,OCP and OTP
• 12-Year Manufacturer’s warranty
Here is what Seasonic has to say about their new PRIME Fanless PSU:
“The PRIME 600 Titanium Fanless utilizes the fanless technology, which eliminates the fan noise completely, ensuring truly quiet operation. The unit not only is rated to be 80PLUS Titanium efficient, but it also has the highest power output on the fanless power supply market at the moment. The power supply is ideal any situations that demand silence from the equipment. The high quality components inside and the innovative circuit design result in clean and stable power output. The fully modular cables allow for better cable management in the computer case.
Seasonic employs the most efficient manufacturing methods, uses the best materials and works with most reliable suppliers to produce reliable products. The PRIME Series layout, revolutionary manufacturing solutions and solid design attest to the highest level of ingenuity of Seasonics’s engineers and product developers. Demonstrating confidence in its power supplies, Seasonic stands out in the industry by offering PRIME Series a generous 12-year manufacturer’s warranty period.”
Compared to manufacturers like Dell, HP, and ASUS, Razer is a relative newcomer to the notebook market having only shipped their first notebook models in 2013. Starting first with gaming-focused designs like the Razer Blade and Blade Pro, Razer branched out to a more general notebook audience in 2016 with the launch of the Razer Blade Stealth.
Even though Razer is a primarily gamer-centric brand, the Razer Blade Stealth does not feature a discrete GPU for gaming. Instead, Razer advertises using their Razer Core V2 external Thunderbolt 3 enclosure to add your own full-size GPU, giving users the flexibility of a thin-and-light ultrabook, but with the ability to play games when docked.
Compared to my previous daily driver notebook, the "Space Gray" MacBook Pro, the Razer Blade Stealth shares a lot of industrial design similarities, even down to the "Gunmetal" colorway featured on our review unit. The aluminum unibody construction, large touchpad, hinge design, and more all clearly take inspiration from Apple's notebooks over the years. In fact, I've actually mistaken this notebook for a MacBook Pro in a few quick glances around the office in recent weeks.
As someone who is a fan of the industrial design of the MacBook Pro lineup, but not necessarily Apple's recent hardware choices, these design cues are a good thing. In some ways, the Razer Blade Stealth feels like Apple had continued with their previous Retina MacBook Pro designs instead of moving into the current Touch Bar-sporting iteration.
|Razer Blade Stealth (Early 2018)|
|Screen||13.3" QHD+ (3200x1800) IGZO Touch Screen|
|GPU||Intel UHD Graphics 620|
|RAM||16GB LPDDR3-2133MHz (non-upgradeable)|
|Storage||256 GB PCIe||512 GB PCIe||1 TB PCIe|
|Network||Killer™ 1535 Wireless-AC (802.11a/b/g/n/ac + Bluetooth® 4.1)|
1 x Thunderbolt 3
|Connectivity||1 x Thunderbolt 3
2 x USB 3.0 (Type-A)
|Audio||Stereo Speakers, Array Microphone|
|Weight||2.98 lbs. / 1.35 kg|
|Dimensions||0.54” / 13.8 mm (Height) x 12.6” / 321 mm (Width) x 8.1” / 206 mm (Depth)|
|Operating System||Windows 10 Home|
One of the things that surprised me most when researching the Razer Blade Stealth was just how equipped the base model was. All models include 16 GB of RAM, a QHD+ touch screen, and at least 256 GB of PCIe NVMe flash storage. However, I would have actually liked to see a 1080p screen option, be it with or without touch. For such a small display size, I would rather gain the battery life advantages of the lower resolution.
Overshadowing the Previous Gen
To say that sim racing has had a banner year is perhaps an understatement. We have an amazingly robust ecosystem of titles and hardware that help accentuate the other to provide outstanding experiences for those who wish to invest. This past year has seen titles such as Project CARS 2, Forza 7, DiRT 4, and F1 2017 released as well as stalwarts such as iRacing getting major (and consistent) updates. We also have seen the rise of esports with racing titles, most recently with the F1 series and the WRC games. These have become flashy affairs with big sponsors and some significant prizes.
Racing has always had a niche in PCs, but titles such as Forza on Xbox and Gran Turismo on Playstation have ruled the roost. The joy of PC racing is the huge amount of accessories that can be applied to the platform without having to pay expensive licenses to the console guys. We have really seen the rise of guys like Thrustmaster and Fanatec through the past decade providing a lot of focus and support to the PC world.
This past year has seen a pretty impressive lineup of new products addressing racing on both PC and console. One of the first big releases is what I will be covering today. It has been a while since Thrustmaster released the TS-PC wheel set, but it has set itself up to be the product to beat in an increasingly competitive marketplace.
So Long, Battery Stress
Wireless peripherals can be stressful. Sure, we all love being free from the tether, but as time goes on worries about responsiveness linger in the back of the mind like an unwelcome friend. Logitech is here with an impressive answer: the G613 Wireless Mechanical Gaming Keyboard and the G603 Lightspeed Wireless Gaming Mouse. This pair of peripherals promise an astounding 18-months of battery life with performance that’s competitive with their wire-bound cousins. Did they succeed?
G613 Wireless Mechanical Gaming Keyboard
- MSRP: $149.99
- Key Switch: Romer-G
- Durability: 70 million keypresses
- Actuation distance: 0.06 in (1.5 mm)
- Actuation force: 1.6 oz (45 g)
- Total travel distance: 0.12 in (3.0 mm)
- Keycaps: ABS, Pad Printed Legends
- Battery Life: 18 months
- Connectivity: Wireless, Bluetooth
- Dimensions: 18.8 x 8.5 inches
G603 LIGHTSPEED Wireless Gaming Mouse
- MSRP: $69.99 ($59.97 on Amazon as of this writing)
- Sensor: HERO
- Resolution: 200 – 12,000 dpi
- Max. acceleration: tested at >40G3
- Max. speed: tested at >400 IPS3
- USB data format: 16 bits/axis
- USB report rate: HI mode: 1000 Hz (1ms), LO mode: 125 Hz (8 ms)
- Bluetooth report rate: 88-133 Hz (7.5-11.25 ms)
- Microprocessor: 32-bit ARM
- Main buttons: 20 million clicks with precision mechanical button tensioning
- Battery life: HI mode: 500 hours (non-stop gaming), LO mode: 18 months (standard usage)
- Weight: 3.14 oz (88.9 g) mouse only, 4.79 oz (135.7 g), with 2 AA batteries
Starting with the G613, we find a full-size keyboard that is both longer and wider than average. This is due to a set of six programmable macro keys (highlighted in blue, G1-G6, assignable in Logitech’s Gaming Software) along the left side. There is also a non-detachable wrist rest along the bottom made of hard plastic.
The overall footprint isn’t much larger than a standard full-size keyboard with a wrist rest, it's 18.8 x 8.5 inch dimensions, but it’s definitely something to consider if you’re space constrained. I appreciate that Logitech included the wrist rest but with more comfortable padded options out there, it would have been nice to be able to swap it out.
It's clear by now that AMD's latest CPU releases, the Ryzen 3 2200G and the Ryzen 5 2400G are compelling products. We've already taken a look at them in our initial review, as well as investigated how memory speed affected the graphics performance of the internal GPU but it seemed there was something missing.
Recently, it's been painfully clear that GPUs excel at more than just graphics rendering. With the rise of cryptocurrency mining, OpenCL and CUDA performance are as important as ever.
Cryptocurrency mining certainly isn't the only application where having a powerful GPU can help system performance. We set out to see how much of an advantage the Radeon Vega 11 graphics in the Ryzen 5 2400G provided over the significantly less powerful UHD 630 graphics in the Intel i5-8400.
|Test System Setup|
|CPU||AMD Ryzen 5 2400G
Intel Core i5-8400
|Motherboard||Gigabyte AB350N-Gaming WiFi
ASUS STRIX Z370-E Gaming
|Memory||2 x 8GB G.SKILL FlareX DDR4-3200
(All memory running at 3200 MHz)
|Storage||Corsair Neutron XTi 480 SSD|
|Graphics Card||AMD Radeon Vega 11 Graphics
Intel UHD 630 Graphics
|Graphics Drivers||AMD 17.40.3701
|Power Supply||Corsair RM1000x|
|Operating System||Windows 10 Pro x64 RS3|
Before we take a look at some real-world examples of where a powerful GPU can be utilized, let's look at the relative power of the Vega 11 graphics on the Ryzen 5 2400G compared to the UHD 630 graphics on the Intel i5-8400.
SiSoft Sandra is a suite of benchmarks covering a wide array of system hardware and functionality, including an extensive range of GPGPU tests, which we are looking at today.
Comparing the raw shader performance of the Ryzen 5 2400G and the Intel i5-8400 provides a clear snapshot of what we are dealing with. In every precision category, the Vega 11 graphics in the AMD part are significantly more powerful than the Intel UHD 630 graphics. This all combines to provide a 175% increase in aggregate shader performance over Intel for the AMD part.
Now that we've taken a look at the theoretical power of these GPUs, let's see how they perform in real-world applications.
Delivering on the Promise of Thunderbolt 3
Despite the greatly increased adoption of Thunderbolt 3 over the previous 2 Thunderbolt standards, the market is still lacking actual devices that take advantage of the full 40Gbps bandwidth that Thunderbolt 3 offers.
External storage seems like a natural use of this PCI-E 3.0 x4 interface available with the Thunderbolt 3 standard, but storage devices that take advantage of this are few and far between. Most of the devices in the market currently are merely bridges for SATA M.2 drives to Thunderbolt 3, which would be limited by the SATA 6Gb/s interface.
However, this market gap seems poised to change. Today, we are taking a look at the TEKQ Rapide Thunderbolt 3 Portable SSD, which advertises sequential transfer speeds up to 2.3 GB/s Read and 1.3 GB/s Write.
Introduction and Features
BitFenix is a relatively new player in the PC power supply market. Formed in 2014 and based in New Taipei, Taiwan, BitFenix started their PC hardware business with a focus on power supplies, cases, lighting accessories, and LED fans targeted towards enthusiasts, gamers and modders. They currently have four different power supply lines: the Formula Gold, BitFenix BPA, Whisper M, and the Fury. The Whisper M series includes five models ranging from 450W up to 850W. We will be taking a detailed look at the BitFenix Whisper M 850W power supply in this review.
The BitFenix Whisper M Series power supplies are all modular and certified to comply with the 80 Plus Gold standards for high efficiency. They feature all Japanese made capacitors, a 135mm fan with a FDB, and come backed by a 7-year warranty.
BitFenix Whisper M 850W PSU Key Features:
• 850W Continuous DC output at up to 50°C
• 80 PLUS Gold certified for high efficiency
• Fully-modular cables
• Dedicated quad +12V rails
• 135mm Cooling fan with FDB
• Intelligent Fan Control for quiet operation
• Japanese made capacitors
• Complies with Intel ATX12V v2.4 and Haswell C6/C1
• Active Power Factor correction with Universal AC input (100 to 240 VAC)
• Safety protections: OCP, OVP, UVP, OPP, SCP, OTP, and SIP
• No Load Operation (NLO)
• 2013 ErP Lot 6 ready
• 7-Year warranty
• MSRP: $119.99 USD
A Different Kind of Productivity Mouse
Logitech has been a major player in the world of computer mice for years. In fact, if you’re reading this, there’s a good chance you’ve used one yourself. Never one to rest on their laurels, one of Logitech’s latest entries, the MX Master 2S, puts creatives and professionals square in its sights and aims to change the way you compute.
Through a suite of interesting control features, a high precision Darkfield sensor, three-system connectivity, and the unique functionality afforded by Logitech’s Options software, the MX Master 2S is more than a little interesting. Read on to see exactly what this mouse has to offer.
- MSRP: $99.99
- Connectivity: Wireless Receiver, Bluetooth
- Sensor technology: Darkfield high precision
- Nominal value: 1000 dpi
- DPI: 200 to 4000 dpi (can be set in increments of 50 dpi)
- Battery life: up to 70 days on a single full charge
- Battery: rechargeable Li-Po (500 mAh) battery
- Number of buttons: 7
- Gesture button: Yes
- Scroll Wheel: Yes, with auto-shift
- Standard and Special buttons: Back/Forward and middle click
- Wireless operating distance: 10m
- Wireless technology: Advanced 2.4 GHz wireless technology
- Optional software: Logitech Options and Logitech Flow
- Dimensions (HxWxD): 3.4 in (85.7 mm) x 5.0 in (126.0 mm) x 2.0 in (48.4 mm)
- Weight: 5.1oz (145g)
- Warranty: 1-year
The MX Master 2S features retail packaging common to Logitech mice, with the book-like cover and inner display bubble. Behind the tray holding the mouse, you’ll also find the micro-USB cable for charging and some brief documentation. Here you’ll also the features spotlighted, and Logitech makes a special point to showcase the software suite. Right away, it’s clear how important the software package is to the 2S.
Taking the mouse of its packaging, the first thing you’ll notice is how large it is. The main body is wider and suited to palm and claw grips. The left side also features a textured wing for a thumb rest and access to the gesture button. The mouse is heavier than many, coming in at 145g, so gamers will want to take note: it’s not the best for rapid response gaming. For productivity and creative work, I found this weight to be a good compromise between functionality and holding an expansive battery without negatively impacting the smooth glide of its teflon feet.
Memory speed is not a factor that the average gamer thinks about when building their PC. For the most part, memory performance hasn't had much of an effect on modern processors running high-speed memory such as DDR3 and DDR4.
With the launch of AMD's Ryzen processors, last year emerged a platform that was more sensitive to memory speeds. By running Ryzen processors with higher frequency and lower latency memory, users should see significant performance improvements, especially in 1080p gaming scenarios.
However, the Ryzen processors are not the only ones to exhibit this behavior.
Gaming on integrated GPUs is a perfect example of a memory starved situation. Take for instance the new AMD Ryzen 5 2400G and it's Vega-based GPU cores. In a full Vega 56 or 64 situation, these Vega cores utilize blazingly fast HBM 2.0 memory. However, due to constraints such as die space and cost, this processor does not integrate HBM.
Instead, both the CPU portion and the graphics portion of the APU must both depend on the same pool of DDR4 system memory. DDR4 is significantly slower than memory traditionally found on graphics cards such as GDDR5 or HBM. As a result, APU performance is usually memory limited to some extent.
In the past, we've done memory speed testing with AMD's older APUs, however with the launch of the new Ryzen and Vega based R3 2200G and R5 2400G, we decided to take another look at this topic.
For our testing, we are running the Ryzen 5 2400G at three different memory speeds, 2400 MHz, 2933 MHz, and 3200 MHz. While the maximum supported JEDEC memory standard for the R5 2400G is 2933, the memory provided by AMD for our processor review will support overclocking to 3200MHz just fine.
Introduction and Technical Specifications
The newly released Vue Coolant is the latest coolant product from Primochill, offering one of their most unique products to date. Their Vue Coolant shows as a solid color at rest and fractalizes when in motion, creating a random visualization with your water loop under power. The coolant is water based and tested by Primochill as safe for any same metal copper based loops, as well as safe to use with acrylic components. Currently, Primochill offers Vue coolant in a large variety of colors all with an MRSP of $24.95 for a 1 quart container.
Courtesy of Primochill
One of the coolest properties of Primochill's Vue coolant is its fractalization properties when in motion. The fluid shown is at rest on the left and in motion on the right. Notice how random patterns emerge in the right bottle, forming unique visuals. The visualizations are caused by particles in suspension in the coolant itself.
Courtesy of Primochill
Primochill currently offers their Vue coolant in the 18 colors shown, ranging from dark red (Crimson) to Yellow with even more colors promised in the future. They have ensured that the Vue coolant will fit in almost any system build imaginable with their broad color palette.
Courtesy of Primochill
To best prep an existing system and cooling loop for use with their Vue coolant, Primochill developed System Reboot. You simply add the bottle's contents to a gallon of distilled water and run it through your loop for up to 24hrs. It was designed to remove any residue or staining from your system components, leaving them like new. Primochill designed this cleaning agent specifically for Vue system preparation because of the Vue coolant's fragile nature when exposed to other coolant compounds.
Cherry is one of the most well-known brands in the mechanical keyboard industry. The company, based in Germany, is best known for their MX key switches, which have become the gold standard in the premium keyboard market. As a result of their high standards, tight quality control, and even the occasional scarcity, “genuine Cherry key switches” has become a veritable marketing point on more than a few features lists.
Since they make their own switches, it should come as no surprise that Cherry also produces their own keyboards. Today, we’re looking at the G80-3494, a new entry in the G80-3000 line and one of the few keyboards in the United States to feature Cherry MX Silent Black key switches. Do their full-fledged boards live up to the lofty standards of their switches?
- MSRP: $149.99 (currently sale price: $111.56)
- Layout: ANSI, 104-key
- Key Switch: Cherry MX Silent Black (linear)
- Key Lifespan: 50M keystroke
- Actuation Force: 60cN
- N-Key Rollover: 14-key simultaneous
- Cable: 1.75m, non-detachable, PVC coated
- Dimensions: 470 x 195 x 44 mm
- Weight: 935g
Addressing New Markets
Machine Learning is one of the hot topics in technology, and certainly one that is growing at a very fast rate. Applications such as facial recognition and self-driving cars are powering much of the development going on in this area. So far we have seen CPUs and GPUs being used in ML applications, but in most cases these are not the most efficient ways of doing these highly parallel but relatively computationally simple workloads. New chips have been introduced that are far more focused on machine learning, and now it seems that ARM is throwing their hat into the ring.
ARM is introducing three products under the Project Trillium brand. It features a ML processor, a OD (Object Detection) processor, and a ARM developed Neural Network software stack. This project came as a surprise for most of us, but in hindsight it is a logical avenue for them to address as it will be incredibly important moving forward. Currently many applications that require machine learning are not processed at the edge, namely in the consumer’s hand or device right next to them. Workloads may be requested from the edge, but most of the heavy duty processing occurs in datacenters located all around the world. This requires communication, and sometimes pretty hefty levels of bandwidth. If neither of those things are present, applications requiring ML break down.
Introduction and First Impressions
NZXT has proven to be willing to adapt and innovate in the competitive DIY PC space, introducing their own software control suite (CAM) to control cooling and lighting effects in 2014, and this year launching their first motherboard. We have have seen CAM in action with products like the Kraken AiO liquid CPU coolers, which required the software to fully unlock their potential - both thermally and visually (RGB) speaking, and it's an integral part of the new H700i enclosure.
“The H700i showcases NZXT’s vision for modern PC building. This premium mid-tower case features a unique CAM Powered Smart Device that digitally drives RGB lighting and fan performance. You can effortlessly control RGB lighting and fans, while Adaptive Noise Reduction optimizes your build’s acoustics through machine learning and ideal fan settings. Includes four integrated Aer F fans and two RGB LED to enhance the aesthetics of your build as seen through the H700i’s stunning tempered glass panel.”
Now that NZXT has brought that CAM software feature-set to enclosures beginning with the H700i mid-tower we have for you today, we will pay close attention to the way the integrated "Smart Device" - a module that controls fans and lighting - fits into the usual thermal/noise equation. OEM systems from the likes of Dell with their Alienware desktops have used similar dedicated hardware for cooling and lighting control, and it's interesting to see this enter the DIY space. How important is software control of cooling and RGB effects to you? That depends, of course, and partly on how easy it is to use.
We will take a close look in and around this new enclosure, and while it’s on the test bench we will see how the stylish H700i stacks up with thermal and noise results vs. some other recent cases - and test the H700i both with and without CAM software optimization to see what sort of difference it makes in practice. Let’s get started!
Raven Ridge Desktop
As we approach the one-year anniversary of the release of the Ryzen family of processors, the full breadth of the releases AMD put forth inside of 12 months is more apparent than ever. Though I feel like I have written summations of 2017 for AMD numerous times, it still feels like an impressive accomplishment as I reflect for today’s review. Starting with the Ryzen 7 family of processors targeting enthusiasts, AMD iterated through Ryzen 5, Ryzen 3, Ryzen Threadripper, Ryzen Pro, EPYC, and Ryzen Mobile.
Today, though its is labeled as a 2000-series of parts, we are completing what most would consider the first full round of the Ryzen family. As the first consumer desktop APU (AMD’s term for a processor with tightly integrated on-die graphics), the Ryzen 5 2400G and the Ryzen 3 2200G look very much like the Ryzen parts before them and like the Ryzen mobile APUs that we previously looked at in notebook form. In fact, from an architectural standpoint, these are the same designs.
Before diving into the hardware specifications and details, I think it is worth discussing the opportunity that AMD has with the Ryzen with Vega graphics desktop part. By most estimates, more than 30% of the desktop PCs sold around the world ship without a discrete graphics card installed. This means they depend on the integrated graphics from processor to handle the functions of general compute and any/all gaming that might happen locally. Until today, AMD has been unable to address that market with its currently family of Ryzen processors, as they require discrete graphics solutions.
While most of our readers fall into the camp of not just using a discrete solution but requiring one for gaming purposes, there are a lot of locales and situations where the Ryzen APU is going to provide more than enough graphics horsepower. The emerging markets in China and India, for example, are regularly using low-power systems with integrated graphics, often based on Intel HD Graphics or previous generation AMD solutions. These gamers and consumers will see dramatic increases in performance with the Zen + Vega solution that today’s processor releases utilize.
Let’s not forget about secondary systems, small form factor designs, and PCs design for your entertainment centers as possible outlets for and uses for Ryzen APUs even for the most hardcore of enthusiast. Mom or Dad need a new PC for basic tasks on a budget? Again, AMD is hoping to make a case today for those sales.