All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Not Just a Better Camera
Samsung’s updated Galaxy phones are available now, and while the external designs - while beautiful - look the same as last year, the Galaxy S9 and S9+ feature faster internals and an improved camera system. Is it worth an upgrade over the Galaxy S8? How does this new flagship from Samsung compare to Apple’s more expensive iPhone X? Read on to find out!
During the Galaxy S9 at Samsung’s “Unpacked” event unveiling the new phones, much was made about the GS9’s camera - and particularly its video recording capability, which features an ultra slow-motion mode. While camera is a vital part of the experience, and can make or break a handset for many people, it is the application processor that constitutes a bigger upgrade from last year’s Galaxy S8 phones.
In the USA, Samsung is using Qualcomm’s new Snapdragon 845, while many of the international versions of the phone use Samsung’s own Exynos SoC. We took an early look at performance with the Snapdragon 845 during Qualcomm’s recent media day, and now with shipping hardware and far more time for benchmarking we can really put this new mobile platform to the test. You can take or leave synthetic benchmark results, of course; I can offer my own subjective impressions of overall responsiveness, which is as much a test of software optimization as hardware.
|Samsung Galaxy S9+ Specifications (US Version)|
|Display||6.2-inch 1440x2960 AMOLED|
|SoC||Qualcomm Snapdragon 845 (SDM845)|
|CPU Cores||8x Kryo 385 up to 2.8 GHz|
|GPU Cores||Adreno 630|
|RAM||6 GB LPDDR4X|
|Storage||64 / 128 / 256 GB|
|Network||Snapdragon X20 LTE|
Bluetooth 5.0; A2DP, aptX
USB 3.1 (Type-C)
|Battery||3500 mAh Li-Ion|
|Dimensions||158.1 x 73.8 x 8.5 mm, 189 g|
Samsung has opted to bring back the same industrial design introduced with last year’s Galaxy S8/S8+, but this was already a class-leading design so that is not a bad thing.
Despite the recent launch of the high-powered Hades Canyon NUC, that doesn't mean the traditional NUC form-factor is dead, quite the opposite in fact. Intel continues to iterate on the core 4-in x 4-in NUC design, adding new features and updating to current Intel processor families.
Today, we are taking a look at one of the newest iterations of desktop NUC, the NUC7i7DNHE, also known as the Dawson Canyon platform.
While this specific NUC is segmented more towards business and industrial applications, we think it has a few tricks up its sleeves that end users will appreciate.
|Processor||Intel Core i7-8650U (Kaby Lake Refresh)|
|Graphics||Intel UHD 630 Integrated|
|Memory||2 X DDR4 SODIMM slots|
Available M.2 SATA/PCIe drive slot
Available 2.5" drive slot
|Wireless||Intel Wireless-AC 8265 vPro|
2 x HDMI 2.0a
4 x USB 3.0
|Price||$595 - SimplyNUC|
Introduction and Technical Specifications
Courtesy of ECS
The ECS Z370 Lightsaber motherboard is the latest offering in ECS' L337 product line, offering support for the Intel Z370 chipset. Similar to previous iterations of the Lightsaber board, the Z370-Lightsaber builds on the those board by adding dual-m.2 slot support, enhanced power support, as well as support for the latest Intel Coffee Lake-based processors. With an MSRP of $199, ECS priced the Z370 Lightsaber to be price-competitive with other mid-tier Z370-based offerings.
Courtesy of ECS
ECS designed the Z370 Lightsaber with a 14-phase digital power delivery system, using high efficiency chokes and MOSFETs, as well as solid core capacitors for optimal board performance. The following features into the Z370 Lightsaber board: six SATA 3 ports; two PCIe X2 M.2 ports; a Rivet Networks Killer E2500 GigE NIC; three PCI-Express x16 slots; three PCI-Express x1 slots; a 3-digit diagnostic LED display; on-board power, reset, Quick overclock, BIOS set, BIOS update, BIOS backup, and Clear CMOS buttons; a dual BIOS switch; Realtek audio solution; integrated DVI and HDMI video port support; and USB 2.0, 3.0, and 3.1 Gen2 port support.
Courtesy of ECS
For the integrated audio solution, ECS used a Realtek chipset on a separate PCB to minimize audio crosstalk and interference. The also include a removable audio amplifier chipset and high-end Nichicon audio capacitors for a superior audio experience.
Introduction, Specifications, and Packaging
A while back, we reviewed the ICY DOCK ToughArmor MB998SP-B and MB993SK-B hot-swap SATA docks. These were well built, high-density docks meant for 7mm height SSDs and HDDs. The former part was unique in that it let you squeeze eight drives in a single 5.25” drive bay, all while enabling you to hot swap all of them at the front panel. The ToughArmor line has been pushing into higher and higher bay counts, so it only made sense that we eventually saw something higher than an 8-bay unit:
Enter the ToughArmor MB516SP-B. While it looks like two MB998SP-B’s stacked on top of each other, there is more than meets the eye in order to pull this trick off properly. We'll focus on that further into the review, but for now, let us get through the specs.
Introduction and Design
Azulle might not be a familiar name unless you have been browsing for mini PCs lately, as the company offers various small form-factor computers and accessories on Amazon.
Today we will take a close look at their Intel Apollo Lake-powered Byte3 mini PC which starts at $179.99 (and goes up to $337.99 depending on configuration), and provides another fanless solution to this category. Does our $199.99 quad-core version, which includes Windows 10 Pro, stand out? Read on to find out!
- Processor: Quad-core Intel Apollo Lake N3450
- RAM: 4 GB / 8 GB
- Storage: eMMC 32 GB / 2.5" SSD or M.2 SSD Supported
- M.2 Slot: AHCI (SATA)
- GPU: Intel HD Graphics 500
- Wi-Fi: Dual-Band 2.4 GHz / 5.0 GHz
- Ethernet: 1 Gigabit
- Bluetooth: 4.0
- Display Output: 1x HDMI (4K @60Hz), 1x VGA
- USB Ports: 3x USB 3.0 / 1x USB 2.0 / 1x USB Type-C
- SD Slot: Up to 256 GB
- BIOS: Wake on LAN / PXE / BIO Reset
- IR: IR Control
- Audio Output: 3.5 mm jack
- OS Support: Windows 10 Pro / Ubuntu Linux
- Power Supply: 12V
- Dimensions: 5.6 x 4 x 1.5 inches
Thanks to Azulle for providing the Byte3 for our review!
- Azulle Byte3 Mini PC (N3450/4GB/32GB/Win 10 Pro): $199.99 - Amazon.com
We'll start with a quick look inside the box:
The Byte3 has a small, rectangular form-factor of some 5.6 inches wide and 4 inches deep, with a height of 1.5 inches.
Takes a lickin' and keeps on clickin'
Over the years, Corsair has developed a name for itself as one of the premiere manufacturers of mechanical gaming keyboards in the market. One could argue that their K70 series is the leading inspiration for gaming keyboard design to today. Their dominance isn’t just limited to physical design, however. RGB illumination and powerful software programming have also defined their keyboards and set them apart from the competition.
Today, we’re looking at a newer entry in the mechanical keys catalog with the K68 RGB. The K68 is more of a budget-entry, but still packs a suite of premium features to please gaming fans. It’s also water and dust resistant with an IP32 rating. We put that to the test. Without further ado, let’s take a close look.
Specifications and Design
- MSRP: $119.99
- Keyboard Size: Standard
- Key Switches: Cherry MX Red
- Keyboard Backlighting: RGB
- Switch Lifespan: 50-million actuations
- Report Rate: 1000Hz
- Matrix: Full Key (NKRO), 100% anti-ghosting
- Water/Dust Resistance: IP32
- Media Keys: Dedicated
- Wrist Rest: Yes
- Cable Type: Tangle-free rubber
- WIN Lock: Yes
- Software: CUE Enabled
- Dimensions: 455mm x 170mm x 39mm
- Weight: 1.41kg
- Warranty: Two years
The K68 arrives in standard Corsair keyboard packaging. The box is rich with feature-highlights and definitely plays up the RGB illumination. This is 2018 and a Corsair product, so that should come as no surprise.
Everything is well packed inside the box. The keyboard ships with the usual plastic dust-sleeves on both the keyboard, cable, and plastic wrist-rest. We also get a pair of small documentation inserts that describe the warranty and unlabeled hotkeys.
Taking it out of the box, it’s here we get the first indicators of how Corsair managed to cut $50 off the price of the K70 RGB. The cable, rather than coming braided, is standard rubber. Likewise, the included wrist-rest is a more lightweight plastic, felt especially in the more flexible arms attaching it to the keyboard’s body. Neither of these are bad, especially when many gaming keyboards don’t include a wrist-rest at all.
Introduction and Features
Enermax is an established player in the PC peripherals market with a full line of power supplies for enthusiasts, gamers and professionals alike. The Platimax DF Series currently includes five models: 1200W, 1050W, 850W, 600W and 500W. They all feature Platinum level efficiency and come housed in a relatively compact chassis, measuring only 160mm deep. The Platimax DF Series includes Enermax’s latest D.F. (Dust Free Rotation™) fan technology, which works by briefly spinning the fan in the opposite direction to keep dust from building up on the fan blades.
All of the Platimax DF Series power supplies incorporate fully modular cables and feature Japanese made 105°C electrolytic capacitors, a 140mm Twister-bearing fan with semi-fanless operation and they come backed by either a 10-year (1200W, 1050W, and 850W) or 5-year (600W and 500W) warranty. We will be taking a detailed look at the Platimax DF 850W power supply in this review.
Enermax Platimax DF 850W PSU Key Features:
• 850W Continuous DC output at up to 50°C
• 80 PLUS® Platinum certified for high efficiency
• Enermax patented Dust Free Rotation™ fan technology
• Multi GPU support with six PCI-E 6+2 pin connectors
• Fully-modular cables
• Dedicated quad +12V rails (70A/840W combined)
• 140mm Twister-bearing fan (160,000 MTBF)
• Semi-fanless operation (below 30-40% load)
• Japanese made electrolytic capacitors 105°C
• Active Power Factor correction with Universal AC input (100 to 240 VAC)
• Safety protections: OCP, OVP, UVP, OPP, SCP, OTP, and SIP
• Zero Load Operation ready
• 2013 ErP Lot 6 ready
• 10-Year warranty
• MSRP: $249.99 USD
Introduction, Specifications and Packaging
While Western Digital has a huge history with spinning disks, their experience with SSDs has been touch and go. They expanded further into the HDD arena with their very long merging process with HGST, but they have only really dabbled in the solid-state arena. Their earliest attempt was with the Black2 back in 2013, which was a novel concept that never really caught mainstream fame. WD acquired SanDisk a few years back, but they were better known for SD cards and OEM SATA SSDs. More recently we began seeing WD test the waters with PCIe / NVMe parts, with a WD Black and Blue launching at CES 2017. Those were 'ok', but were more of a budget SSD than a powerhouse class-leading product worthy of the Black moniker. Today we see WD take another stab at a WD Black NVMe SSD:
Enter the WD Black NVMe and SanDisk Extreme PRO M.2 NVMe 3D 1TB SSDs. Yes, I know the names are a mouthful, but I would be more worried about the potential for confusion when looking for a WD Black SSD on the market (as there are now two *very* similarly named products). Technically the new part is the 'Western Digital WD Black NVMe SSD'. Yes I know don't tell me - they said Western Digital twice.
We will also be reviewing the SanDisk Extreme PRO M.2 NVMe 3D SSD today. I'm including those results as well, but just as they did with their previous SATA SSD release, these are identical parts with different packaging and labeling. The specs are the same. Heck, the firmware is the same minus the bits that report the device name to the host. For the sake of simplicity, and the fact that the WD part is meant for retail/gamers (SanDisk for creative pros and OEMs), I'll stick with referring mostly to the WD side throughout this review.
Strong specs here. Fast sequentials, but random IOPS is rated at QD32 across 8 threads (QD=256), which is, well, just silly. I know WD is doing this because 'everyone is doing it', and they have to compete, but I have a feeling we will also be seeing very good low QD performance today.
It doesn't get much more no frills than this.
The tides are turning. Over the last few years, the technology industry sung with praises and predictions on virtual reality. The past year, however, tides have begun to shift. While VR remains prohibitively expensive and still wanting in the kind of experiences gamers crave, Augmented Reality is becoming the head-mounted hope for mainstream saturation.
Today, we’re taking a look at one of the first major consumer AR products with Lenovo Star Wars: Jedi Challenges. The set marries exciting technology with exciting IP, but is it enough to justify the $199 MSRP?
MSRP: $199.99 ($169.99 on Amazon as of this writing)
- Dimensions: 315.5mm x 47.2mm
- Weight: 275g
- Buttons: Power, Activation Matrix, Control Button
- Battery: Micro-USB Rechargable
Lenovo Mirage AR Headset
- Dimensions: 209.2mm x 83.4mm x 154.8mm
- Weight: 477g
- Buttons: Select, Cancel, Menu
- Camera: Dual motion tracking cameras
- Battery: Micro-USB Rechargable
- Dimensions: 94.1mm x 76.7mm
- Weight: 117g
- Buttons: Power/color switch
- AA batteries (x2) required
- Connection: Bluetooth connection to phone
- Languages: English, German, Japanese, French, Spanish
The set comes in a large box that doubles as a storage container when the headset and isn’t in use. Everything is nicely packaged, but especially the lightsaber which rests in a nice foam cut-out just under the top half of the box. The unboxing experience is befittingly premium for a product such as this.
The attention to detail on the lightsaber is impressive. It’s a loving recreation of Luke’s lightsaber from A New Hope. The top illuminates white or blue to indicate when it’s paired with your phone. In-game, pressing the side buttons causes the blade to rise up with the iconic sound effect; if you’re a Star Wars fan, it’s beyond neat.
Introduction and Motherboard Layout
For the launch of the Intel H370 chipset motherboards, GIGABYTE chose their AORUS brand to lead the charge. The AORUS branding differentiates the enthusiast and gamer friendly products from other GIGABYTE product lines, similar to how ASUS uses the ROG branding to differentiate their high performance product line. The H370 AORUS Gaming 3 WIFI is among GIGABYTE's intial release boards offering support for the latest Intel consumer chipset and processor lines. Built around the Intel H370 chlipset, the board supports the Intel LGA1151 Coffee Lake processor line and Dual Channel DDR4 memory running at up to 2667MHz speeds. The H370 AORUS Gaming 3 WIFI can be found in retail with an MRSP of $139.99.
The HS370 AORUS Gaming 3 WIFI motherboard features a black PCB with black and chrome colored heat sinks covering all the necessary board components. The AORUS series logos are emblazoned on the chipset heat sink and the rear panel cover. Further, a large rendering of the logo is silk-screened in the upper left quadrant of the board. The ATX form factor provides more than enough surface area to house the integrated features, as well as giving the board compatibility with most available consumer enclosures.
The board's back is completely free of components, posing no problems with case mounting or mounting the CPU backplate.
GIGABYTE designed the H370 AORUS Gaming 3 WIFI motherboard with a 10-phase digital power system in an 8+2 configuration. The CPU VRMs are passively cooled by dual aluminum heat sinks above and to the upper right of the CPU socket.
Introduction and Case Exterior
The Meshify C - TG from Fractal Design is a high-airflow ATX case design with some added style from its unique angled front panel. Throw in a tempered glass side panel and a pair of pre-installed Dynamic X2 GP-12 120 mm fans and the $89.99 price tag looks pretty good - but how did it perform? We'll find out.
Having reviewed a few Fractal Design cases in the past three years I have come to expect a few things from their enclosures: solid construction, intelligent internal layouts, and excellent cable management. As to style, their cases are generally understated, and the Meshify's black color scheme with a tinted glass side certainly fits the bill - though the angled front mesh design catches the light and does add some visual interest.
More than a single enclosure, Meshify is now a dedicated line from Fractal Design, with a new Meshify C Mini for mATX/mITX motherboards, as well as variants of this Meshify C including a model with a solid side panel (the standard Meshify C) and one with dark-tinted glass (Meshify C - Dark TG). Regardless of which model you might be considering, they share a common design focused on high airflow (with a full compliment of filters), flexible storage options, and maximizing component space within their compact dimensions.
OK, call me crazy (you wouldn’t be the first) but this is something I’ve wanted to try for years, and I bet I’m not the only one. Each time a new power supply comes across the lab bench with ever increasing output capacities, I find myself thinking, “I could weld with this beast.” Well the AX1600i pushed me over the edge and I decided to give it a go; what could possibly go wrong?
133.3 Amps on the +12V outputs!
The Corsair AX1600i Digital power supply can deliver up to 133 Amps on the combined +12V rails, more than enough amperage for welding. There are dozens of PC power supplies on the market today that can deliver 100 Amps or more on the +12V output, but the AX1600i has another feature that might help make this project a success, the ability to manually set current limits on the +12V outputs. Thanks to the fact that the AX1600i is a digital power supply that allows manually setting the current limits on the +12V outputs via the Corsair Link data acquisition and control software, I might be able to add the ability to select a desired amperage to weld with. Yes!
Just because the AX1600i “can” deliver 133A doesn’t mean I want that much current available for welding. I typically only use that much power when I’m welding heavy steel pieces using ¼” rod. For this experiment I would like to be able to start out at a lot lower amperage, and I’m hoping the Corsair Link software will provide that ability.
Stick Welding with a PC Power Supply!
My first thought was to try to adapt a TIG (Tungsten Inert Gas) welder for use with the AX1600i. I figured using a TIG torch (Tungsten electrode shrouded with Argon gas instead of a flux coated rod) might give better control especially at the lower voltage and currents where I plan to start testing. TIG welders are commonly used to weld small stainless steel parts and sheet metal. But then I remembered the TIG welder power supply has a high voltage pulse built-in to initiate the plasma arc. Without that extra kick-start, it might be difficult to strike an arc without damaging the fine pointed tip of the Tungsten electrode. So I decided to just go with a conventional stick welding setup. The fact that PC power supplies put out DC voltage will be an advantage over the more common AC buzz-box arc welders for better stability and producing higher quality welds.
Obviously, trying to convert a PC power supply into an arc welding power supply will require a few modifications. Here is a quick list of the main challenges I think we will have to overcome.
• Higher capacity fan for better cooling
• Terminate all the PSU’s +12V cables into welding leads
• Disable the Short Circuit protection feature
• Implement selecting the desired current output
• Strike and maintain a stable arc with only 12 volts
Bloody Gaming is no newcomer to the world of PC gaming peripherals. As a subsidiary of A4Tech, they’re one of the few peripheral manufacturers to own their own assembly lines. Controlling their own manufacturing allows them to take risks and attempt new approaches the competition may not. Coming from a rich heritage of innovation at A4Tech, it comes as no surprise that Bloody has consistently sought to push the boundaries of the technology we use to game.
At the same time, the brand has taken a uniquely aggressive approach from name to design. Today, we’re looking at the company’s next generation of keyboard with the B975. With this release, we find a more restrained design coupled with the freshly redesigned Light Strike 3 optical switches and full RGB backlighting.
But is it enough for Bloody to challenge the heavy hitters like Logitech, Razer, and Corsair? Let’s find out.
Announced at Intel's Developer Forum in 2012, and launched later that year, the Next Unit of Computing (NUC) project was initially a bit confusing to the enthusiast PC press. In a market that appeared to be discarding traditional desktops in favor of notebooks, it seemed a bit odd to launch a product that still depended on a monitor, mouse, and keyboard, yet didn't provide any more computing power.
Despite this criticism, the NUC lineup has rapidly expanded over the years, seeing success in areas such as digital signage and enterprise environments. However, the enthusiast PC market has mostly eluded the lure of the NUC.
Intel's Skylake-based Skull Canyon NUC was the company's first attempt to cater to the enthusiast market, with a slight stray from the traditional 4-in x 4-in form factor and the adoption of their best-ever integrated graphics solution in the Iris Pro. Additionally, the ability to connect external GPUs via Thunderbolt 3 meant Skull Canyon offered more of a focus on high-end PC graphics.
However, Skull Canyon mostly fell on deaf ears among hardcore PC users, and it seemed that Intel lacked the proper solution to make a "gaming-focused" NUC device—until now.
Announced at CES 2018, the lengthily named 8th Gen Intel® Core™ processors With Radeon™ RX Vega M Graphics (henceforth referred to as the code name, Kaby Lake-G) marks a new direction for Intel. By partnering with one of the leaders in high-end PC graphics, AMD, Intel can now pair their processors with graphics capable of playing modern games at high resolutions and frame rates.
The first product to launch using the new Kaby Lake-G family of processors is Intel's own NUC, the NUC8i7HVK (Hades Canyon). Will the marriage of Intel and AMD finally provide a NUC capable of at least moderate gaming? Let's dig a bit deeper and find out.
Introduction and Technical Specifications
Courtesy of Noctua
Noctua is a well respected manufacturer in the highly competitive CPU cooler space, offering products optimized for high efficiency and low-noise. Their latest release for AMD Ryzen processors offer good stock performance at minimal noise levels. The cooler's minimalistic dimensions also ensures broad compatibility with AM4-based systems. Unlike other members of the Noctua cooler line, the L9a-AM4 uses a proprietary mounting system, not the standard SecuFirm2™ mounting mechanism. With an MSRP of $39.99, the NH-L9a-AM4 comes at a premium price for its performance goals.
Courtesy of Noctua
Courtesy of Noctua
The NH-L9a-AM4 CPU cooler is single radiator cooler placed in a horizontal orientation with a single included fan. The radiator's horizontal orientation gives the cooler a lower height in comparison to a cooler with the traditional vertical radiators while maintaining equivalent cooling performance. In typical Noctua fashion, the NH-L9a-AM4 combines a copper base plate and heat pipes with aluminum finned cooling towers for an optimal hybrid cooling solution. The base plate and heat pipes are nickel-plated for looks and to prevent corrosion.
Since it's introduction in early 2015, the modern iteration of the Dell XPS 13 has been one of the most influential computers in recent history. An example of the rise of desirable Windows-based notebooks back into the premium market, the XPS 13 has done what only a few OEMs have been able to—inspire knockoffs. Now, the market is filled with similar designs including ultrathin bezels (and some even copying the compromises of webcam placement), at similar price points.
Even though it's been regarded as one of the best PC notebooks for its entire tenure, it was clear for a while that Dell must move the brand of their flagship notebook forward, and here it is, the redesigned XPS 13 9370 for 2018.
From a quick glance, the 2018 XPS 13 is quite similar to the outgoing 9360 model from last year. Apart from this new, radical Alpine White and Rose Gold color scheme of our particular review unit, you would be hard-pressed to spot it as unique in public. However, once you start to dig in, the changes become quite evident.
While the new XPS 13 maintains the same physical footprint as the previous iterations, it loses a significant amount of thickness. Still retaining the wedge shape, although much less exaggerated now, the XPS 13 9370 measures only 0.46" at its thickest point, compared to 0.6" on the previous design. While tenths of inches may not seem like a huge difference, this amounts to a 23% reduction in thickness, which is noticeable for a highly portable item like a notebook.
Introduction and Features
Corsair is a well-respected name in the PC industry and they continue to offer a complete line of products for enthusiasts, gamers, and professionals alike. Today we are taking a detailed look at Corsair’s latest flagship power supply, the AX1600i Digital ATX power supply unit. This is the most technologically advanced power supply we have reviewed to date. Over time, we often grow numb to marketing terms like “most technologically advanced”, “state-of-the-art”, “ultra-stable”, “super-high efficiency”, etc., but in the case of the AX1600i Digital PSU, we have seen these claims come to life before our eyes.
1,600 Watts: 133.3 Amps on the +12V outputs!
The AX1600i Digital power supply is capable of delivering up to 1,600 watts of continuous DC power (133.3 Amps on the +12V rails) and is 80 Plus Titanium certified for super-high efficiency. If that’s not impressive enough, the PSU can do it while operating on 115 VAC mains and with an ambient temperature up to 50°C (internal case temperature). This beast was made for multiple power-hungry graphic adapters and overclocked CPUs.
The AX1600i is a digital power supply, which provides two distinct advantages. First, it incorporates Digital Signal Processing (DSP) on both the primary and secondary sides, which allows the PSU to deliver extremely tight voltage regulation over a wide range of loads. And second, the AX1600i features the digital Corsair Link, which enables the PSU to be connected to the PC’s motherboard (via USB) for real-time monitoring (efficiency, voltage regulation, and power usage) and control (over-current protection and fan speed profiles).
Quiet operation with a semi-fanless mode (zero-rpm fan mode up to ~40% load) might not be at the top of your feature list when shopping for a 1,600 watt PSU, but the AX1600i is up to the challenge.
(Courtesy of Corsair)
Corsair AX1600i Digital ATX PSU Key Features:
• Digital Signal Processor (DSP) for extremely clean and efficient power
• Corsair Link Interface for monitoring and adjusting performance
• 1,600 watts continuous power output (50°C)
• Dedicated single +12V rail (133.3A) with user-configurable virtual rails
• 80 Plus Titanium certified, delivering up to 94% efficiency
• Ultra-low noise 140mm Fluid Dynamic Bearing (FDB) fan
• Silent, Zero RPM mode up to ~40% load (~640W)
• Self-test switch to verify power supply functionality
• Premium components (GaN transistors and all Japanese made capacitors)
• Fully modular cable system
• Conforms to ATX12V v2.4 and EPS 2.92 standards
• Universal AC input (100-240V) with Active PFC
• Safety Protections: OCP, OVP, UVP, SCP, OTP, and OPP
• Dimensions: 150mm (W) x 86mm (H) x 200mm (L)
• 10-Year warranty and legendary Corsair customer service
• $449.99 USD
It's all fun and games until something something AI.
Microsoft announced the Windows Machine Learning (WinML) API about two weeks ago, but they did so in a sort-of abstract context. This week, alongside the 2018 Game Developers Conference, they are grounding it in a practical application: video games!
Specifically, the API provides the mechanisms for game developers to run inference on the target machine. The training data that it runs against would be in the Open Neural Network Exchange (ONNX) format from Microsoft, Facebook, and Amazon. Like the initial announcement suggests, it can be used for any application, not just games, but… you know. If you want to get a technology off the ground, and it requires a high-end GPU, then video game enthusiasts are good lead users. When run in a DirectX application, WinML kernels are queued on the DirectX 12 compute queue.
We’ve discussed the concept before. When you’re rendering a video game, simulating an accurate scenario isn’t your goal – the goal is to look like you are. The direct way of looking like you’re doing something is to do it. The problem is that some effects are too slow (or, sometimes, too complicated) to correctly simulate. In these cases, it might be viable to make a deep-learning AI hallucinate a convincing result, even though no actual simulation took place.
Fluid dynamics, global illumination, and up-scaling are three examples.
Previously mentioned SIGGRAPH demo of fluid simulation without fluid simulation...
... just a trained AI hallucinating a scene based on input parameters.
Another place where AI could be useful is… well… AI. One way of making AI is to give it some set of data from the game environment, often including information that a player in its position would not be able to know, and having it run against a branching logic tree. Deep learning, on the other hand, can train itself on billions of examples of good and bad play, and make results based on input parameters. While the two methods do not sound that different, the difference between logic being designed (vs logic being assembled from an abstract good/bad dataset) someone abstracts the potential for assumptions and programmer error. Of course, it abstracts that potential for error into the training dataset, but that’s a whole other discussion.
The third area that AI could be useful is when you’re creating the game itself.
There’s a lot of grunt and grind work when developing a video game. Licensing prefab solutions (or commissioning someone to do a one-off asset for you) helps ease this burden, but that gets expensive in terms of both time and money. If some of those assets could be created by giving parameters to a deep-learning AI, then those are assets that you would not need to make, allowing you to focus on other assets and how they all fit together.
These are three of the use cases that Microsoft is aiming WinML at.
Sure, these are smooth curves of large details, but the antialiasing pattern looks almost perfect.
For instance, Microsoft is pointing to an NVIDIA demo where they up-sample a photo of a car, once with bilinear filtering and once with a machine learning algorithm (although not WinML-based). The bilinear algorithm behaves exactly as someone who has used Photoshop would expect. The machine learning algorithm, however, was able to identify the objects that the image intended to represent, and it drew the edges that it thought made sense.
Like their DirectX Raytracing (DXR) announcement, Microsoft plans to have PIX support WinML “on Day 1”. As for partners? They are currently working with Unity Technologies to provide WinML support in Unity’s ML-Agents plug-in. That’s all the game industry partners they have announced at the moment, though. It’ll be interesting to see who jumps in and who doesn’t over the next couple of years.
O Rayly? Ya Rayly. No Ray!
Microsoft has just announced a raytracing extension to DirectX 12, called DirectX Raytracing (DXR), at the 2018 Game Developer's Conference in San Francisco.
The goal is not to completely replace rasterization… at least not yet. This effect will be mostly implemented for effects that require supplementary datasets, such as reflections, ambient occlusion, and refraction. Rasterization, the typical way that 3D geometry gets drawn on a 2D display, converts triangle coordinates into screen coordinates, and then a point-in-triangle test runs across every sample. This will likely occur once per AA sample (minus pixels that the triangle can’t possibly cover -- such as a pixel outside of the triangle's bounding box -- but that's just optimization).
For rasterization, each triangle is laid on a 2D grid corresponding to the draw surface.
If any sample is in the triangle, the pixel shader is run.
This example shows the rotated grid MSAA case.
A program, called a pixel shader, is then run with some set of data that the GPU could gather on every valid pixel in the triangle. This set of data typically includes things like world coordinate, screen coordinate, texture coordinates, nearby vertices, and so forth. This lacks a lot of information, especially things that are not visible to the camera. The application is free to provide other sources of data for the shader to crawl… but what?
- Cubemaps are useful for reflections, but they don’t necessarily match the scene.
- Voxels are useful for lighting, as seen with NVIDIA’s VXGI and VXAO.
This is where DirectX Raytracing comes in. There’s quite a few components to it, but it’s basically a new pipeline that handles how rays are cast into the environment. After being queued, it starts out with a ray-generation stage, and then, depending on what happens to the ray in the scene, there are close-hit, any-hit, and miss shaders. Ray generation allows the developer to set up how the rays are cast, where they call an HLSL instrinsic instruction, TraceRay (which is a clever way of invoking them, by the way). This function takes an origin and a direction, so you can choose to, for example, cast rays only in the direction of lights if your algorithm was to, for instance, approximate partially occluded soft shadows from a non-point light. (There are better algorithms to do that, but it's just the first example that came off the top of my head.) The close-hit, any-hit, and miss shaders occur at the point where the traced ray ends.
To connect this with current technology, imagine that ray-generation is like a vertex shader in rasterization, where it sets up the triangle to be rasterized, leading to pixel shaders being called.
Even more interesting – the close-hit, any-hit, and miss shaders can call TraceRay themselves, which is used for multi-bounce and other recursive algorithms (see: figure above). The obvious use case might be reflections, which is the headline of the GDC talk, but they want it to be as general as possible, aligning with the evolution of GPUs. Looking at NVIDIA’s VXAO implementation, it also seems like a natural fit for a raytracing algorithm.
Speaking of data structures, Microsoft also detailed what they call the acceleration structure. Each object is composed of two levels. The top level contains per-object metadata, like its transformation and whatever else data that the developer wants to add to it. The bottom level contains the geometry. The briefing states, “essentially vertex and index buffers” so we asked for clarification. DXR requires that triangle geometry be specified as vertex positions in either 32-bit float3 or 16-bit float3 values. There is also a stride property, so developers can tweak data alignment and use their rasterization vertex buffer, as long as it's HLSL float3, either 16-bit or 32-bit.
As for the tools to develop this in…
Microsoft announced PIX back in January 2017. This is a debugging and performance analyzer for 64-bit, DirectX 12 applications. Microsoft will upgrade it to support DXR as soon as the API is released (specifically, “Day 1”). This includes the API calls, the raytracing pipeline resources, the acceleration structure, and so forth. As usual, you can expect Microsoft to support their APIs with quite decent – not perfect, but decent – documentation and tools. They do it well, and they want to make sure it’s available when the API is.
Example of DXR via EA's in-development SEED engine.
In short, raytracing is here, but it’s not taking over rasterization. It doesn’t need to. Microsoft is just giving game developers another, standardized mechanism to gather supplementary data for their games. Several game engines have already announced support for this technology, including the usual suspects of anything top-tier game technology:
- Frostbite (EA/DICE)
- SEED (EA)
- 3DMark (Futuremark)
- Unreal Engine 4 (Epic Games)
- Unity Engine (Unity Technologies)
They also said, “and several others we can’t disclose yet”, so this list is not even complete. But, yeah, if you have Frostbite, Unreal Engine, and Unity, then you have a sizeable market as it is. There is always a question about how much each of these engines will support the technology. Currently, raytracing is not portable outside of DirectX 12, because it’s literally being announced today, and each of these engines intend to support more than just Windows 10 and Xbox.
Still, we finally have a standard for raytracing, which should drive vendors to optimize in a specific direction. From there, it's just a matter of someone taking the risk to actually use the technology for a cool work of art.
If you want to read more, check out Ryan's post about the also-announced RTX, NVIDIA's raytracing technology.
CalDigit Tuff Rugged External Drive
There are a myriad of options when it comes to portable external storage. But if you value durability just as much as portability, those options quickly dry up. Combining a cheap 2.5-inch hard drive with an AmazonBasics enclosure is often just fine for an external storage solution that sits in your climate controlled office all day, but it's probably not the best choice for field use during your national park photography trip, your scuba diving expedition, or on-site construction management.
For situations like these where the elements become a factor and the chance of an accidental drop skyrockets, it's a good idea to invest in "ruggedized" equipment. Companies like Panasonic and Dell have long offered laptops custom-designed to withstand unusually harsh environments, and accessory makers have followed suit with ruggedized hard drives.
Today we're taking a look at one such ruggedized hard drive, the CalDigit Tuff. Released in 2017, the CalDigit Tuff is a 2.5-inch bus-powered external drive available in both HDD and SSD options. CalDigit loaned us the 2TB HDD model for testing.