Introduction

Gaming headsets are an ever-growing segment, with seemingly every hardware company offering their own take on this popular concept these days. Logitech is far from a new player in this space, with a number of headsets on the market over the years. Their most recent lineup included the top-end G930, and this headset has been superseded by the new G933 (wireless) and G633 (wired) models. We’ll take a look - and listen - in this review.

artemis_cover.jpg

With the new Artemis Spectrum headsets Logitech is introducing their new 40 mm Pro-G drivers, which the company says will offer high-fidelity sound:

"Patent pending advanced Pro-G audio drivers are made with hybrid mesh materials that provide the audiophile-like performance gaming fans have been demanding. From your favorite music to expansive game soundtracks, the Pro-G drivers deliver both clean and accurate highs as well as a deep rich bass that you would expect from premium headphones."

More than a pair of stereo headphones, of course, the Artemis Spectrum G933 and G633 feature (simulated) 7.1 channel surround via selectable Dolby or DTS Headphone:X technology. How convincing this effect might be is a focus of the review, and we will take a close look at audio performance.

While these two pairs of gaming headphones might look identical, the G933 differentiates itself from the G633 by offering 2.4 GHz wireless capability. Both headsets also feature two fully customizable RGB lighting zones, with 16.8 million colors controlled through the Logitech Gaming Software on your PC. But a computer isn't required to use these headsets; both the G933 and G633 are fully compatible with the XBox One and PlayStation 4, and with a 3.5 mm audio cable (included with both) they can be used as a stereo headset with just about anything including smartphones.

artemis_main.jpg

Continue reading our review of the Logitech Artemis Spectrum G933/G633 headsets!!

Subject: Storage
Manufacturer: Samsung

Introduction, Specifications and Packaging

Introduction

Around this same time last year, Samsung launched their Portable SSD T1. This was a nifty little external SSD with some very good performance and capabilities. Despite its advantages and the cool factor of having a thin and light 1TB SSD barely noticeable in your pocket, there was some feedback from consumers that warranted a few tweaks to the design. There was also the need for a new line as Samsung was switching over their VNAND from 32 to 48 layer, enabling a higher capacity tier for this portable SSD. All of these changes were wrapped up into the new Samsung Portable SSD T3:

160217-180734.jpg

Specifications

T3 specs.png

Most of these specs are identical to the previous T1, with some notable exceptions. Consumer feedback prompted a newer / heavier metal housing, as the T1 (coming in at only 26 grams) was almost too light. With that newer housing came a slight enlarging of dimensions. We will do some side by side comparisons later in the review.

Read on for our full review of the new Samsung T3!

Author:
Subject: Systems
Manufacturer: Various

Part 1 - Picking the Parts

I'm guilty. I am one of those PC enthusiasts that thinks everyone knows how to build a PC. Everyone has done it before, and all you need from the tech community is the recommendation for parts, right? Turns out that isn't the case at all, and as more and more gamers and users come into our community, they are overwhelmed and often under served. It's time to fix that.

This cropped up for me personally when my nephew asked me about getting him a computer. At just 14 years old, he had never built a PC, watched a PC be constructed - nothing of that sort. Even though his uncle had built computers nearly every week for 15 years or more, he had little to no background on what the process was like. I decided that this was perfect opportunity to teach him and create a useful resource for the community at large to help empower another generation to adopt the DIY mindset.

I decided to start with three specific directions:

  • Part 1 - Introduce the array of PC components, what the function of each is and why we picked the specific hardware we did.
     
  • Part 2 - Show him the process of actual construction from CPU install to cable routing
     
  • Part 3 - Walk through the installation of Windows and get him setup with Steam and the idea of modern PC gaming.

Each of the above sections was broken up into a separate video during our day at the office, and will be presented here and on our YouTube channel

I would like to thank Gigabyte for sponsoring this project with us, providing the motherboard, graphics card and helping work with the other vendors to get us a great combination of hardware. Visit them at Gigabyte.com for the full lineup of motherboard, graphics cards and more!!

gbsponsor.jpg

Part 1 - Picking the Parts

Selecting the parts to build a PC can be a daunting task for a first timer. What exactly is a motherboard and do you need one? Should you get 2 or 4 or more memory modules? SSD vs HDD? Let's lay it all out there for you.

The specific configuration used in Austin's PC build is pretty impressive!

  Austin's First PC Build
Processor Intel Core i5-6600K - $249
Motherboard Gigabyte Z170X-Gaming 5 - $189
Memory Corsair Vengeance LPX 16GB DDR4-3200 - $192
Graphics Card Gigabyte GTX 970 Gaming Xtreme - $374
Storage Corsair Neutron XT 480GB - $184
Western Digital 3TB Red - $109
Case Corsair Obsidian 450D - $119
Power Supply Corsair RM550x - $117
Keyboard Logitech G910 Orion Spark - $159
Mouse Logitech G602 - $51
Headset Logitech G933 Artemis Spectrum - $192
Monitor Acer XB280HK - $699
OS Windows 10 Home - $119
Total Price $2054 (not including the monitor) - Amazon.com Cart

Continue reading My First PC Build on PC Perspective!!

Subject: Mobile
Manufacturer: Apple

It's Easier to Be Convincing than Correct

This is a difficult topic to discuss. Some perspectives assume that law enforcement have terrible, Orwellian intentions. Meanwhile, law enforcement officials, with genuinely good intentions, don't understand that the road to Hell is paved with those. Bad things are much more likely to happen when human flaws are justified away, which is easy to do when your job is preventing mass death and destruction. Human beings like to use large pools of evidence to validate assumptions, without realizing it, rather than discovering truth.

Ever notice how essays can always find sources, regardless of thesis? With increasing amounts of data, you are progressively more likely to make a convincing argument, but not necessarily a more true one. Mix in good intentions, which promotes complacency, and mistakes can happen.

apple-2016-iphonenot.png

HOPEFULLY NOT...

But this is about Apple. Recently, the FBI demanded that Apple creates a version of iOS that can be broken into by law enforcement. They frequently use the term “back door,” while the government prefers other terminology. Really, words are words and the only thing that matters is what it describes -- and it describes a mechanism to compromise the device's security in some way.

This introduces several problems.

The common line that I hear is, “I don't care, because I have nothing to hide.” Well... that's wrong in a few ways. First, having nothing to hide is irrelevant if the person who wants access to your data assumes that you have something you want to hide, and is looking for evidence that convinces themselves that they're right. Second, you need to consider all the people who want access to this data. The FBI will not be the only one demanding a back door, or even the United States as a whole. There are a whole lot of nations that trusts individuals, including their own respective citizens, less than the United States. You can expect that each of them would request a backdoor.

You can also expect each of them, and organized criminals, wanting to break into each others'.

Lastly, we've been here before, and what it comes down to is criminalizing math. Encryption is just a mathematical process that is easy to perform, but hard to invert. It all started because it is easy to multiply two numbers together, but hard to factor them. The only method we know is dividing by every possible number that's smaller than the square root of said number. If the two numbers are prime, then you are stuck finding one number out of all those possibilities (the other prime number will be greater than the square root). In the 90s, numbers over a certain size were legally classified as weapons. That may sound ridiculous, and there would be good reason for that feeling. Either way, it changed; as a result, online banks and retailers thrived.

Apple closes their letter with the following statement:

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Good intentions lead to complacency, which is where the road to (metaphorical) Hell starts.

Subject: Storage
Manufacturer: Samsung

Introduction, Specifications and Packaging

Introduction

The steady increase in flash memory capacity per die is necessary for bringing SSD costs down, but SSDs need a minimum number of dies present to maintain good performance. Back when Samsung announced their 48-layer VNAND, their Senior VP of Marketing assured me that the performance drop that comes along with the low die count present in lower capacity models would be dealt with properly. At the time, Unsoo Kim mentioned the possibility of Samsung producing 128Gbit 48-layer VNAND, but it now appears that they have opted to put everything into 256Gbit on 3D side. Fortunately they still have a planar (2D) NAND production line going, and they will be using that same flash in a newer line of low capacity models. When their 850 Series transitions over to 48-layer (enabling 2TB capacities), Samsung will drop the 120GB capacity of that line and replace it with a new OEM / system builder destined 750 EVO:

160210-175142.jpg

The SSD 750 EVO Series is essentially a throwback to the 840 EVO, but without all of the growing pains experienced by that line. Samsung assured me that the same corrections that ultimately fixed the long-term read-based slow down issues with the 840 EVO also apply to the 750 EVO, and despite the model number being smaller, these should actually perform a bit better than their predecessor. Since it would be silly to just launch a single 120GB capacity to make up for the soon to be dropped 850 EVO 120GB, we also get a 250GB model, which should make for an interesting price point.

Specifications

specs.png

Baseline specs are very similar to the older 840 EVO series, with some minor differences (to be shown below). There are some unlisted specs that are carried over from the original series. For those we need to reference the slides from the 840 EVO launch:

DSC04638.JPG

Read on for the full review of these two new models!

Manufacturer: PC Perspective

Caught Up to DirectX 12 in a Single Day

The wait for Vulkan is over.

I'm not just talking about the specification. Members of the Khronos Group have also released compatible drivers, SDKs and tools to support them, conformance tests, and a proof-of-concept patch for Croteam's The Talos Principle. To reiterate, this is not a soft launch. The API, and its entire ecosystem, is out and ready for the public on Windows (at least 7+ at launch but a surprise Vista or XP announcement is technically possible) and several distributions of Linux. Google will provide an Android SDK in the near future.

khronos-2016-vulkan-why.png

I'm going to editorialize for the next two paragraphs. There was a concern that Vulkan would be too late. The thing is, as of today, Vulkan is now just as mature as DirectX 12. Of course, that could change at a moment's notice; we still don't know how the two APIs are being adopted behind the scenes. A few DirectX 12 titles are planned to launch in a few months, but no full, non-experimental, non-early access game currently exists. Each time I say this, someone links the Wikipedia list of DirectX 12 games. If you look at each entry, though, you'll see that all of them are either: early access, awaiting an unreleased DirectX 12 patch, or using a third-party engine (like Unreal Engine 4) that only list DirectX 12 as an experimental preview. No full, released, non-experimental DirectX 12 game exists today. Besides, if the latter counts, then you'll need to accept The Talos Principle's proof-of-concept patch, too.

But again, that could change. While today's launch speaks well to the Khronos Group and the API itself, it still needs to be adopted by third party engines, middleware, and software. These partners could, like the Khronos Group before today, be privately supporting Vulkan with the intent to flood out announcements; we won't know until they do... or don't. With the support of popular engines and frameworks, dependent software really just needs to enable it. This has not happened for DirectX 12 yet, and, now, there doesn't seem to be anything keeping it from happening for Vulkan at any moment. With the Game Developers Conference just a month away, we should soon find out.

khronos-2016-vulkan-drivers.png

But back to the announcement.

Vulkan-compatible drivers are launching today across multiple vendors and platforms, but I do not have a complete list. On Windows, I was told to expect drivers from NVIDIA for Windows 7, 8.x, 10 on Kepler and Maxwell GPUs. The standard is compatible with Fermi GPUs, but NVIDIA does not plan on supporting the API for those users due to its low market share. That said, they are paying attention to user feedback and they are not ruling it out, which probably means that they are keeping an open mind in case some piece of software gets popular and depends upon Vulkan. I have not heard from AMD or Intel about Vulkan drivers as of this writing, one way or the other. They could even arrive day one.

On Linux, NVIDIA, Intel, and Imagination Technologies have submitted conformant drivers.

Drivers alone do not make a hard launch, though. SDKs and tools have also arrived, including the LunarG SDK for Windows and Linux. LunarG is a company co-founded by Lens Owen, who had a previous graphics software company that was purchased by VMware. LunarG is backed by Valve, who also backed Vulkan in several other ways. The LunarG SDK helps developers validate their code, inspect what the API is doing, and otherwise debug. Even better, it is also open source, which means that the community can rapidly enhance it, even though it's in a releasable state as it is. RenderDoc, the open-source graphics debugger by Crytek, will also add Vulkan support. ((Update (Feb 16 @ 12:39pm EST): Baldur Karlsson has just emailed me to let me know that it was a personal project at Crytek, not a Crytek project in general, and their GitHub page is much more up-to-date than the linked site.))

vulkan_gltransition_maintenance1.png

The major downside is that Vulkan (like Mantle and DX12) isn't simple.
These APIs are verbose and very different from previous ones, which requires more effort.

Image Credit: NVIDIA

There really isn't much to say about the Vulkan launch beyond this. What graphics APIs really try to accomplish is standardizing signals that enter and leave video cards, such that the GPUs know what to do with them. For the last two decades, we've settled on an arbitrary, single, global object that you attach buffers of data to, in specific formats, and call one of a half-dozen functions to send it.

Compute APIs, like CUDA and OpenCL, decided it was more efficient to handle queues, allowing the application to write commands and send them wherever they need to go. Multiple threads can write commands, and multiple accelerators (GPUs in our case) can be targeted individually. Vulkan, like Mantle and DirectX 12, takes this metaphor and adds graphics-specific instructions to it. Moreover, GPUs can schedule memory, compute, and graphics instructions at the same time, as long as the graphics task has leftover compute and memory resources, and / or the compute task has leftover memory resources.

This is not necessarily a “better” way to do graphics programming... it's different. That said, it has the potential to be much more efficient when dealing with lots of simple tasks that are sent from multiple CPU threads, especially to multiple GPUs (which currently require the driver to figure out how to convert draw calls into separate workloads -- leading to simplifications like mirrored memory and splitting workload by neighboring frames). Lots of tasks aligns well with video games, especially ones with lots of simple objects, like strategy games, shooters with lots of debris, or any game with large crowds of people. As it becomes ubiquitous, we'll see this bottleneck disappear and games will not need to be designed around these limitations. It might even be used for drawing with cross-platform 2D APIs, like Qt or even webpages, although those two examples (especially the Web) each have other, higher-priority bottlenecks. There are also other benefits to Vulkan.

khronos-2016-vulkan-middleware.png

The WebGL comparison is probably not as common knowledge as Khronos Group believes.
Still, Khronos Group was criticized when WebGL launched as "it was too tough for Web developers".
It didn't need to be easy. Frameworks arrived and simplified everything. It's now ubiquitous.
In fact, Adobe Animate CC (the successor to Flash Pro) is now a WebGL editor (experimentally).

Open platforms are required for this to become commonplace. Engines will probably target several APIs from their internal management APIs, but you can't target users who don't fit in any bucket. Vulkan brings this capability to basically any platform, as long as it has a compute-capable GPU and a driver developer who cares.

Thankfully, it arrived before any competitor established market share.

Author:
Subject: Displays
Manufacturer: ASUS

A unique combo of size and resolution

We see all kinds of monitors at PC Perspective; honestly it's probably too many. It's rare when a form factor or combination of features really feels unique, but today's review of the ASUS PB328Q is exactly that. Have we seen 2560x1440 displays? Countless. More than a few VA panels have graced our test benches. And 30-32 inch monitors were the biggest rage in screen technology as far back as 2007. A refresh rate of 75Hz is no longer as novel a feature as it used to be either.

01.jpg

The ASUS PB328Q combines all of that into a package that stands out from other professional, low cost monitor options. The largest 2560x1440 monitor that I have used previously is 27-inches, and the 5-in difference between that and what the PB328Q offers is an immediately obvious change. The question is though, does the size and resolution combination, along with the panel technology, combine to a form a product that is good for productivity, gaming, both, or neither? With a price of just $539 on Amazon, many users might be interested in the answer.​

Here are the specifications for the ASUS PB328Q display.

  ASUS PB328Q Specifications
Screen Size 32 inch
Screen Mode WQHD
Response Time 4ms
Aspect Ratio 16:9
Backlight Technology LED
Panel Technology VA (vertical alignment)
Tilt Angle -5 to +20 degrees
Adjustable Height Yes
Video
Maximum Resolution 2560x1440
Standard Refresh Rate 75 Hz
Color Supported 1073.1M (10-bit) with 12-bit Look-up Table
Contrast Ratio 100,000,000:1 (ASCR)
Brightness 300 nits
Tearing Prevention Technology None
Audio
Speakers 3W x 2 Stereo RMS
Interfaces/Ports
DisplayPort Yes
HDMI Yes
DVI Yes
3.5mm Audio Output Yes
Physical Characteristics
Color Black
Miscellaneous
Package Contents Dual-link DVI cable
VGA cable
Audio cable
Power cord
DisplayPort cable
USB 3.0 cable
HDMI cable

For those new to VA panel technology, is helps to have some background before we start testing the PB328Q. Vertical alignment panels are very good at blocking the backlight coming through the screen to the user's eyes, making them excellent at producing strong blacks and high contrast ratios when compared to other LCD technology. VA also results in vastly improved color reproduction and viewing angles, falling above TN and (usually) below IPS screens in that area.

Continue reading our review of the ASUS PB328Q Monitor!!

Manufacturer: Phanteks

Introduction and First Impressions

The Enthoo EVOLV ITX it is not a new enclosure, but this striking color scheme - black with a glossy red interior - is. We'll take a thorough look at this mini-ITX enclosure in this review, and see how well it performs enclosing a gaming build.

DSC_0355.jpg

The EVOLV series from Phanteks includes ATX, micro-ATX, and this mini-ITX versions; with all three sharing a common design language, though some of the features naturally differ. With this smallest design Phanteks decided to retain enough size to permit the use of standard components, with room for ATX power supplies, full length graphics cards, and liquid CPU cooling with up to a 280 mm radiator.

The EVOLV ATX was my first experience with a Phanteks enclosure, and I was impressed with the build quality and thoughtful design touches. There is a different approach to building with mini-ITX that introduces new elements, including the ability of a system to remain cool and quiet with components in much tighter quarters.

DSC_0400.jpg

Continue reading our review of the Phanteks Enthoo EVOLV ITX case!!

Author:
Manufacturer: EVGA

Introduction and Features

Introduction

2-750-GQ-Banner.jpg

(Courtesy of EVGA)

EVGA continues to expand their already huge PC power supply line with the introduction of the GQ series, which are aimed at price conscious consumers who want good value while still maintaining many of the performance features found in EVGA’s premium models. The GQ Series contains four models ranging from 650W up to 1000W: the EVGA 650 GQ, 750 GQ, 850 GQ and 1000 GQ. We will be taking a detailed look at the 750 GQ in this review.

3-GQ-Banner.jpg

The GQ series power supplies are 80 Plus Gold certified for high efficiency and feature all modular cables, high-quality Japanese brand capacitors, and a quiet 135mm cooling fan with a fluid dynamic bearing. All GQ series power supplies are NVIDIA SLI and AMD Crossfire Ready and are backed by a 5-year warranty.

4a-750GQ.jpg

EVGA 750W GQ PSU Key Features:

•    Fully modular cables to reduce clutter and improve airflow
•    80 PLUS Gold certified, with up to 90%/92% efficiency (115VAC/240VAC)
•    100% Japanese brand capacitors ensure long-term reliability
•    Quiet 135mm Fluid Dynamic bearing fan for reliability and quiet operation
•    ECO Intelligent Thermal Control allows silent, fan-less operation at low power
•    NVIDIA SLI & AMD Crossfire Ready
•    Ready for 4th Generation Intel Core Processors (C6/C7 Idle Mode)
•    Compliant with ErP Lot 6 2013 Requirement
•    Active Power Factor correction (0.99) with Universal AC input
•    5-Year warranty and EVGA Customer Support

EVGA was founded in 1999 with headquarters in Brea, California. They continue to specialize in producing NVIDIA based graphics adapters and Intel based motherboards and keep expanding their PC power supply product line, which currently includes thirty-eight models ranging from the high-end 1,600W SuperNOVA T2 to the budget minded EVGA 400W power supply.

4b-750W-Compare.jpg

(Courtesy of EVGA)

As you can see in the table above, EVGA currently offers seven different variations of 750W power supplies. Let’s get started with the review and see what makes this new 750W GQ model stand out from the rest.

Please continue reading our review of the EVGA 750W GQ PSU!!!

Author:
Subject: Editorial
Manufacturer: ARM

28HPCU: Cost Effective and Power Efficient

Have you ever been approached about something and upon first hearing about it, the opportunity just did not seem very exciting?  Then upon digging into things, it became much more interesting?  This happened to me with this announcement.  At first blush, who really cares that ARM is partnering with UMC at 28 nm?  Well, once I was able to chat with the people at ARM, it is much more interesting than initially expected.

icon_arm.jpg

The new hotness in fabrication is the latest 14 nm and 16 nm processes from Samsung/GF and TSMC respectively.  It has been a good 4+ years since we last had a new process node that actually performed as expected.  The planar 22/20 nm products just were not entirely suitable for mass production.  Apple was one of the few to actually develop a part for TSMC’s 20 nm process that actually sold in the millions.  The main problem was a lack of power and speed scaling as compared to 28 nm processes.  Planar was a bad choice, but the development of FinFET technologies hadn’t been implemented in time for it to show up at this time by 3rd party manufacturers.

There is a problem with the latest process generations, though.  They are new, expensive, and are production constrained.  Also, they may not be entirely appropriate for the applications that are being developed.  There are several strengths with 28 nm as compared.  These are mature processes with an excess of line space.  The major fabs are offering very competitive pricing structures for 28 nm as they see space being cleared up on the lines with higher end SOCs, GPUs, and assorted ASICs migrating to the new process nodes.

umc_01.png

TSMC has typically been on the forefront of R&D with advanced nodes.  UMC is not as aggressive with their development, but they tend to let others do some of the heavy lifting and then integrate the new nodes when it fits their pricing and business models.  TSMC is on their third generation of 28 nm.  UMC is on their second, but that generation encompasses many of the advanced features of TSMC’s 3rd generation so it is actually quite competitive.

Click here to continue reading about ARM, UMC, and the 28HPCU process!

Author:
Manufacturer: Various

Early testing for higher end GPUs

UPDATE 2/5/16: Nixxes released a new version of Rise of the Tomb Raider today with some significant changes. I have added another page at the end of this story that looks at results with the new version of the game, a new AMD driver and I've also included some SLI and CrossFire results.

I will fully admit to being jaded by the industry on many occasions. I love my PC games and I love hardware but it takes a lot for me to get genuinely excited about anything. After hearing game reviewers talk up the newest installment of the Tomb Raider franchise, Rise of the Tomb Raider, since it's release on the Xbox One last year, I've been waiting for its PC release to give it a shot with real hardware. As you'll see in the screenshots and video in this story, the game doesn't appear to disappoint.

rotr-screen1.jpg

Rise of the Tomb Raider takes the exploration and "tomb raiding" aspects that made the first games in the series successful and applies them to the visual quality and character design brought in with the reboot of the series a couple years back. The result is a PC game that looks stunning at any resolution, but even more so in 4K, that pushes your hardware to its limits. For single GPU performance, even the GTX 980 Ti and Fury X struggle to keep their heads above water.

In this short article we'll look at the performance of Rise of the Tomb Raider with a handful of GPUs, leaning towards the high end of the product stack, and offer up my view on whether each hardware vendor is living up to expectations.

Continue reading our look at GPU performance in Rise of the Tomb Raider!!

Author:
Subject: General Tech
Manufacturer: Logitech G

A mix of styles

Logitech continues its push and re-entry into the gaming peripherals market in 2016, this time adding another keyboard under the Orion brand to the mix. The Logitech G G810 Orion Spectrum is, as the name implies, an RGB mechanical keyboard using the company's proprietary Romer-G switches. But despite the similarity in model numbers to the G910 Orion Spark announced in late 2014, the G810 has some significant design and functionality changes.

11.jpg

This new offering is cleaner, less faceted (both in key caps and design) but comes much closer to the feel and function than the tenkeyless G410 from last year. Let's take a look at how the G810 changes things up for Logitech G.

Keyboard Design

The G810 Orion Spectrum is a full size keyboard with tenkey (also known as the numeric keypad) that has sleeker lines and more professional lines that its big brother. The black finish is matte on the keys and framing but the outside edges of the keyboard have a gloss to them. It's a very minimal part of the design though so you shouldn't have to worry about fingerprints.

01.jpg

At first glance, you can see that Logitech toned down some of the gamer-centric accents when compared to either the G910 or the G410. There is no wrist rest, no PCB-trace inspired lines, no curves and no sharp edges. What you get instead is a keyboard that is equally well placed in modern office or in an enthusiasts gaming den. To me, there are a lot of touches that remind me of the Das Keyboard - understated design that somehow makes it more appealing to the educated consumer. 

02.jpg

This marks the first keyboard with the new Logitech G logo on it, though you are likely more concerned about the lack of G-Keys, the company's name for its macro-capable buttons on the G910. For users that still want that capability, Logitech G allows you to reprogram the function keys along the top for macro capability, and has a pretty simple switch in software to enable or disable those macros. This means you can maintain the F-row of keys for Windows applications but still use macros for gaming.

Continue reading our review of the Logitech G810 Orion Spectrum keyboard!!

Subject: Systems
Manufacturer: PC Perspective

That Depends on Whether They Need One

Ars Technica UK published an editorial called, Hey Valve: What's the point of Steam OS? The article does not actually pose the question in it's text -- it mostly rants about technical problems with a Zotac review unit -- but the headline is interesting none-the-less.

Here's my view of the situation.

steam-os.png

The Death of Media Center May Have Been...

There's two parts to this story, and both center around Windows 8. The first was addressed in an editorial that I wrote last May, titled The Death of Media Center & What Might Have Been. Microsoft wanted to expand the PC platform into the living room. Beyond the obvious support for movies, TV, and DVR, they also pushed PC gaming in a few subtle ways. The Games for Windows certification required games to be launchable by Media Center and support Xbox 360 peripherals, which pressures game developers to make PC games comfortable to play on a couch. They also created Tray and Play, which is an optional feature that allows PC games to be played from the disk while they installed in the background. Back in 2007, before Steam and other digital distribution services really took off, this eliminated install time, which was a major user experience problem with PC gaming (and a major hurdle for TV-connected PCs).

It also had a few nasty implications. Games for Windows Live tried to eliminate modding by requiring all content to be certified (or severely limiting the tools as seen in Halo 2 Vista). Microsoft was scared about the content that users could put into their games, especially since Hot Coffee (despite being locked, first-party content) occurred less than two years earlier. You could also argue that they were attempting to condition PC users to accept paid DLC.

Windows_Media_Center_Logo.png

Regardless of whether it would have been positive or negative for the PC industry, the Media Center initiative launched with Windows Vista, which is another way of saying “exploded on the launch pad, leaving no survivors.” Windows 7 cleared the wreckage with a new team, who aimed for the stars with Windows 8. They ignored the potential of the living room PC, preferring devices and services (ie: Xbox) over an ecosystem provided by various OEMs.

If you look at the goals of Steam OS, they align pretty well with the original, Vista-era ambitions. Valve hopes to create a platform that hardware vendors could compete on. Devices, big or small, expensive or cheap, could fill all of the various needs that users have in the living room. Unfortunately, unlike Microsoft, they cannot be (natively) compatible with the catalog of Windows software.

This may seem like Valve is running toward a cliff, but keep reading.

What If Steam OS Competed with Windows Store?

Windows 8 did more than just abandon the vision of Windows Media Center. Driven by the popularity of the iOS App Store, Microsoft saw a way to end the public perception that Windows is hopelessly insecure. With the Windows Store, all software needs to be reviewed and certified by Microsoft. Software based on the Win32 API, which is all software for Windows 7 and earlier, was only allowed within the “Desktop App,” which was a second-class citizen and could be removed at any point.

mozilla-2016-donothurt.png

This potential made the PC software industry collectively crap themselves. Mozilla was particularly freaked out, because Windows Store demanded (at the time) that all web browsers become reskins of Internet Explorer. This means that Firefox would not be able to implement any new Web standards on Windows, because it can only present what Internet Explorer (Trident) draws. Mozilla's mission is to develop a strong, standards-based web browser that forces all others to interoperate or die.

Remember: “This website is best viewed with Internet Explorer”?

Executives from several PC gaming companies, including Valve, Blizzard, and Mojang, spoke out against Windows 8 at the time (along with browser vendors and so forth). Steam OS could be viewed as a fire escape for Valve if Microsoft decided to try its luck and kill, or further deprecate, Win32 support. In the mean time, Windows PCs could stream to it until Linux gained a sufficient catalog of software.

microsoft-2016-windowsrt.png

Image Credit: Wikipedia

This is where Steam OS gets interesting. Its software library cannot compete against Windows with its full catalog of Win32 applications, at least not for a long time. On the other hand, if Microsoft continues to support Win32 as a first-class citizen, and they returned to the level of openness with software vendors that they had in the Windows XP era, then Valve doesn't really have a reason to care about Steam OS as anything more than a hobby anyway. Likewise, if doomsday happens and something like Windows RT ends up being the future of Windows, as many feared, then Steam OS wouldn't need to compete against Windows. Its only competition from Microsoft would be Windows Store apps and first-party software.

I would say that Valve might even have a better chance than Microsoft in that case.

Author:
Subject: Processors
Manufacturer: AMD

AMD Keeps Q1 Interesting

CES 2016 was not a watershed moment for AMD.  They showed off their line of current video cards and, perhaps more importantly, showed off working Polaris silicon, which will be their workhorse for 2016 in the graphics department.  They did not show off Zen, a next generation APU, or any AM4 motherboards.  The CPU and APU world was not presented in a way that was revolutionary.  What they did show off, however, hinted at the things to come to help keep AMD relevant in the desktop space.

AMD_NewQ1.jpg

It was odd to see an announcement about the stock cooler that AMD was introducing, but when we learned more about it, the more important it was for AMD’s reputation moving forward.  The Wraith cooler is a new unit to help control the noise and temperatures of the latest AMD CPUs and select APUs.  This is a fairly beefy unit with a large, slow moving fan that produces very little noise.  This is a big change from the variable speed fans on previous coolers that could get rather noisy and leave temperatures that were higher in range than are comfortable.  There has been some derision aimed at AMD for providing “just a cooler” for their top end products, but it is a push that is making them more user and enthusiast friendly without breaking the bank.

Socket AM3+ is not dead yet.  Though we have been commenting on the health of the platform for some time, AMD and its partners work to improve and iterate upon these products to include technologies such as USB 3.1 and M.2 support.  While these chipsets are limited to PCI-E 2.0 speeds, the four lanes available to most M.2 controllers allows these boards to provide enough bandwidth to fully utilize the latest NVMe based M.2 drives available.  We likely will not see a faster refresh on AM3+, but we will see new SKUs utilizing the Wraith cooler as well as a price break for the processors that exist in this socket.

Click here to continue reading about AMD's latest offerings for Q1 2016!

Subject: Storage
Manufacturer: Gigabyte

Introduction

NVMe was a great thing to happen to SSDs. The per-IO reduction in latency and CPU overhead was more than welcome, as PCIe SSDs were previously using the antiquated AHCI protocol, which was a carryover from the SATA HDD days. With NVMe came additional required support in Operating Systems and UEFI BIOS implementations. We did some crazy experiments with arrays of these new devices, but we were initially limited by the lack of native hardware-level RAID support to tie multiple PCIe devices together. The launch of the Z170 chipset saw a remedy to this, by including the ability to tie as many as three PCIe SSDs behind a chipset-configured array. The recent C600 server chipset also saw the addition of RSTe capability, expanding this functionality to enterprise devices like the Intel SSD P3608, which was actually a pair of SSDs on a single PCB.

Most Z170 motherboards have come with one or two M.2 slots, meaning that enthusiasts wanting to employ the 3x PCIe RAID made possible by this new chipset would have to get creative with the use of interposer / adapter boards (or use a combination of PCI and U.2 connected Intel SSD 750s). With the Samsung 950 Pro available, as well as the slew of other M.2 SSDs we saw at CES 2016, it’s safe to say that U.2 is going to push back into the enterprise sector, leaving M.2 as the choice for consumer motherboards moving forward. It was therefore only a matter of time before a triple-M.2 motherboard was launched, and that just recently happened - Behold the Gigabyte Z170X-SOC Force!

160128-170345.jpg

This new motherboard sits at the high end of Gigabyte’s lineup, with a water-capable VRM cooler and other premium features. We will be passing this board onto Morry for a full review, but this piece will be focusing on one section in particular:

160128-170427.jpg

I have to hand it to Gigabyte for this functional and elegant design choice. The space between the required four full length PCIe slots makes it look like it was chosen to fit M.2 SSDs in-between them. I should also note that it would be possible to use three U.2 adapters linked to three U.2 Intel SSD 750s, but native M.2 devices makes for a significantly more compact and consumer friendly package.

160122-181745.jpg

With the test system set up, let’s get right into it, shall we?

Read on for our look at triple M.2 in action!

Author:
Manufacturer: Dell

Overview

Dell has never exactly been a brand that gamers gravitate towards. While we have seen some very high quality products out of Dell in the past few years, including the new XPS 13, and people have loved their Ultrasharp monitor line, neither of these target gamers directly. Dell acquired Alienware in 2006 in order to enter the gaming market and continues to make some great products, but they retain the Alienware branding. It seems to me a gaming-centric notebook with just the Dell brand could be a hard sell.

However, that's exactly what we have today with the Dell Inspiron 15 7000. Equipped with an Intel Core i5-6300HQ and NVIDIA GTX 960M for $799, has Dell created a contender in the entry-level gaming notebook race?

IMG_4126.JPG

For years, the Inspiron line has been Dell's entry level option for notebooks and subsequently has a questionable reputation as far as quality and lifespan. With the Inspiron 15 7000 being the most expensive product offering in the Inspiron line though, I was excited to see if it could sway my opinion of the brand.

Click here to continue reading about the Dell Inspiron 15 7000!

Manufacturer: Corsair

Introduction and First Impressions

The new Corsair Carbide 600Q and 600C enclosures are the company's first inverted ATX designs, and the layout promises improved airflow for better cooling.

DSC_0228.jpg

The Carbide Series from Corsair has encompassed enclosures from the company's least expensive budget-friendly options such as the $59 Carbide 100R, to high-performance options like the $159 Carbide Air 540. This new Carbide 600 enclosure is available in two versions, the 600C and 600Q, which both carry an MSRP of $149. This positions the 600C/600Q enclosures near the Graphite and Obsidian series models, but this is only fitting as there is nothing "budget" about these new Carbide 600 models.

The Carbide Series 600Q in for review differs from the 600C most obviously in its lack of the latter's hinged, latching side-panel, which also contains a large window. But the differences extend to the internal makeup of the enclosure, as the 600Q includes significant noise damping inside the front, top, and side panels. We'll be taking a close look at the noise levels along with thermal performance with this "Q" version of the new enclosure in our review.

Carbide_600Q_AIRFLOW.jpg

Continue reading our review of the Corsair Carbide Series 600Q enclosure!!

Author:
Subject: General Tech
Manufacturer: Logitech G

My new desk mate

Earlier this month at the 2016 edition of the Consumer Electronics Show, Logitech released a new product for the gaming market that might have gone unnoticed by some. The G502 Proteus Spectrum is a new gaming mouse that takes an amazing product and makes it just a little better with the help of some RGB goodness. The G502 Proteus Core has been around for a while now and has quickly become one of the best selling gaming mice on Amazon, a testament to its quality and popularity. (It has been as high as #1 overall in recent days.)

01.jpg

We have been using the G502 Proteus Core in our gaming test beds at the office for some months and during that time I often lamented about how I wanted to upgrade the mouse on my own desk to one. While I waited for myself stop being lazy and not just switching one for the G402 currently in use at my workstation, Logitech released the new G502 Proteus Spectrum and handed me a sample at CES to bring home. Perfect!

02.jpg

  Logitech G502 Proteus Spectrum Specifications
Resolution 200 - 12,000 DPI
Max Acceleration >40G
Max Speed >300 IPS
USB Data 16 bits/axis
USB Report Rate 1000 Hz (1 ms)
Processor 32-bit ARM
Button rating 20 million clicks
Feet rating 250 kilometers
Price $79 - Amazon.com

The G502 Proteus Spectrum is very similar to the Core model, with the only difference being the addition of an RGB light under the G logo and DPI resolution indicators. This allows you to use the Logitech Gaming Software to customize its color, its pattern (breathing, still or rotating) as well as pair it up and sync with the RGB lights of other Logitech accessories you might have. If you happen to own a Logitech G910 or G410 keyboard, or one of the new headsets (G633/933) then you'll quickly find yourself in color-coordinated heaven.

03.jpg

In the box you'll find the mouse, attached to a lengthy cable that works great even with my standing desk, and a set of five weights that you can install on the bottom if you like a heavier feel to your mousing action. I installed as many as I could under the magnetic door on the underside of the mouse and definitely prefer it. The benefit of the weights (as opposed to just a heavier mouse out of the box) is that users can customize it as they see fit.

Continue reading our short review of the Logitech G502 Proteus Spectrum mouse!!

Manufacturer: Scythe

Introduction and First Impressions

The Scythe Ninja 4 (SCNJ-4000) is the latest model in the Ninja series, and an imposing air cooler with dimensions similar to Noctua's massive NH-D14. But there's more to the story than size, as this is engineered for silence above all else. Read on to see just how quiet it is, and of course how well it's able to cope with CPU loads.

NINJA4-Side_View.jpg

"The Ninja 4 is the latest model in the Ninja CPU Cooler Series, developed for uncompromising performance. It features the new T-M.A.P.S technology, an optimized alignment of heatpipes, and the back-plate based Hyper Precision Mounting System (H.P.M.S) for firm mounting and easy installation procedure. These improvements and a special, adjustable Glide Stream 120mm PWM fan result in an increased cooling performance while reducing the weight compared to his predecessor. Also the design of the heat-sink allows fan mounting on all four sides. This enables the optimal integration of the Ninja 4 in the air flow of the pc-case and reduces turbulence and the emergence of hotspots."

The Ninja 4 is built around a very large, square heatsink, which allows the single 120 mm fan to be mounted on any side, and this PWM fan offers three speed settings to further control noise. And noise is really what the Ninja is all about, with some really low minimum speeds possible on what is a very quiet Scythe fan to begin with.

NINJA4-Frontview.jpg

Will a single low-speed fan design affect the ability to keep a CPU cool under stress? Will the Ninja 4's fan spin up and become less quiet under full load? These questions will soon be answered.

Continue reading our review of the Scythe Ninja 4 CPU cooler!!

Author:
Subject: Editorial
Manufacturer: AMD

Fighting for Relevance

AMD is still kicking.  While the results of this past year have been forgettable, they have overcome some significant hurdles and look like they are improving their position in terms of cutting costs while extracting as much revenue as possible.  There were plenty of ups and downs for this past quarter, but when compared to the rest of 2015 there were some solid steps forward here.

AMD-Logo.jpg

The company reported revenues of $958 million, which is down from $1.06 billion last quarter.  The company also recorded a $103 million loss, but that is down significantly from the $197 million loss the quarter before.  Q3 did have a $65 million write-down due to unsold inventory.  Though the company made far less in revenues, they also shored up their losses.  The company is still bleeding, but they still have plenty of cash on hand for the next several quarters to survive.  When we talk about non-GAAP figures, AMD reports a $79 million loss for this past quarter.

For the entire year AMD recorded $3.99 billion in revenue with a net loss of $660 million.  This is down from FY 2014 revenues of $5.51 billion and a net loss of $403 million.  AMD certainly is trending downwards year over year, but they are hoping to reverse that come 2H 2016.

amd-financial-analyst-day-2015-11-1024.jpg

Graphics continues to be solid for AMD as they increased their sales from last quarter, but are down year on year.  Holiday sales were brisk, but with only the high end Fury series being a new card during this season, the impact of that particular part was not as great as compared to the company having a new mid-range series like the newly introduced R9 380X.  The second half of 2016 will see the introduction of the Polaris based GPUs for both mobile and desktop applications.  Until then, AMD will continue to provide the current 28 nm lineup of GPUs to the market.  At this point we are under the assumption that AMD and NVIDIA are looking at the same timeframe for introducing their next generation parts due to process technology advances.  AMD already has working samples on Samsung’s/GLOBALFOUNDRIES 14nm LPP (low power plus) that they showed off at CES 2016.

Click here to continue reading about AMD's Q4 2015 and FY 2015 results!