All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Introduction and Technical Specifications
Courtesy of Reeven
Courtesy of Reeven
Reeven is a new comer to the computer cooling space, offering a wide range of coolers and fans for all aspects of the industry. The latest members of their cooler lineup, the Okeanos and Brontes coolers, offer high performance in two very different form factors. The Okeanos is a large dual radiator, dual fan cooler while the Brontes is a low profile C-shaped cooler touting compatibility across most form factors. While the Okeanos offers support for all known CPU and board types, the Brontes supports all mainstream systems with the exception of the Intel 2011-based CPUs and boards.
Things are about to get...complicated
Earlier this week, the team behind Ashes of the Singularity released an updated version of its early access game, which updated its features and capabilities. With support for DirectX 11 and DirectX 12, and adding in multiple graphics card support, the game featured a benchmark mode that got quite a lot of attention. We saw stories based on that software posted by Anandtech, Guru3D and ExtremeTech, all of which had varying views on the advantages of one GPU or another.
That isn’t the focus of my editorial here today, though.
Shortly after the initial release, a discussion began around results from the Guru3D story that measured frame time consistency and smoothness with FCAT, a capture based testing methodology much like the Frame Rating process we have here at PC Perspective. In that post on ExtremeTech, Joel Hruska claims that the results and conclusion from Guru3D are wrong because the FCAT capture methods make assumptions on the output matching what the user experience feels like. Maybe everyone is wrong?
First a bit of background: I have been working with Oxide and the Ashes of the Singularity benchmark for a couple of weeks, hoping to get a story that I was happy with and felt was complete, before having to head out the door to Barcelona for the Mobile World Congress. That didn’t happen – such is life with an 8-month old. But, in my time with the benchmark, I found a couple of things that were very interesting, even concerning, that I was working through with the developers.
FCAT overlay as part of the Ashes benchmark
First, the initial implementation of the FCAT overlay, which Oxide should be PRAISED for including since we don’t have and likely won’t have a DX12 universal variant of, was implemented incorrectly, with duplication of color swatches that made the results from capture-based testing inaccurate. I don’t know if Guru3D used that version to do its FCAT testing, but I was able to get some updated EXEs of the game through the developer in order to the overlay working correctly. Once that was corrected, I found yet another problem: an issue of frame presentation order on NVIDIA GPUs that likely has to do with asynchronous shaders. Whether that issue is on the NVIDIA driver side or the game engine side is still being investigated by Oxide, but it’s interesting to note that this problem couldn’t have been found without a proper FCAT implementation.
With all of that under the bridge, I set out to benchmark this latest version of Ashes and DX12 to measure performance across a range of AMD and NVIDIA hardware. The data showed some abnormalities, though. Some results just didn’t make sense in the context of what I was seeing in the game and what the overlay results were indicating. It appeared that Vsync (vertical sync) was working differently than I had seen with any other game on the PC.
For the NVIDIA platform, tested using a GTX 980 Ti, the game seemingly randomly starts up with Vsync on or off, with no clear indicator of what was causing it, despite the in-game settings being set how I wanted them. But the Frame Rating capture data was still working as I expected – just because Vsync is enabled doesn’t mean you can look at the results in capture formats. I have written stories on what Vsync enabled captured data looks like and what it means as far back as April 2013. Obviously, to get the best and most relevant data from Frame Rating, setting vertical sync off is ideal. Running into more frustration than answers, I moved over to an AMD platform.
Introduction and Technical Specifications
***Editor's Note*** - Before getting into the nuts and bolts of the Core X9, please understand that this initial review is meant as a detailed introduction into the capabilities and build strengths of the Core X9 E-ATX Cube Chassis. A deeper look into the advanced capabilities of this monstrous case will be explored in a soon to be released follow-up article. Stay tuned to PC Perspective for the follow-up.
Courtesy of Thermaltake
The Thermaltake Core X9 E-ATX Cube Chassis is one of the largest and most configurable they've developed. The case is roughly cube shaped with a steel and plastic construction. The height and depth of the unit allows the Core X9 to support up to quad-fan radiators mounted to its top or sides and up to a tri-fan radiator in front. At an MSRP of $169.99, the Core X9 E-ATX Cube Chassis features a competitive price in light of its size and configurability.
Courtesy of Thermaltake
Courtesy of Thermaltake
The Core X9 case was designed to be fully modular, supporting a variety of build configurations to be able to adapt to the whatever build style the end user can dream up. The case comes with a variety of mounts for mounting fans or liquid cooling radiators to the top, side, or bottom of the case. Additionally, Thermaltake integrated three 5.25" device bays as well as two hard drive bays supporting up to three drives each. The chassis motherboard is removable as well for easy install of the motherboard into the system. The chassis itself can be easily segregated into upper and lower sections for controlling system and component heat flow if desired.
Courtesy of Thermaltake
Courtesy of Thermaltake
Until you can acurately visually just how many radiators and fans that this case supports, you really don't have a feel for the immense size of the Core X9. From front to back, the case support 4 x 120mm fans or a 480mm radiator along either of its lower sides or in the dual top mounts. On top, you can actually mount a total of eight 120mm fans or dual 480mm radiators if you so choose. And that doesn't take into account the additional two 140mm fans that can be mounted in the upper and lower sections of the case's rear panel, nor the three 120mm fans, dual 200mm fans, or 360mm radiator that can be mounted to the case's front panel.
3D printing has been an interest of the staff here at PC Perspective for a few years now. Seeing how inexpensive it has gotten to build your own or buy an entry level 3D printer we were interested in doing some sort of content around 3D printing, but we weren't quite sure what to do.
However, an idea arose after seeing Monoprice's new 3D printer offerings at this year's CES. What if we put Ryan, someone who has no idea how 3D printers actually work, in front of one of their entry level models and tell him to print something.
Late last week we received the Maker Select 3D printer from Monoprice, along with a box full of different types of filament and proceeded to live stream us attempting to make it all work.
And thus, the new series "Ryan 3D Prints" was born.
While I've had some limited 3d printing experience in the past, Ryan honestly went into this knowing virtually nothing about the process.
The Maker Select printer isn't ready to print out of the box and requires a bit of assembly, with the setup time from unboxing to first print ultimately taking about 90 minutes for us on the stream. Keep in mind that we were going pretty slow and attempting to explain as best as we could as we went, so someone working by themselves could probably get up and running a bit quicker.
I was extremely impressed with how quickly we were printing successful, and high-quality objects. Beyond having to take a second try at leveling the print bed, we ran into no issues during setup.
Monoprice includes a microSD card with 4 sample models you can print on the Maker Select, and we went ahead and printed an example of all of them. While I don't know at what resolution these models were sliced at, I am impressed with the quality considering the $350 price tag on the Maker Select.
This certainly isn't the end of our 3D printing experience. Our next steps involve taking this printer and hooking it up to a PC and attempting to print our own models with an application like Cura.
Beyond that, we plan to compare different types of filament, take a look at the Dual Extruder Monoprice printer, and maybe even future offerings like the SLA printer they showed off at CES. Stay tuned to see what we end up making!
Gaming headsets are an ever-growing segment, with seemingly every hardware company offering their own take on this popular concept these days. Logitech is far from a new player in this space, with a number of headsets on the market over the years. Their most recent lineup included the top-end G930, and this headset has been superseded by the new G933 (wireless) and G633 (wired) models. We’ll take a look - and listen - in this review.
With the new Artemis Spectrum headsets Logitech is introducing their new 40 mm Pro-G drivers, which the company says will offer high-fidelity sound:
"Patent pending advanced Pro-G audio drivers are made with hybrid mesh materials that provide the audiophile-like performance gaming fans have been demanding. From your favorite music to expansive game soundtracks, the Pro-G drivers deliver both clean and accurate highs as well as a deep rich bass that you would expect from premium headphones."
More than a pair of stereo headphones, of course, the Artemis Spectrum G933 and G633 feature (simulated) 7.1 channel surround via selectable Dolby or DTS Headphone:X technology. How convincing this effect might be is a focus of the review, and we will take a close look at audio performance.
While these two pairs of gaming headphones might look identical, the G933 differentiates itself from the G633 by offering 2.4 GHz wireless capability. Both headsets also feature two fully customizable RGB lighting zones, with 16.8 million colors controlled through the Logitech Gaming Software on your PC. But a computer isn't required to use these headsets; both the G933 and G633 are fully compatible with the XBox One and PlayStation 4, and with a 3.5 mm audio cable (included with both) they can be used as a stereo headset with just about anything including smartphones.
Introduction, Specifications and Packaging
Around this same time last year, Samsung launched their Portable SSD T1. This was a nifty little external SSD with some very good performance and capabilities. Despite its advantages and the cool factor of having a thin and light 1TB SSD barely noticeable in your pocket, there was some feedback from consumers that warranted a few tweaks to the design. There was also the need for a new line as Samsung was switching over their VNAND from 32 to 48 layer, enabling a higher capacity tier for this portable SSD. All of these changes were wrapped up into the new Samsung Portable SSD T3:
Most of these specs are identical to the previous T1, with some notable exceptions. Consumer feedback prompted a newer / heavier metal housing, as the T1 (coming in at only 26 grams) was almost too light. With that newer housing came a slight enlarging of dimensions. We will do some side by side comparisons later in the review.
Part 1 - Picking the Parts
I'm guilty. I am one of those PC enthusiasts that thinks everyone knows how to build a PC. Everyone has done it before, and all you need from the tech community is the recommendation for parts, right? Turns out that isn't the case at all, and as more and more gamers and users come into our community, they are overwhelmed and often under served. It's time to fix that.
This cropped up for me personally when my nephew asked me about getting him a computer. At just 14 years old, he had never built a PC, watched a PC be constructed - nothing of that sort. Even though his uncle had built computers nearly every week for 15 years or more, he had little to no background on what the process was like. I decided that this was perfect opportunity to teach him and create a useful resource for the community at large to help empower another generation to adopt the DIY mindset.
I decided to start with three specific directions:
- Part 1 - Introduce the array of PC components, what the function of each is and why we picked the specific hardware we did.
- Part 2 - Show him the process of actual construction from CPU install to cable routing
- Part 3 - Walk through the installation of Windows and get him setup with Steam and the idea of modern PC gaming.
Each of the above sections was broken up into a separate video during our day at the office, and will be presented here and on our YouTube channel.
I would like to thank Gigabyte for sponsoring this project with us, providing the motherboard, graphics card and helping work with the other vendors to get us a great combination of hardware. Visit them at Gigabyte.com for the full lineup of motherboard, graphics cards and more!!
Part 1 - Picking the Parts
Selecting the parts to build a PC can be a daunting task for a first timer. What exactly is a motherboard and do you need one? Should you get 2 or 4 or more memory modules? SSD vs HDD? Let's lay it all out there for you.
The specific configuration used in Austin's PC build is pretty impressive!
|Austin's First PC Build|
|Processor||Intel Core i5-6600K - $249|
|Motherboard||Gigabyte Z170X-Gaming 5 - $189|
|Memory||Corsair Vengeance LPX 16GB DDR4-3200 - $192|
|Graphics Card||Gigabyte GTX 970 Gaming Xtreme - $374|
|Storage||Corsair Neutron XT 480GB - $184
Western Digital 3TB Red - $109
|Case||Corsair Obsidian 450D - $119|
|Power Supply||Corsair RM550x - $117|
|Keyboard||Logitech G910 Orion Spark - $159|
|Mouse||Logitech G602 - $51|
|Headset||Logitech G933 Artemis Spectrum - $192|
|Monitor||Acer XB280HK - $699|
|OS||Windows 10 Home - $119|
|Total Price||$2054 (not including the monitor) - Amazon.com Cart|
It's Easier to Be Convincing than Correct
This is a difficult topic to discuss. Some perspectives assume that law enforcement have terrible, Orwellian intentions. Meanwhile, law enforcement officials, with genuinely good intentions, don't understand that the road to Hell is paved with those. Bad things are much more likely to happen when human flaws are justified away, which is easy to do when your job is preventing mass death and destruction. Human beings like to use large pools of evidence to validate assumptions, without realizing it, rather than discovering truth.
Ever notice how essays can always find sources, regardless of thesis? With increasing amounts of data, you are progressively more likely to make a convincing argument, but not necessarily a more true one. Mix in good intentions, which promotes complacency, and mistakes can happen.
But this is about Apple. Recently, the FBI demanded that Apple creates a version of iOS that can be broken into by law enforcement. They frequently use the term “back door,” while the government prefers other terminology. Really, words are words and the only thing that matters is what it describes -- and it describes a mechanism to compromise the device's security in some way.
This introduces several problems.
The common line that I hear is, “I don't care, because I have nothing to hide.” Well... that's wrong in a few ways. First, having nothing to hide is irrelevant if the person who wants access to your data assumes that you have something you want to hide, and is looking for evidence that convinces themselves that they're right. Second, you need to consider all the people who want access to this data. The FBI will not be the only one demanding a back door, or even the United States as a whole. There are a whole lot of nations that trusts individuals, including their own respective citizens, less than the United States. You can expect that each of them would request a backdoor.
You can also expect each of them, and organized criminals, wanting to break into each others'.
Lastly, we've been here before, and what it comes down to is criminalizing math. Encryption is just a mathematical process that is easy to perform, but hard to invert. It all started because it is easy to multiply two numbers together, but hard to factor them. The only method we know is dividing by every possible number that's smaller than the square root of said number. If the two numbers are prime, then you are stuck finding one number out of all those possibilities (the other prime number will be greater than the square root). In the 90s, numbers over a certain size were legally classified as weapons. That may sound ridiculous, and there would be good reason for that feeling. Either way, it changed; as a result, online banks and retailers thrived.
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.
Good intentions lead to complacency, which is where the road to (metaphorical) Hell starts.
Introduction, Specifications and Packaging
The steady increase in flash memory capacity per die is necessary for bringing SSD costs down, but SSDs need a minimum number of dies present to maintain good performance. Back when Samsung announced their 48-layer VNAND, their Senior VP of Marketing assured me that the performance drop that comes along with the low die count present in lower capacity models would be dealt with properly. At the time, Unsoo Kim mentioned the possibility of Samsung producing 128Gbit 48-layer VNAND, but it now appears that they have opted to put everything into 256Gbit on 3D side. Fortunately they still have a planar (2D) NAND production line going, and they will be using that same flash in a newer line of low capacity models. When their 850 Series transitions over to 48-layer (enabling 2TB capacities), Samsung will drop the 120GB capacity of that line and replace it with a new OEM / system builder destined 750 EVO:
The SSD 750 EVO Series is essentially a throwback to the 840 EVO, but without all of the growing pains experienced by that line. Samsung assured me that the same corrections that ultimately fixed the long-term read-based slow down issues with the 840 EVO also apply to the 750 EVO, and despite the model number being smaller, these should actually perform a bit better than their predecessor. Since it would be silly to just launch a single 120GB capacity to make up for the soon to be dropped 850 EVO 120GB, we also get a 250GB model, which should make for an interesting price point.
Baseline specs are very similar to the older 840 EVO series, with some minor differences (to be shown below). There are some unlisted specs that are carried over from the original series. For those we need to reference the slides from the 840 EVO launch:
Caught Up to DirectX 12 in a Single Day
I'm not just talking about the specification. Members of the Khronos Group have also released compatible drivers, SDKs and tools to support them, conformance tests, and a proof-of-concept patch for Croteam's The Talos Principle. To reiterate, this is not a soft launch. The API, and its entire ecosystem, is out and ready for the public on Windows (at least 7+ at launch but a surprise Vista or XP announcement is technically possible) and several distributions of Linux. Google will provide an Android SDK in the near future.
I'm going to editorialize for the next two paragraphs. There was a concern that Vulkan would be too late. The thing is, as of today, Vulkan is now just as mature as DirectX 12. Of course, that could change at a moment's notice; we still don't know how the two APIs are being adopted behind the scenes. A few DirectX 12 titles are planned to launch in a few months, but no full, non-experimental, non-early access game currently exists. Each time I say this, someone links the Wikipedia list of DirectX 12 games. If you look at each entry, though, you'll see that all of them are either: early access, awaiting an unreleased DirectX 12 patch, or using a third-party engine (like Unreal Engine 4) that only list DirectX 12 as an experimental preview. No full, released, non-experimental DirectX 12 game exists today. Besides, if the latter counts, then you'll need to accept The Talos Principle's proof-of-concept patch, too.
But again, that could change. While today's launch speaks well to the Khronos Group and the API itself, it still needs to be adopted by third party engines, middleware, and software. These partners could, like the Khronos Group before today, be privately supporting Vulkan with the intent to flood out announcements; we won't know until they do... or don't. With the support of popular engines and frameworks, dependent software really just needs to enable it. This has not happened for DirectX 12 yet, and, now, there doesn't seem to be anything keeping it from happening for Vulkan at any moment. With the Game Developers Conference just a month away, we should soon find out.
But back to the announcement.
Vulkan-compatible drivers are launching today across multiple vendors and platforms, but I do not have a complete list. On Windows, I was told to expect drivers from NVIDIA for Windows 7, 8.x, 10 on Kepler and Maxwell GPUs. The standard is compatible with Fermi GPUs, but NVIDIA does not plan on supporting the API for those users due to its low market share. That said, they are paying attention to user feedback and they are not ruling it out, which probably means that they are keeping an open mind in case some piece of software gets popular and depends upon Vulkan. I have not heard from AMD or Intel about Vulkan drivers as of this writing, one way or the other. They could even arrive day one.
On Linux, NVIDIA, Intel, and Imagination Technologies have submitted conformant drivers.
Drivers alone do not make a hard launch, though. SDKs and tools have also arrived, including the LunarG SDK for Windows and Linux. LunarG is a company co-founded by Lens Owen, who had a previous graphics software company that was purchased by VMware. LunarG is backed by Valve, who also backed Vulkan in several other ways. The LunarG SDK helps developers validate their code, inspect what the API is doing, and otherwise debug. Even better, it is also open source, which means that the community can rapidly enhance it, even though it's in a releasable state as it is. RenderDoc,
the open-source graphics debugger by Crytek, will also add Vulkan support. ((Update (Feb 16 @ 12:39pm EST): Baldur Karlsson has just emailed me to let me know that it was a personal project at Crytek, not a Crytek project in general, and their GitHub page is much more up-to-date than the linked site.))
The major downside is that Vulkan (like Mantle and DX12) isn't simple.
These APIs are verbose and very different from previous ones, which requires more effort.
Image Credit: NVIDIA
There really isn't much to say about the Vulkan launch beyond this. What graphics APIs really try to accomplish is standardizing signals that enter and leave video cards, such that the GPUs know what to do with them. For the last two decades, we've settled on an arbitrary, single, global object that you attach buffers of data to, in specific formats, and call one of a half-dozen functions to send it.
Compute APIs, like CUDA and OpenCL, decided it was more efficient to handle queues, allowing the application to write commands and send them wherever they need to go. Multiple threads can write commands, and multiple accelerators (GPUs in our case) can be targeted individually. Vulkan, like Mantle and DirectX 12, takes this metaphor and adds graphics-specific instructions to it. Moreover, GPUs can schedule memory, compute, and graphics instructions at the same time, as long as the graphics task has leftover compute and memory resources, and / or the compute task has leftover memory resources.
This is not necessarily a “better” way to do graphics programming... it's different. That said, it has the potential to be much more efficient when dealing with lots of simple tasks that are sent from multiple CPU threads, especially to multiple GPUs (which currently require the driver to figure out how to convert draw calls into separate workloads -- leading to simplifications like mirrored memory and splitting workload by neighboring frames). Lots of tasks aligns well with video games, especially ones with lots of simple objects, like strategy games, shooters with lots of debris, or any game with large crowds of people. As it becomes ubiquitous, we'll see this bottleneck disappear and games will not need to be designed around these limitations. It might even be used for drawing with cross-platform 2D APIs, like Qt or even webpages, although those two examples (especially the Web) each have other, higher-priority bottlenecks. There are also other benefits to Vulkan.
The WebGL comparison is probably not as common knowledge as Khronos Group believes.
Still, Khronos Group was criticized when WebGL launched as "it was too tough for Web developers".
It didn't need to be easy. Frameworks arrived and simplified everything. It's now ubiquitous.
In fact, Adobe Animate CC (the successor to Flash Pro) is now a WebGL editor (experimentally).
Open platforms are required for this to become commonplace. Engines will probably target several APIs from their internal management APIs, but you can't target users who don't fit in any bucket. Vulkan brings this capability to basically any platform, as long as it has a compute-capable GPU and a driver developer who cares.
Thankfully, it arrived before any competitor established market share.
A unique combo of size and resolution
We see all kinds of monitors at PC Perspective; honestly it's probably too many. It's rare when a form factor or combination of features really feels unique, but today's review of the ASUS PB328Q is exactly that. Have we seen 2560x1440 displays? Countless. More than a few VA panels have graced our test benches. And 30-32 inch monitors were the biggest rage in screen technology as far back as 2007. A refresh rate of 75Hz is no longer as novel a feature as it used to be either.
The ASUS PB328Q combines all of that into a package that stands out from other professional, low cost monitor options. The largest 2560x1440 monitor that I have used previously is 27-inches, and the 5-in difference between that and what the PB328Q offers is an immediately obvious change. The question is though, does the size and resolution combination, along with the panel technology, combine to a form a product that is good for productivity, gaming, both, or neither? With a price of just $539 on Amazon, many users might be interested in the answer.
Here are the specifications for the ASUS PB328Q display.
|ASUS PB328Q Specifications|
|Screen Size||32 inch|
|Panel Technology||VA (vertical alignment)|
|Tilt Angle||-5 to +20 degrees|
|Standard Refresh Rate||75 Hz|
|Color Supported||1073.1M (10-bit) with 12-bit Look-up Table|
|Contrast Ratio||100,000,000:1 (ASCR)|
|Tearing Prevention Technology||None|
|Speakers||3W x 2 Stereo RMS|
|3.5mm Audio Output||Yes|
|Package Contents||Dual-link DVI cable
USB 3.0 cable
For those new to VA panel technology, is helps to have some background before we start testing the PB328Q. Vertical alignment panels are very good at blocking the backlight coming through the screen to the user's eyes, making them excellent at producing strong blacks and high contrast ratios when compared to other LCD technology. VA also results in vastly improved color reproduction and viewing angles, falling above TN and (usually) below IPS screens in that area.
Introduction and First Impressions
The Enthoo EVOLV ITX it is not a new enclosure, but this striking color scheme - black with a glossy red interior - is. We'll take a thorough look at this mini-ITX enclosure in this review, and see how well it performs enclosing a gaming build.
The EVOLV series from Phanteks includes ATX, micro-ATX, and this mini-ITX versions; with all three sharing a common design language, though some of the features naturally differ. With this smallest design Phanteks decided to retain enough size to permit the use of standard components, with room for ATX power supplies, full length graphics cards, and liquid CPU cooling with up to a 280 mm radiator.
The EVOLV ATX was my first experience with a Phanteks enclosure, and I was impressed with the build quality and thoughtful design touches. There is a different approach to building with mini-ITX that introduces new elements, including the ability of a system to remain cool and quiet with components in much tighter quarters.
Introduction and Features
(Courtesy of EVGA)
EVGA continues to expand their already huge PC power supply line with the introduction of the GQ series, which are aimed at price conscious consumers who want good value while still maintaining many of the performance features found in EVGA’s premium models. The GQ Series contains four models ranging from 650W up to 1000W: the EVGA 650 GQ, 750 GQ, 850 GQ and 1000 GQ. We will be taking a detailed look at the 750 GQ in this review.
The GQ series power supplies are 80 Plus Gold certified for high efficiency and feature all modular cables, high-quality Japanese brand capacitors, and a quiet 135mm cooling fan with a fluid dynamic bearing. All GQ series power supplies are NVIDIA SLI and AMD Crossfire Ready and are backed by a 5-year warranty.
EVGA 750W GQ PSU Key Features:
• Fully modular cables to reduce clutter and improve airflow
• 80 PLUS Gold certified, with up to 90%/92% efficiency (115VAC/240VAC)
• 100% Japanese brand capacitors ensure long-term reliability
• Quiet 135mm Fluid Dynamic bearing fan for reliability and quiet operation
• ECO Intelligent Thermal Control allows silent, fan-less operation at low power
• NVIDIA SLI & AMD Crossfire Ready
• Ready for 4th Generation Intel Core Processors (C6/C7 Idle Mode)
• Compliant with ErP Lot 6 2013 Requirement
• Active Power Factor correction (0.99) with Universal AC input
• 5-Year warranty and EVGA Customer Support
EVGA was founded in 1999 with headquarters in Brea, California. They continue to specialize in producing NVIDIA based graphics adapters and Intel based motherboards and keep expanding their PC power supply product line, which currently includes thirty-eight models ranging from the high-end 1,600W SuperNOVA T2 to the budget minded EVGA 400W power supply.
(Courtesy of EVGA)
As you can see in the table above, EVGA currently offers seven different variations of 750W power supplies. Let’s get started with the review and see what makes this new 750W GQ model stand out from the rest.
28HPCU: Cost Effective and Power Efficient
Have you ever been approached about something and upon first hearing about it, the opportunity just did not seem very exciting? Then upon digging into things, it became much more interesting? This happened to me with this announcement. At first blush, who really cares that ARM is partnering with UMC at 28 nm? Well, once I was able to chat with the people at ARM, it is much more interesting than initially expected.
The new hotness in fabrication is the latest 14 nm and 16 nm processes from Samsung/GF and TSMC respectively. It has been a good 4+ years since we last had a new process node that actually performed as expected. The planar 22/20 nm products just were not entirely suitable for mass production. Apple was one of the few to actually develop a part for TSMC’s 20 nm process that actually sold in the millions. The main problem was a lack of power and speed scaling as compared to 28 nm processes. Planar was a bad choice, but the development of FinFET technologies hadn’t been implemented in time for it to show up at this time by 3rd party manufacturers.
There is a problem with the latest process generations, though. They are new, expensive, and are production constrained. Also, they may not be entirely appropriate for the applications that are being developed. There are several strengths with 28 nm as compared. These are mature processes with an excess of line space. The major fabs are offering very competitive pricing structures for 28 nm as they see space being cleared up on the lines with higher end SOCs, GPUs, and assorted ASICs migrating to the new process nodes.
TSMC has typically been on the forefront of R&D with advanced nodes. UMC is not as aggressive with their development, but they tend to let others do some of the heavy lifting and then integrate the new nodes when it fits their pricing and business models. TSMC is on their third generation of 28 nm. UMC is on their second, but that generation encompasses many of the advanced features of TSMC’s 3rd generation so it is actually quite competitive.
Early testing for higher end GPUs
UPDATE 2/5/16: Nixxes released a new version of Rise of the Tomb Raider today with some significant changes. I have added another page at the end of this story that looks at results with the new version of the game, a new AMD driver and I've also included some SLI and CrossFire results.
I will fully admit to being jaded by the industry on many occasions. I love my PC games and I love hardware but it takes a lot for me to get genuinely excited about anything. After hearing game reviewers talk up the newest installment of the Tomb Raider franchise, Rise of the Tomb Raider, since it's release on the Xbox One last year, I've been waiting for its PC release to give it a shot with real hardware. As you'll see in the screenshots and video in this story, the game doesn't appear to disappoint.
Rise of the Tomb Raider takes the exploration and "tomb raiding" aspects that made the first games in the series successful and applies them to the visual quality and character design brought in with the reboot of the series a couple years back. The result is a PC game that looks stunning at any resolution, but even more so in 4K, that pushes your hardware to its limits. For single GPU performance, even the GTX 980 Ti and Fury X struggle to keep their heads above water.
In this short article we'll look at the performance of Rise of the Tomb Raider with a handful of GPUs, leaning towards the high end of the product stack, and offer up my view on whether each hardware vendor is living up to expectations.
A mix of styles
Logitech continues its push and re-entry into the gaming peripherals market in 2016, this time adding another keyboard under the Orion brand to the mix. The Logitech G G810 Orion Spectrum is, as the name implies, an RGB mechanical keyboard using the company's proprietary Romer-G switches. But despite the similarity in model numbers to the G910 Orion Spark announced in late 2014, the G810 has some significant design and functionality changes.
This new offering is cleaner, less faceted (both in key caps and design) but comes much closer to the feel and function than the tenkeyless G410 from last year. Let's take a look at how the G810 changes things up for Logitech G.
The G810 Orion Spectrum is a full size keyboard with tenkey (also known as the numeric keypad) that has sleeker lines and more professional lines that its big brother. The black finish is matte on the keys and framing but the outside edges of the keyboard have a gloss to them. It's a very minimal part of the design though so you shouldn't have to worry about fingerprints.
At first glance, you can see that Logitech toned down some of the gamer-centric accents when compared to either the G910 or the G410. There is no wrist rest, no PCB-trace inspired lines, no curves and no sharp edges. What you get instead is a keyboard that is equally well placed in modern office or in an enthusiasts gaming den. To me, there are a lot of touches that remind me of the Das Keyboard - understated design that somehow makes it more appealing to the educated consumer.
This marks the first keyboard with the new Logitech G logo on it, though you are likely more concerned about the lack of G-Keys, the company's name for its macro-capable buttons on the G910. For users that still want that capability, Logitech G allows you to reprogram the function keys along the top for macro capability, and has a pretty simple switch in software to enable or disable those macros. This means you can maintain the F-row of keys for Windows applications but still use macros for gaming.
That Depends on Whether They Need One
Ars Technica UK published an editorial called, Hey Valve: What's the point of Steam OS? The article does not actually pose the question in it's text -- it mostly rants about technical problems with a Zotac review unit -- but the headline is interesting none-the-less.
Here's my view of the situation.
The Death of Media Center May Have Been...
There's two parts to this story, and both center around Windows 8. The first was addressed in an editorial that I wrote last May, titled The Death of Media Center & What Might Have Been. Microsoft wanted to expand the PC platform into the living room. Beyond the obvious support for movies, TV, and DVR, they also pushed PC gaming in a few subtle ways. The Games for Windows certification required games to be launchable by Media Center and support Xbox 360 peripherals, which pressures game developers to make PC games comfortable to play on a couch. They also created Tray and Play, which is an optional feature that allows PC games to be played from the disk while they installed in the background. Back in 2007, before Steam and other digital distribution services really took off, this eliminated install time, which was a major user experience problem with PC gaming (and a major hurdle for TV-connected PCs).
It also had a few nasty implications. Games for Windows Live tried to eliminate modding by requiring all content to be certified (or severely limiting the tools as seen in Halo 2 Vista). Microsoft was scared about the content that users could put into their games, especially since Hot Coffee (despite being locked, first-party content) occurred less than two years earlier. You could also argue that they were attempting to condition PC users to accept paid DLC.
Regardless of whether it would have been positive or negative for the PC industry, the Media Center initiative launched with Windows Vista, which is another way of saying “exploded on the launch pad, leaving no survivors.” Windows 7 cleared the wreckage with a new team, who aimed for the stars with Windows 8. They ignored the potential of the living room PC, preferring devices and services (ie: Xbox) over an ecosystem provided by various OEMs.
If you look at the goals of Steam OS, they align pretty well with the original, Vista-era ambitions. Valve hopes to create a platform that hardware vendors could compete on. Devices, big or small, expensive or cheap, could fill all of the various needs that users have in the living room. Unfortunately, unlike Microsoft, they cannot be (natively) compatible with the catalog of Windows software.
This may seem like Valve is running toward a cliff, but keep reading.
What If Steam OS Competed with Windows Store?
Windows 8 did more than just abandon the vision of Windows Media Center. Driven by the popularity of the iOS App Store, Microsoft saw a way to end the public perception that Windows is hopelessly insecure. With the Windows Store, all software needs to be reviewed and certified by Microsoft. Software based on the Win32 API, which is all software for Windows 7 and earlier, was only allowed within the “Desktop App,” which was a second-class citizen and could be removed at any point.
This potential made the PC software industry collectively crap themselves. Mozilla was particularly freaked out, because Windows Store demanded (at the time) that all web browsers become reskins of Internet Explorer. This means that Firefox would not be able to implement any new Web standards on Windows, because it can only present what Internet Explorer (Trident) draws. Mozilla's mission is to develop a strong, standards-based web browser that forces all others to interoperate or die.
Remember: “This website is best viewed with Internet Explorer”?
Executives from several PC gaming companies, including Valve, Blizzard, and Mojang, spoke out against Windows 8 at the time (along with browser vendors and so forth). Steam OS could be viewed as a fire escape for Valve if Microsoft decided to try its luck and kill, or further deprecate, Win32 support. In the mean time, Windows PCs could stream to it until Linux gained a sufficient catalog of software.
Image Credit: Wikipedia
This is where Steam OS gets interesting. Its software library cannot compete against Windows with its full catalog of Win32 applications, at least not for a long time. On the other hand, if Microsoft continues to support Win32 as a first-class citizen, and they returned to the level of openness with software vendors that they had in the Windows XP era, then Valve doesn't really have a reason to care about Steam OS as anything more than a hobby anyway. Likewise, if doomsday happens and something like Windows RT ends up being the future of Windows, as many feared, then Steam OS wouldn't need to compete against Windows. Its only competition from Microsoft would be Windows Store apps and first-party software.
I would say that Valve might even have a better chance than Microsoft in that case.
AMD Keeps Q1 Interesting
CES 2016 was not a watershed moment for AMD. They showed off their line of current video cards and, perhaps more importantly, showed off working Polaris silicon, which will be their workhorse for 2016 in the graphics department. They did not show off Zen, a next generation APU, or any AM4 motherboards. The CPU and APU world was not presented in a way that was revolutionary. What they did show off, however, hinted at the things to come to help keep AMD relevant in the desktop space.
It was odd to see an announcement about the stock cooler that AMD was introducing, but when we learned more about it, the more important it was for AMD’s reputation moving forward. The Wraith cooler is a new unit to help control the noise and temperatures of the latest AMD CPUs and select APUs. This is a fairly beefy unit with a large, slow moving fan that produces very little noise. This is a big change from the variable speed fans on previous coolers that could get rather noisy and leave temperatures that were higher in range than are comfortable. There has been some derision aimed at AMD for providing “just a cooler” for their top end products, but it is a push that is making them more user and enthusiast friendly without breaking the bank.
Socket AM3+ is not dead yet. Though we have been commenting on the health of the platform for some time, AMD and its partners work to improve and iterate upon these products to include technologies such as USB 3.1 and M.2 support. While these chipsets are limited to PCI-E 2.0 speeds, the four lanes available to most M.2 controllers allows these boards to provide enough bandwidth to fully utilize the latest NVMe based M.2 drives available. We likely will not see a faster refresh on AM3+, but we will see new SKUs utilizing the Wraith cooler as well as a price break for the processors that exist in this socket.
NVMe was a great thing to happen to SSDs. The per-IO reduction in latency and CPU overhead was more than welcome, as PCIe SSDs were previously using the antiquated AHCI protocol, which was a carryover from the SATA HDD days. With NVMe came additional required support in Operating Systems and UEFI BIOS implementations. We did some crazy experiments with arrays of these new devices, but we were initially limited by the lack of native hardware-level RAID support to tie multiple PCIe devices together. The launch of the Z170 chipset saw a remedy to this, by including the ability to tie as many as three PCIe SSDs behind a chipset-configured array. The recent C600 server chipset also saw the addition of RSTe capability, expanding this functionality to enterprise devices like the Intel SSD P3608, which was actually a pair of SSDs on a single PCB.
Most Z170 motherboards have come with one or two M.2 slots, meaning that enthusiasts wanting to employ the 3x PCIe RAID made possible by this new chipset would have to get creative with the use of interposer / adapter boards (or use a combination of PCI and U.2 connected Intel SSD 750s). With the Samsung 950 Pro available, as well as the slew of other M.2 SSDs we saw at CES 2016, it’s safe to say that U.2 is going to push back into the enterprise sector, leaving M.2 as the choice for consumer motherboards moving forward. It was therefore only a matter of time before a triple-M.2 motherboard was launched, and that just recently happened - Behold the Gigabyte Z170X-SOC Force!
This new motherboard sits at the high end of Gigabyte’s lineup, with a water-capable VRM cooler and other premium features. We will be passing this board onto Morry for a full review, but this piece will be focusing on one section in particular:
I have to hand it to Gigabyte for this functional and elegant design choice. The space between the required four full length PCIe slots makes it look like it was chosen to fit M.2 SSDs in-between them. I should also note that it would be possible to use three U.2 adapters linked to three U.2 Intel SSD 750s, but native M.2 devices makes for a significantly more compact and consumer friendly package.
With the test system set up, let’s get right into it, shall we?
Dell has never exactly been a brand that gamers gravitate towards. While we have seen some very high quality products out of Dell in the past few years, including the new XPS 13, and people have loved their Ultrasharp monitor line, neither of these target gamers directly. Dell acquired Alienware in 2006 in order to enter the gaming market and continues to make some great products, but they retain the Alienware branding. It seems to me a gaming-centric notebook with just the Dell brand could be a hard sell.
However, that's exactly what we have today with the Dell Inspiron 15 7000. Equipped with an Intel Core i5-6300HQ and NVIDIA GTX 960M for $799, has Dell created a contender in the entry-level gaming notebook race?
For years, the Inspiron line has been Dell's entry level option for notebooks and subsequently has a questionable reputation as far as quality and lifespan. With the Inspiron 15 7000 being the most expensive product offering in the Inspiron line though, I was excited to see if it could sway my opinion of the brand.