Subject: Displays
Manufacturer: ASUS

Introduction and First Impressions

The ASUS PB258Q is a "frameless" monitor with a full 2560x1440 resolution from a fairly compact 25-inch size, and at first glance it might appear to be a bare LCD panel affixed to a stand. This attractive design also features 100% sRGB coverage and full height/tilt/swivel and rotation adjustment. The price? Less than $400. We'll put it to the test to see just what kind of value to expect here.

DSC_0265.jpg

A beautiful looking monitor even with nothing on the display

The ASUS PB258Q came out of nowhere one day when I was looking to replace a smaller 1080p display on my desk. Given some pretty serious size constraints I was hesitant to move up to the 27 - 30 inch range for 2560x1440 monitors, but I didn't want to settle for 1920x1080 again. The ASUS PB258Q intrigued me immediately not only due to its interesting size/resolution of 25-inch/1440p, but also for the claimed 100% sRGB coverage and fully adjustable stand. And then I looked over at the price. $376.99 shipped from Amazon with Prime shipping? Done.

pb258q_amazon.PNG

The pricing (and compact 25-inch size) made it a more compelling choice to me than the PB278Q, ASUS's "professional graphics monitor" which uses a PLS panel, though this larger display has recently dropped in price to the $400 range. When the PB258Q arrived a couple of days later I was first struck by how compact it is, and how nice the monitor looked without even being powered up.

Read on for our complete review of the ASUS PB258Q frameless IPS monitor!!

Author:
Manufacturer: NVIDIA

Another Maxwell Iteration

The mainstream end of the graphics card market is about to get a bit more complicated with today’s introduction of the GeForce GTX 950. Based on a slightly cut down GM206 chip, the same used in the GeForce GTX 960 that was released almost 8 months ago, the new GTX 950 will fill a gap in the product stack for NVIDIA, resting right at $160-170 MSRP. Until today that next-down spot from the GTX 960 was filled by the GeForce GTX 750 Ti, the very first iteration of Maxwell (we usually call it Maxwell 1) that came out in February of 2014!

Even though that is a long time to go without refreshing the GTX x50 part of the lineup, NVIDIA was likely hesitant to do so based on the overwhelming success of the GM107 for mainstream gaming. It was low cost, incredibly efficient and didn’t require any external power to run. That led us down the path of upgrading OEM PCs with GTX 750 Ti, an article and video that still gets hundreds of views and dozens of comments a week.

IMG_3123.JPG

The GTX 950 has some pretty big shoes to fill. I can tell you right now that it uses more power than the GTX 750 Ti, and it requires a 6-pin power connector, but it does so while increasing gaming performance dramatically. The primary competition from AMD is the Radeon R7 370, a Pitcairn GPU that is long in the tooth and missing many of the features that Maxwell provides.

And NVIDIA is taking a secondary angle with the GTX 950 launch –targeting the MOBA players (DOTA 2 in particular) directly and aggressively. With the success of this style of game over the last several years, and the impressive $18M+ purse for the largest DOTA 2 tournament just behind us, there isn’t a better area of PC gaming to be going after today. But are the tweaks and changes to the card and software really going to make a difference for MOBA gamers or is it just marketing fluff?

Let’s dive into everything GeForce GTX 950!

Continue reading our review of the NVIDIA GeForce GTX 950 2GB Graphics Card!!

Author:
Manufacturer: Intel

Core and Interconnect

The Skylake architecture is Intel’s first to get a full release on the desktop in more than two years. While that might not seem like a long time in the grand scheme of technology, for our readers and viewers that is a noticeable change and shift from recent history that Intel has created with the tick-tock model of releases. Yes, Broadwell was released last year and was solid product, but Intel focused almost exclusively on the mobile platforms (notebooks and tablets) with it. Skylake will be much more ubiquitous and much more quickly than even Haswell.

Skylake represents Intel’s most scalable architecture to date. I don’t mean only frequency scaling, though that is an important part of this design, but rather in terms of market segment scaling. Thanks to brilliant engineering and design from Intel’s Israeli group Intel will be launching Skylake designs ranging from 4.5 watt TDP Core M solutions all the way up to the 91 watt desktop processors that we have already reviewed in the Core i7-6700K. That’s a range that we really haven’t seen before and in the past Intel has depended on the Atom architecture to make up ground on the lowest power platforms. While I don’t know for sure if Atom is finally trending towards the dodo once Skylake’s reign is fully implemented, it does make me wonder how much life is left there.

skylakeicon.jpg

Scalability also refers to the package size – something that ensures that the designs the engineers created can actually be built and run in the platform segments they are targeting. Starting with the desktop designs for LGA platforms (DIY market) that fits on a 1400 mm2 design on the 91 watt TDP implementation Intel is scaling all the way down to 330 mm2 in a BGA1515 package for the 4.5 watt TDP designs. Only with a total product size like that can you hope to get Skylake in a form factor like the Compute Stick – which is exactly what Intel is doing. And note that the smaller packages require the inclusion of the platform IO chip as well, something that H- and S-series CPUs can depend on the motherboard to integrate.

Finally, scalability will also include performance scaling. Clearly the 4.5 watt part will not offer the user the same performance with the same goals as the 91 watt Core i7-6700K. The screen resolution, attached accessories and target applications allow Intel to be selective about how much power they require for each series of Skylake CPUs.

Core Microarchitecture

The fundamental design theory in Skylake is very similar to what exists today in Broadwell and Haswell with a handful of significant and hundreds of minor change that make Skylake a large step ahead of previous designs.

skylake-16.jpg

This slide from Julius Mandelblat, Intel Senior Principle Engineer, shows a higher level overview of the entirety of the consumer integration of Skylake. You can see that Intel’s goals included a bigger and wider core design, higher frequency, improved right architecture and fabric design and more options for eDRAM integration. Readers of PC Perspective will already know that Skylake supports both DDR3L and DDR4 memory technologies but the inclusion of the camera ISP is new information for us.

Continue reading our overview of the Intel Skylake microarchitecture!!

Author:
Manufacturer: Stardock

Benchmark Overview

I knew that the move to DirectX 12 was going to be a big shift for the industry. Since the introduction of the AMD Mantle API along with the Hawaii GPU architecture we have been inundated with game developers and hardware vendors talking about the potential benefits of lower level APIs, which give more direct access to GPU hardware and enable more flexible threading for CPUs to game developers and game engines. The results, we were told, would mean that your current hardware would be able to take you further and future games and applications would be able to fundamentally change how they are built to enhance gaming experiences tremendously.

I knew that the reader interest in DX12 was outstripping my expectations when I did a live blog of the official DX12 unveil by Microsoft at GDC. In a format that consisted simply of my text commentary and photos of the slides that were being shown (no video at all), we had more than 25,000 live readers that stayed engaged the whole time. Comments and questions flew into the event – more than me or my staff could possible handle in real time. It turned out that gamers were indeed very much interested in what DirectX 12 might offer them with the release of Windows 10.

game3.jpg

Today we are taking a look at the first real world gaming benchmark that utilized DX12. Back in March I was able to do some early testing with an API-specific test that evaluates the overhead implications of DX12, DX11 and even AMD Mantle from Futuremark and 3DMark. This first look at DX12 was interesting and painted an amazing picture about the potential benefits of the new API from Microsoft, but it wasn’t built on a real game engine. In our Ashes of the Singularity benchmark testing today, we finally get an early look at what a real implementation of DX12 looks like.

And as you might expect, not only are the results interesting, but there is a significant amount of created controversy about what those results actually tell us. AMD has one story, NVIDIA another and Stardock and the Nitrous engine developers, yet another. It’s all incredibly intriguing.

Continue reading our analysis of the Ashes of the Singularity DX12 benchmark!!

Author:
Manufacturer: Intel

It comes after 8, but before 10

As the week of Intel’s Developer Forum (IDF) begins, you can expect to see a lot of information about Intel’s 6th Generation Core architecture, codenamed Skylake, finally revealed. When I posted my review of the Core i7-6700K, the first product based on that architecture to be released in any capacity, I was surprised that Intel was willing to ship product without the normal amount of background information for media and developers. Rather than give us the details and then ship product, which has happened for essentially every consumer product release I have been a part of, Intel did the reverse: ship a consumer friendly CPU and then promise to tell us how it all works later in the month at IDF.

Today I came across a document posted on Intel’s website that dives into very specific detail on the new Gen9 graphics and compute architecture of Skylake. Details on the Core architecture changes are not present, and instead we are given details on how the traditional GPU portion of the SoC has changed. To be clear: I haven’t had any formal briefing from Intel on this topic or anything surrounding the architecture of Skylake or the new Gen9 graphics system but I wanted to share the details we found available. I am sure we’ll learn more this week as IDF progresses so I will update this story where necessary.

What Intel calls Processor Graphics is what we used to call simply integrated graphics for the longest time. The purpose and role of processor graphics has changed drastically over the years and it is now not only responsible for 3D graphics rendering but compute, media and display capabilities of the Intel Skylake SoC (when discrete add-in graphics is not used). The architecture document used to source this story focuses on Gen9 graphics, the compute architecture utilized in the latest Skylake CPUs. The Intel HD Graphics 530 on the Core i7-6700K / Core i5-6600K is the first product released and announced using Gen9 graphics and is also the first to adopt Intel’s new 3-digit naming scheme.

skylakegen9-4.jpg

This die shot of the Core i7-6700K shows the increased size and prominence of the Gen9 graphics in the overall SoC design. Containing four traditional x86 CPU cores and 1 “slice” implementation of Gen9 graphics (with three visible sub-slices we’ll describe below), this is not likely to be the highest performing iteration of the latest Intel HD Graphics technology.

skylakegen9-4.2.jpg

Like the Intel processors before it, the Skylake design utilizes a ring bus architecture to connect the different components of the SoC. This bi-directional interconnect has a 32-byte wide data bus and connects to multiple “agents” on the CPU. Each individual CPU core is considered its own agent while the Gen9 compute architecture is considered one complete agent. The system agent bundles the DRAM memory, the display controller, PCI Express and other I/O interface that communicate with the rest of the PC. Any off-chip memory requests and transactions occur through this bus while on-chip data transfers tend to be handled differently.

Continue reading our look at the new Gen9 graphics and compute architecture on Skylake!!

Manufacturer: PC Perspective

It's Basically a Function Call for GPUs

Mantle, Vulkan, and DirectX 12 all claim to reduce overhead and provide a staggering increase in “draw calls”. As mentioned in the previous editorial, loading graphics card with tasks will take a drastic change in these new APIs. With DirectX 10 and earlier, applications would assign attributes to (what it is told is) the global state of the graphics card. After everything is configured and bound, one of a few “draw” functions is called, which queues the task in the graphics driver as a “draw call”.

While this suggests that just a single graphics device is to be defined, which we also mentioned in the previous article, it also implies that one thread needs to be the authority. This limitation was known about for a while, and it contributed to the meme that consoles can squeeze all the performance they have, but PCs are “too high level” for that. Microsoft tried to combat this with “Deferred Contexts” in DirectX 11. This feature allows virtual, shadow states to be loaded from secondary threads, which can be appended to the global state, whole. It was a compromise between each thread being able to create its own commands, and the legacy decision to have a single, global state for the GPU.

Some developers experienced gains, while others lost a bit. It didn't live up to expectations.

pcper-2015-dx12-290x.png

The paradigm used to load graphics cards is the problem. It doesn't make sense anymore. A developer might not want to draw a primitive with every poke of the GPU. At times, they might want to shove a workload of simple linear algebra through it, while other requests could simply be pushing memory around to set up a later task (or to read the result of a previous one). More importantly, any thread could want to do this to any graphics device.

pcper-2015-dx12-980.png

The new graphics APIs allow developers to submit their tasks quicker and smarter, and it allows the drivers to schedule compatible tasks better, even simultaneously. In fact, the driver's job has been massively simplified altogether. When we tested 3DMark back in March, two interesting things were revealed:

  • Both AMD and NVIDIA are only a two-digit percentage of draw call performance apart
  • Both AMD and NVIDIA saw an order of magnitude increase in draw calls

Read on to see what this means for games and game development.

Author:
Subject: General Tech
Manufacturer: Gigabyte

Killing those end of summer blues

As we approach the end of summer and the beginning of the life of Windows 10, PC Perspective and Gigabyte (along with Thermaltake and Kingston) have teamed up to bring our readers a system build guide and giveaway that is sure to get your gears turning. If you think that an X99-based system with an 8-core Intel Extreme processor, SLI graphics, 480GB SSD and 32GB of memory sounds up your alley...pay attention.

01.jpg

Deep in thought...

Even with the dawn of Skylake nearly upon us, there is no debate that the Haswell-E platform will continue to be the basis of the enthusiasts dream system for a long time. Lower power consumption is great, but nothing is going to top 8-cores, 16-threads and all the PCI Express lanes you could need for expansion to faster storage and accessories. With that in mind Gigabyte has partnered with PC Perspective to showcase the power of X99 and what a builder today can expect when putting together a system with a fairly high budget, but with lofty goals in mind as well.

Let's take a look at the components we are using today.

  Gigabyte X99 System Build
Processor Intel Core i7-5960X - $1048
Motherboard Gigabyte X99 Gaming 5P - $309
Memory Kingston HyperX Fury DDR4-2666 32GB - $325
Graphics Card 2 x Gigabyte G1 Gaming GTX 960 2GB - $199
Storage Kingston HyperX Savage 480GB SSD - $194
Case Thermaltake Core V51 - $82
Power Supply Thermaltake Toughpower Grand 850 watt - $189
CPU Cooler Thermaltake Water 3.0 Extreme S - $94
Total Price $1591 - Amazon Full Card (except CPU)
$1048 - Amazon Intel Core i7-5960X
Grand Total: $2639

Continue reading our system build and find out how you can WIN this PC!!

Going Beyond the Reference GTX 970

Zotac has been an interesting company to watch for the past few years.  It is a company that has made a name for themselves in the small form factor community with some really interesting designs and products.  They continue down that path, but they have increasingly focused on high quality graphics cards that address a pretty wide market.  They provide unique products from the $40 level up through the latest GTX 980 Ti with hybrid water and air cooling for $770.  The company used to focus on reference designs, but some years past they widened their appeal by applying their own design decisions to the latest NVIDIA products.

zot_01.jpg

Catchy looking boxes for people who mostly order online! Still, nice design.

The beginning of this year saw Zotac introduce their latest “Core” brand products that aim to provide high end features to more modestly priced parts.  The Core series makes some compromises to hit price points that are more desirable for a larger swath of consumers.  The cards often rely on more reference style PCBs with good quality components and advanced cooling solutions.  This equation has been used before, but Zotac is treading some new ground by offering very highly clocked cards right out of the box.

Overall Zotac has a very positive reputation in the industry for quality and support.

zot_02.jpg

Plenty of padding in the box to protect your latest investment.

 

Zotac GTX 970 AMP! Extreme Core Edition

The product we are looking at today is the somewhat long-named AMP! Extreme Core Edition.  This is based on the NVIDIA GTX 970 chip which features 56 ROPS, 1.75 MB of L2 cache, and 1664 CUDA Cores.  The GTX 970 has of course been scrutinized heavily due to the unique nature of its memory subsystem.  While it does physically have a 256 bit bus, the last 512 MB (out of 4GB) is addressed by a significantly slower unit due to shared memory controller capacity.  In theory the card reference design supports up to 224 GB/sec of memory bandwidth.  There are obviously some very unhappy people out there about this situation, but much of this could have been avoided if NVIDIA had disclosed the exact nature of the GTX 970 configuration.

Click here to read the entire Zotac GTX 970 AMP! Extreme Core Edition Review!

Subject: Storage
Manufacturer: Intel
Tagged: Z170, Skylake, rst, raid, Intel

A quick look at storage

** This piece has been updated to reflect changes since first posting. See page two for PCIe RAID results! **

Our Intel Skylake launch coverage is intense! Make sure you hit up all the stories and videos that are interesting for you!

When I saw the small amount of press information provided with the launch of Intel Skylake, I was both surprised and impressed. The new Z170 chipset was going to have an upgraded DMI link, nearly doubling throughput. DMI has, for a long time, been suspected as the reason Intel SATA controllers have pegged at ~1.8 GB/sec, which limits the effectiveness of a RAID with more than 3 SSDs. Improved DMI throughput could enable the possibility of a 6-SSD RAID-0 that exceeds 3GB/sec, which would compete with PCIe SSDs.

ASUS PCIe RAID.png

Speaking of PCIe SSDs, that’s the other big addition to Z170. Intel’s Rapid Storage Technology was going to be expanded to include PCIe (even NVMe) SSDs, with the caveat that they must be physically connected to PCIe lanes falling under the DMI-connected chipset. This is not as big of as issue as you might think, as Skylake does not have 28 or 40 PCIe lanes as seen with X99 solutions. Z170 motherboards only have to route 16 PCIe lanes from the CPU to either two (8x8) or three (8x4x4) PCIe slots, and the remaining slots must all hang off of the chipset. This includes the PCIe portion of M.2 and SATA Express devices.

PCH storage config.png

Continue reading our preview of the new storage options on the Z170 chipset!!

Author:
Subject: Processors
Manufacturer: Intel

Light on architecture details

Our Intel Skylake launch coverage is intense! Make sure you hit up all the stories and videos that are interesting for you!

The Intel Skylake architecture has been on our radar for quite a long time as Intel's next big step in CPU design. Through leaks and some official information discussed by Intel over the past few months, we know at least a handful of details: DDR4 memory support, 14nm process technology, modest IPC gains and impressive GPU improvements. But the details have remained a mystery on how the "tock" of Skylake on the 14nm process technology will differ from Broadwell and Haswell.

Interestingly, due to some shifts in how Intel is releasing Skylake, we are going to be doing a review today with very little information on the Skylake architecture and design (at least officially). While we are very used to the company releasing new information at the Intel Developer Forum along with the launch of a new product, Intel has instead decided to time the release of the first Skylake products with Gamescom in Cologne, Germany. Parts will go on sale today (August 5th) and we are reviewing a new Intel processor without the background knowledge and details that will be needed to really explain any of the changes or differences in performance that we see. It's an odd move honestly, but it has some great repercussions for the enthusiasts that read PC Perspective: Skylake will launch first as an enthusiast-class product for gamers and DIY builders.

For many of you this won't change anything. If you are curious about the performance of the new Core i7-6700K, power consumption, clock for clock IPC improvements and anything else that is measurable, then you'll get exactly what you want from today's article. If you are a gear-head that is looking for more granular details on how the inner-workings of Skylake function, you'll have to wait a couple of weeks longer - Intel plans to release that information on August 18th during IDF.

01.jpg

So what does the addition of DDR4 memory, full range base clock manipulation and a 4.0 GHz base clock on a brand new 14nm architecture mean for users of current Intel or AMD platforms? Also, is it FINALLY time for users of the Core i7-2600K or older systems to push that upgrade button? (Let's hope so!)

Continue reading our review of the Intel Core i7-6700K Skylake processor!!

Author:
Manufacturer: Intel

Bioshock Infinite Results

Our Intel Skylake launch coverage is intense! Make sure you hit up all the stories and videos that are interesting for you!

Today marks the release of Intel's newest CPU architecture, code named Skylake. I already posted my full review of the Core i7-6700K processor so, if you are looking for CPU performance and specification details on that part, you should start there. What we are looking at in this story is the answer to a very simple, but also very important question:

Is it time for gamers using Sandy Bridge system to finally bite the bullet and upgrade?

I think you'll find that answer will depend on a few things, including your gaming resolution and aptitude for multi-GPU configuration, but even I was surprised by the differences I saw in testing.

sli1.jpg

Our testing scenario was quite simple. Compare the gaming performance of an Intel Core i7-6700K processor and Z170 motherboard running both a single GTX 980 and a pair of GTX 980s in SLI against an Intel Core i7-2600K and Z77 motherboard using the same GPUs. I installed both the latest NVIDIA GeForce drivers and the latest Intel system drivers for each platform.

  Skylake System Sandy Bridge System
Processor Intel Core i7-6700K Intel Core i7-2600K
Motherboard ASUS Z170-Deluxe Gigabyte Z68-UD3H B3
Memory 16GB DDR4-2133 8GB DDR3-1600
Graphics Card 1x GeForce GTX 980
2x GeForce GTX 980 (SLI)
1x GeForce GTX 980
2x GeForce GTX 980 (SLI)
OS Windows 8.1 Windows 8.1

Our testing methodology follows our Frame Rating system, which uses a capture-based system to measure frame times at the screen (rather than trusting the software's interpretation).

If you aren't familiar with it, you should probably do a little research into our testing methodology as it is quite different than others you may see online.  Rather than using FRAPS to measure frame rates or frame times, we are using an secondary PC to capture the output from the tested graphics card directly and then use post processing on the resulting video to determine frame rates, frame times, frame variance and much more.

This amount of data can be pretty confusing if you attempting to read it without proper background, but I strongly believe that the results we present paint a much more thorough picture of performance than other options.  So please, read up on the full discussion about our Frame Rating methods before moving forward!!

While there are literally dozens of file created for each “run” of benchmarks, there are several resulting graphs that FCAT produces, as well as several more that we are generating with additional code of our own.

If you need some more background on how we evaluate gaming performance on PCs, just check out my most recent GPU review for a full breakdown.

I only had time to test four different PC titles:

  • Bioshock Infinite
  • Grand Theft Auto V
  • GRID 2
  • Metro: Last Light

Continue reading our look at discrete GPU scaling on Skylake compared to Sandy Bridge!!

Subject: Motherboards
Manufacturer: ASUS

Introduction and Technical Specifications

Our Intel Skylake launch coverage is intense! Make sure you hit up all the stories and videos that are interesting for you!

Introduction

ASUS used the introduction of the Z170 chipset and Intel LGA1151 processors to revamp their new motherboard product lines, updating the aesthetics and features with the latest innovations. We are giving you a preview of what to expect from the ASUS Z170 chipset boards, using the Z170-A board provided for evaluation. A more detailed evaluation of the ASUS Z170-A will be released in the coming weeks as we receive evaluation samples from other manufacturers with which we can compare it.

02-board.jpg

Courtesy of ASUS

03-platform-overview.jpg

Courtesy of ASUS

ASUS updated the aesthetics of the channel series boards, changing the previous generation's black and gold design to a much more eye-catching black and white with blue accents. The Z170-A motherboard even comes with an integrated rear panel cover and large black-chromed heat sinks for its VRMs. The board offers full support for Intel's new line of LGA1151 "Skylake-S" processors, paired with dual channel DDR4 memory.

04-oc-expl.jpg

Courtesy of ASUS

ASUS integrated their latest overclocking technology into the Z170-A motherboard, giving their "entry-level" model the same overclocking potential as offered with their flagship board. ASUS uses a tri-chip design to ensure the board's overclocking capabilities - the Pro Clock control chip, the TPU chip, and the EPU chip. The three chips work in concert to unlock ultra-high frequencies for pushing you Skylake processor to its limits.

05-audio.jpg

Courtesy of ASUS

The Z170-A motherboard also has the latest ASUS sound technology integrated - Crystal Sound 3. Like its predecessors, Crystal Sound 3 integrates the audio components on a isolated PCB from the other main board components minimizing noise generation caused by those other integrated devices. ASUS designed the audio subsystem with high-quality Japanese-sourced audio and power circuitry for a top-notch audio experience.

Continue reading our preview of the ASUS Z170-A motherboard!

Author:
Manufacturer: Lenovo

Introduction

After spending some time in the computer hardware industry, it's easy to become jaded about trade shows and unannounced products. The vast majority of hardware we see at events like CES every year is completely expected beforehand. While this doesn't mean that these products are bad by any stretch, they can be difficult to get excited about.

Everyone once and a while however, we find ourselves with our hands on something completely unexpected. Hidden away in a back room of Lenovo's product showcase at CES this year, we were told there was a product would amaze us — called the LaVie.

IMG_3037.JPG

And they were right.

Unfortunately, the Lenovo LaVie-Z is one of those products that you can't truly understand until you get it in your hands. Billed as the world's lightest 13.3" notebook, the standard LaVie-Z comes in at a weight of just 1.87 lbs. The touchscreen-enabled LaVie-Z 360 gains a bit of weight, coming in at 2.04 lbs.

While these numbers are a bit difficult to wrap your head around, I'll try to provide a bit of context. For example, the Google Nexus 9 weighs .94 lbs. For just over twice the weight as Google's flagship tablet, Lenovo has provided a full Windows notebook with an i7 ultra mobile processor.

IMG_2181.JPG

Furthermore the new 12" Apple MacBook which people are touting as being extremely light comes in at 2.03 lbs, almost the same weight as the touchscreen version of the LaVie-Z. For the same weight, you also gain a much more powerful Intel i7 processor in the LaVie, when compared to the Intel Core-M option in the MacBook.

All of this comes together to provide an experience that is quite unbelievable. Anyone that I have handed one of these notebooks to has been absolutely amazed that it's a real, functioning computer. The closest analog that I have been able to come up with for picking up the LaVie-Z is one of the cardboard placeholder laptops they have at furniture stores.

The personal laptop that I carry day-to-day is a 11" MacBook Air, which only weighs 2.38 lbs, but the LaVie-Z feels infinitely lighter.

However, as impressive as the weight (or lack thereof) of the LaVie-Z is, let's dig deeper into what the experience of using the world's lightest notebook.

Click here to continue reading our review of the Lenovo LaVie-Z and LaVie-Z 360

Manufacturer: Antec

Introduction and First Impressions

The Antec Signature Series S10 is the company's new flagship enclosure, and it looks every bit the part. A massive full-tower design with seemingly no expense spared in its design and construction, the S10 boasts many interesting design details. So is it worth the staggering $499 price tag? (Update: A day after our review was published Newegg cut the $499 MSRP by $150, taking the S10 down to $299 after a $50 rebate.) 

DSC_0749.jpg

The Signature S10 is an interesting product to be sure. Antec, long renowned as a maker of premium cases has in recent years lost some of the cachet that they once had with enthusiasts. This is no reflection on Antec and more a result of the industy's flood of enclosures into the market, with virtually every brand filling all price segments. Corsair, SilverStone, Fractal Design, Lian Li, Cooler Master, In Win, NZXT, BitFenix, Phanteks, and the list goes on and on...

So where does the new S10 enclosure fit into this market? Antec made the daring move of placing the Signature enclosure directly at the top with a shocking $499 retail price - which subsequently dropped to $449 and then again to $349 before a $50 rebate. I can think of no other recent enclosure this expensive at launch other than the In Win S-Frame, and it positioned the S10 as an unattainable object for most builders. So was Antec successful in creating an aspirational product - even before the recent price cuts?

DSC_0773.jpg

Is that... Batman??

Continue reading our review of the Antec Signature S10 enclosure!!

Author:
Manufacturer: Cyonic

Introduction and Features

Introduction

2-Banner.jpg

Introducing Cyonic, a new player in the global PC power supply arena. Founded in 2013, Cyonic’s goal is to become a global brand of high performance computer parts and accessories. The business is starting off by selling power supplies and Cyonic will soon have three product lines: the AU Series (fixed cables), AUx Series (all modular cables), and the Arise Series (sold exclusively in Japan). The AU and AUx Series will both contain three models, ranging in output capacities of 450W, 550W, and 650W. In this review we will take a detailed look at the AU-450x we received for evaluation.

3-AU-450x-side.jpg

The new Cyonic AUx Series power supplies feature fully modular cables, quiet operation and high efficiency. And they are housed in a compact chassis that measures only 140mm (5.5”) deep. Cyonic suggests the AUx Series power supplies are “the ideal choice for office use, casual gaming, and Home Theater PCs”. Getting into the PC power supply market might seem a rather daunting task, but Cyonic has partnered with Seasonic as their OEM, which is certainly a good start.

Cyonic AUx PSU Key Features:

•    450W, 550W and 650W models
•    Fully modular cables design
•    Compact ATX chassis: only 140mm (5.5”) deep
•    80 PLUS Gold certified, at least 90% efficiency under 50% load
•    120mm Fluid Dynamic Bearing fan for long life and quiet operation
•    Intelligent Fan Manager for very quiet operation
•    High quality components including 105°C Japanese electrolytic capacitors
•    Compatibility with Intel's 4th Generation Core processors
•    Safety Protections: OPP, OVP, UVP and SCP
•    Conforms to ATX12V v2.31 and EPS 2.92 standards
•    5-Year warranty

Please continue reading our review of the Cyonic AU-450x PSU!!!

Subject: General Tech
Manufacturer: Logitech

Introduction, Packaging, and A Closer Look

Introduction

We haven’t had a chance to sit down with any racing wheels for quite some time here at PC Perspective. We have an old Genius wheel on a shelf in the back of our closet here at the office. Ryan played around with that a few years back, and that was the extent of the racing wheel usage here at home base. Josh, on the other hand, frequents driving sims with a Thrustmaster F430. I hadn’t ventured into racing sims, though I do dabble with the real thing a bit.

Given our previous  Logitech coverage, and especially following our recent Q&A covering the new LogitechG line, it only made sense for us to take a look at the new Logitech G29 Driving Force Racing Wheel.

DSC03140.jpg

Today we are covering the G29, which is a PS3/PS4/PC specific model from Logitech. There is an Xbox/PC variant coming soon in the form of the G920, with will have a different (fewer) button layout and no LED RPM/shift display.

Read on for our review of the Logitech G29 Driving Force Racing Wheel!

Author:
Manufacturer: Wasabi Mango

Overview

A few years ago, we took our first look at the inexpensive 27" 1440p monitors which were starting to flood the market via eBay sellers located in Korea. These monitors proved to be immensely popular and largely credited for moving a large number of gamers past 1080p.

However, in the past few months we have seen a new trend from some of these same Korean monitor manufacturers. Just like the Seiki Pro SM40UNP 40" 4K display that we took a look at a few weeks ago, the new trend is large 4K monitors.

Built around a 42-in LG AH-IPS panel, the Wasabi Mango UHD420 is an impressive display. Inclusion of HDMI 2.0 and DisplayPort 1.2 allow you to achieve 4K at a full 60Hz and 4:4:4 color gamut. At a cost of just under $800 on Amazon, this is an incredibly appealing value.

IMG_2939.JPG

Whether or not the UHD420 is a TV or a monitor is actually quite the tossup. The lack of a tuner
might initially lead you to believe it's not a TV. Inclusion of a DisplayPort connector, and USB 3.0 hub might make you believe it's a monitor, but it's bundled with a remote control (entirely in Korean). In reality, this display could really be used for either use case (unless you use OTA tuning), and really starts to blur the lines between a "dumb" TV and a monitor. You'll also find VESA 400x400mm mounting holes on this display for easy wall mounting.

Continue reading our overview of the Wasabi Mango UHD420 4K HDMI 2.0 FreeSync Display!!

Subject: Mobile
Manufacturer: MSI

Introduction and First Impressions

The MSI GT72 Dominator Pro G gaming laptop is a beast of a portable, with a GeForce GTX 980M graphics card and a 5th-Gen Intel Core i7 processor within its massive frame. And this iteration of the GT72 features NVIDIA's G-SYNC technology, which should help provide smooth gameplay on its 75 Hz IPS display.

gt72g_01.jpg

The gaming laptop market is filled with options at just about any price you can imagine (as long as your imagination starts at around $1000), and there are seemingly limitless combinations of specs and minute configuration differences even within a particular brand’s offering. A few names stand out in this market, and MSI has created a product meant to stand tall against the likes of Alienware and ASUS ROG. And it doesn’t just stand tall, it stands wide - and deep for that matter. Running about the size of home plate on a regulation baseball diamond (well, approximately anyway), this is nearly 8 ½ lbs of PC gaming goodness.

Not everyone needs a 17-inch notebook, but there’s something awesome about these giant things when you see them in person. The design of this GT72 series is reminiscent of an exotic sports car (gaming laptops in general seem to have fully embraced the sports car theme), and if you’re considering completely replacing a desktop for gaming and all of your other computing the extra space it takes up is more than worth it if you value a large display and full keyboard. Doubtless there are some who would simply be augmenting a desktop experience with a supremely powerful notebook like this, but for most people laptops like this are a major investment that generally replaces the need for a dedicated PC tower.

g72g_logo.jpg

What about the cost? It certainly isn’t “cheap” considering the top-of-the-line specs, and price is clearly the biggest barrier to entry with a product like this - far beyond the gargantuan size. Right off the bat I’ll bring up this laptop’s $2099 retail price - and not because I think it’s high. It’s actually very competitive as equipped. And in addition to competitive pricing MSI is also ahead of the curve a bit with its adoption of the 5th-Gen Core i7 Broadwell mobile processors, while most gaming laptops are still on Haswell. Broadwell’s improved efficiency should help with battery life a bit, but your time away from a power plug is always going to be limited with gaming laptops!

Continue reading our review of the MSI GT72 Dominator Pro G G-Sync Notebook!!

Manufacturer: PC Perspective

... But Is the Timing Right?

Windows 10 is about to launch and, with it, DirectX 12. Apart from the massive increase in draw calls, Explicit Multiadapter, both Linked and Unlinked, has been the cause of a few pockets of excitement here and there. I am a bit concerned, though. People seem to find this a new, novel concept that gives game developers the tools that they've never had before. It really isn't. Depending on what you want to do with secondary GPUs, game developers could have used them for years. Years!

Before we talk about the cross-platform examples, we should talk about Mantle. It is the closest analog to DirectX 12 and Vulkan that we have. It served as the base specification for Vulkan that the Khronos Group modified with SPIR-V instead of HLSL and so forth. Some claim that it was also the foundation of DirectX 12, which would not surprise me given what I've seen online and in the SDK. Allow me to show you how the API works.

amd-2015-mantle-execution-model.png

Mantle is an interface that mixes Graphics, Compute, and DMA (memory access) into queues of commands. This is easily done in parallel, as each thread can create commands on its own, which is great for multi-core processors. Each queue, which are lists leading to the GPU that commands are placed in, can be handled independently, too. An interesting side-effect is that, since each device uses standard data structures, such as IEEE754 decimal numbers, no-one cares where these queues go as long as the work is done quick enough.

Since each queue is independent, an application can choose to manage many of them. None of these lists really need to know what is happening to any other. As such, they can be pointed to multiple, even wildly different graphics devices. Different model GPUs with different capabilities can work together, as long as they support the core of Mantle.

microsoft-dx12-build15-ue4frame.png

DirectX 12 and Vulkan took this metaphor so their respective developers could use this functionality across vendors. Mantle did not invent the concept, however. What Mantle did is expose this architecture to graphics, which can make use of all the fixed-function hardware that is unique to GPUs. Prior to AMD's usage, this was how GPU compute architectures were designed. Game developers could have spun up an OpenCL workload to process physics, audio, pathfinding, visibility, or even lighting and post-processing effects... on a secondary GPU, even from a completely different vendor.

Vista's multi-GPU bug might get in the way, but it was possible in 7 and, I believe, XP too.

Read on to see a couple reasons why we are only getting this now...

Subject: Displays
Manufacturer: BenQ

Overdrive initialized

We have been tracking the differences between AMD’s FreeSync and Nvidia’s G-Sync for some time now. The launch of FreeSync-capable displays started out a bit shaky, as some features we took for granted went missing. The first round of FreeSync displays we reviewed came with non-functional overdrive when the display / GPU pipeline was operating in FreeSync mode.

ghost1.jpg

Comparison of overdrive response in first round FreeSync displays. Images should look like the ROG Swift (left), which was correctly applying overdrive.

While AMD apparently fixed a portion of this problem in a subsequent driver update, getting overdrive to function in these early displays would require a firmware update. Unlike what you may be used to with a motherboard or SSD firmware, displays are not typically end-user upgradeable. This meant that even if manufacturers produced a fix, owners would have to send in their display to be updated (and be without it for several weeks).

The only manufacturer to step forward and retroactively support overdrive in their first gen FreeSync panel was BenQ. In a statement issued via TFTCentral:

BenQ have confirmed that the FreeSync/AMA issue which affected their XL2730Z display has now been fixed. This issue caused the overdrive (AMA) feature to not function when the screen was connected to a FreeSync capable system. As a result, users could not make use of the AMA feature and benefit from the improved response times that the 'normal' AMA mode offered, as compared with AMA Off. See our review for more information.

A driver update from AMD is already available and should be downloaded from their website. In addition BenQ will be releasing a firmware update for the monitor itself to fix this issue. Current stocks in distribution are being recalled and updated with retailers so future purchases should already carry this new firmware. This is expected to apply for stock purchased AFTER 1st July, as V002 firmware screens should be shipped by BenQ to distributors in late June.

For those who already have an XL2730Z if you want to, you can return it to BenQ for them to carry out the firmware update for you. This only applies if the user is experiencing issues with the performance of the screen. There is no simple way for the end user to update the firmware themselves and it is not encouraged. Users should contact BenQ support through their relevant country website for more information on how to return their screen for the update.

The catch with the above is that the statement came from BenQ PR for Europe, and we nor TFTCentral have been able to confirm any equivalent upgrade process in place for the USA. We did note in various online reviews that those receiving their BenQ XL2730Z in the last week of June confirmed having the new V002 firmware.

If you have one of these panels, verifying your firmware is simple. Hold down the menu button while powering up the display (you will have to hold the power button for a few seconds before you hear a beep).

service-menu.jpg

The display will power up and appear as normal, except that now pressing the menu button again will bring up the above service menu. Those with the update will have “V002” as the starting text of the ‘F/W Version’ result.

ASUS-overdrive.jpg

Overdrive functioning on the ASUS MG279Q IPS FreeSync display, showing an odd simultaneous ‘negative ghost’ outline of a slightly ghosted image.

We have been eager to retest the BenQ since hearing of this updated firmware revision. While we have seen overdrive functioning in the recent ASUS MG279Q, it was not a perfect implementation, and we were curious to know if BenQ’s implementation fared any better.

BenQ.jpg

Continue reading for the results of our testing!