All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Early testing for higher end GPUs
UPDATE 2/5/16: Nixxes released a new version of Rise of the Tomb Raider today with some significant changes. I have added another page at the end of this story that looks at results with the new version of the game, a new AMD driver and I've also included some SLI and CrossFire results.
I will fully admit to being jaded by the industry on many occasions. I love my PC games and I love hardware but it takes a lot for me to get genuinely excited about anything. After hearing game reviewers talk up the newest installment of the Tomb Raider franchise, Rise of the Tomb Raider, since it's release on the Xbox One last year, I've been waiting for its PC release to give it a shot with real hardware. As you'll see in the screenshots and video in this story, the game doesn't appear to disappoint.
Rise of the Tomb Raider takes the exploration and "tomb raiding" aspects that made the first games in the series successful and applies them to the visual quality and character design brought in with the reboot of the series a couple years back. The result is a PC game that looks stunning at any resolution, but even more so in 4K, that pushes your hardware to its limits. For single GPU performance, even the GTX 980 Ti and Fury X struggle to keep their heads above water.
In this short article we'll look at the performance of Rise of the Tomb Raider with a handful of GPUs, leaning towards the high end of the product stack, and offer up my view on whether each hardware vendor is living up to expectations.
A mix of styles
Logitech continues its push and re-entry into the gaming peripherals market in 2016, this time adding another keyboard under the Orion brand to the mix. The Logitech G G810 Orion Spectrum is, as the name implies, an RGB mechanical keyboard using the company's proprietary Romer-G switches. But despite the similarity in model numbers to the G910 Orion Spark announced in late 2014, the G810 has some significant design and functionality changes.
This new offering is cleaner, less faceted (both in key caps and design) but comes much closer to the feel and function than the tenkeyless G410 from last year. Let's take a look at how the G810 changes things up for Logitech G.
The G810 Orion Spectrum is a full size keyboard with tenkey (also known as the numeric keypad) that has sleeker lines and more professional lines that its big brother. The black finish is matte on the keys and framing but the outside edges of the keyboard have a gloss to them. It's a very minimal part of the design though so you shouldn't have to worry about fingerprints.
At first glance, you can see that Logitech toned down some of the gamer-centric accents when compared to either the G910 or the G410. There is no wrist rest, no PCB-trace inspired lines, no curves and no sharp edges. What you get instead is a keyboard that is equally well placed in modern office or in an enthusiasts gaming den. To me, there are a lot of touches that remind me of the Das Keyboard - understated design that somehow makes it more appealing to the educated consumer.
This marks the first keyboard with the new Logitech G logo on it, though you are likely more concerned about the lack of G-Keys, the company's name for its macro-capable buttons on the G910. For users that still want that capability, Logitech G allows you to reprogram the function keys along the top for macro capability, and has a pretty simple switch in software to enable or disable those macros. This means you can maintain the F-row of keys for Windows applications but still use macros for gaming.
That Depends on Whether They Need One
Ars Technica UK published an editorial called, Hey Valve: What's the point of Steam OS? The article does not actually pose the question in it's text -- it mostly rants about technical problems with a Zotac review unit -- but the headline is interesting none-the-less.
Here's my view of the situation.
The Death of Media Center May Have Been...
There's two parts to this story, and both center around Windows 8. The first was addressed in an editorial that I wrote last May, titled The Death of Media Center & What Might Have Been. Microsoft wanted to expand the PC platform into the living room. Beyond the obvious support for movies, TV, and DVR, they also pushed PC gaming in a few subtle ways. The Games for Windows certification required games to be launchable by Media Center and support Xbox 360 peripherals, which pressures game developers to make PC games comfortable to play on a couch. They also created Tray and Play, which is an optional feature that allows PC games to be played from the disk while they installed in the background. Back in 2007, before Steam and other digital distribution services really took off, this eliminated install time, which was a major user experience problem with PC gaming (and a major hurdle for TV-connected PCs).
It also had a few nasty implications. Games for Windows Live tried to eliminate modding by requiring all content to be certified (or severely limiting the tools as seen in Halo 2 Vista). Microsoft was scared about the content that users could put into their games, especially since Hot Coffee (despite being locked, first-party content) occurred less than two years earlier. You could also argue that they were attempting to condition PC users to accept paid DLC.
Regardless of whether it would have been positive or negative for the PC industry, the Media Center initiative launched with Windows Vista, which is another way of saying “exploded on the launch pad, leaving no survivors.” Windows 7 cleared the wreckage with a new team, who aimed for the stars with Windows 8. They ignored the potential of the living room PC, preferring devices and services (ie: Xbox) over an ecosystem provided by various OEMs.
If you look at the goals of Steam OS, they align pretty well with the original, Vista-era ambitions. Valve hopes to create a platform that hardware vendors could compete on. Devices, big or small, expensive or cheap, could fill all of the various needs that users have in the living room. Unfortunately, unlike Microsoft, they cannot be (natively) compatible with the catalog of Windows software.
This may seem like Valve is running toward a cliff, but keep reading.
What If Steam OS Competed with Windows Store?
Windows 8 did more than just abandon the vision of Windows Media Center. Driven by the popularity of the iOS App Store, Microsoft saw a way to end the public perception that Windows is hopelessly insecure. With the Windows Store, all software needs to be reviewed and certified by Microsoft. Software based on the Win32 API, which is all software for Windows 7 and earlier, was only allowed within the “Desktop App,” which was a second-class citizen and could be removed at any point.
This potential made the PC software industry collectively crap themselves. Mozilla was particularly freaked out, because Windows Store demanded (at the time) that all web browsers become reskins of Internet Explorer. This means that Firefox would not be able to implement any new Web standards on Windows, because it can only present what Internet Explorer (Trident) draws. Mozilla's mission is to develop a strong, standards-based web browser that forces all others to interoperate or die.
Remember: “This website is best viewed with Internet Explorer”?
Executives from several PC gaming companies, including Valve, Blizzard, and Mojang, spoke out against Windows 8 at the time (along with browser vendors and so forth). Steam OS could be viewed as a fire escape for Valve if Microsoft decided to try its luck and kill, or further deprecate, Win32 support. In the mean time, Windows PCs could stream to it until Linux gained a sufficient catalog of software.
Image Credit: Wikipedia
This is where Steam OS gets interesting. Its software library cannot compete against Windows with its full catalog of Win32 applications, at least not for a long time. On the other hand, if Microsoft continues to support Win32 as a first-class citizen, and they returned to the level of openness with software vendors that they had in the Windows XP era, then Valve doesn't really have a reason to care about Steam OS as anything more than a hobby anyway. Likewise, if doomsday happens and something like Windows RT ends up being the future of Windows, as many feared, then Steam OS wouldn't need to compete against Windows. Its only competition from Microsoft would be Windows Store apps and first-party software.
I would say that Valve might even have a better chance than Microsoft in that case.
AMD Keeps Q1 Interesting
CES 2016 was not a watershed moment for AMD. They showed off their line of current video cards and, perhaps more importantly, showed off working Polaris silicon, which will be their workhorse for 2016 in the graphics department. They did not show off Zen, a next generation APU, or any AM4 motherboards. The CPU and APU world was not presented in a way that was revolutionary. What they did show off, however, hinted at the things to come to help keep AMD relevant in the desktop space.
It was odd to see an announcement about the stock cooler that AMD was introducing, but when we learned more about it, the more important it was for AMD’s reputation moving forward. The Wraith cooler is a new unit to help control the noise and temperatures of the latest AMD CPUs and select APUs. This is a fairly beefy unit with a large, slow moving fan that produces very little noise. This is a big change from the variable speed fans on previous coolers that could get rather noisy and leave temperatures that were higher in range than are comfortable. There has been some derision aimed at AMD for providing “just a cooler” for their top end products, but it is a push that is making them more user and enthusiast friendly without breaking the bank.
Socket AM3+ is not dead yet. Though we have been commenting on the health of the platform for some time, AMD and its partners work to improve and iterate upon these products to include technologies such as USB 3.1 and M.2 support. While these chipsets are limited to PCI-E 2.0 speeds, the four lanes available to most M.2 controllers allows these boards to provide enough bandwidth to fully utilize the latest NVMe based M.2 drives available. We likely will not see a faster refresh on AM3+, but we will see new SKUs utilizing the Wraith cooler as well as a price break for the processors that exist in this socket.
NVMe was a great thing to happen to SSDs. The per-IO reduction in latency and CPU overhead was more than welcome, as PCIe SSDs were previously using the antiquated AHCI protocol, which was a carryover from the SATA HDD days. With NVMe came additional required support in Operating Systems and UEFI BIOS implementations. We did some crazy experiments with arrays of these new devices, but we were initially limited by the lack of native hardware-level RAID support to tie multiple PCIe devices together. The launch of the Z170 chipset saw a remedy to this, by including the ability to tie as many as three PCIe SSDs behind a chipset-configured array. The recent C600 server chipset also saw the addition of RSTe capability, expanding this functionality to enterprise devices like the Intel SSD P3608, which was actually a pair of SSDs on a single PCB.
Most Z170 motherboards have come with one or two M.2 slots, meaning that enthusiasts wanting to employ the 3x PCIe RAID made possible by this new chipset would have to get creative with the use of interposer / adapter boards (or use a combination of PCI and U.2 connected Intel SSD 750s). With the Samsung 950 Pro available, as well as the slew of other M.2 SSDs we saw at CES 2016, it’s safe to say that U.2 is going to push back into the enterprise sector, leaving M.2 as the choice for consumer motherboards moving forward. It was therefore only a matter of time before a triple-M.2 motherboard was launched, and that just recently happened - Behold the Gigabyte Z170X-SOC Force!
This new motherboard sits at the high end of Gigabyte’s lineup, with a water-capable VRM cooler and other premium features. We will be passing this board onto Morry for a full review, but this piece will be focusing on one section in particular:
I have to hand it to Gigabyte for this functional and elegant design choice. The space between the required four full length PCIe slots makes it look like it was chosen to fit M.2 SSDs in-between them. I should also note that it would be possible to use three U.2 adapters linked to three U.2 Intel SSD 750s, but native M.2 devices makes for a significantly more compact and consumer friendly package.
With the test system set up, let’s get right into it, shall we?
Dell has never exactly been a brand that gamers gravitate towards. While we have seen some very high quality products out of Dell in the past few years, including the new XPS 13, and people have loved their Ultrasharp monitor line, neither of these target gamers directly. Dell acquired Alienware in 2006 in order to enter the gaming market and continues to make some great products, but they retain the Alienware branding. It seems to me a gaming-centric notebook with just the Dell brand could be a hard sell.
However, that's exactly what we have today with the Dell Inspiron 15 7000. Equipped with an Intel Core i5-6300HQ and NVIDIA GTX 960M for $799, has Dell created a contender in the entry-level gaming notebook race?
For years, the Inspiron line has been Dell's entry level option for notebooks and subsequently has a questionable reputation as far as quality and lifespan. With the Inspiron 15 7000 being the most expensive product offering in the Inspiron line though, I was excited to see if it could sway my opinion of the brand.
Introduction and First Impressions
The new Corsair Carbide 600Q and 600C enclosures are the company's first inverted ATX designs, and the layout promises improved airflow for better cooling.
The Carbide Series from Corsair has encompassed enclosures from the company's least expensive budget-friendly options such as the $59 Carbide 100R, to high-performance options like the $159 Carbide Air 540. This new Carbide 600 enclosure is available in two versions, the 600C and 600Q, which both carry an MSRP of $149. This positions the 600C/600Q enclosures near the Graphite and Obsidian series models, but this is only fitting as there is nothing "budget" about these new Carbide 600 models.
The Carbide Series 600Q in for review differs from the 600C most obviously in its lack of the latter's hinged, latching side-panel, which also contains a large window. But the differences extend to the internal makeup of the enclosure, as the 600Q includes significant noise damping inside the front, top, and side panels. We'll be taking a close look at the noise levels along with thermal performance with this "Q" version of the new enclosure in our review.
My new desk mate
Earlier this month at the 2016 edition of the Consumer Electronics Show, Logitech released a new product for the gaming market that might have gone unnoticed by some. The G502 Proteus Spectrum is a new gaming mouse that takes an amazing product and makes it just a little better with the help of some RGB goodness. The G502 Proteus Core has been around for a while now and has quickly become one of the best selling gaming mice on Amazon, a testament to its quality and popularity. (It has been as high as #1 overall in recent days.)
We have been using the G502 Proteus Core in our gaming test beds at the office for some months and during that time I often lamented about how I wanted to upgrade the mouse on my own desk to one. While I waited for myself stop being lazy and not just switching one for the G402 currently in use at my workstation, Logitech released the new G502 Proteus Spectrum and handed me a sample at CES to bring home. Perfect!
|Logitech G502 Proteus Spectrum Specifications|
|Resolution||200 - 12,000 DPI|
|Max Speed||>300 IPS|
|USB Data||16 bits/axis|
|USB Report Rate||1000 Hz (1 ms)|
|Button rating||20 million clicks|
|Feet rating||250 kilometers|
|Price||$79 - Amazon.com|
The G502 Proteus Spectrum is very similar to the Core model, with the only difference being the addition of an RGB light under the G logo and DPI resolution indicators. This allows you to use the Logitech Gaming Software to customize its color, its pattern (breathing, still or rotating) as well as pair it up and sync with the RGB lights of other Logitech accessories you might have. If you happen to own a Logitech G910 or G410 keyboard, or one of the new headsets (G633/933) then you'll quickly find yourself in color-coordinated heaven.
In the box you'll find the mouse, attached to a lengthy cable that works great even with my standing desk, and a set of five weights that you can install on the bottom if you like a heavier feel to your mousing action. I installed as many as I could under the magnetic door on the underside of the mouse and definitely prefer it. The benefit of the weights (as opposed to just a heavier mouse out of the box) is that users can customize it as they see fit.
Introduction and First Impressions
The Scythe Ninja 4 (SCNJ-4000) is the latest model in the Ninja series, and an imposing air cooler with dimensions similar to Noctua's massive NH-D14. But there's more to the story than size, as this is engineered for silence above all else. Read on to see just how quiet it is, and of course how well it's able to cope with CPU loads.
"The Ninja 4 is the latest model in the Ninja CPU Cooler Series, developed for uncompromising performance. It features the new T-M.A.P.S technology, an optimized alignment of heatpipes, and the back-plate based Hyper Precision Mounting System (H.P.M.S) for firm mounting and easy installation procedure. These improvements and a special, adjustable Glide Stream 120mm PWM fan result in an increased cooling performance while reducing the weight compared to his predecessor. Also the design of the heat-sink allows fan mounting on all four sides. This enables the optimal integration of the Ninja 4 in the air flow of the pc-case and reduces turbulence and the emergence of hotspots."
The Ninja 4 is built around a very large, square heatsink, which allows the single 120 mm fan to be mounted on any side, and this PWM fan offers three speed settings to further control noise. And noise is really what the Ninja is all about, with some really low minimum speeds possible on what is a very quiet Scythe fan to begin with.
Will a single low-speed fan design affect the ability to keep a CPU cool under stress? Will the Ninja 4's fan spin up and become less quiet under full load? These questions will soon be answered.
Fighting for Relevance
AMD is still kicking. While the results of this past year have been forgettable, they have overcome some significant hurdles and look like they are improving their position in terms of cutting costs while extracting as much revenue as possible. There were plenty of ups and downs for this past quarter, but when compared to the rest of 2015 there were some solid steps forward here.
The company reported revenues of $958 million, which is down from $1.06 billion last quarter. The company also recorded a $103 million loss, but that is down significantly from the $197 million loss the quarter before. Q3 did have a $65 million write-down due to unsold inventory. Though the company made far less in revenues, they also shored up their losses. The company is still bleeding, but they still have plenty of cash on hand for the next several quarters to survive. When we talk about non-GAAP figures, AMD reports a $79 million loss for this past quarter.
For the entire year AMD recorded $3.99 billion in revenue with a net loss of $660 million. This is down from FY 2014 revenues of $5.51 billion and a net loss of $403 million. AMD certainly is trending downwards year over year, but they are hoping to reverse that come 2H 2016.
Graphics continues to be solid for AMD as they increased their sales from last quarter, but are down year on year. Holiday sales were brisk, but with only the high end Fury series being a new card during this season, the impact of that particular part was not as great as compared to the company having a new mid-range series like the newly introduced R9 380X. The second half of 2016 will see the introduction of the Polaris based GPUs for both mobile and desktop applications. Until then, AMD will continue to provide the current 28 nm lineup of GPUs to the market. At this point we are under the assumption that AMD and NVIDIA are looking at the same timeframe for introducing their next generation parts due to process technology advances. AMD already has working samples on Samsung’s/GLOBALFOUNDRIES 14nm LPP (low power plus) that they showed off at CES 2016.
Introduction and Technical Specifications
Courtesy of ASUS
The ASUS X99-M WS is an mATX version of their workstation line board for the Intel X99 chipset with enhanced support for USB 3.1-based devices. The board features full support for all Intel LGA2011-v3 processors paired with DDR4 memory operating in up to quad channel configuration. Priced at a reasonable $279.99, the X99-M WS/USB 3.1 board is priced competitively considering the feature set and performance designed into the board.
Courtesy of ASUS
Courtesy of ASUS
The X99-M WS motherboard features an 8-phase digital power system with high efficiency Beat Thermal chokes, Dr. MOS MOSFETs, and 12k rated solid capacitors. Additionally, ASUS integrated their CPU OC Socket into the board's design, ensuring board's overclocking potential. ASUS chose to integrate the following features into the X99-M WS/USB 3.1 board: eight SATA 3 ports; one M.2 PCIe x2 capable port; Intel I218LM and I210-AT Gigabit NICs; tri-port 802.11ac WiFi adapter; three PCI-Express x16 slots; a PCI-Express x1 slot; 2-digit diagnostic LED display; on-board power, reset, CMOS clear, MemOK!, and BIOS Flashback buttons; TPU, EPU, Dr. Power, and EZ_XMP switches; Crystal Sound 2 audio subsystem; and USB 2.0, 3.0, and 3.1 Type-A port support.
Courtesy of ASUS
Introduction and Features
Earlier this year we took a detailed look at the Silent Base 600 and found it to be a full-featured mid-tower enclosure that focuses on quiet, virtually silent operation, while at the same time delivering excellent cooling performance, usability and support for high-end hardware. Be Quiet! introduced the Silent Base 800 mid-tower case well over a year ago and later released the Silent Base 600 mid-tower case in 2015. The two cases are functionally very similar with only a few minor changes differentiating the two. Because the two cases are so similar, we are going to highlight the Silent Base 800’s features and specifications and then point out the main differences using the Silent Base 600 as a reference.
Be Quiet!’s Silent Base Series currently includes two cases; the Silent Base 800 and the Silent Base 600. As you might expect, the Silent Base Series is designed for very quiet operation while still offering excellent performance and cooling. Both cases are targeted towards users looking to build a quiet high-end gaming or multimedia system.
Silent Base 800 Key Features
The Be Quiet! Silent Base 800 ATX Mid-Tower enclosure comes in four different color schemes (Black/Black, Orange/Black, Silver/Black, and Red/Black) and is available with or without a side window. The Silent Base 800 comes with three Be Quiet! Pure Wings 2 fans (two 140mm intakes and one 120mm exhaust) pre-installed along with numerous options that support additional fans or liquid cooling if desired.
“The Be Quiet! Silent Base 800 offers the perfect symbiosis of noise prevention and cooling performance, good usability, and an extensive capacity for high-end hardware.”
Be Quiet! Silent Base 800 Mid-Tower Case Main Features:
• Mid-Tower ATX enclosure available in four different color schemes (with or without a side window)
• Supports ATX, Micro-ATX and Mini-ITX motherboards
• Innovative construction assures excellent cooling efficiency and air circulation
• Easily removed dust filters on front and bottom panels
• Sound dampening mats used on front panel and both side panels
• Anti-vibration decoupling provided for fans, HDDs and power supply
• Double-glazed side panel window provides superb soundproofing
• Three included Pure Wings fans: (2) 140mm intakes and (1) 120mm exhaust
• Removable top panel, with top fan mounts pre-drilled for 240mm or 280mm fans and/or liquid cooling radiators
• Excellent cooling and low noise levels with up to six fan mounting locations
o Front: two 140mm fans included
o Top: Dual 120mm or 140mm
o Rear: 120mm fan included
o Bottom: 120mm or 140mm
• (2) USB 3.0, (2) USB 2.0 and audio jacks on the top panel
• Seven internal 3.5” hard drive bays
• Four internal 2.5” SSD mounting locations
• Three external 5.25” drive bays
• Tool-free mounting for all 3.5”/2.5” internal drives
• Up to 290mm (11.4”) clearance for graphic cards
• Up to 400mm (15.7”) for long graphic cards (with HDD cage removed)
• Up to 170mm (6.7”) of space for CPU coolers
• 3-Year manufacturer’s warranty
• MSRP: $149.99 USD ($139.99 without side window)
UltraWide G-Sync Arrives
When NVIDIA first launched G-Sync monitors, they had the advantage of being first to literally everything. They had the first variable refresh rate technology, the first displays of any kind that supported it and the first ecosystem to enable it. AMD talked about FreeSync just a few months later, but it wasn't until March of 2015 that we got our hands on the first FreeSync enabled display, and it was very much behind the experience provided by G-Sync displays. That said, what we saw with that launch, and continue to see as time goes on, is that there are a much higher quantity of FreeSync options, with varying specifications and options, compared to what NVIDIA has built out.
This is important to note only because, as we look at the Acer Predator X34 monitor today, the first 34-in curved panel to support G-Sync, it comes 3 months after the release of the similarly matched monitor from Acer that worked with AMD FreeSync. The not-as-sexyily-named Acer XR341CK offers a 3440x1440 resolution, 34-in curved IPS panel and a 75Hz refresh rate.
But, as NVIDIA tends to do, they found a way to differentiate its own products, with the help of Acer. The Predator X34 monitor has a unique look and style to it, and it improves the maximum refresh rate to 100Hz (although that is considered overclocking). The price is a bit higher too, coming in at $1300 or so on Amazon.com; the FreeSync-enabled XR341CK monitor sells for just $941.
Thank you for all you do!
Much of what I am going to say here is repeated from the description on our brand new Patreon support page, but I think a direct line to our readers is in order.
First, I think you may need a little back story. Ask anyone that has been doing online media in this field for any length of time and they will tell you that getting advertisers to sign on and support the production of "free" content has been getting more and more difficult. You'll see this proven out in the transition of several key personalities of our industry away from media into the companies they used to cover. And you'll see it in the absorption of some of our favorite media outlets, being purchased by larger entities with the promise of being able to continue doing what they have been doing. Or maybe you've seen it show as more interstitial ads, road blocks, sponsored site sections, etc.
At PC Perspective we've seen the struggle first hand but I have done my best to keep as much of that influence away from my team. We are not immune - several years ago we started doing site skins, something we didn't plan for initially. I do think I have done a better than average job keeping the lights on here though, so to speak. We have good sell through on our ad inventory and some of the best companies in our industry support the work we do.
Some of the PC Perspective team at CES 2016
Let me be clear though - we aren't on the verge of going out of business. I am not asking for Patreon supporters to keep from firing anyone. We just wanted to maintain and grow our content library and capability and it seemed like the audience that benefits and enjoys that content might be the best place to start.
Some of you are likely asking yourself if supporting PC Perspective is really necessary? After all, you can chug out a 400 word blog in no time! The truth is that high quality, technical content takes a lot of man hours and those hours are expensive. Our problem is that to advertisers, a page view is a page view, they don't really care how much time and effort went into creating the content on that page. If we spend 20 hours developing a way to evaluate variable refresh rate monitors with an oscilloscope, but put the results on a single page at pcper.com, we get the same amount of traffic as someone that just posts an hour's worth of gameplay experiences. Both are valuable to the community, but one costs a lot more to produce.
Frame Rating testing methodology helped move the industry forward
The easy way out is to create click bait style content (have you seen the new Marvel trailer??!?) and hope for enough extra page views to make up for the difference. But many people find the allure of the cheap/easy posts too easy and quickly devolve into press releases and marketing vomit. No one at PC Perspective wants to see that happen here.
Not only do we want to avoid a slide into that fate but we want to improve on what we are doing, going further down the path of technical analysis with high quality writing and video content. Very few people are working on this kind of writing and analysis yet it is vitally important to those of you that want the information to make critical purchasing decisions. And then you, in turn, pass those decisions on to others with less technical interest (brothers, mothers, friends).
We have ideas for new regular shows including a PC Perspective Mailbag, a gaming / Virtual LAN Party show and even an old hardware post-mortem production. All of these take extra time beyond what each person has dedicated today and the additional funding provided by a successful Patreon campaign will help us towards those goals.
I don't want anyone to feel that they are somehow less of a fan of PC Perspective if you can't help - that's not what we are about and not what I stand for. Just being here, reading and commenting on our work means a lot to us. You can still help by spreading the word about stories you find interesting or even doing your regular Amazon.com shopping through our link on the right side bar.
But for those of you that can afford a monthly contribution, consider a "value for value" amount. How much do you think the content we have produced and will produce is worth to you? If that's $3/month, thank you! If that's $20/month, thank you as well!
Support PC Perspective through Patreon
The team and I spent a lot of our time in the last several weeks talking through this Patreon campaign and we are proud to offer ourselves up to our community. PC Perspective is going to be here for a long time, and support from readers like you will help us be sure we can continue to improve and innovate on the information and content we provide.
Again, thank you so much for support over the last 16 years!
Laptops and Monitors
Dell kicked off their CES presence with a presentation that featured actor Josh Brener of “Silicon Valley” fame. His monologues were entertaining, but unfortunately he was performing in front of a pretty tough crowd. It was 10:30 in the morning and people were still scarfing down coffee and breakfast goods that were provided by Dell. Not exactly a group receptive of humorous monologues at that time in the morning. Oddly enough I was seated next to Josh's wife, Meghan Falcone, who helped provide the laugh track for his presentation. She was kind enough to place my dirty, germ-ridden coffee cup right next to the AV equipment table when I was finished with it. Probably a poor move on her part.
The presentation was actually about some pretty interesting products coming to Dell this year. The presentation was held in a restaurant in The Venetian and space was rather limited. Dell did what they could in the space provided, and entertained some 60+ reporters and editors with the latest and greatest technology coming from Dell.
Dell had a runaway success last year with their latest XPS laptops with the InfinityEdge Displays. The 13” model was a huge success with even Ryan buying one. These products featured quick processors and graphics, outstanding screen quality, and excellent battery life considering weight and performance. Dell decided to apply this design to their business class Latitude laptops. The big mover is expected to be the new Dell Latitude 13” 7000 series Ultrabook. This will come with a variety of configurations, but it will all be based on the same chasis that features the 13” InfinityEdge Display as well as a carbon fiber top lid. This will host all of the business class security features that those customers expect. It also features USB Type-C connectors as well as Thunderbolt 3.
The Latitude 12 7000 series is a business oriented 2-in-1 device with a 12.5” screen. This easily converts from a laptop to a tablet and is along the same design lines as the latest Surface 4. It features a 4K touch display that is covered by a large piece of Gorilla Glass. The magnesium unibody build provides a great amount of rigidity while keeping weight low. The attachable base/keyboard is a backlit unit that is extremely thin.
Finally we have the smaller Latitude 11 5000 series 2-in1 that features a 10.8 inch touch display, hardened glass, and the magnesium frame. It is only 1.56 pounds and provides all the business and security features demanded by that market.
Introduction and Features
As you might expect, be quiet! is focused on virtually silent power supplies and they continue to be one of the top selling brands in Europe. The Dark Power Pro 11 Series occupies the top tier in be quiets!’s PC power supply lineup. All of the Dark Power Pro 11 models are certified for high efficiency (80 Plus Platinum) and come with modular cables. In this review we will be taking a detailed look at the be quiet! Dark Power Pro 11 1,000W power supply with Cable Management. There are six power supplies in the Dark Power Pro 11 CM Series, which include 550W, 650W, 750W, 850W, 1000W and 1200W models.
be quiet! designed the Dark Power Pro 11 CM Series to provide high efficiency with minimal noise for systems that demand whisper-quiet operation without compromising on power quality. In addition to the Dark Power Pro 11 Series, be quiet! offers a full range of power supplies in ATX, SFX, and TFX form factors.
(Courtesy of be quiet!)
All of the Dark Power Pro 11 Cable Management Series power supplies are semi-modular (all cables are modular except for the fixed 24-pin ATX cable). Along with 80 Plus Platinum certified high efficiency and quiet operation, the Dark Power Pro 11 1,000W modular power supply features an overclocking key to select between multi-rail and single rail +12V outputs. The power supply uses be quiet!’s latest SilentWings3 135mm fan for virtually silent operation. The fan speed starts out very slow and remains slow and quiet through mid power levels. And the Dark Power Pro 11 power supplies allow connecting up to four case fans, whose speed will be controlled by the PSU.
be quiet! Dark Power Pro 11 1,000W CM PSU Key Features:
• 1,000W continuous DC output (ATX12V v2.4, EPS 2.92 compliant)
• Virtually inaudible SilentWings3 135mm cooling fan
• 80 PLUS Platinum certified efficiency (up to 94%)
• Premium 105°C rated parts enhance stability and reliability
• Powerful GPU support with nine PCI-E connectors
• User-friendly cable management reduces clutter and improves airflow
• NVIDIA SLI Ready and AMD CrossFire X certified
• ErP 2014 ready and meets Energy Star 6.0 guidelines
• Zero load design supports Intel’s Deep Power Down C6 & C7 modes
• Fully Intel Haswell compatible
• Active Power Factor correction (0.99) with Universal AC input
• German product conception, design and quality control
• Safety Protections : OCP, OVP, UVP, SCP, OTP, and OPP
• 5-Year warranty
• MSRP for the Dark Power Pro 11 1,000W CM PSU: $239.00 USD
Are Computers Still Getting Faster?
It looks like CES is starting to wind down, which makes sense because it ended three days ago. Now that we're mostly caught up, I found a new video from The 8-Bit Guy. He doesn't really explain any old technologies in this one. Instead, he poses an open question about computer speed. He was able to have a functional computing experience on a ten-year-old Apple laptop, which made him wonder if the rate of computer advancement is slowing down.
I believe that he (and his guest hosts) made great points, but also missed a few important ones.
One of his main arguments is that software seems to have slowed down relative to hardware. I don't believe that is true, but I believe it's looking in the right area. PCs these days are more than capable of doing just about anything in terms of 2D user interface that we would want to, and do so with a lot of overhead for inefficient platforms and sub-optimal programming (relative to the 80's and 90's at the very least). The areas that require extra horsepower are usually doing large batches of many related tasks. GPUs are key in this area, and they are keeping up as fast as they can, despite some stagnation with fabrication processes and a difficulty (at least before HBM takes hold) in keeping up with memory bandwidth.
For the last five years to ten years or so, CPUs have been evolving toward efficiency as GPUs are being adopted for the tasks that need to scale up. I'm guessing that AMD, when they designed the Bulldozer architecture, hoped that GPUs would have been adopted much more aggressively, but even as graphics devices, they now have a huge effect on Web, UI, and media applications.
These are also tasks that can scale well between devices by lowering resolution (and so forth). The primary thing that a main CPU thread needs to do is figure out the system's state and keep the graphics card fed before the frame-train leaves the station. In my experience, that doesn't scale well (although you can sometimes reduce the amount of tracked objects for games and so forth). Moreover, it is easier to add GPU performance, compared to single-threaded CPU, because increasing frequency and single-threaded IPC should be more complicated than planning out more, duplicated blocks of shaders. These factors combine to give lower-end hardware a similar experience in the most noticeable areas.
So, up to this point, we discussed:
- Software is often scaling in ways that are GPU (and RAM) limited.
- CPUs are scaling down in power more than up in performance.
- GPU-limited tasks can often be approximated with smaller workloads.
- Software gets heavier, but it doesn't need to be "all the way up" (ex: resolution).
- Some latencies are hard to notice anyway.
Back to the Original Question
This is where “Are computers still getting faster?” can be open to interpretation.
Tasks are diverging from one class of processor into two, and both have separate industries, each with their own, multiple goals. As stated, CPUs are mostly progressing in power efficiency, which extends (an assumed to be) sufficient amount of performance downward to multiple types of devices. GPUs are definitely getting faster, but they can't do everything. At the same time, RAM is plentiful but its contribution to performance can be approximated with paging unused chunks to the hard disk or, more recently on Windows, compressing them in-place. Newer computers with extra RAM won't help as long as any single task only uses a manageable amount of it -- unless it's seen from a viewpoint that cares about multi-tasking.
In short, computers are still progressing, but the paths are now forked and winding.
Introduction and Technical Specifications
Courtesy of SUPERMICRO
SUPERMICRO's latest venture into the consumer realm is the C7Z170-SQ motherboard, featuring an appealing black and red enthusiast-friendly aesthetic and a gamer-friendly feature set. Centered on the Intel Z170 chipset, the board offers fully support for the Intel Skylake-S processor line as well as DDR4 memory in Dual Channel mode. At a competitive $219.99 MSRP, the SUPERMICRO C7Z170-SQ motherboard is competitively-priced considering its features and performance potential.
Courtesy of SUPERMICRO
The C7Z170-SQ was designed with an 8+1 phase digital power delivery system to power the CPU and integrated graphics. SUPERMICRO integrated the following featured into the board: six SATA 3 ports; a PCIe X4 M.2 slot; an Intel i219-V Gigabit NIC; three PCI-Express x16 slots; three PCI-Express x4 slots; 2-digit diagnostic LED display; on-board power, CMOS clear, and BIOS restore buttons; and USB 2.0, 3.0, and USB 3.1 Type-C port support.
Courtesy of SUPERMICRO
Looking Towards 2016
ARM invited us to a short conversation with them on the prospects of 2016. The initial answer as to how they feel the upcoming year will pan out is, “Interesting”. We covered a variety of topics ranging from VR to process technology. ARM is not announcing any new products at this time, but throughout this year they will continue to push their latest Mali graphics products as well as the Cortex A72.
Trends to Watch in 2016
The one overriding trend that we will see is that of “good phones at every price point”. ARM’s IP scales from very low to very high end mobile SOCs and their partners are taking advantage of the length and breadth of these technologies. High end phones based on custom cores (Apple, Qualcomm) will compete against those licensing the Cortex A72 and A57 parts for their phones. Lower end options that are less expensive and pull less power (which then requires less battery) will flesh out the midrange and budget parts. Unlike several years ago, the products from top to bottom are eminently usable and relatively powerful products.
Camera improvements will also take center stage for many products and continue to be a selling point and an area of differentiation for competitors. Improved sensors and software will obviously be the areas where the ARM partners will focus on, but ARM is putting some work into this area as well. Post processing requires quite a bit of power to do quickly and effectively. ARM is helping here to leverage the Neon SIMD engine and leveraging the power of the Mali GPU.
4K video is becoming more and more common as well with handhelds, and ARM is hoping to leverage that capability in shooting static pictures. A single 4K frame is around 8 megapixels in size. So instead of capturing video, the handheld can achieve a “best shot” type functionality. So the phone captures the 4K video and then users can choose the best shot available to them in that period of time. This is a simple idea that will be a nice feature for those with a product that can capture 4K video.
AMD Polaris Architecture Coming Mid-2016
In early December, I was able to spend some time with members of the newly formed Radeon Technologies Group (RTG), which is a revitalized and compartmentalized section of AMD that is taking over all graphics work. During those meetings, I was able to learn quite a bit about the plans for RTG going forward, including changes for AMD FreeSync and implementation of HDR display technology, and their plans for the GPUOpen open-sourced game development platform. Perhaps most intriguing of all: we received some information about the next-generation GPU architecture, targeted for 2016.
Codenamed Polaris, this new architecture will be the 4th generation of GCN (Graphics Core Next), and it will be the first AMD GPU that is built on FinFET process technology. These two changes combined promise to offer the biggest improvement in performance per watt, generation to generation, in AMD’s history.
Though the amount of information provided about the Polaris architecture is light, RTG does promise some changes to the 4th iteration of its GCN design. Those include primitive discard acceleration, an improved hardware scheduler, better pre-fetch, increased shader efficiency, and stronger memory compression. We have already discussed in a previous story that the new GPUs will include HDMI 2.0a and DisplayPort 1.3 display interfaces, which offer some impressive new features and bandwidth. From a multimedia perspective, Polaris will be the first GPU to include support for h.265 4K decode and encode acceleration.
This slide shows us quite a few changes, most of which were never discussed specifically that we can report, coming to Polaris. Geometry processing and the memory controller stand out as potentially interesting to me – AMD’s Fiji design continues to lag behind NVIDIA’s Maxwell in terms of tessellation performance and we would love to see that shift. I am also very curious to see how the memory controller is configured on the entire Polaris lineup of GPUs – we saw the introduction of HBM (high bandwidth memory) with the Fury line of cards.