Subject: Motherboards | April 8, 2016 - 05:27 PM | Jeremy Hellstrom
Tagged: gigabyte, Z170X Gaming 6, Intel, Z170
At this moment the Gigabyte Z170X Gaming 6 costs $170 on Amazon, which gets you support for dual SLI or triple Crossfire, a Killer NIC E2200, a pair of M.2 slots, three SEx ports and even a Type-C USB 3.1 port in amongst other USB, A/V out and SATA connections. [H]ard|OCP tested the performance of this board and found the overclocking potential to be somewhat disappointing, although possible with some effort. After dealing with BIOS issues and some very warm MOSFETs one reviewer settled on running an i7 6700 @ 4.6GHz (100x46) and DDR4 set at 2666MHz. In the end this board is a good value for someone who wants a wide variety of features and is either disinclined to overclock or who is willing to put effort into tweaking the UEFI to acheive a decent overclock.
"GIGABYTE is back with its $165 Z170X Gaming 6 motherboard today. It’s a full featured motherboard that won’t break the bank and has a lot to offer. While many enthusiasts need what is considered high end, there are a lot of enthusiasts just looking for something that will get the job done with a few extra bells and whistles."
Here are some more Motherboard articles from around the web:
- MSI Z170A SLI Plus @ Kitguru
- ASUS Sabertooth Z170 S @ eTeknix
- MSI Z170A Gaming Pro Carbon @ Modders-Inc
- ASRock Z170 Extreme4+ @ Hardware Canucks
- Gigabyte Z170-Gaming K3 @ eTeknix
Subject: General Tech | April 1, 2016 - 01:55 PM | Jeremy Hellstrom
Tagged: Xeon E5-2600 v4, Intel, Broadwell-EP
Yesterday, towards the end of the day, Intel announced the arrival of their newest Xeon chips, the v4 series of Xeon E5 CPUs. As you would expect of server chips there is no GPU present however there are new features to improve your servers performance. The new Broadwell-EP chips will have up to 22 cores and 44 threads, an impressive 55MB of cache on some models and support for DDR4-2400. As far as raw performance goes, Intel advertises these chips as delivering about 5% instructions per second compared to Haswell and handles AVX instructions more efficiently, allowing cores not running these tasks to remain at full speed. The Register has a great breakdown of the other new features which these Xeons can provide.
"These chips follow up 2014’s Xeon E5 v3 parts, which used a 22nm process size and the Haswell micro-architecture. Intel shrunk Haswell to 14nm, and after some tinkering, codenamed the resulting design Broadwell."
Here is some more Tech News from around the web:
- Google Updates: beef, Trump and Snoop @ The Inquirer
- OS 9.3.1 arrives with fix for link-crashing glitch @ The Inquirer
- Reddit's warrant canary shuffles off this mortal coil @ The Register
- HPE adds power-fail-protected NVDIMM tech to servers @ The Register
- Stacker NAND Technology TRIPLES Flash Capacity @ TechARP
- BlackBerry Priv will get Marshmallow in May, but sales remain a mystery @ The Inquirer
- Nvidia Shield Android TV @ Kitguru
- Impressive StarCraft 2 AI More Fair to Fleshy Opponents @ Hack a Day
A new fighter has entered the ring
When EVGA showed me that it was entering the world of gaming notebooks at CES in January, I must admit, I questioned the move. A company that, at one point, only built and distributed graphics cards based on NVIDIA GeForce GPUs had moved to mice, power supplies, tablets (remember that?) and even cases, was going to get into the cutthroat world of notebooks. But I was promised that EVGA had an angle; it would not be cutting any corners in order to bring a truly competitive and aggressive product to the market.
Just a couple of short months later (seriously, is it the end of March already?) EVGA presented us with a shiny new SC17 Gaming Notebook to review. It’s thinner than you might expect, heavier than I would prefer and packs some impressive compute power, along with unique features and overclocking capability, that will put it on your short list of portable gaming rigs for 2016.
Let’s start with a dive into the spec table and then go from there.
|EVGA SC17 Specifications|
|Processor||Intel Core i7-6820HK|
|Memory||32GB G.Skill DDR4-2666|
|Graphics Card||GeForce GTX 980M 8GB|
|Storage||256GB M.2 NVMe PCIe SSD
1TB 7200 RPM SATA 6G HDD
|Display||Sharp 17.3 inch UDH 4K with matte finish|
|Connectivity||Intel 219-V Gigabit Ethernet
Intel AC-8260 802.11ac
2x USB 3.0 Type-A
1x USB 3.1 Type-C
|Audio||Realtek ALC 255
|Video||1x HDMI 1.4
2x mini DisplayPort (1x G-Sync support)
|Dimensions||16-in x 11.6-in x 1.05-in|
|OS||Windows 10 Home|
With a price tag of $2,699, EVGA owes you a lot – and it delivers! The processor of choice is the Intel Core i7-6820HK, an unlocked, quad-core, HyperThreaded processor that brings desktop class computing capability to a notebook. The base clock speed is 2.7 GHz but the Turbo clock reaches as high as 3.6 GHz out of the box, supplying games, rendering programs and video editors plenty of horsepower for production on the go. And don’t forget that this is one of the first unlocked processors from Intel for mobile computing – multipliers and voltages can all be tweaked in the UEFI or through Precision X Mobile software to push it even further.
Based on EVGA’s relationship with NVIDIA, it should surprise exactly zero people that a mobile GeForce GPU is found inside the SC17. The GTX 980M is based on the Maxwell 2.0 design and falls slightly under the desktop consumer class GeForce GTX 970 card in CUDA core count and clock speed. With 1536 CUDA cores and a 1038 MHz base clock, with boost capability, the discrete graphics will have enough juice for most games at very high image quality settings. EVGA has configured the GPU with 8GB of GDDR5 memory, more than any desktop GTX 970… so there’s that. Obviously, it would have been great to see the full powered GTX 980 in the SC17, but that would have required changes to the thermal design, chassis and power delivery.
Subject: Mobile | March 23, 2016 - 12:32 PM | Jeremy Hellstrom
Tagged: surface, surface book, tablet, Skylake, notebook, microsoft, Intel
The Register is not exaggerating in the quote below, the new Microsoft Surface Book ranges from $1500-$3200 depending on the model you chose, passing even the overpriced Chromebook Pixel by quite a sum of money. For that price you get a 3200x2000 (267ppi) 13.5" display on a tablet which weighs 3.34lbs (1.5kg), the detachable keyboard with an optional Nvidia GPU and an extra battery as well as a Surface pen. If you want the dock which adds more connectivity options, well that is another $200 and seeing as how there is only two USB3.0 ports, a single MiniDP and an SD card reader on the keyboard you are likely to want it.
Certainly The Register liked the looks, design and power of this ultrabook but with the competition, up to and including Apple, offering similar products at half the price it is a hard sell in the end. Ryan expressed a similar opinion when he reveiwed the Surface Book.
"Sumptuous and slightly absurd, Microsoft's Surface Book is the most expensive laptop you can get, short of ordering a 24-carat custom gold plated jobbie."
Here are some more Mobile articles from around the web:
More Mobile Articles
- Microsoft Surface Book @ The Inquirer
- Dell XPS 15 @ Kitguru
- SilverStone Reversible Phone Charging & Data Cord @ [H]ard|OCP
- Razer Nabu Watch Review @ Hardware Canucks
- ASUS ZenPad 7.0 @ Tech ARP
Subject: Processors | March 22, 2016 - 05:08 PM | Ryan Shrout
Tagged: Intel, tick tock, tick-tock, process technology, kaby lake
It should come as little surprise to our readers that have followed news about Kaby Lake, Intel's extension of the Skylake architecture that officially broke nearly a decade of tick-tock processor design. With tick-tock, Intel would iterate in subsequent years between a new processor microarchitecture (Sandy Bridge, Ivy Bridge, etc.) and a new process technology (45nm, 32nm, 22nm, etc.). According to this story over at Fool.com, Intel's officially ending that pattern of production.
From the company's latest K-10 filing:
"We expect to lengthen the amount of time we will utilize our 14 [nanometer] and our next-generation 10 [nanometer] process technologies, further optimizing our products and process technologies while meeting the yearly market cadence for product introductions."
It is likely that that graphic above that showcases the changes from Tick-Tock to what is going on now isn't "to scale" and we may see more than three steps in each iteration along the way. Intel still believes that it has and will continue to have the best process technology in the world and that its processors will benefit.
Continuing further, the company indicates that "this competitive advantage will be extended in the future as the costs to build leading-edge fabrication facilities increase, and as fewer semiconductor companies will be able to leverage platform design and manufacturing."
Kaby Lake details leaking out...
As Scott pointed out in our discussions about this news, it might mean consumers will see advantages in longer socket compatibility going forward though I would still see this as a net-negative for technology. As process technology improvements slow down, either due to complexity or lack of competition in the market, we will see less innovation in key areas of performance and power consumption.
Introduction and First Impressions
The CRYORIG C7 is a compact air cooler for Intel and processors, designed to fit anywhere a stock solution will. Standing just 47 mm tall, and featuring a footprint close in size to an Intel stock cooler, CRYORIG claims this ultra-compact design will still outperform the stock solution.
An attractive design, the C7 is further sweetened by a $29.99 retail, which places it in a favorable position in the compact CPU cooler market. Designs like these are rarely useful for enthusiasts, but there it certainly a need for good aftermarket options when overclocking isn't a consideration. There was a time when the stock Intel cooler was sufficient for many basic builds, and for some that may still be the case. But if you've spent a little more to get higher performance, a better heatsink can certainly help; and if you're an enthusiast, the stock cooler was never adequate anyway (even before Intel stopped shipping it in K series CPUs).
In this review we'll find out if this small cooler can deliver on its performance promise, and see just how much noise it might make in the process.
Subject: General Tech | March 17, 2016 - 11:07 PM | Ken Addison
Tagged: podcast, video, amd, XConnect, gdc 2016, Vega, Polaris, navi, razer blade, Sulon Q, Oculus, vive, raja koduri, GTX 1080, msi, vortex, Intel, skulltrail, nuc
PC Perspective Podcast #391 - 03/17/2016
Join us this week as we discuss the AMD's news from GDC, the MSI Vortex, and Q&A!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store (audio only)
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:28:26
News items of interest:
Hardware/Software Picks of the Week
Jeremy: QLEDs are real!
Subject: Shows and Expos | March 16, 2016 - 09:00 PM | Jeremy Hellstrom
Tagged: skulltrail, Skull Canyon, nuc, Intel, GDC
No we are not talking about the motherboard from 2008 which was going to compete with AMD's QuadFX platform and worked out just as well. We are talking about a brand new Skull Canyon NUC powered by an i7-6770HQ with Iris Pro 580 graphics and up to 32GB of DDR4-2133. The NUC NUC6i7KYK will also be the first system we have seen with a fully capable USB Type-C port, it will offer Thunderbolt 3, USB 3.1 and DisplayPort 1.2 connectivity; not simultaneously but the flexibility is nothing less than impressive. It will also sport a full-size HDMI 2.0 port and Mini DisplayPort 1.2 outputs so you can still send video while using the Type C port for data transfer. The port will also support external graphics card enclosures if you plan on using this as a gaming machine as well.
The internal storage subsystem is equally impressive, dual M.2 slots will give you great performance, the SD card slot not so much but still a handy feature. Connectivity is supplied by Intel Dual Band Wireless-AC 8260 (802.11 ac) and Bluetooth 4.2 and an infrared sensor will let you use your favourite remote control if you set up the Skulltrail NUC as a media server. All of these features are in a device less than 0.7 litres in size, with your choice of two covers and support for your own if you desire to personalize your system. The price is not unreasonable, the MSRP for a barebones system is $650, one with 16GB memory, 256GB SSD and Windows 10 should retail for about $1000. You can expect to see these for sale on NewEgg in April to ship in May.
Subject: Motherboards | March 16, 2016 - 02:43 PM | Jeremy Hellstrom
Tagged: Intel, gigabyte, X99P-SLI, LGA2011-v3
X99 based systems do not come cheap, with some boards costing well over $300 and very few under $200 the X99P-SLI could be considered mid-range. The board doesn't skimp on a lot at this price either, an M.2 slot, a pair of USB 3.1 ports, OP-AMP based onboard sound, a conveniently placed header for USB 3.0 on your front panel and yes, it does have a single SEx port. Hardware Canucks breaks down how the PCIe slots are shared and many other of the boards features in their review which you should check out, the board was determined to be a Dam Good Value.
"The X99 platform may not be known for affordability but Gigabyte's new X99P-SLI aims to change that opinion with USB 3.1, M.2, great overclocking, quad GPU support and more for less than $250."
Here are some more Motherboard articles from around the web:
- Maximus VIII Formula LGA 1151 @ [H]ard|OCP
- ASRock E3V5 WS @ Kitguru
- ASRock Fatal1ty Z170 Gaming-ITX/AC @ Modders-Inc
- Supermicro X11SAE Workstation @ eTeknix
- ASUS B150 PRO GAMING/AURA @ eTeknix
- Gigabyte 990FX-Gaming @ Kitguru
Things are about to get...complicated
Earlier this week, the team behind Ashes of the Singularity released an updated version of its early access game, which updated its features and capabilities. With support for DirectX 11 and DirectX 12, and adding in multiple graphics card support, the game featured a benchmark mode that got quite a lot of attention. We saw stories based on that software posted by Anandtech, Guru3D and ExtremeTech, all of which had varying views on the advantages of one GPU or another.
That isn’t the focus of my editorial here today, though.
Shortly after the initial release, a discussion began around results from the Guru3D story that measured frame time consistency and smoothness with FCAT, a capture based testing methodology much like the Frame Rating process we have here at PC Perspective. In that post on ExtremeTech, Joel Hruska claims that the results and conclusion from Guru3D are wrong because the FCAT capture methods make assumptions on the output matching what the user experience feels like. Maybe everyone is wrong?
First a bit of background: I have been working with Oxide and the Ashes of the Singularity benchmark for a couple of weeks, hoping to get a story that I was happy with and felt was complete, before having to head out the door to Barcelona for the Mobile World Congress. That didn’t happen – such is life with an 8-month old. But, in my time with the benchmark, I found a couple of things that were very interesting, even concerning, that I was working through with the developers.
FCAT overlay as part of the Ashes benchmark
First, the initial implementation of the FCAT overlay, which Oxide should be PRAISED for including since we don’t have and likely won’t have a DX12 universal variant of, was implemented incorrectly, with duplication of color swatches that made the results from capture-based testing inaccurate. I don’t know if Guru3D used that version to do its FCAT testing, but I was able to get some updated EXEs of the game through the developer in order to the overlay working correctly. Once that was corrected, I found yet another problem: an issue of frame presentation order on NVIDIA GPUs that likely has to do with asynchronous shaders. Whether that issue is on the NVIDIA driver side or the game engine side is still being investigated by Oxide, but it’s interesting to note that this problem couldn’t have been found without a proper FCAT implementation.
With all of that under the bridge, I set out to benchmark this latest version of Ashes and DX12 to measure performance across a range of AMD and NVIDIA hardware. The data showed some abnormalities, though. Some results just didn’t make sense in the context of what I was seeing in the game and what the overlay results were indicating. It appeared that Vsync (vertical sync) was working differently than I had seen with any other game on the PC.
For the NVIDIA platform, tested using a GTX 980 Ti, the game seemingly randomly starts up with Vsync on or off, with no clear indicator of what was causing it, despite the in-game settings being set how I wanted them. But the Frame Rating capture data was still working as I expected – just because Vsync is enabled doesn’t mean you can look at the results in capture formats. I have written stories on what Vsync enabled captured data looks like and what it means as far back as April 2013. Obviously, to get the best and most relevant data from Frame Rating, setting vertical sync off is ideal. Running into more frustration than answers, I moved over to an AMD platform.