Author:
Subject: Mobile
Manufacturer: EVGA

A new fighter has entered the ring

When EVGA showed me that it was entering the world of gaming notebooks at CES in January, I must admit, I questioned the move. A company that, at one point, only built and distributed graphics cards based on NVIDIA GeForce GPUs had moved to mice, power supplies, tablets (remember that?) and even cases, was going to get into the cutthroat world of notebooks. But I was promised that EVGA had an angle; it would not be cutting any corners in order to bring a truly competitive and aggressive product to the market.

06.jpg

Just a couple of short months later (seriously, is it the end of March already?) EVGA presented us with a shiny new SC17 Gaming Notebook to review. It’s thinner than you might expect, heavier than I would prefer and packs some impressive compute power, along with unique features and overclocking capability, that will put it on your short list of portable gaming rigs for 2016.

Let’s start with a dive into the spec table and then go from there.

  EVGA SC17 Specifications
Processor Intel Core i7-6820HK
Memory 32GB G.Skill DDR4-2666
Graphics Card GeForce GTX 980M 8GB
Storage 256GB M.2 NVMe PCIe SSD
1TB 7200 RPM SATA 6G HDD
Display Sharp 17.3 inch UDH 4K with matte finish
Connectivity Intel 219-V Gigabit Ethernet
Intel AC-8260 802.11ac
Bluetooth 4.2
2x USB 3.0 Type-A
1x USB 3.1 Type-C
Audio Realtek ALC 255
Integrated Subwoofer
Video 1x HDMI 1.4
2x mini DisplayPort (1x G-Sync support)
Dimensions 16-in x 11.6-in x 1.05-in
OS Windows 10 Home
MSRP $2,699

With a price tag of $2,699, EVGA owes you a lot – and it delivers! The processor of choice is the Intel Core i7-6820HK, an unlocked, quad-core, HyperThreaded processor that brings desktop class computing capability to a notebook. The base clock speed is 2.7 GHz but the Turbo clock reaches as high as 3.6 GHz out of the box, supplying games, rendering programs and video editors plenty of horsepower for production on the go. And don’t forget that this is one of the first unlocked processors from Intel for mobile computing – multipliers and voltages can all be tweaked in the UEFI or through Precision X Mobile software to push it even further.

Based on EVGA’s relationship with NVIDIA, it should surprise exactly zero people that a mobile GeForce GPU is found inside the SC17. The GTX 980M is based on the Maxwell 2.0 design and falls slightly under the desktop consumer class GeForce GTX 970 card in CUDA core count and clock speed. With 1536 CUDA cores and a 1038 MHz base clock, with boost capability, the discrete graphics will have enough juice for most games at very high image quality settings. EVGA has configured the GPU with 8GB of GDDR5 memory, more than any desktop GTX 970… so there’s that. Obviously, it would have been great to see the full powered GTX 980 in the SC17, but that would have required changes to the thermal design, chassis and power delivery.

Continue reading our review of the EVGA SC17 gaming notebook!!

Microsoft Surface Book, taking the prize for most expensive laptop and running with it

Subject: Mobile | March 23, 2016 - 12:32 PM |
Tagged: surface, surface book, tablet, Skylake, notebook, microsoft, Intel

The Register is not exaggerating in the quote below, the new Microsoft Surface Book ranges from $1500-$3200 depending on the model you chose, passing even the overpriced Chromebook Pixel by quite a sum of money.  For that price you get a 3200x2000 (267ppi) 13.5" display on a tablet which weighs 3.34lbs (1.5kg), the detachable keyboard with an optional Nvidia GPU and an extra battery as well as a Surface pen.  If you want the dock which adds more connectivity options, well that is another $200 and seeing as how there is only two USB3.0 ports, a single MiniDP and an SD card reader on the keyboard you are likely to want it. 

Certainly The Register liked the looks, design and power of this ultrabook but with the competition, up to and including Apple, offering similar products at half the price it is a hard sell in the end.  Ryan expressed a similar opinion when he reveiwed the Surface Book.

surface_book_product_detached.png

"Sumptuous and slightly absurd, Microsoft's Surface Book is the most expensive laptop you can get, short of ordering a 24-carat custom gold plated jobbie."

Here are some more Mobile articles from around the web:

More Mobile Articles

 

Source: The Register

Intel officially ends the era of "tick-tock" processor production

Subject: Processors | March 22, 2016 - 05:08 PM |
Tagged: Intel, tick tock, tick-tock, process technology, kaby lake

It should come as little surprise to our readers that have followed news about Kaby Lake, Intel's extension of the Skylake architecture that officially broke nearly a decade of tick-tock processor design. With tick-tock, Intel would iterate in subsequent years between a new processor microarchitecture (Sandy Bridge, Ivy Bridge, etc.) and a new process technology (45nm, 32nm, 22nm, etc.). According to this story over at Fool.com, Intel's officially ending that pattern of production.

From the company's latest K-10 filing:

"We expect to lengthen the amount of time we will utilize our 14 [nanometer] and our next-generation 10 [nanometer] process technologies, further optimizing our products and process technologies while meeting the yearly market cadence for product introductions."

ticktockout.JPG

It is likely that that graphic above that showcases the changes from Tick-Tock to what is going on now isn't "to scale" and we may see more than three steps in each iteration along the way. Intel still believes that it has and will continue to have the best process technology in the world and that its processors will benefit.

Continuing further, the company indicates that "this competitive advantage will be extended in the future as the costs to build leading-edge fabrication facilities increase, and as fewer semiconductor companies will be able to leverage platform design and manufacturing."

intel-roadmap-5q-002-1920x1080.jpg

Kaby Lake details leaking out...

As Scott pointed out in our discussions about this news, it might mean consumers will see advantages in longer socket compatibility going forward though I would still see this as a net-negative for technology. As process technology improvements slow down, either due to complexity or lack of competition in the market, we will see less innovation in key areas of performance and power consumption. 

Source: Fool.com
Manufacturer: CRYORIG

Introduction and First Impressions

The CRYORIG C7 is a compact air cooler for Intel and processors, designed to fit anywhere a stock solution will. Standing just 47 mm tall, and featuring a footprint close in size to an Intel stock cooler, CRYORIG claims this ultra-compact design will still outperform the stock solution.

c7_01.jpg

An attractive design, the C7 is further sweetened by a $29.99 retail, which places it in a favorable position in the compact CPU cooler market. Designs like these are rarely useful for enthusiasts, but there it certainly a need for good aftermarket options when overclocking isn't a consideration. There was a time when the stock Intel cooler was sufficient for many basic builds, and for some that may still be the case. But if you've spent a little more to get higher performance, a better heatsink can certainly help; and if you're an enthusiast, the stock cooler was never adequate anyway (even before Intel stopped shipping it in K series CPUs).

c7_02.jpg

In this review we'll find out if this small cooler can deliver on its performance promise, and see just how much noise it might make in the process.

Continue reading our review of the CRYORIG C7 Ultra-Compact CPU Cooler!!

Podcast #391 - AMD's news from GDC, the MSI Vortex, and Q&A!

Subject: General Tech | March 17, 2016 - 11:07 PM |
Tagged: podcast, video, amd, XConnect, gdc 2016, Vega, Polaris, navi, razer blade, Sulon Q, Oculus, vive, raja koduri, GTX 1080, msi, vortex, Intel, skulltrail, nuc

PC Perspective Podcast #391 - 03/17/2016

Join us this week as we discuss the AMD's news from GDC, the MSI Vortex, and Q&A!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Meet the new Intel Skulltrail NUC; Changing the Game

Subject: Shows and Expos | March 16, 2016 - 09:00 PM |
Tagged: skulltrail, Skull Canyon, nuc, Intel, GDC

board.jpg

No we are not talking about the motherboard from 2008 which was going to compete with AMD's QuadFX platform and worked out just as well.  We are talking about a brand new Skull Canyon NUC powered by an i7-6770HQ with Iris Pro 580 graphics and up to 32GB of DDR4-2133.  The NUC NUC6i7KYK will also be the first system we have seen with a fully capable USB Type-C port, it will offer Thunderbolt 3, USB 3.1 and DisplayPort 1.2 connectivity; not simultaneously but the flexibility is nothing less than impressive.  It will also sport a full-size HDMI 2.0 port and Mini DisplayPort 1.2 outputs so you can still send video while using the Type C port for data transfer.  The port will also support external graphics card enclosures if you plan on using this as a gaming machine as well.

SkullCanyon-NUC-extremeangle1-wht.jpg

The internal storage subsystem is equally impressive, dual M.2 slots will give you great performance, the SD card slot not so much but still a handy feature.  Connectivity is supplied by Intel Dual Band Wireless-AC 8260 (802.11 ac) and Bluetooth 4.2 and an infrared sensor will let you use your favourite remote control if you set up the Skulltrail NUC as a media server.  All of these features are in a device less than 0.7 litres in size, with your choice of two covers and support for your own if you desire to personalize your system.  The price is not unreasonable, the MSRP for a barebones system is $650, one with 16GB memory, 256GB SSD and Windows 10 should retail for about $1000.  You can expect to see these for sale on NewEgg in April to ship in May.

All this and more can be found on Intel's news room, and you can click here for the full system specs.

Source: Intel

Gigabyte's X99P-SLI; does $230 count as mid-range?

Subject: Motherboards | March 16, 2016 - 02:43 PM |
Tagged: Intel, gigabyte, X99P-SLI, LGA2011-v3

X99 based systems do not come cheap, with some boards costing well over $300 and very few under $200 the X99P-SLI could be considered mid-range.  The board doesn't skimp on a lot at this price either, an M.2 slot, a pair of USB 3.1 ports, OP-AMP based onboard sound, a conveniently placed header for USB 3.0 on your front panel and yes, it does have a single SEx port.  Hardware Canucks breaks down how the PCIe slots are shared and many other of the boards features in their review which you should check out, the board was determined to be a Dam Good Value.

info_chart.jpg

"The X99 platform may not be known for affordability but Gigabyte's new X99P-SLI aims to change that opinion with USB 3.1, M.2, great overclocking, quad GPU support and more for less than $250."

Here are some more Motherboard articles from around the web:

Motherboards

Author:
Manufacturer: Microsoft

Things are about to get...complicated

Earlier this week, the team behind Ashes of the Singularity released an updated version of its early access game, which updated its features and capabilities. With support for DirectX 11 and DirectX 12, and adding in multiple graphics card support, the game featured a benchmark mode that got quite a lot of attention. We saw stories based on that software posted by Anandtech, Guru3D and ExtremeTech, all of which had varying views on the advantages of one GPU or another.

That isn’t the focus of my editorial here today, though.

Shortly after the initial release, a discussion began around results from the Guru3D story that measured frame time consistency and smoothness with FCAT, a capture based testing methodology much like the Frame Rating process we have here at PC Perspective. In that post on ExtremeTech, Joel Hruska claims that the results and conclusion from Guru3D are wrong because the FCAT capture methods make assumptions on the output matching what the user experience feels like.  Maybe everyone is wrong?

First a bit of background: I have been working with Oxide and the Ashes of the Singularity benchmark for a couple of weeks, hoping to get a story that I was happy with and felt was complete, before having to head out the door to Barcelona for the Mobile World Congress. That didn’t happen – such is life with an 8-month old. But, in my time with the benchmark, I found a couple of things that were very interesting, even concerning, that I was working through with the developers.

vsyncoff.jpg

FCAT overlay as part of the Ashes benchmark

First, the initial implementation of the FCAT overlay, which Oxide should be PRAISED for including since we don’t have and likely won’t have a DX12 universal variant of, was implemented incorrectly, with duplication of color swatches that made the results from capture-based testing inaccurate. I don’t know if Guru3D used that version to do its FCAT testing, but I was able to get some updated EXEs of the game through the developer in order to the overlay working correctly. Once that was corrected, I found yet another problem: an issue of frame presentation order on NVIDIA GPUs that likely has to do with asynchronous shaders. Whether that issue is on the NVIDIA driver side or the game engine side is still being investigated by Oxide, but it’s interesting to note that this problem couldn’t have been found without a proper FCAT implementation.

With all of that under the bridge, I set out to benchmark this latest version of Ashes and DX12 to measure performance across a range of AMD and NVIDIA hardware. The data showed some abnormalities, though. Some results just didn’t make sense in the context of what I was seeing in the game and what the overlay results were indicating. It appeared that Vsync (vertical sync) was working differently than I had seen with any other game on the PC.

For the NVIDIA platform, tested using a GTX 980 Ti, the game seemingly randomly starts up with Vsync on or off, with no clear indicator of what was causing it, despite the in-game settings being set how I wanted them. But the Frame Rating capture data was still working as I expected – just because Vsync is enabled doesn’t mean you can look at the results in capture formats. I have written stories on what Vsync enabled captured data looks like and what it means as far back as April 2013. Obviously, to get the best and most relevant data from Frame Rating, setting vertical sync off is ideal. Running into more frustration than answers, I moved over to an AMD platform.

Continue reading PC Gaming Shakeup: Ashes of the Singularity, DX12 and the Microsoft Store!!

Video Perspective: Huawei MateBook 2-in-1 Hands-on from MWC 2016

Subject: Systems, Mobile | February 25, 2016 - 11:42 AM |
Tagged: MWC, MWC 2016, Huawei, matebook, Intel, core m, Skylake, 2-in-1

Huawei is getting into the PC business with the MateBook 2-in-1, built in the same vein as the Microsoft Surface Pro 4. Can they make a splash with impressive hardware and Intel Core m processors?

Intel insists their clock is still running

Subject: General Tech | February 19, 2016 - 01:34 PM |
Tagged: Intel, delay, 10nm

Today Intel has insisted that the rumours of a further delay in their scheduled move to a 10nm process are greatly exaggerated.  They had originally hoped to make this move in the latter half of this year but difficulties in the design process moved that target into 2017.  They have assured The Inquirer and others that the speculation, based on information in a job vacancy posting, is inaccurate and that the they still plan on releasing processors built on a 10nm node by the end of next year.  You can still expect Kaby Lake before the end of the year and Intel also claims to have found promising techniques to shrink their processors below 10nm in the future,

intel_10nm_panel2-Copy.png

"INTEL HAS moved to quash speculation that its first 10nm chips could be pushed back even further than the second half of 2017, after already delaying them from this year."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer