Author:
Subject: Mobile
Manufacturer: MSI

Overview

Thanks goes out to CUK, Computer Upgrade King, for supplying the MSI GS63VR notebook for our testing and evaluation

It's been a few weeks since we took a look at our first gaming notebook with NVIDIA's Max-Q design, the ASUS ROG Zephyrus. We briefly touched on the broad array of announced Max-Q Notebooks on that review, and today we are taking a look at the MSI GS63VR Stealth Pro.

IMG_4857.JPG

One of the first notebooks to feature the GTX 1070 with Max-Q Design, the MSI GS63VR is a more traditional notebook form factor than the GTX 1080-toting ASUS ROG Zephyrus. In fact, the GS series has been a long running line of thin-and-light gaming notebooks from MSI. What is new though is the avalability of a GTX 1070-class option in this chassis. The GS63VR previously topped out with the GTX 1060 as the highest end option.

MSI GS63VR Stealth Pro-002  (configuration as reviewed)
Processor Intel Core i7-7700HQ (Kaby Lake)
Graphics NVIDIA Geforce GTX 1070 with Max-Q Design (8GB)
Memory 32GB DDR4
Screen 15.6-in 1920x1080 120Hz
Storage

512GB Samsung PM871a M.2 SATA SSD

1TB Seagate 5400RPM HDD

Camera 1080p
Wireless Intel 8265 802.11ac (2x2) + BT 4.1
Connections

Ethernet
HDMI 2.0
3x USB 3.0
Thunderbolt 3
Mini DisplayPort
1x USB 2.0
Audio combo jack

Battery 57 Wh
Dimensions 379.98mm x 248.92mm x17.53mm (14.96" x 9.80" x 0.69")
3.96 lbs. (1792 g)
OS Windows 10 Pro
Price $2399 - Newegg.com CUKUSA

Taking a look a look at the exact notebook configuration we are testing, we find a well-equipped gaming notebook. In addition to the GTX 1070 Max-Q, we find a 35W Quad-Core mobile CPU from Intel, 32GB of system RAM, and plentiful storage options including both M.2 SSD and traditional 2.5" SATA drive configurations. This specific notebook is equipped with a SATA M.2 SSD, but this notebook will also support PCIe devices with the same M.2 port.

Continue reading our review of the MSI GS63VR Gaming Notebook!

Author:
Manufacturer: NVIDIA

Can you hear me now?

One of the more significant downsides to modern gaming notebooks is noise. These devices normally have small fans that have to spin quickly to cool the high-performance components found inside. While the answer for loud gaming desktops might be a nice set of headphones, for notebooks that may be used in more public spaces, that's not necessarily a good solution for friends or loved ones.

Attempting to address the problem of loud gaming notebooks, NVIDIA released a technology called WhisperMode. WhisperMode launched alongside NVIDIA's Max-Q design notebooks earlier this year, but it will work with any notebook enabled with an NVIDIA GTX 1060 or higher. This software solution aims to limit noise and power consumption of notebooks by restricting the frame rate of your game to a reasonable compromise of performance, noise, and power levels. NVIDIA has profiled over 400 games to find this sweet spot and added profiles for those games to WhisperMode technology.

WhisperMode is enabled through the NVIDIA GeForce Experience application.

GFE-whisper.PNG

From GFE, you can also choose to "Optimize games for WhisperMode." This will automatically adjust settings (in-game) to complement the frame rate target control of WhisperMode.

NVCP_whisper.PNG

If you want to adjust the Frame Rate Target, that must be done in the traditional NVIDIA Control Panel and is done on a per app basis. The target can be set at intervals of 5 FPS from 30 to the maximum refresh of your display. Having to go between two pieces of software to tweak these settings seems overly complex and hopefully some upcoming revamp of the NVIDIA software stack might address this user interface falacy. 

To put WhisperMode through its paces, we tried it on two notebooks - one with a GTX 1070 Max-Q (the MSI GS63VR) and one with a GTX 1080 Max-Q (the ASUS ROG Zephyrus). Our testing consisted of two games, Metro: Last Light and Hitman. Both of these games were run for 15 minutes to get the system up to temperature and achieve sound measurements that are more realistic to extended gameplay sessions. Sound levels were measured with our Extech 407739 Sound Level Meter placed at a distance of 6 inches from the given notebooks, above the keyboard and offset to the right.

Continue reading our review of the new NVIDIA WhisperMode technology!

Author:
Subject: Mobile
Manufacturer: Various

Specifications

In the original premise for today’s story, I had planned to do a standard and straight-forward review of the iPad Pro 10.5-inch model, the latest addition to Apple’s line of tablet devices. After receiving the 12.9-in variant, with the same processor upgrade but a larger and much more substantial screen, I started using them both as my daily-driver computing device. I was surprised at how well both handled the majority of tasks I tossed their way but there was still some lingering doubt in my mind about the usefulness of the iOS system as it exists today for my purposes.

The next step was for me to acquire an equivalent Windows 10-based tablet and try making THAT my everyday computer and see how my experiences changed. I picked up the new Surface Pro (2017) model that was priced nearly identical to the iPad Pro 12.9-in device. That did mean sacrificing some specifications that I would usually not do, including moving down to 4GB of memory and a 128GB SSD. This brought the total of the iPad Pro + Pencil + keyboard within $90 of the Surface Pro and matching accessories.

IMG_4814.JPG

I should mention at the outset that with the pending release of iOS 11 due in the fall, the Apple iPad Pro line could undergo enough of a platform upgrade to change some of the points in this story. At that time, we can reevaluate our stance and conclusions.

Specifications

Let’s start our editorial with a comparison of the hardware being tested in the specification department. Knowing that we are looking two ARM-based devices and an x86 system, we should realize core counts, clocks, and the like are even less comparable and relatable than in the Intel/AMD debates. However, it does give us a good bearing on how the hardware landscape looks when we get into the benchmarking section of this story.

Surface Pro (2017) vs. iPad Pro (2017) Comparison
Processor Intel Core i5-7300U (Kaby Lake)
2-core/4-thread
Apple A10X
(3x high performance Hurrican, 3x high efficiency Zephyr cores)
Graphics Intel HD Graphics 620 12-core Custom PowerVR
Memory 4GB 4GB
Screen 12.3-in 2736x1824 IPS 12.9-in 2732x2048 IPS 120 Hz
10.5-in 2224x1668 IPS 120 Hz
Storage

128GB SSD

256GB SSD
Camera 5MP Front
8MP Rear
7MP Front
12MP Rear + OIS
Wireless 802.11ac 802.11ac
Connections USB 3.0
Mini DisplayPort
Headphone
Lightning
Headphone
Battery 45 Wh 12.9-in: 41 Wh
10.5-in: 30.4 Wh
Dimensions 11.50-in x 7.93-in x 0.33-in 12.9-in: 12.04-in x 8.69-in x 0.27-in
10.5-in: 9.87-in x 6.85-in x 0.24-in
OS Windows 10 iOS 10
Price $999 - Amazon.com 12.9-in: $899
10.5-in: $749 - Amazon.com

Continue reading our comparison of the 2017 Surface Pro and iPad Pro!

Author:
Subject: Mobile
Manufacturer: ASUS

Overview

A few months ago at Computex, NVIDIA announced their "GeForce GTX with Max-Q Design" initiative. Essentially, the heart of this program is the use of specifically binned GTX 1080, 1070 and 1060 GPUs. These GPUs have been tested and selected during the manufacturing process to ensure lower power draw at the same performance levels when compared to the GPUs used in more traditional form factors like desktop graphics cards.

slide1.png

In order to gain access to these "Max-Q" binned GPUs, notebook manufacturers have to meet specific NVIDIA guidelines on noise levels at thermal load (sub-40 dbA). To be clear, NVIDIA doesn't seem to be offering reference notebook designs (as demonstrated by the variability in design across the Max-Q notebooks) to partners, but rather ideas on how they can accomplish the given goals.

slide2.png

At the show, NVIDIA and some of their partners showed off several Max-Q notebooks. We hope to take a look at all of these machines in the coming weeks, but today we're focusing on one of the first, the ASUS ROG Zephyrus.

IMG_4744.JPG

ASUS ROG Zephyrus  (configuration as reviewed)
Processor Intel Core i7-7700HQ (Kaby Lake)
Graphics NVIDIA Geforce GTX 1080 with Max-Q Deseign (8GB)
Memory 24GB DDR4  (8GB Soldered + 8GBx2 DIMM)
Screen 15.6-in 1920x1080 120Hz G-SYNC 
Storage

512GB Samsung SM961 NVMe

Camera HD Webcam
Wireless 802.11ac
Connections Thunderbolt 3
HDMI 2.0
4 x USB 3.0
Audio combo jack
Power 50 Wh Battery, 230W AC Adapter
Dimensions 378.9mm x 261.9mm x 17.01-17.78mm (14.92" x 10.31" x 0.67"-0.70")
4.94 lbs. (2240.746 g)
OS Windows 10 Home
Price $2700 - Amazon.com

As you can see, the ASUS ROG Zephyrus has the specifications of a high-end gaming desktop, let alone a gaming notebook. In some gaming notebook designs, the bottleneck comes down to CPU horsepower more than GPU horsepower. That doesn't seem to be the case here. The powerful GTX 1080 GPU is paired with a quad-core HyperThread Intel processor capable of boosting up to 3.8 GHz. 

Continue reading our review of the ASUS Zephyrus Max-Q Gaming Notebook!

Author:
Subject: General Tech
Manufacturer: SILVIA

Intelligent Gaming

Kal Simpson recently had the chance to sit down and have an extensive interview with SILVIA's Chief Product Officer - Cognitive Code, Alex Mayberry.  SILVIA is a company that specializes on conversational AI that can be adapted to a variety of platforms and applications.  Kal's comments are in bold while Alex's are in italics.

SILVIA virtual assistant.jpg

Always good to speak with you Alex. Whether it's the latest Triple-A video game release or the progress being made in changing the way we play, virtual reality for instance – your views and developments within the gaming space as a whole remains impressive. Before we begin, I’d like to give the audience a brief flashback of your career history. Prominent within the video game industry you’ve been involved with many, many titles – primarily within the PC gaming space. Quake 2: The Reckoning, America’s Army, a plethora of World of Warcraft titles.

Those more familiar with your work know you as the lead game producer for Diablo 3 / Reaper of Souls, as well as the executive producer for Star Citizen. The former of which we spoke on during the release of the game for PC, PlayStation 4 and the Xbox One, back in 2014.

So I ask, given your huge involvement with some of the most popular titles, what sparked your interest within the development of intelligent computing platforms? No-doubt the technology can be adapted to applications within gaming, but what’s the initial factor that drove you to Cognitive Code – the SILVIA technology?

AM: Conversational intelligence was something that I had never even thought about in terms of game development. My experience arguing with my Xbox and trying to get it to change my television channel left me pretty sceptical about the technology. But after leaving Star Citizen, my paths crossed with Leslie Spring, the CEO and Founder of Cognitive Code, and the creator of the SILVIA platform. Initially, Leslie was helping me out with some engineering work on VR projects I was spinning up. After collaborating for a bit, he introduced me to his AI, and I became intrigued by it. Although I was still very focused on VR at the time, my mind kept drifting to SILVIA.

I kept pestering Leslie with questions about the technology, and he continued to share some of the things that it could do. It was when I saw one of his game engine demos showing off a sci-fi world with freely conversant robots that the light went on in my head, and I suddenly got way more interested in artificial intelligence. At the same time, I was discovering challenges in VR that needed solutions. Not having a keyboard in VR creates an obstacle for capturing user input, and floating text in your field of view is really detrimental to the immersion of the experience. Also, when you have life-size characters in VR, you naturally want to speak to them. This is when I got interested in using SILVIA to introduce an entirely new mechanic to gaming and interactive entertainment. No more do we have to rely on conversation trees and scripted responses.

how-silvia-work1.jpg

No more do we have to read a wall of text from a quest giver. With this technology, we can have a realistic and free-form conversation with our game characters, and speak to them as if they are alive. This is such a powerful tool for interactive storytelling, and it will allow us to breathe life into virtual characters in a way that’s never before been possible. Seeing the opportunity in front of me, I joined up with Cognitive Code and have spent the last 18 months exploring how to design conversationally intelligent avatars. And I’ve been having a blast doing it.

Click here to continue reading the entire interview!

Subject: Mobile
Manufacturer: ASUS

Introduction and Specifications

The ZenBook 3 UX390UA is a 12.5-inch thin-and-light which offers a 1920x1080 IPS display, choice of 7th-generation Intel Core i5 or Core i7 processors, 16GB of DDR4 memory, and a roomy 512GB PCIe SSD. It also features just a single USB Type-C port, eschewing additional I/O in the vein of recent Apple MacBooks (more on this trend later in the review). How does it stack up? I had the pleasure of using it for a few weeks and can offer my own usage impressions (along with those ever-popular benchmark numbers) to try answering that question.

DSC_0144.jpg

A thin-and-light (a.k.a. ‘Ultrabook’) is certainly an attractive option when it comes to portability, and the ZenBook 3 certainly delivers with a slim 0.5-inch thickness and 2 lb weight from its aluminum frame. Another aspect of thin-and-light designs are the typically low-power processors, though the “U” series in Intel’s 7th-generation processor lineup still offer good performance numbers for portable machines. Looking at the spec sheet it is clear that ASUS paid attention to performance with this ZenBook, and we will see later on if a good balance has been struck between performance and battery life.

Our review unit was equipped with a Core i7-7500U processor, a 2-core/4-thread part with a 15W TDP and speeds ranging from 2.70 - 3.50 GHz, along with the above-mentioned 16GB of RAM and 512GB SSD. With an MSRP of $1599 for this configuration it faces some stiff competition from the likes of the Dell XPS line and recent Lenovo ThinkPad and Apple MacBook offerings, though it can of course be found for less than its MSRP (and this configuration currently sells on Amazon for about $1499). The ZenBook 3 certainly offers style if you are into blade-like aluminum designs, and, while not a touchscreen, nothing short of Gorilla Glass 4 was employed to protect the LCD display.

“ZenBook 3’s design took some serious engineering prowess and craftsmanship to realize. The ultra-thin 11.9mm profile meant we had to invent the world’s most compact laptop hinge — just 3mm high — to preserve its sleek lines. To fit in the full-size keyboard, we had to create a surround that’s just 2.1mm wide at the edges, and we designed the powerful four-speaker audio system in partnership with audiophile specialists Harman Kardon. ZenBook is renowned for its unique, stunning looks, and you’ll instantly recognize the iconic Zen-inspired spun-metal finish on ZenBook 3’s all-metal unibody enclosure — a finish that takes 40 painstaking steps to create. But we’ve added a beautiful twist, using a special 2-phase anodizing process to create stunning golden edge highlights. To complete this sophisticated new theme, we’ve added a unique gold ASUS logo and given the keyboard a matching gold backlight.”

DSC_0147.jpg

Continue reading our review of the ASUS ZenBook 3 UX390UA laptop!

Subject: Mobile
Manufacturer: Samsung

Introduction and Specifications

The Galaxy S8 Plus is Samsung's first ‘big’ phone since the Note7 fiasco, and just looking at it the design and engineering process seems to have paid off. Simply put, the GS8/GS8+ might just be the most striking handheld devices ever made. The U.S. version sports the newest and fastest Qualcomm platform with the Snapdragon 835, and the international version of the handset uses Samsung’s Exynos 8895 Octa SoC. We have the former on hand, and it was this MSM8998-powered version of the 6.2-inch GS8+ that I spent some quality time with over the past two weeks.

DSC_0836.jpg

There is more to a phone than its looks, and even in that department the Galaxy S8+ raises questions about durability with that large, curved glass screen. With the front and back panels wrapping around as they do the phone has a very slim, elegant look that feels fantastic in hand. And while one drop could easily ruin your day with any smartphone, this design is particularly unforgiving - and screen replacement costs with these new S8 phones are particularly high due to the difficulty in repairing the screen, and need to replace the AMOLED display along with the laminated glass.

Forgetting the fragility for a moment and just embracing the case-free lifestyle I was so tempted to adopt, lest I change the best in-hand feel I've had from a handset (and I didn't want to hide its knockout design, either), I got down to actually objectively assessing the phone's performance. This is the first production phone we have had on hand with the new Snapdragon 835 platform, and we will be able to draw some definitive performance conclusions compared to SoCs in other shipping phones.

DSC_0815.jpg

Samsung Galaxy S8+ Specifications (US Version)
Display 6.2-inch 1440x2960 AMOLED
SoC Qualcomm Snapdragon 835 (MSM8998)
CPU Cores 4x 2.45 GHz Kryo
4x 1.90 GHz Kryo
GPU Cores Adreno 540
RAM 4 / 6 GB LPDDR4 (6 GB with 128 GB storage option)
Storage 64 / 128 GB
Network Snapdragon X16 LTE
Connectivity 802.11ac Wi-Fi
2x2 MU-MIMO
Bluetooth 5.0; A2DP, aptX
USB 3.1 (Type-C)
NFC
Battery 3500 mAh Li-Ion
Dimensions 159.5 x 73.4 x 8.1 mm, 173 g
OS Android 7.0

Continue reading our review of the Samsung Galaxy S8+ smartphone!

Author:
Subject: Mobile
Manufacturer: Dell

Overview

Editor’s Note: After our review of the Dell XPS 13 2-in-1, Dell contacted us about our performance results. They found our numbers were significantly lower than their own internal benchmarks. They offered to send us a replacement notebook to test, and we have done so. After spending some time with the new unit we have seen much higher results, more in line with Dell’s performance claims. We haven’t been able to find any differences between our initial sample and the new notebook, and our old sample has been sent back to Dell for further analysis. Due to these changes, the performance results and conclusion of this review have been edited to reflect the higher performance results.

It's difficult to believe that it's only been a little over 2 years since we got our hands on the revised Dell XPS 13. Placing an emphasis on minimalistic design, large displays in small chassis, and high-quality construction, the Dell XPS 13 seems to have influenced the "thin and light" market in some noticeable ways.

IMG_4579.JPG

Aiming their sights at a slightly different corner of the market, this year Dell unveiled the XPS 13 2-in-1, a convertible tablet with a 360-degree hinge. However, instead of just putting a new hinge on the existing XPS 13, Dell has designed the all-new XPS 13 2-in-1 from the ground up to be even more "thin and light" than it's older sibling, which has meant some substantial design changes. 

Since we are a PC hardware-focused site, let's take a look under the hood to get an idea of what exactly we are talking about with the Dell XPS 13 2-in-1.

Dell XPS 13 2-in-1
MSRP $999 $1199 $1299 $1399
Screen 13.3” FHD (1920 x 1080) InfinityEdge touch display
CPU Core i5-7Y54 Core i7-7Y75
GPU Intel HD Graphics 615
RAM 4GB 8GB 16GB
Storage 128GB SATA 256GB PCIe
Network Intel 8265 802.11ac MIMO (2.4 GHz, 5.0 GHz)
Bluetooth 4.2
Display Output

1 x Thunderbolt 3
1 x USB 3.1 Type-C (DisplayPort)

Connectivity USB 3.0 Type-C
3.5mm headphone
USB 3.0 x 2 (MateDock)
Audio Dual Array Digital Microphone
Stereo Speakers (1W x 2)
Weight 2.7 lbs ( 1.24 kg)
Dimensions 11.98-in x 7.81-in x 0.32-0.54-in
(304mm x 199mm x 8 -13.7 mm)
Battery 46 WHr
Operating System Windows 10 Home / Pro (+$50)

One of the more striking design decisions from a hardware perspective is the decision to go with the low power Core i5-7Y54 processor, or as you may be familar with from it's older naming scheme, Core M. In the Kaby Lake generation, Intel has decided to drop the Core M branding (though oddly Core m3 still exists) and integrate these lower power parts into the regular Core branding scheme.

Click here to continue reading our review of the Dell XPS 13 2-in-1

Subject: Mobile
Manufacturer: Google

Introduction and Design

In case you have not heard by now, Pixel is the re-imagining of the Nexus phone concept by Google; a fully stock version of the Android experience on custom, Google-authorized hardware - and with the promise of the latest OS updates as they are released. So how does the hardware stack up? We are late into the life of the Pixel by now, and this is more of a long-term review as I have had the smaller version of the phone on hand for some weeks now. As a result I can offer my candid view of the less-covered of the two Pixel handsets (most reviews center around the Pixel XL), and its performance.

DSC_0186.jpg

There was always a certain cachet to owning a Nexus phone, and you could rest assured that you would be running the latest version of Android before anyone on operator-controlled hardware. The Nexus phones were sold primarily by Google, unlocked, with operator/retail availability at times during their run. Things took a turn when Google opted to offer a carrier-branded version of the Nexus 6 back in November of 2014, along with their usual unlocked Google Play store offering. But this departure was not just an issue of branding, as the price jumped to a full $649; the off-contract cost of premium handsets such as Apple’s iPhone. How could Google hope to compete in a space dominated by Apple and Samsung phones purchased by and large with operator subsidies and installment plans? They did not compete, of course, and the Nexus 6 flopped.

Pixel, coming after the Huawei-manufactured Nexus 6p and LG-manufactured Nexus 5X, drops the “Nexus” branding while continuing the tradition of a reference Android experience - and the more recent tradition of premium pricing. As we have seen in the months since its release, the Pixel did not put much of a dent into the Apple/Samsung dominated handset market. But even during the budget-friendly Nexus era, which offered a compelling mix of day-one Android OS update availability and inexpensive, unlocked hardware (think Nexus 4 at $299 and Nexus 5 at $349), Google's own phones were never mainstream. Still, in keeping with iPhone and Galaxy flagships $649 nets you a Pixel, which also launched through Verizon in an exclusive operator deal. Of course a larger version of the Pixel exists, and I would be remiss if I did not mention the Pixel XL. Unfortunately I would also be remiss if I didn't mention that stock for the XL has been quite low with availability constantly in question.

DSC_0169.jpg

The Pixel is hard to distinguish from an iPhone 7 from a distance (other than the home button)

Google Pixel Specifications
Display 5.0-inch 1080x1920 AMOLED
SoC Qualcomm Snapdragon 821 (MSM8996)
CPU Cores 2x 2.15 GHz Kryo
2x 1.60 GHz Kryo
GPU Cores Adreno 530
RAM 4GB LPDDR4
Storage 32 / 128 GB
Network Snapdragon X12 LTE
Connectivity 802.11ac Wi-Fi
2x2 MU-MIMO
Bluetooth 4.2
USB 3.0
NFC
Dimensions 143.8 x 69.5 x 8.5 mm, 143 g
OS Android 7.1

Continue reading our review of the Google Pixel smartphone!

Author:
Manufacturer: ARM

ARM Refreshes All the Things

This past April ARM invited us to visit Cambridge, England so they could discuss with us their plans for the next year.  Quite a bit has changed for the company since our last ARM Tech Day in 2016.  They were acquired by SoftBank, but continue to essentially operate as their own company.  They now have access to more funds, are less risk averse, and have a greater ability to expand in the ever growing mobile and IOT marketplaces.

dynamiq_01.png

The ARM of today certainly is quite different than what we had known 10 years ago when we saw their technology used in the first iPhone.  The company back then had good technology, but a relatively small head count.  They kept pace with the industry, but were not nearly as aggressive as other chip companies in some areas.  Through the past 10 years they have grown not only in numbers, but in technologies that they have constantly expanded on.  The company became more PR savvy and communicated more effectively with the press and in the end their primary users.  Where once ARM would announce new products and not expect to see shipping products upwards of 3 years away, we are now seeing the company be much more aggressive with their designs and getting them out to their partners so that production ends up happening in months as compared to years.

Several days of meetings and presentations left us a bit overwhelmed by what ARM is bringing to market towards the end of 2017 and most likely beginning of 2018.  On the surface it appears that ARM has only done a refresh of the CPU and GPU products, but once we start looking at these products in the greater scheme and how they interact with DynamIQ we see that ARM has changed the mobile computing landscape dramatically.  This new computing concept allows greater performance, flexibility, and efficiency in designs.  Partners will have far more control over these licensed products to create more value and differentiation as compared to years past.

dynamiq_02.png

We have previously covered DynamIQ at PCPer this past March.  ARM wanted to seed that concept before they jumped into more discussions on their latest CPUs and GPUs.  Previous Cortex products cannot be used with DynamIQ.  To leverage that technology we must have new CPU designs.  In this article we are covering the Cortex-A55 and Cortex-A75.  These two new CPUs on the surface look more like a refresh, but when we dig in we see that some massive changes have been wrought throughout.  ARM has taken the concepts of the previous A53 and A73 and expanded upon them fairly dramatically, not only to work with DynamIQ but also by removing significant bottlenecks that have impeded theoretical performance.

Continue reading our overview of the new family of ARM CPUs and GPU!

Author:
Subject: Systems, Mobile
Manufacturer: Apple

What have we here?

The latest iteration of the Apple MacBook Pro has been a polarizing topic to both Mac and PC enthusiasts. Replacing the aging Retina MacBook Pro introduced in 2012, the Apple MacBook Pro 13-inch with Touch Bar introduced late last year offered some radical design changes. After much debate (and a good Open Box deal), I decided to pick up one of these MacBooks to see if it could replace my 11" MacBook Air from 2013, which was certainly starting to show it's age.

DSC02852.JPG

I'm sure that a lot of our readers, even if they aren't Mac users, are familiar with some of the major changes the Apple made with this new MacBook Pro. One of the biggest changes comes when you take a look at the available connectivity on the machine. Gone are the ports you might expect like USB type-A, HDMI, and Mini DisplayPort. These ports have been replaced with 4 Thunderbolt 3 ports, and a single 3.5mm headphone jack.

While it seems like USB-C (which is compatible with Thunderbolt 3) is eventually posed to take over the peripheral market, there are obvious issues with replacing all of the connectivity on a machine aimed at professionals with type-c connectors. Currently, type-c devices are few and are between, meaning you will have to rely on a series of dongles to connect the devices you already own. 

DSC02860.JPG

I will say however, that it ultimately hasn't been that much of an issue for me so far in the limited time that I've owned this MacBook. In order to evaluate how bad the dongle issue was, I only purchased a single, simple adapter with my MacBook which provided me with a Type-A USB port and a pass-through Type-C port for charging.

Continue reading our look at using the MacBook Pro with Windows!

Author:
Subject: Mobile
Manufacturer: Dell

Overview

The Dell Inspiron 15 7000 Gaming series has been part of the increasingly interesting sub-$1000 gaming notebook market since it’s introduction in 2015. We took a look at last year’s offering and were very impressed with the performance it had to offer, but slightly disappointed in the build quality.

DSC02807.JPG

Dell is back this year with an all-new industrial design for the Inspiron 15 7000 Gaming along with updated graphics in form of the GeForce GTX 1050 Ti.  Can a $850 gaming notebook possibly live up to expectations? Let’s take a closer look.

After three generations of the Dell Inspiron 15 Gaming product, it’s evident that Dell takes this market segment seriously. Alienware seems to have lost a bit of the hearts and minds of gamers in the high-end segment, but Dell has carved out a nice corner of the gaming market.

Dell Inspiron 15 7567 Gaming  (configuration as reviewed)
Processor Intel Core i5-7300HQ (Kaby Lake)
Graphics NVIDIA Geforce GTX 1050 Ti (4GB)
Memory 8GB DDR4-2400 (One DIMM)
Screen 15.6-in 1920x1080 I
Storage

256GB SanDisk X400 SATA M.2 

Available 2.5" drive slot

Camera 720p / Dual Digital Array Microphone
Wireless Intel 3165 802.11ac + BT 4.2 (Dual Band, 1x1)
Connections Ethernet
HDMI 2.0
3x USB 3.0
SD
Audio combo jack
Battery 74 Wh
Dimensions 384.9mm x 274.73mm x 25.44mm (15.15" x 10.82" x 1")
5.76 lbs. (2620 g)
OS Windows 10 Home
Price $849 - Dell.com

Let's just get this out of the way: for the $850 price tag of the model that we were sent by Dell for review, this is an amazing collection of hardware. Traditionally laptops under $1000 have an obvious compromise, but it's difficult to find one here. Dedicated graphics, flash Storage, 1080p screen, and a large battery all are features that I look for in notebooks. Needless to say, my expectations for the Inspiron 15 Gaming are quite high.

Click here to continue reading our review of the Dell Inspiron 15 7000 Gaming.

Author:
Subject: Processors, Mobile
Manufacturer: Qualcomm

A new start

Qualcomm is finally ready to show the world how the Snapdragon 835 Mobile Platform performs. After months of teases and previews, including a the reveal that it was the first processor built on Samsung’s 10nm process technology and a mostly in-depth look at the architectural changes to the CPU and GPU portions of the SoC, the company let a handful of media get some hands-on time with development reference platform and run some numbers.

To frame the discussion as best I can, I am going to include some sections from my technology overview. This should give some idea of what to expect from Snapdragon 835 and what areas Qualcomm sees providing the widest variation from previous SD 820/821 product.

Qualcomm frames the story around the Snapdragon 835 processor with what they call the “five pillars” – five different aspects of mobile processor design that they have addressed with updates and technologies. Qualcomm lists them as battery life (efficiency), immersion (performance), capture, connectivity, and security.

slides1-6.jpg

Starting where they start, on battery life and efficiency, the SD 835 has a unique focus that might surprise many. Rather than talking up the improvements in performance of the new processor cores, or the power of the new Adreno GPU, Qualcomm is firmly planted on looking at Snapdragon through the lens of battery life. Snapdragon 835 uses half of the power of Snapdragon 801.

slides2-11.jpg

Since we already knew that the Snapdragon 835 was going to be built on the 10nm process from Samsung, the first such high performance part to do so, I was surprised to learn that Qualcomm doesn’t attribute much of the power efficiency improvements to the move from 14nm to 10nm. It makes sense – most in the industry see this transition as modest in comparison to what we’ll see at 7nm. Unlike the move from 28nm to 14/16nm for discrete GPUs, where the process technology was a huge reason for the dramatic power drop we saw, the Snapdragon 835 changes come from a combination of advancements in the power management system and offloading of work from the primary CPU cores to other processors like the GPU and DSP. The more a workload takes advantage of heterogeneous computing systems, the more it benefits from Qualcomm technology as opposed to process technology.

slides2-22.jpg

Continue reading our preview of Qualcomm Snapdragon 835 performance!

Author:
Subject: Mobile
Manufacturer: Lenovo

Overview

If you look at the current 2-in-1 notebook market, it is clear that the single greatest influence is the Lenovo Yoga. Despite initial efforts to differentiate convertible Notebook-tablet designs, newly released machines such as the HP Spectre x360 series and the Dell XPS 13" 2-in-1 make it clear that the 360-degree "Yoga-style" hinge is the preferred method.

DSC02488.JPG

Today, we are looking at a unique application on the 360-degree hinge, the Lenovo Yoga Book. Will this new take on the 2-in-1 concept be so influential?

The Lenovo Yoga Book is 10.1" tablet that aims to find a unique way to implement a stylus on a modern touch device. The device itself is a super thin clamshell-style design, featuring an LCD on one side of the device, and a large touch-sensitive area on the opposing side.

lenovo-yoga-book-feature-notetaking-android-full-width.jpg

This large touch area serves two purposes. Primarily, it acts as a surface for the included stylus that Lenovo is calling the Real Pen. Using the Real Pen, users can do thing such as sketch in Adobe Photoshop and Illustrator or takes notes in an application such as Microsoft OneNote.

The Real Pen has more tricks up its sleeve than just a normal stylus. It can be converted from a pen with a Stylus tip on it to a full ballpoint pen. When paired with the "Create Pad" included with the Yoga Book, you can write on top of a piece of actual paper using the ballpoint pen, and still have the device pick up on what you are drawing.

Click here to continue reading our review of the Lenovo Yoga Book.

Manufacturer: PC Perspective

Living Long and Prospering

The open fork of AMD’s Mantle, the Vulkan API, was released exactly a year ago with, as we reported, a hard launch. This meant public, but not main-branch drivers for developers, a few public SDKs, a proof-of-concept patch for The Talos Principle, and, of course, the ratified specification. This sets up the API to find success right out of the gate, and we can now look back over the year since.

khronos-2017-vulkan-alt-logo.png

Thor's hammer, or a tempest in a teapot?

The elephant in the room is DOOM. This game has successfully integrated the API and it uses many of its more interesting features, like asynchronous compute. Because the API is designed in a sort-of “make a command, drop it on a list” paradigm, the driver is able to select commands based on priority and available resources. AMD’s products got a significant performance boost, relative to OpenGL, catapulting their Fury X GPU up to the enthusiast level that its theoretical performance suggested.

Mobile developers have been picking up the API, too. Google, who is known for banishing OpenCL from their Nexus line and challenging OpenGL ES with their Android Extension Pack (later integrated into OpenGL ES with version 3.2), has strongly backed Vulkan. The API was integrated as a core feature of Android 7.0.

On the engine and middleware side of things, Vulkan is currently “ready for shipping games” as of Unreal Engine 4.14. It is also included in Unity 5.6 Beta, which is expected for full release in March. Frameworks for emulators are also integrating Vulkan, often just to say they did, but sometimes to emulate the quirks of these system’s offbeat graphics co-processors. Many other engines, from Source 2 to Torque 3D, have also announced or added Vulkan support.

Finally, for the API itself, The Khronos Group announced (pg 22 from SIGGRAPH 2016) areas that they are actively working on. The top feature is “better” multi-GPU support. While Vulkan, like OpenCL, allows developers to enumerate all graphics devices and target them, individually, with work, it doesn’t have certain mechanisms, like being able to directly ingest output from one GPU into another. They haven’t announced a timeline for this.

Subject: Mobile
Manufacturer: Huawei

Introduction and Specifications

The Mate 9 is the current version of Huawei’s signature 6-inch smartphone, building on last year’s iteration with the company’s new Kirin 960 SoC (featuring ARM's next-generation Bifrost GPU architecture), improved industrial design, and exclusive Leica-branded dual camera system.

Mate9_Main.jpg

In the ultra-competitive smartphone world there is little room at the top, and most companies are simply looking for a share of the market. Apple and Samsung have occupied the top two spots for some time, with HTC, LG, Motorola, and others, far behind. But the new #3 emerged not from the usual suspects, but from a name many of us in the USA had not heard of until recently; and it is the manufacturer of the Mate 9. And comparing this new handset to the preceding Mate 8 (which we looked at this past August), it is a significant improvement in most respects.

With this phone Huawei has really come into their own with their signature phone design, and 2016 was a very good product year with the company’s smartphone offerings. The P9 handset launched early in 2016, offering not only solid specs and impressive industrial design, but a unique camera that was far more than a gimmick. Huawei’s partnership with Leica has resulted in a dual-camera system that operates differently than systems found on phones such as the iPhone 7 Plus, and the results are very impressive. The Mate 9 is an extension of that P9 design, adapted for their larger Mate smartphone series.

DSC_0649.jpg

Continue reading our review of the Huawei Mate 9!

Subject: Mobile
Manufacturer: Lenovo
Tagged: yoga, X1, Thinkpad, oled, Lenovo

Intro, Exterior and Internal Features

Lenovo sent over an OLED-equipped ThinkPad X1 Yoga a while back. I was mid-development on our client SSD test suite and had some upcoming travel. Given that the new suite’s result number crunching spreadsheet ends extends out to column FHY (4289 for those counting), I really needed a higher res screen and improved computer horsepower in a mobile package. I commandeered the X1 Yoga OLED for the trip and to say it grew on me quickly is an understatement. While I do tend to reserve my heavier duty computing tasks and crazy spreadsheets for desktop machines and 40” 4K displays, the compute power of the X1 Yoga proved itself quite reasonable for a mobile platform. Sure there is a built in pen that comes in handy when employing the Yoga’s flip over convertibility into tablet mode, but the real beauty of this particular laptop comes with its optional 2560x1440 14” OLED display.

170209-180903.jpg

OLED is just one of those things you need to see in person to truly appreciate. Photos of these screens just can’t capture the perfect blacks and vivid colors. In productivity use, something about either the pixel pattern or the amazing contrast made me feel like the effective resolution of the panel was higher than its rating. It really is a shame that you are likely reading this article on an LCD, because the OLED panel on this particular model of Lenovo laptop really is the superstar. I’ll dive more into the display later on, but for now let’s cover the basics:

Read on for our review of the ThinkPad X1 Yoga!

Subject: Mobile
Manufacturer: Qualcomm

Introduction

Introduction

In conjunction with Ericsson, Netgear, and Telstra, Qualcomm officially unveiled the first Gigabit LTE ready network. Sydney, Australia is the first city to have this new cellular spec deployed through Telstra. Gigabit LTE, dubbed 4GX by Telstra, offers up to 1Gbps download speeds and 150 Mbps upload speeds with a supported device. Gigabit LTE implementation took partnership between all four companies to become a reality with Ericsson providing the backend hardware and software infrastructure and upgrades, Qualcomm designing its next-gen Snapdragon 835 SoC and Snapdragon X16 modem for Gigabit LTE support, Netgear developing the Nighthawk M1 Mobile router which leverages the Snapdragon 835, and Telstra bringing it all together on its Australian-based cellular network. Qualcomm, Ericsson, and Telstra all see the 4GX implementation as a solid step forward in the path to 5G with 4GX acting as the foundation layer for next-gen 5G networks and providing a fallback, much the same as 3G acted as a fallback for the current 4G LTE cellular networks.

Gigabit LTE Explained

02-telstra-gigabit-lte-explained.jpg

Courtesy of Telstra

What exactly is meant by Gigabit LTE (or 4GX as Telstra has dubbed the new cellular technology)? Gigabit LTE increases both the download and upload speeds of current generation 4G LTE to 1Gbps download and 150 Mbps upload speeds by leveraging several technologies for optimizing the signal transmission between the consumer device and the cellular network itself. Qualcomm designed the Snapdragon X16 modem to operate on dual 60MHz signals with 4x4 MIMO support or dual 80MHz signals without 4x4 MIMO. Further, they increased the modem's QAM support to 256 (8-bit) instead of the current 64 QAM support (6-bit), enabling 33% more data per stream - an increase of 75 Mbps to 100 Mbps per stream. The X16 modem leverages a total of 10 communication streams for delivery of up to 1 Gbps performance and also offers access to previously inaccessible frequency bands using LAA (License Assisted Access) to leverage increased power and speed needs for Gigabit LTE support.

Continue reading our coverage of the Gigabit LTE technology!

Author:
Subject: Processors, Mobile
Manufacturer: Qualcomm

Semi-custom CPU

With the near comes a new push for performance, efficiency and feature leadership from Qualcomm and its Snapdragon line of mobile SoCs. The Snapdragon 835 was officially announced in November of last year when the partnership with Samsung on 10nm process technology was announced, but we now have the freedom to share more of the details on this new part and how it changes Qualcomm’s position in the ultra-device market. Though devices with the new 835 part won’t be on the market for several more months, with announcements likely coming at CES this year.

slides1-5.jpg

Qualcomm frames the story around the Snapdragon 835 processor with what they call the “five pillars” – five different aspects of mobile processor design that they have addressed with updates and technologies. Qualcomm lists them as battery life (efficiency), immersion (performance), connectivity, and security.

slides1-6.jpg

Starting where they start, on battery life and efficiency, the SD 835 has a unique focus that might surprise many. Rather than talking up the improvements in performance of the new processor cores, or the power of the new Adreno GPU, Qualcomm is firmly planted on looking at Snapdragon through the lens of battery life. Snapdragon 835 uses half of the power of Snapdragon 801.

slides2-2.jpg

The company touts usage claims of 1+ day of talk time, 5+ days of music playback, 11 hours of 4K video playback, 3 hours of 4K video capture and 2+ hours of sustained VR gaming. These sound impressive, but as we must always do in this market, you must wait for consumer devices from Qualcomm partners to really measure how well this platform will do. Going through a typical power user comparison of a device built on the Snapdragon 835 to one use the 820, Qualcomm thinks it could result in 2 or more hours of additional battery life at the end of the day.

We have already discussed the new Quick Charge 4 technology, that can offer 5 hours of use with just 5 minutes of charge time.

Continue reading our preview of the Qualcomm Snapdragon 835 SoC!

Subject: Systems, Mobile

Vulkan 1.0, OpenGL 4.5, and OpenGL ES 3.2 on a console

A few days ago, sharp eyes across the internet noticed that Nintendo’s Switch console has been added to lists of compliant hardware at The Khronos Group. Vulkan 1.0 was the eye-catcher, although the other tabs also claims conformance with OpenGL 4.5 and OpenGL ES 3.2. The device is not listed as compatible with OpenCL, although that does not really surprise me for a single-GPU gaming system. The other three APIs have compute shaders designed around the needs of game developers. So the Nintendo Switch conforms to the latest standards of the three most important graphics APIs that a gaming device should use -- awesome.

But what about performance?

In other news, Eurogamer / Digital Foundary and VentureBeat uncovered information about the hardware. It will apparently use a Tegra X1, which is based around second-generation Maxwell, that is under-clocked from what we see on the Shield TV. When docked, the GPU will be able to reach 768 MHz on its 256 CUDA cores. When undocked, this will drop to 307.2 MHz (although the system can utilize this mode while docked, too). This puts the performance at ~315 GFLOPs when in mobile, pushing up to ~785 GFLOPs when docked.

You might compare this to the Xbox One, which runs at ~1310 GFLOPs, and the PlayStation 4, which runs at ~1840 GFLOPs. This puts the Nintendo Switch somewhat behind it, although the difference is even greater than that. The FLOP calculation of Sony and Microsoft is 2 x Shader Count x Frequency, but the calculation of Nintendo’s Switch is 4 x Shader Count x Frequency. FMA is the factor of two, but the extra factor of two in Nintendo’s case... ...

Yup, the Switch’s performance rating is calculated as FP16, not FP32.

nintendo-2016-switch-gpu.png

Snippet from an alleged leak of what Nintendo is telling developers.
If true, it's very interesting that FP16 values are being discussed as canonical.

Reducing shader precision down to 16-bit is common for mobile devices. It takes less transistors to store and translate half-precision values, and accumulated error will be muted by the fact that you’re viewing it on a mobile screen. The Switch isn’t always a mobile device, though, so it will be interesting to see how this reduction of lighting and shading precision will affect games on your home TV, especially in titles that don’t follow Nintendo’s art styles. That said, shaders could use 32-bit values, but then you are cutting your performance for those instructions in half, when you are already somewhat behind your competitors.

As for the loss of performance when undocked, it shouldn’t be too much of an issue if Nintendo pressures developers to hit 1080p when docked. If that’s the case, the lower resolution, 720p mobile screen will roughly scale with the difference in clock.

Lastly, there is a bunch of questions surrounding Nintendo’s choice of operating system: basically, all the questions. It’s being developed by Nintendo, but we have no idea what they forked it from. NVIDIA supports the Tegra SoC on both Android and Linux, it would be legal for Nintendo to fork either one, and Nintendo could have just asked for drivers even if NVIDIA didn’t already support the platform in question. Basically, anything is possible from the outside, and I haven’t seen any solid leaks from the inside.

The Nintendo Switch launches in March.