Podcast #376 - Intel Speed Shift, CPU Coolers from Noctua and DEEPCOOL, Broadwell-E Rumors, and more!

Subject: General Tech | November 19, 2015 - 02:42 PM |
Tagged: podcast, video, noctua, Deepcool, Gamer Storm Gabriel, Intel, speed shift, amd, R9, fury x, trixx, Broadwell-E, kaby lake, nvidia, shield tablet k1, knights landing, asus, chromebit

PC Perspective Podcast #376 - 11/19/2015

Join us this week as we discuss Intel Speed Shift, CPU Coolers from Noctua and DEEPCOOL, Broadwell-E Rumors, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Sebastian Peak

Program length: 1:19:22

  1. Week in Review:
  2. 0:32:10 This episode of PC Perspective Podcast is brought to you by Braintree. Even the best mobile app won’t work without the right payments API. That’s where the Braintree v.0 SDK comes in. One amazingly simple integration gives you every way to pay. Try out the sandbox and see for yourself at braintree­payments.com/pcper
  3. News item of interest:
  4. Hardware/Software Picks of the Week:
  5. Closing/outro

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

NVIDIA Re-Releases SHIELD Tablet as K1 - Cuts Price to $199

Subject: Mobile | November 18, 2015 - 06:41 PM |
Tagged: tegra k1, tablet, shield tablet k1, shield controller, shield, nvidia, gaming tablet, Android

NVIDIA has released their updated version of the SHIELD tablet with a new name, but very little has changed other than the name (now the SHIELD tablet K1) and the price - now $100 less expensive at $199.99.


The SHIELD tablet K1 (pictured case and controller are not included)

Under the hood the 8-inch Android-powered tablet is identical to its predecessor, with the quad-core Tegra K1 processor with its 192 CUDA core GPU powering the gaming action on the 1920x1200 display. The controller is still a separate $59.99 purchase, but of course this is not required to use the tablet.

Here are full specs from NVIDIA:

  • Processor: NVIDIA Tegra K1 192 core Kepler GPU (2.2 GHz ARM Cortex A15 CPU with 2 GB RAM)
  • Display: 8-inch 1920x1200 multi-touch full-HD display
  • Audio: Front-facing stereo speakers with built-in microphone
  • Storage: 16 GB
  • Wireless: 802.11n 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi; Bluetooth 4.0 LE, GPS/GLONASS
  • I/O: Mini-HDMI output, Micro-USB 2.0, MicroSD slot, 3.5 mm stereo headphone jack with microphone support
  • Motion Sensors: 3-axis gyro, 3-axis accelerometer, 3-axis compass
  • Cameras: Front, 5MP HDR; Back, 5MP auto-focus HDR
  • Battery: 19.75 Watt Hours
  • Dimensions: Weight, 12.6 oz (356 g); H x W x D: 8.8 in (221 mm) x 5.0 in (126 mm) x 0.36 in (9.2 mm)
  • Operating System: Android Lollipop
  • Gaming Features: SHIELD controller compatible, GeForce NOW cloud gaming service, Console Mode, NVIDIA ShadowPlay
  • Included Apps: Google Play, NVIDIA SHIELD Hub, Fallout Shelter, NVIDIA Dabbler, Squid, Twitch


This update really comes down to price, as NVIDIA is being more aggressive about the adoption of their gaming tablet with the new MSRP. This doesn't come without some concessions, however, as the SHIELD tablet K1 ships without any accessories (no USB cable or charger). It's a move remienscent of Nintendo with the "New 3DS XL", which also shipped without a charger, and the standard micro-USB connection should be readily at hand for most of the target audience.

The question of course must be, is this now a more compelling product at $199? It does make the controller seem a bit more affordable considering the bundle will now run $260 - $40 below the previous tablet-only price. Time will tell (and of course you can let us know in the comments below!).

NVIDIA is selling the SHIELD tablet K1 directly from their web store, and it's already on Amazon for the same $199.99 price.

Source: NVIDIA

Podcast #375 - Snapdragon 820, Lenovo Yoga 900, R9 380X and more!

Subject: General Tech | November 12, 2015 - 02:47 PM |
Tagged: podcast, video, qualcomm, snapdragon 820, Lenovo, yoga 900, be quiet!, amd, r9 380x, GLOBALFOUNDRIES, 14nm, FinFET, nvidia, asus, Maximus VIII Extreme, Thrustmaster, T300

PC Perspective Podcast #375 - 11/12/2015

Join us this week as we discuss the Snapdragon 820, Lenovo Yoga 900, R9 380X and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Sebastian Peak

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!


NVIDIA Jetson TX1 Will Power Autonomous Embedded Devices With Machine Learning

Subject: General Tech | November 12, 2015 - 02:46 AM |
Tagged: Tegra X1, nvidia, maxwell, machine learning, jetson, deep neural network, CUDA, computer vision

Nearly two years ago, NVIDIA unleashed the Jetson TK1, a tiny module for embedded systems based around the company's Tegra K1 "super chip." That chip was the company's first foray into CUDA-powered embedded systems capable of machine learning including object recognition, 3D scene processing, and enabling things like accident avoidance and self-parking cars.

Now, NVIDIA is releasing even more powerful kit called the Jetson TX1. This new development platform covers two pieces of hardware: the credit card sized Jetson TX1 module and a larger Jetson TX1 Development Kit that the module plugs into and provides plenty of I/O options and pin outs. The dev kit can be used by software developers or for prototyping while the module alone can be used with finalized embedded products.


NVIDIA foresees the Jetson TX1 being used in drones, autonomous vehicles, security systems, medical devices, and IoT devices coupled with deep neural networks, machine learning, and computer vision software. Devices would be able to learn from the environment in order to navigate safely, identify and classify objects of interest, and perform 3D mapping and scene modeling. NVIDIA partnered with several companies for proof-of-concepts including Kespry and Stereolabs.

Using the TX1, Kespry was able to use drones to classify and track in real time construction equipment moving around a construction site (in which the drone was not necessarily programmed for exactly as sites and weather conditions vary, the machine learning/computer vision was used to allow the drone to navigate the construction site and a deep neural network was used to identify and classify the type of equipment it saw using its cameras. Meanwhile Stereolabs used high resolution cameras and depth sensors to capture photos of buildings and then used software to reconstruct the 3D scene virtually for editing and modeling. You can find other proof-of-concept videos, including upgrading existing drones to be more autonomous posted here.

From the press release:

"Jetson TX1 will enable a new generation of incredibly capable autonomous devices," said Deepu Talla, vice president and general manager of the Tegra business at NVIDIA. "They will navigate on their own, recognize objects and faces, and become increasingly intelligent through machine learning. It will enable developers to create industry-changing products."

But what about the hardware side of things? Well, the TX1 is a respectable leap in hardware and compute performance. Sitting at 1 Teraflops of rated (FP16) compute performance, the TX1 pairs four ARM Cortex A57 and four ARM Cortex A53 64-bit CPU cores with a 256-core Maxwell-based GPU. Definitely respectable for its size and low power consumption, especially considering NVIDIA claims the SoC can best the Intel Skylake Core i7-6700K in certain workloads (thanks to the GPU portion). The module further contains 4GB of LPDDR4 memory and 16GB of eMMC flash storage.

In short, while on module storage has not increased, RAM has been doubled and compute performance has tripled for FP16 compute performance and jumped by approximately 40% for FP32 versus the Jetson TK1's 2GB of DDR3 and 192-core Kepler GPU. The TX1 also uses a smaller process node at 20nm (versus 28nm) and the chip is said to use "very little power." Networking support includes 802.11ac and Gigabit Ethernet. The chart below outlines the major differences between the two platforms.

  Jetson TX1 Jetson TK1
GPU (Architecture) 256-core (Maxwell) 192-core (Kepler)
CPU 4 x ARM Cortex A57 + 4 x A53 "4+1" ARM Cortex A15 "r3"
eMMC 16 GB 16 GB
Compute Performance (FP16) 1 TFLOP 326 GFLOPS
Compute Performance (FP32) - via AnandTech 512 GFLOPS (AT's estimation) 326 GFLOPS (NVIDIA's number)
Manufacturing Node 20nm 28nm
Launch Pricing $299 $192

The TX1 will run the Linux For Tegra operating system and supports the usual suspects of CUDA 7.0, cuDNN, and VisionWorks development software as well as the latest OpenGL drivers (OpenGL 4.5, OpenGL ES 3.1, and Vulkan).

NVIDIA is continuing to push for CUDA Everywhere, and the Jetson TX1 looks to be a more mature product that builds on the TK1. The huge leap in compute performance should enable even more interesting projects and bring more sophisticated automation and machine learning to smaller and more intelligent devices.

For those interested, the Jetson TX1 Development Kit (the full I/O development board with bundled module) will be available for pre-order today at $599 while the TX1 module itself will be available soon for approximately $299 each in orders of 1,000 or more (like Intel's tray pricing).

With CUDA 7, it is apparently possible for the GPU to be used for general purpose processing as well which may open up some doors that where not possible before in such a small device. I am interested to see what happens with NVIDIA's embedded device play and what kinds of automated hardware is powered by the tiny SoC and its beefy graphics.

Source: NVIDIA

Fallout 4 performance at the high end

Subject: General Tech | November 11, 2015 - 06:36 PM |
Tagged: R9 FuryX, nvidia, GTX 980 Ti, gaming, fallout 4, amd

[H]ard|OCP tested out the performance of the 980 Ti and FuryX in single card configurations as multiple GPU support is non-existent in Fallout 4, some have had moderate success with workarounds which [H] mentions at the end of the review.  At launch it seems NVIDIA's card offers significantly better performance overall, hopefully that delta will decrease as patches and drivers are rolled out.  As far as features go, enabling godrays has a huge effect on performance for both cards and FXAA is the best performing AA when displaying a wide variety of terrain, close forested areas allowed TAA to narrow the gap.  As to the game itself, as of yet they do not sound overly impressed.


"Fallout 4 is out on the PC, in this preview we will take a look at performance between GeForce GTX 980 Ti and Radeon R9 Fury X as well as some in-game feature performance comparisons. We'll also take a look at some in-game feature screenshots and find out what settings are best for an enjoyable gaming experience."

Here is some more Tech News from around the web:


Source: [H]ard|OCP

NVIDIA's new Tesla M40 series

Subject: General Tech | November 11, 2015 - 06:12 PM |
Tagged: nvidia, Tesla M40, neural net, JetsonTX1

There are a lot of colloquialisms tossed about such as AI research and machine learning which refer to the work being done designing neural nets by feeding in huge amounts of data to an architecture capable of forming and weighting connections in an attempt to create a system capable of processing that input in a meaningful way.  You might be familiar with some of the more famous experiments such as Google's Deep Dream and Wolfram's Language
Image Identification Project
.  As you might expect this takes a huge amount of computational power and NVIDA has just announced the Tesla M40 accelerator card for training deep neural nets.  It is fairly low powered at 50-75W of draw and NVIDIA claims it will be able to deal with five times more simultaneous video streams than previous products.  Along with this comes Hyperscale Suite software, specifically designed to work on the new hardware which Jen-Hsun Huang comments on over at The Inquirer.  

At the end of the presentation he also mentioned the tiny Jetson TX1 SoC.  It has 256-core Maxwell GPU capable of 1TFLOPS, a 64-bit ARM A57 CPU, 4GB of memory and communicates via Ethernet or Wi-Fi all on a card 50x87mm (2x3.4)" in size.  It will be available at $300 when released some time early next year.


"Machine learning is the grand computational challenge of our generation. We created the Tesla hyperscale accelerator line to give machine learning a 10X boost. The time and cost savings to data centres will be significant."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

NVIDIA Releases Driver 358.91 for Fallout 4, Star Wars Battlefront, Legacy of the Void

Subject: Graphics Cards | November 9, 2015 - 01:44 PM |
Tagged: nvidia, geforce, 358.91, fallout 4, Star Wars, battlefront, starcraft, legacy of the void

It's a huge month for PC gaming with the release of Bethesda's Fallout 4 and EA's Star Wars Battlefront likely to take up hours and hours of your (and my) time in the lead up to the holiday season. NVIDIA just passed over links to its latest "Game Ready" driver, version 358.91.


Fallout 4 is going to be impressive graphically

Here's the blurb from NVIDIA directly:

Continuing to fulfill our commitment to GeForce gamers to have them Game Ready for the top Holiday titles, today we released a new Game Ready driver.  This Game Ready driver will get GeForce Gamers set-up for tomorrow’s release of Fallout 4, as well as Star Wars Battlefront, StarCraft II: Legacy of the Void. WHQLed and ready for the Fallout wasteland, driver version 358.91 will deliver the best experience for GeForce gamers in some of the holiday’s hottest titles.

Other than learning that NVIDIA considers "WHQLed" to be a verb now, this is good news for PC gamers looking to dive into the world of Fallout or take up arms against the Empire on the day of release. I honestly believe that these kinds of software updates and frequent driver improvements timed to major game releases is one of the biggest advantages that GeForce gamers have over Radeon users; though I hold out hope that the red team will get on the same cadence with one Raja Koduri in charge.

You can also find more information from NVIDIA about configuration with its own GPUs for Fallout 4 and for Star Wars Battlefront on GeForce.com.

Source: NVIDIA

NVIDIA Confirms Clock Speed, Power Increases at High Refresh Rates, Promises Fix

Subject: Graphics Cards | November 6, 2015 - 04:05 PM |
Tagged: ROG Swift, refresh rate, pg279q, nvidia, GTX 980 Ti, geforce, asus, 165hz, 144hz

Last month I wrote a story that detailed some odd behavior with NVIDIA's GeForce GTX graphics cards and high refresh rate monitors, in particular with the new ASUS ROG Swift PG279Q that has a rated 165Hz refresh rate. We found that when running this monitor at 144Hz or higher refresh rate, idle clock speeds and power consumption of the graphics card increased dramatically.

The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.


But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.

When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.

We put NVIDIA on notice with the story and followed up with emails including more information from other users as well as additional testing completed after the story was posted. The result: NVIDIA has confirmed it exists and has a fix incoming!

In an email we got from NVIDIA PR last night: 

We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors.
Guess what? You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for GSYNC and very high refresh rates.
As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays.
We’ll have this fixed in an upcoming driver.

This actually supports an oddity we found before: we noticed that the PG279Q at 144Hz refresh was pushing GPU clocks up pretty high while a monitor without G-Sync support at 144Hz did not. We'll see if this addresses the entire gamut of experiences that users have had (and have emailed me about) with high refresh rate displays and power consumption, but at the very least NVIDIA is aware of the problems and working to fix them.

I don't have confirmation of WHEN I'll be able to test out that updated driver, but hopefully it will be soon, so we can confirm the fix works with the displays we have in-house. NVIDIA also hasn't confirmed what the root cause of the problem is - was it related to the clock domains as we had theorized? Maybe not, since this was a G-Sync specific display issue (based on the quote above). I'll try to weasel out the technical reasoning for the bug if we can and update the story later!

NVIDIA Promoting Game Ready Drivers with Giveaway

Subject: Graphics Cards | November 4, 2015 - 09:01 AM |
Tagged: nvidia, graphics drivers, geforce, game ready

In mid-October, NVIDIA announced that Game Ready drivers would only be available through GeForce Experience with a registered email address, which we covered. Users are able to opt-out of NVIDIA's mailing list, though. They said that this would provide early access to new features, chances to win free hardware, and the ability to participate in the driver development process.


Today's announcement follows up on the “win free hardware” part. The company will be giving away $100,000 worth of prizes, including graphics cards up to the GeForce GTX 980 Ti, game keys, and SHIELD Android TV boxes. To be eligible, users need to register with GeForce Experience and use it to download the latest Game Ready driver.

Speaking of Game Ready drivers, the main purpose of this blog post is to share the list of November/December games that are in this program. NVIDIA pledges to have optimized drivers for these titles on or before their release date:

  • Assassin's Creed: Syndicate
  • Call of Duty Black Ops III
  • Civilization Online
  • Fallout 4
  • Just Cause 3
  • Monster Hunter Online
  • Overwatch
  • RollerCoaster Tycoon World
  • StarCraft II: Legacy of the Void
  • Star Wars Battlefront
  • Tom Clancy's Rainbow Six Siege
  • War Thunder

As is the case recently, NVIDIA also plans to get every Game Ready driver certified by Microsoft, through Microsoft's WHQL driver certification program.

Source: NVIDIA

Podcast #373 - Samsung 950 Pro, ASUS ROG Swift PG279Q, Steam Link and more!

Subject: General Tech | October 29, 2015 - 03:22 PM |
Tagged: podcast, video, Samsung, 950 PRO, NVMe, asus, ROG Swift, pg279q, g-sync, nvidia, amd, steam, steam link, valve

PC Perspective Podcast #373 - 10/29/2015

Join us this week as we discuss the Samsung 950 Pro, ASUS ROG Swift PG279Q, Steam Link and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano, and Sebastian Peak

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!