Podcast #497 - Ryzen X470 NVMe performance, Samsung 970 performance, and more!

Subject: General Tech | April 26, 2018 - 11:35 AM |
Tagged: Samsung, ryzen, rtx, philips, nvidia, logitech, K95, Intel, Hydro PTM, fsp, craft, corsair, Cannon Lake-U, battletech, amd, 970 PRO, 970 EVO, 8086K

PC Perspective Podcast #497 - 04/26/18

Join us this week for discussion on Ryzen X470 NVMe performance, Samsung 970 performance, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts:Allyn Malventano, Jeremy Hellstrom, Josh Walrath, Ken Addison,

Peanut Gallery: Alex Lustenberg

Program length: 1:39:00

Podcast topics of discussion:

  1. Week in Review:
  2. Thanks to Away for supporting PC Perspective. Go to https://www.awaytravel.com/pcper and use the promo code pcper to get $20 off a suitcase!
  3. News items of interest:
  4. Picks of the Week:
    1. 1:27:00 Allyn: Cool USB external enclosure
    2. 1:30:30 Josh: I love this case.
    3. 1:32:15 Ken: CloudHQ
  5. Closing/outro
 
Source:

NVIDIA Releases GeForce 397.31. RTX for Developers.

Subject: Graphics Cards | April 25, 2018 - 08:27 PM |
Tagged: nvidia, graphics drivers, rtx, Volta

It’s quite the jump in version number from 391.35 to 397.31, but NVIDIA has just released a new graphics driver. Interestingly, it is “Game Ready” tied to the Battletech, which I have been looking forward to, but I was always under the impression that no-one else was. Apparently not.

nvidia-geforce.png

As for its new features? The highlight is a developer preview of NVIDIA RTX Technology. This requires a Volta GPU, which currently means Titan V unless your team was seeded something that doesn’t necessarily exist, as well as 396.xx+ drivers, the new Windows 10 update, and Microsoft DXR developer package. Speaking of which, I’m wondering how much of the version number bump could be attributed to RTX being on the 396.xx branch. Even then, it still feels like a branch or two never left NVIDIA’s dev team. Hmm.

Moving on, the driver also conforms with the Vulkan 1.1 test suite (version 1.1.0.3). If you remember back from early March, the Khronos Group released the new standard, which integrated a bunch of features into core, and brought Subgroup Operations into the mix. This could allow future shaders to perform quicker by being compiled with new intrinsic functions.

Also – the standalone installer will apparently clean up after itself better than it used to. Often I can find a few gigabytes of old NVIDIA folders when I’m looking for space to save, so it’s good for NVIDIA to finally address at least some of that.

Pick up the new drivers on NVIDIA’s website or through GeForce Experience.

Source: NVIDIA

Dell and HP ain't down with the GPP

Subject: General Tech | April 12, 2018 - 04:29 PM |
Tagged: transparency, nvidia, npp, dirty pool

Curious about the state of NVIDIA's new GeForce Partner Program?  There is definitely news as both Dell and HP do not seem to have joined up, though neither will confirm nor deny this.  The evidence comes from the availability of AMD GPUs in both of their current gaming lineups.  The new HP Omens do not offer AMD but this is theorized to be a supply issue, or it could simply be down to the better performance offered by NVIDIA's current mobile parts.  Lenovo as well still offers AMD in their Legion gaming systems so for now it seems they are not interested either.

This is very good news for the consumer, if these three big suppliers are not signing on, obviously the supposed benefits to joining the GPP simply are not that attractive to them, in much the same way as the 'transparency' offered by this program does not appeal to enthusiasts.

simon+thumbs-down.jpg

"Since we found out about NVIDIA's GeForce Partner Program, aka GPP, a couple of months ago, we have seen changes implemented by NVIDIA's partners, but what has not happened is far more important to point out at this time."

Here is some more Tech News from around the web:

Tech Talk

Source: [H]ard|OCP

Dell Adds New G Series Of Budget Gaming Laptops To Its Portfolio

Subject: General Tech | April 5, 2018 - 01:25 PM |
Tagged: dell, UHD, gaming laptop, coffee lake h, nvidia, max-q, gtx 1060

Dell recently announced updates to its budget laptop lineup to include a new G series that replaces the previous generation Inspiron Gaming branded products. The new brand comes in three tiers and two form factors that include the 15.6" G3, G5, and G7 and the 17" G3 all of which utilize various levels of 8th Generation Core Intel CPUs and mid-range 1000 series NVIDIA mobile GPUs. There is a lot of overlap in hardware, build, and pricing depending on the specific configuration you build.

These budget gaming laptops are fairly thin – ranging from 22.7mm on the G3 15 to 25mm on the G3 17 and G5 15 and G7 15 – but do make compromises in the build quality department with most of the body being plastic-based rather than metallic (the higher-end components and prices remain reserved to the Alienware models). On the bright side, Dell appears to be taking cooling seriously and makes liberal use of vents both front and rear along with dual fans. The G series also all feature dual drive bays, backlit spill-resistant keyboards, dual Waves MaxxAudio Pro speakers, webcams, fingerprint sensors, and matte exterior finishes.

G3-Family.png

The G3 series features up to an 8th Generation Core i7 processor and either a GTX 1050, GTX 1050 Ti, or GTX 1060 Max Q graphics card along with a full HD (1920x1080) anti-glare display. The G3 15 comes in black, recon blue, or alpine white while the G3 17 comes in either black or recon blue. Three USB 3.1, USB-C / Thunderbolt 3, SD, HDMI, Ethernet, and one audio jack account for the external I/O ports that line the edges of the notebook. Note that the G3 15 has a normal hinge while the higher end models have a smaller centered hinge that leaves gaps on either side presumably for larger vents.

G5 15_image 1.png

Stepping things up a bit to the G5 tier, the G5 15 comes in Licorice Black or Beijing Red and features a quad or hexacore Coffee Lake H processor and up to a GTX 1060 Max Q 6GB and two drive bays for up to two SSDs much like the G3 but adds Killer Networking 1x1 (up to 2x2 Wi-Fi supported) and the option for a 4K UHD IPS panel.

Moving from the G5 15 to the G7 15 in a "but wait, there's more" infomercial style offers you the ability to configure the Licorice Black or Alpine White laptop with a Core i9 Coffee Lake H processor, 32GB RAM, GTX 1060 Max Q, and dual SSDs in addition to the 4K display and Killer networking options of the G5 15. The G7 15 further has a larger 56 Whr 4-cell battery.

G7 15_image 1.png

Limited configurations of the G3 15, G3 17, G5 15, and G7 15 are set to be available later this month (with two options for the G7 15 available now on Dell's website) with additional configuration options to follow. The G3 series starts at $749, the G5 starts at $799, and the G7 starts at $849 (though that model is not yet up on Dell's site) though as you see with the G7 on Dell's site adding SSDs and RAM brings the pricing up quite a bit (the $1099 model has an i7 8750H, GTX 1060, 8GB RAM, and 256 GB SSD for example).

It is refreshing to see Dell move away from the Inspiron brand for gaming, but I hope the fresh brand also brings fresh build quality although you can't ask for too much at these prices with this hardware inside at least for the base models (I am mostly concerned about the small hinge on the higher end models). We will have to wait for reviews to know for sure though. Cnet has a gallery of hands-on photos of the laptops as well as The Verge if you are curious what these machines look like.

Source: Dell

Samsung Launches Coffee Lake Powered Notebook Odyssey Z Gaming Laptop

Subject: General Tech, Systems | April 4, 2018 - 11:03 PM |
Tagged: Samsung Odyssey, Samsung, nvidia, max-p, Intel, coffee lake h

During Intel's launch event for its new Coffee Lake H processors in Beijing, China notebook manufacturers took the wraps off of their latest thin and light offerings. The latest announcement is from Samsung who launched its Notebook Odyssey Z gaming notebook. Measuring 375.6 x 255 x17.9mm and weighing 2.4 kg (5.29 pounds), it may not be particularly thin or light by most standards, but it is a unique design that brings a lot of mobile horsepower to bear for gaming tasks.

Samsung Notebook Odyssey Z.jpg

The Notebook Odyssey Z comes in Titan Silver with red accents and a red backlit keyboard. The top cover of the notebook has a silver and white repeating gradient design and the bottom of the notebook is covered almost entirely in mesh with the top half venting to the inside of the computer. Inside, the top half holds the 15.6" 1920x1080 display and a 720p webcam while the bottom half hosts two 1.5W speakers with angled grills and a red logo up top and the keyboard moved up to the front of the notebook and the trackpad is moved to the right side of the keyboard. The keyboard uses Crater keycaps and there are shortcut keys to record gameplay and change power modes (e.g. the Silent Mode clocks things down and changes the power envelop such that the notebook gets down to a quiet 22 decibels.

Around the edges there is a Gigabit Ethernet, two USB 3.0, one USB Type C, one USB 2.0, one HDMI, one audio, and one DC-in for external I/O.

Samsung Notebook Odyssey Z Underside.jpg

Internally, the Odyssey Z is powered by Intel's new 6-core Core i7 "Coffee Lake H" processor (Samsung doesn't mention which model, but the 45W i7 8750H is a likely option) and a NVIDIA GTX 1060 graphics card. Other hardware includes up to 16 GB of DDR4 2400 MHz memory and 1 TB of NVMe storage. The system is cooled by Samsung's Z AeroFlow cooler which includes vapor chamber heatsinks for the processors. and two blower fans. There is a 54WH battery and it comes with a 180W AC power adapter.

Samsung's Notebook Odyssey Z will be available in certain countries including Korea and China this month with US availability in Q3 2018. No word yet on pricing, however.

Source: Samsung

Making the most of murder and mayhem in Montana

Subject: General Tech | April 4, 2018 - 02:28 PM |
Tagged: gaming, far cry 5, amd, nvidia

Far Cry 5 has received mixed reviews, with some loving the new addition to the series while others are less than amused by the AI's behaviour and story line which frequently interrupts your wandering and side quests.  Regardless of how you feel about the content it is a pretty game and finding the right settings for your particular GPU can only enhance your gameplay.  [H]ard|OCP took some time to run through the game with an RX Vega 64 as well as a GTX 1080 and 1080Ti at 1440p and 4k resolutions.  They also compared the impact on performance using SMAA and TAA has on the NVIDIA cards at 1440p.  Check out the full review, especially the strong showing by the AMD card.

1522467206ygpnsnqhoc_1_1.png

"Far Cry 5 is finally in retail, and in this preview of video card performance we will compare performance of a few video cards, using real gameplay. This will give you an idea what you need to play Far Cry 5, what image quality settings are playable, and what resolutions work best with each video card. Our full evaluation will come later."

Here is some more Tech News from around the web:

Tech Talk

Source: [H]ard|OCP

Eight-GPU SLI in Unreal Engine 4 (Yes There Is a Catch)

Subject: Graphics Cards | March 29, 2018 - 09:52 PM |
Tagged: nvidia, GTC, gp102, quadro p6000

At GTC 2018, Walt Disney Imagineering unveiled a work-in-progress clip of their upcoming Star Wars: Galaxy’s Edge attraction, which is expected to launch next year at Disneyland and Walt Disney World Resort. The cool part about this ride is that it will be using Unreal Engine 4 with eight, GP102-based Quadro P6000 graphics cards. NVIDIA also reports that Disney has donated the code back to Epic Games to help them with their multi-GPU scaling in general – a win for us consumers… in a more limited fashion.

nvidia-2018-GTC-starwars-8-way-sli.jpg

See? SLI doesn’t need to be limited to two cards if you have a market cap of $100 billion USD.

Another interesting angle to this story is how typical PC components are contributing to these large experiences. Sure, Quadro hardware isn’t exactly cheap, but it can be purchased through typical retail channels and it allows the company to focus their engineering time elsewhere.

Ironically, this also comes about two decades after location-based entertainment started to decline… but, you know, it’s Disneyland and Disney World. They’re fine.

Source: NVIDIA

GTC 2018: Nvidia and ARM Integrating NVDLA Into Project Trillium For Inferencing at the Edge

Subject: General Tech | March 29, 2018 - 03:10 PM |
Tagged: project trillium, nvidia, machine learning, iot, GTC 2018, GTC, deep learning, arm, ai

During GTC 2018 NVIDIA and ARM announced a partnership that will see ARM integrate NVIDIA's NVDLA deep learning inferencing accelerator into the company's Project Trillium machine learning processors. The NVIDIA Deep Learning Accelerator (NVDLA) is an open source modular architecture that is specifically optimized for inferencing operations such as object and voice recognition and bringing that acceleration to the wider ARM ecosystem through Project Trillium will enable a massive number of smarter phones, tablets, Internet-of-Things, and embedded devices that will be able to do inferencing at the edge which is to say without the complexity and latency of having to rely on cloud processing. This means potentially smarter voice assistants (e.g. Alexa, Google), doorbell cameras, lighting, and security around the home and out-and-about on your phone for better AR, natural translation, and assistive technologies.

NVIDIAandARM_NVDLA.jpg

Karl Freund, lead analyst for deep learning at Moor Insights & Strategy was quoted in the press release in stating:

“This is a win/win for IoT, mobile and embedded chip companies looking to design accelerated AI inferencing solutions. NVIDIA is the clear leader in ML training and Arm is the leader in IoT end points, so it makes a lot of sense for them to partner on IP.”

ARM's Project Trillium was announced back in February and is a suite of IP for processors optimized for parallel low latency workloads and includes a Machine Learning processor, Object Detection processor, and neural network software libraries. NVDLA is a hardware and software platform based upon the Xavier SoC that is highly modular and configurable hardware that can feature a convolution core, single data processor, planar data processor, channel data processor, and data reshape engines. The NVDLA can be configured with all or only some of those elements and they can independently them up or down depending on what processing acceleration they need for their devices. NVDLA connects to the main system processor over a control interface and through two AXI memory interfaces (one optional) that connect to system memory and (optionally) dedicated high bandwidth memory (not necessarily HBM but just its own SRAM for example).

arm project trillium integrates NVDLA.jpg

NVDLA is presented as a free and open source architecture that promotes a standard way to design deep learning inferencing that can accelerate operations to infer results from trained neural networks (with the training being done on other devices perhaps by the DGX-2). The project, which hosts the code on GitHub and encourages community contributions, goes beyond the Xavier-based hardware and includes things like drivers, libraries, TensorRT support (upcoming)  for Google's TensorFlow acceleration, testing suites and SDKs as well as a deep learning training infrastructure (for the training side of things) that is compatible with the NVDLA software and hardware, and system integration support.

Bringing the "smarts" of smart devices to the local hardware and closer to the users should mean much better performance and using specialized accelerators will reportedly offer the performance levels needed without blowing away low power budgets. Internet-of-Things (IoT) and mobile devices are not going away any time soon, and the partnership between NVIDIA and ARM should make it easier for developers and chip companies to offer smarter (and please tell me more secure!) smart devices.

Also read:

Source: NVIDIA

Podcast #493 - New XPS 13, Noctua NH-L9a, News from NVIDIA GTC and more!

Subject: General Tech | March 29, 2018 - 02:37 PM |
Tagged: podcast, nvidia, GTC 2018, Volta, quadro gv100, dgx-2, noctua, NH-L9a-AM4

PC Perspective Podcast #493 - 03/29/18

Join us this week for our review of the new XPS 13,  Noctua NH-L9a, news from NVIDIA GTC and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Allyn Malventano, Jeremy Hellstrom, Josh Walrath

Peanut Gallery: Ken Addison

Program length: 0:59:35

Podcast topics of discussion:

  1. Week in Review:
  2. News items of interest:
  3. Picks of the Week:
    1. Allyn: retro game music remixed - ocremix.org (torrents)

How to see Hope County, Montana in all its glory

Subject: General Tech | March 28, 2018 - 02:08 PM |
Tagged: gaming, amd, nvidia, far cry 5

Looking to get Far Cry 5 running with the highest settings your GPU can handle?  The Guru of 3D have done a lot of the heavy lifting for you, testing the performance of thirteen cards each from NVIDIA and AMD at 1080p, 1440p and 4K resolutions.  This game needs some juice, even the mighty Titan's cannot reach 60fps with Ultra settings at 4K.  In the review they also take a look at the effect the number of cores and the frequency of your CPU has on performance, not much but enough you might notice.  Check the full details here.

bull_gold_1080p_1495792038.jpg

"Tomorrow Far Cry 5 will become available to the masses, we put it through our testing paces with almost 30 graphics cards, CPU performance and frame times. The looks great, and will offer great game play. Join us in this PC performance analysis."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Guru of 3D