Dell and HP ain't down with the GPP
Subject: General Tech | April 12, 2018 - 04:29 PM | Jeremy Hellstrom
Tagged: transparency, nvidia, npp, dirty pool
Curious about the state of NVIDIA's new GeForce Partner Program? There is definitely news as both Dell and HP do not seem to have joined up, though neither will confirm nor deny this. The evidence comes from the availability of AMD GPUs in both of their current gaming lineups. The new HP Omens do not offer AMD but this is theorized to be a supply issue, or it could simply be down to the better performance offered by NVIDIA's current mobile parts. Lenovo as well still offers AMD in their Legion gaming systems so for now it seems they are not interested either.
This is very good news for the consumer, if these three big suppliers are not signing on, obviously the supposed benefits to joining the GPP simply are not that attractive to them, in much the same way as the 'transparency' offered by this program does not appeal to enthusiasts.
"Since we found out about NVIDIA's GeForce Partner Program, aka GPP, a couple of months ago, we have seen changes implemented by NVIDIA's partners, but what has not happened is far more important to point out at this time."
Here is some more Tech News from around the web:
- Building the Perfect Home Router @ Hack a Day
- Gmail is on the cusp of its biggest revamp in years @ The Inquirer
- Skype for Business has nasty habit of closing down… for business @ The Register
- PC shipments slump for the 14th consecutive quarter, surprising no one @ The Inquirer
- Imagine you're having a CT scan and malware alters the radiation levels – it's doable @ The Register
Dell Adds New G Series Of Budget Gaming Laptops To Its Portfolio
Subject: General Tech | April 5, 2018 - 01:25 PM | Tim Verry
Tagged: dell, UHD, gaming laptop, coffee lake h, nvidia, max-q, gtx 1060
Dell recently announced updates to its budget laptop lineup to include a new G series that replaces the previous generation Inspiron Gaming branded products. The new brand comes in three tiers and two form factors that include the 15.6" G3, G5, and G7 and the 17" G3 all of which utilize various levels of 8th Generation Core Intel CPUs and mid-range 1000 series NVIDIA mobile GPUs. There is a lot of overlap in hardware, build, and pricing depending on the specific configuration you build.
These budget gaming laptops are fairly thin – ranging from 22.7mm on the G3 15 to 25mm on the G3 17 and G5 15 and G7 15 – but do make compromises in the build quality department with most of the body being plastic-based rather than metallic (the higher-end components and prices remain reserved to the Alienware models). On the bright side, Dell appears to be taking cooling seriously and makes liberal use of vents both front and rear along with dual fans. The G series also all feature dual drive bays, backlit spill-resistant keyboards, dual Waves MaxxAudio Pro speakers, webcams, fingerprint sensors, and matte exterior finishes.
The G3 series features up to an 8th Generation Core i7 processor and either a GTX 1050, GTX 1050 Ti, or GTX 1060 Max Q graphics card along with a full HD (1920x1080) anti-glare display. The G3 15 comes in black, recon blue, or alpine white while the G3 17 comes in either black or recon blue. Three USB 3.1, USB-C / Thunderbolt 3, SD, HDMI, Ethernet, and one audio jack account for the external I/O ports that line the edges of the notebook. Note that the G3 15 has a normal hinge while the higher end models have a smaller centered hinge that leaves gaps on either side presumably for larger vents.
Stepping things up a bit to the G5 tier, the G5 15 comes in Licorice Black or Beijing Red and features a quad or hexacore Coffee Lake H processor and up to a GTX 1060 Max Q 6GB and two drive bays for up to two SSDs much like the G3 but adds Killer Networking 1x1 (up to 2x2 Wi-Fi supported) and the option for a 4K UHD IPS panel.
Moving from the G5 15 to the G7 15 in a "but wait, there's more" infomercial style offers you the ability to configure the Licorice Black or Alpine White laptop with a Core i9 Coffee Lake H processor, 32GB RAM, GTX 1060 Max Q, and dual SSDs in addition to the 4K display and Killer networking options of the G5 15. The G7 15 further has a larger 56 Whr 4-cell battery.
Limited configurations of the G3 15, G3 17, G5 15, and G7 15 are set to be available later this month (with two options for the G7 15 available now on Dell's website) with additional configuration options to follow. The G3 series starts at $749, the G5 starts at $799, and the G7 starts at $849 (though that model is not yet up on Dell's site) though as you see with the G7 on Dell's site adding SSDs and RAM brings the pricing up quite a bit (the $1099 model has an i7 8750H, GTX 1060, 8GB RAM, and 256 GB SSD for example).
It is refreshing to see Dell move away from the Inspiron brand for gaming, but I hope the fresh brand also brings fresh build quality although you can't ask for too much at these prices with this hardware inside at least for the base models (I am mostly concerned about the small hinge on the higher end models). We will have to wait for reviews to know for sure though. Cnet has a gallery of hands-on photos of the laptops as well as The Verge if you are curious what these machines look like.
Samsung Launches Coffee Lake Powered Notebook Odyssey Z Gaming Laptop
Subject: General Tech, Systems | April 4, 2018 - 11:03 PM | Tim Verry
Tagged: Samsung Odyssey, Samsung, nvidia, max-p, Intel, coffee lake h
During Intel's launch event for its new Coffee Lake H processors in Beijing, China notebook manufacturers took the wraps off of their latest thin and light offerings. The latest announcement is from Samsung who launched its Notebook Odyssey Z gaming notebook. Measuring 375.6 x 255 x17.9mm and weighing 2.4 kg (5.29 pounds), it may not be particularly thin or light by most standards, but it is a unique design that brings a lot of mobile horsepower to bear for gaming tasks.
The Notebook Odyssey Z comes in Titan Silver with red accents and a red backlit keyboard. The top cover of the notebook has a silver and white repeating gradient design and the bottom of the notebook is covered almost entirely in mesh with the top half venting to the inside of the computer. Inside, the top half holds the 15.6" 1920x1080 display and a 720p webcam while the bottom half hosts two 1.5W speakers with angled grills and a red logo up top and the keyboard moved up to the front of the notebook and the trackpad is moved to the right side of the keyboard. The keyboard uses Crater keycaps and there are shortcut keys to record gameplay and change power modes (e.g. the Silent Mode clocks things down and changes the power envelop such that the notebook gets down to a quiet 22 decibels.
Around the edges there is a Gigabit Ethernet, two USB 3.0, one USB Type C, one USB 2.0, one HDMI, one audio, and one DC-in for external I/O.
Internally, the Odyssey Z is powered by Intel's new 6-core Core i7 "Coffee Lake H" processor (Samsung doesn't mention which model, but the 45W i7 8750H is a likely option) and a NVIDIA GTX 1060 graphics card. Other hardware includes up to 16 GB of DDR4 2400 MHz memory and 1 TB of NVMe storage. The system is cooled by Samsung's Z AeroFlow cooler which includes vapor chamber heatsinks for the processors. and two blower fans. There is a 54WH battery and it comes with a 180W AC power adapter.
Samsung's Notebook Odyssey Z will be available in certain countries including Korea and China this month with US availability in Q3 2018. No word yet on pricing, however.
Making the most of murder and mayhem in Montana
Subject: General Tech | April 4, 2018 - 02:28 PM | Jeremy Hellstrom
Tagged: gaming, far cry 5, amd, nvidia
Far Cry 5 has received mixed reviews, with some loving the new addition to the series while others are less than amused by the AI's behaviour and story line which frequently interrupts your wandering and side quests. Regardless of how you feel about the content it is a pretty game and finding the right settings for your particular GPU can only enhance your gameplay. [H]ard|OCP took some time to run through the game with an RX Vega 64 as well as a GTX 1080 and 1080Ti at 1440p and 4k resolutions. They also compared the impact on performance using SMAA and TAA has on the NVIDIA cards at 1440p. Check out the full review, especially the strong showing by the AMD card.
"Far Cry 5 is finally in retail, and in this preview of video card performance we will compare performance of a few video cards, using real gameplay. This will give you an idea what you need to play Far Cry 5, what image quality settings are playable, and what resolutions work best with each video card. Our full evaluation will come later."
Here is some more Tech News from around the web:
- Valve says it’s “still working hard” on Linux gaming @ Ars Technica
- Far Cry 5’s interrupting story ruins everything @ Rock, Paper, SHOTGUN
- Humble Indie Bundle 19
- Sunless Skies visits some strange place named London @ Rock, Paper, SHOTGUN
- Kalypso Week @ Humble Bundle
Eight-GPU SLI in Unreal Engine 4 (Yes There Is a Catch)
Subject: Graphics Cards | March 29, 2018 - 09:52 PM | Scott Michaud
Tagged: nvidia, GTC, gp102, quadro p6000
At GTC 2018, Walt Disney Imagineering unveiled a work-in-progress clip of their upcoming Star Wars: Galaxy’s Edge attraction, which is expected to launch next year at Disneyland and Walt Disney World Resort. The cool part about this ride is that it will be using Unreal Engine 4 with eight, GP102-based Quadro P6000 graphics cards. NVIDIA also reports that Disney has donated the code back to Epic Games to help them with their multi-GPU scaling in general – a win for us consumers… in a more limited fashion.
See? SLI doesn’t need to be limited to two cards if you have a market cap of $100 billion USD.
Another interesting angle to this story is how typical PC components are contributing to these large experiences. Sure, Quadro hardware isn’t exactly cheap, but it can be purchased through typical retail channels and it allows the company to focus their engineering time elsewhere.
Ironically, this also comes about two decades after location-based entertainment started to decline… but, you know, it’s Disneyland and Disney World. They’re fine.
GTC 2018: Nvidia and ARM Integrating NVDLA Into Project Trillium For Inferencing at the Edge
Subject: General Tech | March 29, 2018 - 03:10 PM | Tim Verry
Tagged: project trillium, nvidia, machine learning, iot, GTC 2018, GTC, deep learning, arm, ai
During GTC 2018 NVIDIA and ARM announced a partnership that will see ARM integrate NVIDIA's NVDLA deep learning inferencing accelerator into the company's Project Trillium machine learning processors. The NVIDIA Deep Learning Accelerator (NVDLA) is an open source modular architecture that is specifically optimized for inferencing operations such as object and voice recognition and bringing that acceleration to the wider ARM ecosystem through Project Trillium will enable a massive number of smarter phones, tablets, Internet-of-Things, and embedded devices that will be able to do inferencing at the edge which is to say without the complexity and latency of having to rely on cloud processing. This means potentially smarter voice assistants (e.g. Alexa, Google), doorbell cameras, lighting, and security around the home and out-and-about on your phone for better AR, natural translation, and assistive technologies.
Karl Freund, lead analyst for deep learning at Moor Insights & Strategy was quoted in the press release in stating:
“This is a win/win for IoT, mobile and embedded chip companies looking to design accelerated AI inferencing solutions. NVIDIA is the clear leader in ML training and Arm is the leader in IoT end points, so it makes a lot of sense for them to partner on IP.”
ARM's Project Trillium was announced back in February and is a suite of IP for processors optimized for parallel low latency workloads and includes a Machine Learning processor, Object Detection processor, and neural network software libraries. NVDLA is a hardware and software platform based upon the Xavier SoC that is highly modular and configurable hardware that can feature a convolution core, single data processor, planar data processor, channel data processor, and data reshape engines. The NVDLA can be configured with all or only some of those elements and they can independently them up or down depending on what processing acceleration they need for their devices. NVDLA connects to the main system processor over a control interface and through two AXI memory interfaces (one optional) that connect to system memory and (optionally) dedicated high bandwidth memory (not necessarily HBM but just its own SRAM for example).
NVDLA is presented as a free and open source architecture that promotes a standard way to design deep learning inferencing that can accelerate operations to infer results from trained neural networks (with the training being done on other devices perhaps by the DGX-2). The project, which hosts the code on GitHub and encourages community contributions, goes beyond the Xavier-based hardware and includes things like drivers, libraries, TensorRT support (upcoming) for Google's TensorFlow acceleration, testing suites and SDKs as well as a deep learning training infrastructure (for the training side of things) that is compatible with the NVDLA software and hardware, and system integration support.
Bringing the "smarts" of smart devices to the local hardware and closer to the users should mean much better performance and using specialized accelerators will reportedly offer the performance levels needed without blowing away low power budgets. Internet-of-Things (IoT) and mobile devices are not going away any time soon, and the partnership between NVIDIA and ARM should make it easier for developers and chip companies to offer smarter (and please tell me more secure!) smart devices.
Also read:
- NVDLA Primer
- Project Trillium: Machine Learning on ARM
- NVIDIA Announces DGX-2 with 16 GV100s & 8 100Gb NICs
- GTC 2018: NVIDIA Announces Volta-Powered Quadro GV100
- NVIDIA Teases Low Power, High Performance Xavier SoC That Will Power Future Autonomous Vehicles
- NVIDIA Launches Jetson TX2 With Pascal GPU For Embedded Devices
- ARM Announces Project Trillium, a New Dedicated AI Processing Family
Podcast #493 - New XPS 13, Noctua NH-L9a, News from NVIDIA GTC and more!
Subject: General Tech | March 29, 2018 - 02:37 PM | Ken Addison
Tagged: podcast, nvidia, GTC 2018, Volta, quadro gv100, dgx-2, noctua, NH-L9a-AM4
PC Perspective Podcast #493 - 03/29/18
Join us this week for our review of the new XPS 13, Noctua NH-L9a, news from NVIDIA GTC and more!
You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Allyn Malventano, Jeremy Hellstrom, Josh Walrath
Peanut Gallery: Ken Addison
Program length: 0:59:35
Podcast topics of discussion:
-
Merch! http://bit.ly/pcpermerch
-
Week in Review:
-
News items of interest:
-
Picks of the Week:
-
Allyn: retro game music remixed - ocremix.org (torrents)
-
Jeremy: I can’t tell if I like this or not, the DeepCool QuadStellar
-
Ken: K40 Laser Engraver - Wiki
-
How to see Hope County, Montana in all its glory
Subject: General Tech | March 28, 2018 - 02:08 PM | Jeremy Hellstrom
Tagged: gaming, amd, nvidia, far cry 5
Looking to get Far Cry 5 running with the highest settings your GPU can handle? The Guru of 3D have done a lot of the heavy lifting for you, testing the performance of thirteen cards each from NVIDIA and AMD at 1080p, 1440p and 4K resolutions. This game needs some juice, even the mighty Titan's cannot reach 60fps with Ultra settings at 4K. In the review they also take a look at the effect the number of cores and the frequency of your CPU has on performance, not much but enough you might notice. Check the full details here.
"Tomorrow Far Cry 5 will become available to the masses, we put it through our testing paces with almost 30 graphics cards, CPU performance and frame times. The looks great, and will offer great game play. Join us in this PC performance analysis."
Here is some more Tech News from around the web:
- Far Cry 5 Benchmark Performance Analysis @ TechPowerUp
- Far Cry 5 Benchmarked: 50 GPUs Tested @ TechSpot
- Far Cry 5 PC Performance Analysis @ Kitguru
- Far Cry 5 review in progress @ Rock, Paper, SHOTGUN
- Humble Indie Bundle 19
- Into the Breach’s interface was a nightmare to make and the key to its greatness @ Rock, Paper, SHOTGUN
- The Best PC Games (You Should Be Playing) @ TechSpot
- ARK Park VR Game & Performance @ BabelTechReviews
- Cyberpunk 2077 on track for release ahead of 2021 @ HEXUS
- Neverwinter Nights: Enhanced Edition has re-launched @ Rock, Paper, SHOTGUN
NVIDIA Announces DGX-2 with 16 GV100s & 8 100Gb NICs
Subject: Systems | March 27, 2018 - 08:04 PM | Scott Michaud
Tagged: Volta, nvidia, dgx-2, DGX
So… this is probably not for your home.
NVIDIA has just announced their latest pre-built system for enterprise customers: the DGX-2. In it, sixteen Volta-based Tesla V100 graphics devices are connected using NVSwitch. This allows groups of graphics cards to communicate to and from every other group at 300GB/s, which, to give a sense of scale, is about as much bandwidth as the GTX 1080 has available to communicate with its own VRAM. NVSwitch treats all 512GB as a unified memory space, too, which means that the developer doesn’t need redundant copies across multiple boards just so it can be seen by the target GPU.
Note: 512GB is 16 x 32GB. This is not a typo. 32GB Tesla V100s are now available.
For a little recap, Tesla V100 cards run a Volta-based GV100 GPU, which has 5120 CUDA cores and runs them at ~15 TeraFLOPs of 32-bit performance. Each of these cores also scale exactly to FP64 and FP16, as was the case since Pascal’s high-end offering, leading to ~7.5 TeraFLOPs of 64-bit or ~30 TeraFLOPs of 16-bit computational throughput. Multiply that by sixteen and you get 480 TeraFLOPs of FP16, 240 TeraFLOPs of FP32, or 120 TeraFLOPs of FP64 performance for the whole system. If you count the tensor units, then we’re just under 2 PetaFlops of tensor instructions. This is powered by a pair of Xeon Platinum CPUs (Skylake) and backed by 1.5TB of system RAM – which is only 3x the amount of RAM that the GPUs have if you stop and think about it.
The device communicates with the outside world through eight EDR InfiniBand NICs. NVIDIA claims that this yields 1600 gigabits of bi-directional bandwidth. Given how much data this device is crunching, it makes sense to keep data flowing in and out as fast as possible, especially for real-time applications. While the Xeons are fast and have many cores, I’m curious to see how much overhead the networking adds to the system when under full load, minus any actual processing.
NVIDIA’s DGX-2 is expected to ship in Q3.
GTC 2018: NVIDIA Announces Volta-Powered Quadro GV100
Subject: General Tech | March 27, 2018 - 03:30 PM | Ken Addison
Tagged: nvidia, GTC, quadro, gv100, GP100, tesla, titan v, v100, votla
One of the big missing markets for NVIDIA with their slow rollout of the Volta architecture was professional workstations. Today, NVIDIA announced they are bringing Volta to the Quadro family with the Quadro GV100 card.
Powered by the same GV100 GPU that announced at last year's GTC in the Tesla V100, and late last year in the Titan V, the Quadro GV100 represents a leap forward in computing power for workstation-level applications. While these users could currently be using TITAN V for similar workloads, as we've seen in the past, Quadro drivers generally provide big performance advantages in these sorts of applications. Although, we'd love to see NVIDIA repeat their move of bringing these optimizations to the TITAN lineup as they did with the TITAN Xp.
As it is a Quadro, we would expect this to be NVIDIA's first Volta-powered product which provides certified, professional driver code paths for applications such as CATIA, Solidedge, and more.
NVIDIA also heavily promoted the idea of using two of these GV100 cards in one system, utilizing NVLink. Considering the lack of NVLink support for the TITAN V, this is also the first time we've seen a Volta card with display outputs supporting NVLink in more standard workstations.
More importantly, this announcement brings NVIDIA's RTX technology to the professional graphics market.
With popular rendering applications like V-Ray already announcing and integrating support for NVIDIA's Optix Raytracing denoiser in their beta branch, it seems only a matter of time before we'll see a broad suite of professional applications supporting RTX technology for real-time. For example, raytraced renders of items being designed in CAD and modeling applications.
This sort of speed represents a potential massive win for professional users, who won't have to waste time waiting for preview renderings to complete to continue iterating on their projects.
The NVIDIA Quadro GV100 is available now directly from NVIDIA now for a price of $8,999, which puts it squarely in the same price range of the previous highest-end Quadro GP100.
PC Perspective Upcoming Events
Get notified when we go live! PC Perspective Live! Google Calendar RSS |

