Subject: Graphics Cards | August 14, 2018 - 01:08 AM | Jeremy Hellstrom
Tagged: Siggraph, ray tracing, quadro rtx 8000, quadro rtx 5000, nvidia, jensen
The attempt to describe the visual effects Jensen Huang showed off at his Siggraph keynote is bound to fail, not that this has ever stopped any of us before. If you have seen the short demo movie they released earlier this year in cooperation with Epic and ILMxLAB you have an idea what they can do with ray tracing. However they pulled a fast one on us, as they were hiding the actual hardware that this was shown with as it was not pre-rendered but instead was actually our first look at their real time ray tracing. The hardware required for this feat is the brand new RTX series and the specs are impressive.
The ability to process 10 Giga rays means that each and every pixel can be influenced by numerous rays of light, perhaps 100 per pixel in a perfect scenario with clean inputs, or 5-20 in cases where their AI de-noiser is required to calculate missing light sources or occlusions, in real time. The card itself functions well as a light source as well. The ability to perform 16 TFLOPS and 16 TIPS means this card is happy doing both floating point and integer calculations simultaneously.
The die itself is significantly larger than the previous generation at 754mm2, and will sport a 300W TDP to keep it in line with the PCIe spec; though we will run it through the same tests as the RX 480 to see how well they did if we get the chance. 30W of the total power is devoted to the onboard USB controller which implies support for VR Link.
The cards can be used in pairs, utilizing Jensun's chest decoration, more commonly known as an NVLink bridge, and more than one pair can be run in a system but you will not be able to connect three or more cards directly.
As that will give you up to 96GB of GDDR6 for your processing tasks, it is hard to consider that limiting. The price is rather impressive as well, compared to previous render farms such as this rather tiny one below you are looking at a tenth the cost to power your movie with RTX cards. The card is not limited to proprietary engines or programs either, with DirectX and Vulkan APIs being supported in addition to Pixar's software. Their Material Definition Language will be made open source, allowing for even broader usage for those who so desire.
You will of course wonder what this means in terms of graphical eye candy, either pre-rendered quickly for your later enjoyment or else in real time if you have the hardware. The image below attempts to show the various features which RTX can easily handle. Mirrored surfaces can be emulated with multiple reflections accurately represented, again handled on the fly instead of being preset, so soon you will be able to see around corners.
It also introduces a new type of anti-aliasing called DLAA and there is no money to win for guessing what the DL stands for. DLAA works by taking an already anti-aliased image and training itself to provide even better edge smoothing, though at a processing cost. As with most other features on these cards, it is not the complexity of the scene which has the biggest impact on calculation time but rather the amount of pixels, as each pixel has numerous rays associated with it.
This new feature also allows significantly faster processing than Pascal, not the small evolutionary changes we have become accustomed to but more of a revolutionary change.
In addition to effects in movies and other video there is another possible use for Turing based chips which might appeal to the gamer, if the architecture reaches the mainstream. With the ability to render existing sources with added ray tracing and de-noising features it might be possible for an enterprising soul to take an old game and remaster it in a way never before possible. Perhaps one day people who try to replay the original System Shock or Deus Ex will make it past the first few hours before the graphical deficiencies overwhelm their senses.
We expect to see more from NVIDIA tomorrow so stay tuned.
Subject: General Tech | August 13, 2018 - 07:43 PM | Ken Addison
Tagged: turing, siggraph 2018, rtx, quadro rtx 8000, quadro rtx 6000, quadro rtx 5000, quadro, nvidia
Today at the professional graphics-focused SIGGRAPH conference, NVIDIA's Jen-Hsun Huang has unveiled details on their much-rumored next GPU architecture, codenamed Turing.
At the core of the Turing architecture are what NVIDIA is referring these as two "engines"– one for accelerating Ray Tracing, and the other for accelerating AI Inferencing.
The Ray Tracing units are called RT cores and are not to be confused with the announcement of NVIDIA RTX technology for real-time ray-tracing that we saw at GDC this year. There, NVIDIA was using their Optix AI-powered denoising filter to clean up ray-traced images, allowing them to save on rendering resources, but the actual ray-tracing was still being done on the GPU cores itself.
Now, these RT cores will perform the ray calculations themselves at what NVIDIA is claiming is up to 10 GigaRays/second, or up to 25X the performance of the current Pascal architecture.
Just like we saw in the Volta-based Quadro GV100, these new Quadro RTX cards will also feature Tensor Cores for deep learning acceleration. It is unclear if these tensor cores remain unchanged from what we saw in Volta or not.
In addition to the RT Cores and Tensor Units, Turing also features an all-new design for the tradition Streaming Multiprocessor (SM) GPU units. Changes include an integer execution unit executing in parallel with the floating point datapath, and a new unified cache architecture with double the bandwidth of the previous generation.
NVIDIA claims these changes combined with the up to 4,608 available CUDA cores in the highest configuration will enable up to 16 TFLOPS and 16 trillion integer operations per second.
Alongside the announcement of the Turing Architecture, NVIDIA unveiled the Quadro RTX 5000, 6000, 8000-series products, due in Q4 2018.
In addition to the announcements at SIGGRAPH tonight, NVIDIA is expected to announce the consumer, GeForce products featuring the Turing architecture next week at an event in Germany.
PC Perspective is at both SIGGRAPH and will be at NVIDIA's event in Germany next week so stay tuned for more details!
Subject: General Tech | August 3, 2018 - 03:09 PM | Jeremy Hellstrom
Tagged: rumours, nvidia, GV104, gtx 1180
Stare deeply at the PCB picture, which purports to be of a GTX 1180. The layout implies many things, such as the amount of GDDR we are likely to see; with 16GB seeming more likely than 8GB. The golden SLI fingers have a different design from previous generations, which could signal the arrival of NVLink for consumers, a great feature for those who can buy their GPUs in pairs. The Inquirer has some more prognostications as well as links to even more leaked pictures.
"It's also visible from the leaked pictures that this model has different Scalable Link Interface (SLI) fingers than the firm's previous GPU cards. This could be a sign of Nvidia implementing NVLink for gaming GPUs, and because the cut out for the GPU is rather small, it could be a GV104 core chip."
Here is some more Tech News from around the web:
- AMD Creates Quad Core Zen SoC with 24 Vega CUs for Chinese Consoles @ Slashdot
- Leaked video shows the Galaxy Note9 from every angle, reveals 512GB version @ Ars Technica
- BlackBerry's Evolve smartphones ditch physical keys in favour of touchscreens @ The Inquirer
- Arm reckons its 'any device, any data, any cloud' IoT tech has legs @ The Register
- Get iflix FREE + New Local Content With iflix 3.0! @ TechARP
Subject: General Tech | July 31, 2018 - 01:17 PM | Jeremy Hellstrom
Tagged: turing, rumours, nvidia, Intel, gtx 1180
Two rumours are circulating today, one about Intel's upcoming year and one about NVIDIA and their new Turing GPUs. There is a possibility that we will see the launch of the GTX 1180 on August 10th, one day before Gamescom 2018 kicks off from what The Inquirer has determined. Over the coming months we will see more models arrive as well as the possible disappearance of the Ti brand as the rumour includes a GTX 1180+.
[H]ard|OCP picked up on a different story, the leak of Intel's coming processors and at least some specifications. One definite piece of good news is that there is only one new chipset listed in the leak, the expected Z390 that indicates it is unlikely we will see yet another socket change.
"The launch will likely see the firm reveal the GTX 1180, GTX 1170, GTX 1160 and GTX 1180+. First out of the door, according to an email to partners leaked last week, will be the GTX 1180, which will replace the popular GTX 1080."
Here is some more Tech News from around the web:
- Add-On Board Brings Xbox 360 Controllers to N64 @ Hackaday
- HP launches 'first of its kind' bug bounty program for, er, printers @ The Inquirer
- Microsoft devises new way of making you feel old: Windows NT is 25 @ The Register
- How hack on 10,000 WordPress sites was used to launch an epic malvertising campaign @ The Register
Subject: Graphics Cards | July 30, 2018 - 03:32 PM | Ken Addison
Tagged: nvidia, geforce, gaming celebration, gamescom, cologne
Earlier today, NVIDIA announced the GeForce Gaming Celebration, taking place August 20th-21st, in Cologne, Germany.
NVIDIA promises that this open to the public event taking place before the Gamescom convention "will be loaded with new, exclusive, hands-on demos of the hottest upcoming games, stage presentations from the world’s biggest game developers, and some spectacular surprises."
For any readers that might be in the area and interested in attending, first come first served registration can be found here. For readers outside of the area, the event will also be live streamed.
PC Perspective will be attending the event, so stay tuned for more news and details! We can't possibly imagine what NVIDIA could be getting ready to announce.
Subject: General Tech | July 27, 2018 - 12:54 PM | Jeremy Hellstrom
Tagged: nvidia, rumour, gtx 1180, turing
Hopefully this rumour is wrong as the price of the current high end cards from NVIDIA have finally stabilized at prices bearing some slight resemblance to their original MSRP. If it is true, the next generation of GPUs from NVIDIA will be priced based on their assumption that cryptomining will be coming back in vogue and people will be willing to pay a higher price for the new cards. We are still expecting to see these Turing cards arrive at the end of August, built on TSMC's 12nm process and introducing a new connector for VR headsets, which requires only a single cord.
The Inquirer doesn't have any guesses on what the prices may be, they do have links to the source of the rumours though.
"Digitimes has some additional info, though, and claims to have heard from 'industry sources' that the incoming Turing-based GPUs will be "priced higher" than Nvidia's current GTX 1080/1070 graphics cards."
Here is some more Tech News from around the web:
- Intel delays 10nm Cannon Lake processors, again, until late 2019 @ The Inquirer
- Spectre/Meltdown fixes in HPC: Want the bad news or the bad news? It's slower, say boffins @ The Register
- As Windows 10 turns three, we look back at its stormy first 1,000 days @ The Inquirer
- Google Bans Cryptocurrency Mining Apps From the Play Store @ Slashdot
- Windows 10 Insiders see double as new builds hit the deck – with promises to end Update Rage @ The Register
- Get Guns of Icarus Alliance FREE for a Limited Time! @ TechARP
- Netgear XR500 router, EX8000 Extender and SX10 (10 Gbit/s) Switch @ Guru of 3D
Subject: Graphics Cards | July 22, 2018 - 03:10 PM | Scott Michaud
Tagged: nvidia, gtx 1170, geforce
Take these numbers with a grain of salt, but WCCFTech has published what they claim is leaked GeForce GTX 1170 benchmarks, found “on Polish hardware forums”. If true, the results show that the graphics card, which would be below the GTX 1180 in performance, is still above the enthusiast-tier GTX 1080 Ti (at least on 3DMark FireStrike). It also suggests that both the GPU core and 16GB of memory are running at ~2.5 GHz.
Image Credit: “Polish Hardware Forums” via WCCFTech
So not only would the GTX 1180 be above the GTX 1080 Ti… but the GTX 1170 apparently is too? Also… 16GB on the second-tier card? Yikes.
Beyond the raw performance, new architectures also give NVIDIA the chance to add new features directly to the silicon. That said, FireStrike is an old-enough benchmark that it won’t take advantage of tweaks for new features, like NVIDIA RTX, so those should be above-and-beyond the increase seen in the score.
Don’t trust every screenshot you see…
Again, if this is true. The source is a picture of a computer monitor, which begs the question, “Why didn’t they just screenshot it?” Beyond that, it’s easy to make a website say whatever you want with the F12 developer tools of any mainstream web browser these days… as I’ve demonstrated in the image above.
Editor's Note: The initial version of this review incorrectly listed the Tiki as having 16GB of RAM, it actually has 32GB of memory.
Looking back through the PC Perspective archives as I prepared for this review, I was shocked to find we've never actually tested a Falcon Northwest Tiki system. Since its introduction in 2012, the Tiki has been a mainstay at conventions like CES, providing a compact solution for manufacturers to provide demos of their hardware and software.
With a base milled out of solid aluminum and GPU cut out window, the Tiki provides modest design flair while still remaining relatively tame and "adult-like" compared to many premium gaming PC options.
The Tiki is available with three different CPU platforms. Users have their pick from Intel X370 and X299, and even X470 platforms based around AMD’s Ryzen CPUs. It’s great to see system builders like Falcon Northwest embracing Ryzen CPUs in some of their flagship models like the Tiki.
|Falcon Northwest Tiki (configuration as reviewed)|
|Processor||Intel Core i7-8086K (Coffee Lake)|
|Motherboard||ROG STRIX Z370-I GAMING|
|Cooler||Asetek 550LC 120mm AIO Water Cooler|
|Graphics||NVIDIA TITAN Xp 12GB|
|Memory||32GB (2x16B) G.SKILL RIPJAWS V DDR4-3000|
Intel SSD Optane 905P 1.5TB U.2
|Power Supply||Silverstone SFX-650W|
|Dimensions||4" Wide x 13.5" Deep x 13.25" Tall. (715 cubic inches)|
|OS||Windows 10 Pro|
|Price||$6,242 (as configured) - Falcon NW|
By looking at the specs, it’s clear that the configuration of Tiki we were sent for review packs a lot of punch into its relatively small form-factor. Not only is the Core i7-8086K the highest-end offering for the Z370 platform, Falcon Northwest has further overclocked the CPU to 5.3 GHz (single thread maximum).
The CPU isn’t the only high-end component found in the Tiki either. Both the graphics card and storage solutions are nearing “overkill level” with the inclusion of an NVIDIA Titan Xp as well as 1.5TB of 3D XPoint storage in the form of an Intel Optane 905P U.2 drive.
Subject: Graphics Cards | July 17, 2018 - 12:38 PM | Ken Addison
Tagged: VR, VirtualLink, valve, usb 3.1, Type-C, Oculus, nvidia, microsoft, DisplayPort, amd
Today, NVIDIA, Oculus, Valve, AMD, and Microsoft, members of the VirtualLink consortium, have announced the VirtualLink standard, which aims to unify physically connecting Virtual Reality headsets to devices.
Based upon the physical USB Type-C connector, VirtualLink will combine the bandwidth of DisplayPort 1.4 (32.1Gbit/s) with a USB 3.1 Data connection, and the ability to deliver up to 27W of power.
VirualLink aims to simplify the setup of current VR Headsets
Given the current "Medusa-like" nature of VR headsets with multiple cables needing to feed video, audio, data, and power to the headset, simplifying to a single cable should provide a measurable benefit to the VR experience. In addition, having a single, unified connector could provide an easier method for third parties to provide wireless solutions, like the current TPCast device.
VirtualLink is an open standard, and the initial specifications can currently be found on the consortium website.
Subject: General Tech | June 28, 2018 - 02:31 PM | Alex Lustenberg
Tagged: video, thermaltake, qualcomm, podcast, PG27UQ, nvidia, micron, K70, Intel, gddr6, g-sync, Elgato, corsair, asus
PC Perspective Podcast #505 - 06/28/18
Join us this week for discussion on ASUS G-SYNC HDR, Logitech G305, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Ken Addison
Peanut Gallery: Alex Lustenberg
Program length: 1:26:36
Podcast topics of discussion: