Matt Pharr Returns to NVIDIA by Joining NVIDIA Research

Subject: General Tech | May 28, 2018 - 08:43 PM |
Tagged: pixar, nvidia, matt pharr, Intel, google

NVIDIA Research has another industry veteran working for them: Matt Pharr.

According to his blog post on the topic, he will be working on some balance of ray tracing, neural-networks, and how they can work together for computer graphics.

View Full Size

Moving on to Green... er... pastures.

Matt Pharr has been in the industry for quite some time. In the 90s, he worked at Pixar on A Bug’s Life and Toy Story 2. He then co-founded a company that made rendering software, which was bought by NVIDIA and eventually lead to Gelato. From there, he founded another company, Neoptica, which was acquired by Intel. While there, he worked alongside the Larabee team. He then joined Google in 2013, which has been his employer for the last five years.

He has been partially credited with physically based rendering, which is a way of defining computer-generated materials that is lighting independent. This allows artists to create content once and use it across multiple scenes, be it indoor or outdoor, light or dark.

We’re at an interesting point in time. We’re beginning to see hardware that can reasonably shoot rays into an environment to augment the data that rasterization provides us. At the same time, we’re also seeing the rise of neural networks that can hallucinate convincing, but physically inaccurate effects relatively cheaply. Graphics isn’t just evolving forward, it’s mixing laterally, too. There’s room for engines and technologies to behave wildly different from everyone else.


May 29, 2018 | 09:21 AM - Posted by ThatsSomeHeavyCalculationsThemRaysRequire (not verified)

So all those AI realated gaming effects on the Tensor cores and the Ray Tracing acceleration done on the compute shaders. From some of the work I have seen AI used for in image processing to seperate out objects(people, animals, other opjects) in images from their backgrounds without the need for green/blue screens maybe they can come up with some better more edge targeted AA using AI and save on the amount of AA calculations needed.

Really IT's PowerVR hardware ray tracing hardware acceleration was ahead of its time and maybe there can be some movement towards getting dedicated ray tracing hardware on future desktop GPUs. But for Now the GPUs with the most available Shader cores will be better for accelerating ray tracing calculations on their compute shaders.

Ray Tracing is not exactily real time even on the fastest of GPUs but that ray tracing denoising via post processing may be able to be accelerated via Tensor Cores and compute shaders also in addition to doing some traditionl Raster Operations Pipeline methods.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.