Rumor: Intel Adds New Codecs with Kaby Lake-S iGPU

Subject: Processors | June 24, 2016 - 11:15 PM |
Tagged: Intel, kaby lake, iGPU, h.265, hevc, vp8, vp9, codec, codecs

Fudzilla isn't really talking about their sources, so it's difficult to gauge how confident we should be, but they claim to have information about the video codecs supported by Kaby Lake's iGPU. This update is supposed to include hardware support for HDR video, the Rec.2020 color gamut, and HDCP 2.2, because, if videos are pirated prior to their release date, the solution is clearly to punish your paying customers with restrictive, compatibility-breaking technology. Time-traveling pirates are the worst.

View Full Size

According to their report, Kaby Lake-S will support VP8, VP9, HEVC 8b, and HEVC 10b, both encode and decode. However, they then go on to say that 10-bit VP9 and 10-bit HEVC 10b does not include hardware encoding. I'm not too knowledgeable about video codecs, but I don't know of any benefits to encoding 8-bit HEVC Main 10. Perhaps someone in our comments can clarify.

Source: Fudzilla

Video News

June 25, 2016 | 01:21 AM - Posted by Anonymous (not verified)

Yeah, not a whole lot of benefit to encoding 10 bit video through hardware. The purpose of 10 bit encoding is typically to get better compression efficiency, which basically goes against the tradeoff of hardware video encoding. The two could potentially cancel each other out to some degree, and give you a bit more compression efficiency without sacrificing speed, but the complexity of supporting it in the hardware is probably not worth it.

June 25, 2016 | 02:40 AM - Posted by Anonymous (not verified)

But 8 bit encoding and large gamut colour space like REC20202 will blow serious chunks. I can see massive posterization as it is already on normal HD source material. 10 bit is a must going forward and 8 bit should have been eliminated years ago.

June 25, 2016 | 02:55 PM - Posted by BlackDove (not verified)

Exactly. And Rec.2020 has a provision for 12bit encoding as well. Its about time sRGB and 8bit gets replaced.

June 26, 2016 | 03:57 AM - Posted by Anonymous (not verified)

Missing the point entirely. You can still encode in software and if you're caring about avoiding artifacts, you're not using hardware encoding, anyway. Hardware encoding is only useful when you need real time encode speed for stuff like streaming. You don't care if it looks great, so long as it gets out at a consistent rate. 8 bit is perfectly fine in those cases.

Of course, like they said, the CPU can decode 10 bit in hardware, though most of them can honestly probably decode it in software too if it was that big a deal. You won't be missing out on playback of this material.

June 25, 2016 | 07:46 AM - Posted by Anonymous (not verified)

I don't understand the articles HDCP comment. Do you have any links to prove that paying customers will be affected by this?

I personally ripped some of my BluRay collection only to find out that I couldn't play several movies on my Samsung BD player due to Cinavia (audio watermark) so there are scenarios where this is an issue, so it would be nice to know EXACTLY what is being referred to.

I can play my stuff on my PC, however it looks like in the future we'll get DRM forced into more hardware, but again the HDCP issue in this article is uncertain. If you read more here, it's vague and confusing->

June 25, 2016 | 09:58 AM - Posted by Ipkh (not verified)

You need HDCP 2.2 for 4k Blu Ray sources. So an HTPC with a 4k drive would need it to avoid buying a Discrete GPU. Also, subscription video might eventually require HDCP as well.

June 25, 2016 | 01:34 PM - Posted by Scott Michaud

I was speaking generally about technologies that are useless outside of sating the fears of content creators, when their content is often pirated before release, thus within their own supply chain.

At the very least, paying customers will be affected by this with its need to update or purchase equipment for no end-user benefit (some of which might be expensive, like an AV receiver). It also hinders legitimate uses, such as fair use mash-ups and repairing damaged disks by cloning them. Worse still is if the DRM ends up being broken or buggy in such a way that it prevents playback.

HDCP hasn't been 100% perfect throughout history, but I wasn't intending to call out any specific issue. Again, I was just referring to its consumer-agnostic at best, and anti-consumer at worst, nature, which is particularly annoying when it's shown that piracy occurs before it's even possible for consumers to have their hands on it.

June 26, 2016 | 01:26 AM - Posted by DK76 (not verified)

This explains it all:

June 26, 2016 | 09:01 PM - Posted by Anonymous (not verified)

Fudzilla is wrong and don't know what they're talking about, Kaby Lake does support HEVC Main10 hardware encoding in addition to decoding.

"HEVC Main 10 (10-bit) encoder and decoder support."

"Full HW HEVC encode MAIN10 profile supported on 7th Generation Intel Core and Core M platforms with limited feature coverage, please refer to limitations section"

"Full HW HEVC decode MAIN10 supported on 7th Generation Intel Intel Core and Core-M platforms."

"Full HW VP9 8 bit and 10 bit decode supported on 7th Generation Intel Core and Core-M platforms."

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.