When Kaby Lake G debuted at CES 2018, it made a big bang. No one expected sworn rivals Intel and AMD to collaborate on a CPU package, marrying a 7th-gen Kaby Lake CPU with a unique AMD Radeon RX Vega GPU.
But what began with a bang ended Monday with an unceremonious memo. First noticed by Paul Alcorn of Tom’s Hardware, the Product Change Notification published by Intel on Monday confirmed that pretty much every single Kaby Lake G, including the Core i7-8706G, the Core i7-8705G, and the Core i5-8305G, would be discontinued. Last call for orders will be on January 17, 2020, and the final shipments are scheduled for July 31, 2020.

Intel officially killed its ambitious Kaby Lake G line on Monday.
You’ll get drivers for five years
While the end of life of a processor isn’t typically a big deal to consumers who own them, one sticking point could have been driver support. Specifically, Kaby Lake G drivers for the custom AMD Radeon RX Vega M graphics come only from Intel. With a normal discrete GPU, the consumer would download drivers from the original company, such as Nvidia or AMD.
With Kaby Lake G kaput, where does that leave Kaby Lake G-owners? Intel said the company will follow its standard policy and provide driver support for Kaby Lake G for five years from the launch of the product. All told, that probably means another 3.5 years of driver updates.

A failure or success?
Kaby Lake G was the part no one expected, and our own review of it was quite promising. It seemed like a legitimate future threat to the hegemony of Nvidia in laptop graphics.
“As good as it is, Kaby Lake G isn’t going to shake up the CPU+GeForce scene today,” we wrote at the time. “But tomorrow, if there’s a Cannon Lake G or a Whiskey Lake G with more cores and better graphics, AMD and Nvidia should be worried.”
Nvidia, shut out of the relationship entirely, had the most to lose. “What’s bad for Nvidia,” we wrote, “is how the integrated-CPU-and-GPU design concentrates power with Intel. If Intel buys the graphics chip and adds it, the laptop vendor is no longer making the choice, potentially freezing out Nvidia.”
Even though AMD had some skin in the game, it also faced risks. “AMD isn’t sitting pretty either,” we wrote. “Today Intel is buying Radeon graphics, but the company recently announced its intent to make its own discrete graphics. It’s entirely possible a future ‘G’ chip will feature Intel discrete graphics, not AMD’s.”

Kaby Lake G greatly reduced the real estate it took to build a discrete GPU laptop.
Obviously, Nvidia doesn’t seem to be in fear of anything today from a G chip. In fact, as of this week, only five products we know of it even used the chip, with one of those being made by Intel. Contrast that with the dozens and dozens of GeForce+CPU designs.
While Dell, HP, and Acer built laptops using the lower-performance 65-watt chip, no laptop vendor made one based on the much faster 100-watt chip. Why not?
To be honest, we’ll never know. Was it technical issues? Availability issues? Did Nvidia figure out a way to thwart Intel? Did AMD decide not to feed its primary competitor any more Radeon chips? Or did Intel decide that its own graphics efforts would take precedent?
Intel’s response to our questions about the death of Kaby Lake G seems to imply that it just may not need AMD’s graphics anymore.
“Intel is refocusing its product portfolio,” an Intel spokesman told PCWorld. “Our 10th Gen Intel Core processors with Iris Plus graphics are built on the new Gen11 graphics architecture that nearly doubled graphics performance. We have more in store from our graphics engine that will bring further enhancements to PCs in the future.”
Perhaps the most important legacy of Kaby Lake G will be its use of Intel’s EMIB technology to join the 7th-gen Kaby Lake cores with the Radeon RX Vega M chip. By using EMIB, Intel greatly reduced the size of the package typically used in a discrete GPU and CPU design. Kaby Lake G also gave it the ability to monitor and control the consumption of power and production of heat.
In the end, Kaby Lake G’s real impact was proving that it was possible for Intel and AMD to work together on a product—something we may never see again.