
[Nov 2021] – It’s no easier to identify inflection points in semiconductor history in hindsight, than it is at the time they occur.
Who thinks much about the 80486 today? This CPU was the first Intel CPU to include an on-chip floating point unit. On PCs, before that, difficult math problems were handled by coprocessors. So, it was a move noted by IC market watchers. However, soon the more raucous controversy revolved around up-and-coming Reduce Instruction Set Compute (RISC) chips like MIPS and SPARC versus Complex Instruction Set Compute (CISC) chip’s like Intel’s.
Andy Grove thought about RISC, then decided the game was elsewhere, and forged ahead with 80486 knockoffs. Grove immortalized his ‘aha’ moment in tech marketing lore as an ‘Inflection Point’ – this, in his “Only the Paranoid Survive” business autobio. RISC was more noise than signal, as he had it.
[His company fielded a RISC player in the form of the i860 – it’s not hard now to recall the excitement of my trade press colleagues when that chip was unveiled (classic trade press term, that) at ISSCC at the midtown Hilton in N.Y. in 1986.]
Andy Grove and co. eventually discerned the fact that RISC advantages were not going to change the all-important PC market. The 80486 (eventually “Pentium”) family began a long run. This long run ran well even as server architecture, cloud architectures, big data architectures and – especially – cell phone architectures – came to comprise a grand share of the IC market. Now, we are looking at AI and Edge AI as a new inflection point for semiconductor schemes.
Today, the hyperscaler cloud providers call the tune. They have written the latest chapter in the book on dedicated AI processors that hold neural nets, and are also actively pursuing low-power Edge AI versions of same. A nighty little Acorn in the form of ARM chip found a home and a standalone floating-point unit [well sort of, arguably, for purposes of discussion], in the form of the GPU, came on the scene, followed by TPUs and EBTKS (Everything But The Kitchen Sink).
Is Edge in any way shape or form a threat to the cloud status quo? It seems unlikely – hyperscalers don’t ask; they act.
Now, chip designers must reformulate machine learning architectures that work in the power-rich cloud data center for use in low-power operation on the edge, outpace a mix of competitors, and find willing customers.
From my perspective, that is one of several backgrounds to a piece I recently wrote about AI on the Edge.
The central source for “The Shape of AI to Come” on Venture Beat was Marian Verhelst, a circuits and systems researcher at Katholieke Universiteit Leuven and the Imec tech hub in Belgium, and, as well, an advisor to startup Axelera.AI and a member of the Tiny ML foundation.
“Big companies like Google, Apple, and Facebook are all investing in making chips. The AI wave caused them to start developing chips because it makes a big difference,” Verhelst told me. Moving neural net style AI/ML from the cloud to the edge brings new requirements, which the story outlines.
Basically, the startups pursue a space pioneered by hyperscaling cloud providers that could not wait for mainline chip makers to create new chip architectures. Now, the task is to extend such architecture to work on the edge. Hindsight will be the final judge – But it could well be an Inflection Point. – Jack Vaughan
Related
https://venturebeat.com/2021/11/24/the-shape-of-edge-ai-to-come/
https://www.intel.com/pressroom/archive/speeches/ag080998.htm Hey that’s a pretty old-style URL!
https://ashishb.net/book-summary/book-summary-only-the-paranoid-survives-by-andrew-grove/