• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Archives for November 2021

Upon inflection

November 27, 2021 By Jack Vaughan

Intel 80486
Intel 80486

[Nov 2021] – It’s no easier to identify inflection points in semiconductor history in hindsight, than it is at the time they occur.

Who thinks much about the 80486 today? This CPU was the first Intel CPU to include an on-chip floating point unit. On PCs, before that, difficult math problems were handled by coprocessors. So, it was a move noted by IC market watchers. However, soon the more raucous controversy revolved around up-and-coming Reduce Instruction Set Compute (RISC) chips like MIPS and SPARC versus Complex Instruction Set Compute (CISC) chip’s like Intel’s.

Andy Grove thought about RISC, then decided the game was elsewhere, and forged ahead with 80486 knockoffs. Grove immortalized his ‘aha’ moment in tech marketing lore as an ‘Inflection Point’ – this, in his “Only the Paranoid Survive” business autobio. RISC was more noise than signal, as he had it.

[His company fielded a RISC player in the form of the i860 – it’s not hard now to recall the excitement of my trade press colleagues when that chip was unveiled (classic trade press term, that) at ISSCC at the midtown Hilton in N.Y. in 1986.]

Andy Grove and co. eventually discerned the fact that RISC advantages were not going to change the all-important PC market. The 80486 (eventually “Pentium”) family began a long run. This long run ran well even as server architecture, cloud architectures, big data architectures and – especially – cell phone architectures – came to comprise a grand share of the IC market. Now, we are looking at AI and Edge AI as a new inflection point for semiconductor schemes.

Today, the hyperscaler cloud providers call the tune. They have written the latest chapter in the book on dedicated AI processors that hold neural nets, and are also actively pursuing low-power Edge AI versions of same. A nighty little Acorn in the form of ARM chip found a home and a standalone floating-point unit [well sort of, arguably, for purposes of discussion], in the form of the GPU, came on the scene, followed by TPUs and EBTKS (Everything But The Kitchen Sink).

Is Edge in any way shape or form a threat to the cloud status quo?  It seems unlikely – hyperscalers don’t ask; they act.

Now, chip designers must reformulate machine learning architectures that work in the power-rich cloud data center for use in low-power operation on the edge, outpace a mix of competitors, and find willing customers.

From my perspective, that is one of several backgrounds to a piece I recently wrote about AI on the Edge.

The central source for “The Shape of AI to Come” on Venture Beat was Marian Verhelst, a circuits and systems researcher at Katholieke Universiteit Leuven and the Imec tech hub in Belgium, and, as well, an advisor to startup Axelera.AI and a member of the Tiny ML foundation.

“Big companies like Google, Apple, and Facebook are all investing in making chips. The AI wave caused them to start developing chips because it makes a big difference,” Verhelst told me. Moving neural net style AI/ML from the cloud to the edge brings new requirements, which the story outlines.

Basically, the startups pursue a space pioneered by hyperscaling cloud providers that could not wait for mainline chip makers to create new chip architectures. Now, the task is to extend such architecture to work on the edge. Hindsight will be the final judge – But it could well be an Inflection Point. – Jack Vaughan

Related
https://venturebeat.com/2021/11/24/the-shape-of-edge-ai-to-come/
https://www.intel.com/pressroom/archive/speeches/ag080998.htm Hey that’s a pretty old-style URL!
https://ashishb.net/book-summary/book-summary-only-the-paranoid-survives-by-andrew-grove/

 

 

 

Trimble works with Microsoft on quantum-inspired logistics

November 18, 2021 By Jack Vaughan

Azure Quantum drives data storage improvements https://t.co/ywLiqKjEbO via @VentureBeat

— Jack Vaughan (@JackIVaughan) November 16, 2021

One among the many takeaways from the COVID-19 pandemic is an awareness that industries needed to take another look at supply-chain optimization. It may not be quite up there with security, but supply-chain optimization is near top of charts in many of today’s IT surveys.
Logistics and optimization are oft-cited uses of quantum computing. Though it is still a trickle, there are growing indications quantum algorithms are being applied to research in supply-chain optimization.
These are often quantum-inspired efforts, meaning they reap the benefit of research in quantum simulation, but don’t require actual quantum computers to do the work.
An example comes by way of Microsoft, which is using its own quantum software to optimize storage – that is to increase capacity and predictability — on its Azure cloud. You can read about it on VentureBeat.
Computer storage management turns out to be a useful use quantum-inspired algorithms, to hear in Microsoft’s telling, and certainly they are well versed in cloud hyperscaling. Some of this is outlined in a Microsoft blog.
But the company said its QIO optimization solvers can work on other domains, and were joined by geospatial and logistics services system mainstay Trimble in the announcement. Trimble said it is using Azure Quantum QIO to i.d. the best routes for vehicle fleets,” ensuring fewer trucks run empty, and maximizing operating load for each trip.” Useful trait, that.
When it comes to quantum-inspired algorithms on classic computers – it will just have to do, in the words of Sammy Cahn’s immortal song, until the quantum computing thing comes along. – Jqck Vquhqn

Progressive Gauge

Copyright © 2023 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack