• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects and Samples
  • Video
  • Contact Jack
  • Show Search
Hide Search

Blog

Upon inflection

November 27, 2021 By Jack Vaughan

Intel 80486
Intel 80486

[Nov 2021] – It’s no easier to identify inflection points in semiconductor history in hindsight, than it is at the time they occur.

Who thinks much about the 80486 today? This CPU was the first Intel CPU to include an on-chip floating point unit. On PCs, before that, difficult math problems were handled by coprocessors. So, it was a move noted by IC market watchers. However, soon the more raucous controversy revolved around up-and-coming Reduce Instruction Set Compute (RISC) chips like MIPS and SPARC versus Complex Instruction Set Compute (CISC) chip’s like Intel’s.

Andy Grove thought about RISC, then decided the game was elsewhere, and forged ahead with 80486 knockoffs. Grove immortalized his ‘aha’ moment in tech marketing lore as an ‘Inflection Point’ – this, in his “Only the Paranoid Survive” business autobio. RISC was more noise than signal, as he had it.

[His company fielded a RISC player in the form of the i860 – it’s not hard now to recall the excitement of my trade press colleagues when that chip was unveiled (classic trade press term, that) at ISSCC at the midtown Hilton in N.Y. in 1986.]

Andy Grove and co. eventually discerned the fact that RISC advantages were not going to change the all-important PC market. The 80486 (eventually “Pentium”) family began a long run. This long run ran well even as server architecture, cloud architectures, big data architectures and – especially – cell phone architectures – came to comprise a grand share of the IC market. Now, we are looking at AI and Edge AI as a new inflection point for semiconductor schemes.

Today, the hyperscaler cloud providers call the tune. They have written the latest chapter in the book on dedicated AI processors that hold neural nets, and are also actively pursuing low-power Edge AI versions of same. A nighty little Acorn in the form of ARM chip found a home and a standalone floating-point unit [well sort of, arguably, for purposes of discussion], in the form of the GPU, came on the scene, followed by TPUs and EBTKS (Everything But The Kitchen Sink).

Is Edge in any way shape or form a threat to the cloud status quo?  It seems unlikely – hyperscalers don’t ask; they act.

Now, chip designers must reformulate machine learning architectures that work in the power-rich cloud data center for use in low-power operation on the edge, outpace a mix of competitors, and find willing customers.

From my perspective, that is one of several backgrounds to a piece I recently wrote about AI on the Edge.

The central source for “The Shape of AI to Come” on Venture Beat was Marian Verhelst, a circuits and systems researcher at Katholieke Universiteit Leuven and the Imec tech hub in Belgium, and, as well, an advisor to startup Axelera.AI and a member of the Tiny ML foundation.

“Big companies like Google, Apple, and Facebook are all investing in making chips. The AI wave caused them to start developing chips because it makes a big difference,” Verhelst told me. Moving neural net style AI/ML from the cloud to the edge brings new requirements, which the story outlines.

Basically, the startups pursue a space pioneered by hyperscaling cloud providers that could not wait for mainline chip makers to create new chip architectures. Now, the task is to extend such architecture to work on the edge. Hindsight will be the final judge – But it could well be an Inflection Point. – Jack Vaughan

Related
https://venturebeat.com/2021/11/24/the-shape-of-edge-ai-to-come/
https://www.intel.com/pressroom/archive/speeches/ag080998.htm Hey that’s a pretty old-style URL!
https://ashishb.net/book-summary/book-summary-only-the-paranoid-survives-by-andrew-grove/

 

 

 

Trimble works with Microsoft on quantum-inspired logistics

November 18, 2021 By Jack Vaughan

Azure Quantum drives data storage improvements https://t.co/ywLiqKjEbO via @VentureBeat

— Jack Vaughan (@JackIVaughan) November 16, 2021

One among the many takeaways from the COVID-19 pandemic is an awareness that industries needed to take another look at supply-chain optimization. It may not be quite up there with security, but supply-chain optimization is near top of charts in many of today’s IT surveys.
Logistics and optimization are oft-cited uses of quantum computing. Though it is still a trickle, there are growing indications quantum algorithms are being applied to research in supply-chain optimization.
These are often quantum-inspired efforts, meaning they reap the benefit of research in quantum simulation, but don’t require actual quantum computers to do the work.
An example comes by way of Microsoft, which is using its own quantum software to optimize storage – that is to increase capacity and predictability — on its Azure cloud. You can read about it on VentureBeat.
Computer storage management turns out to be a useful use quantum-inspired algorithms, to hear in Microsoft’s telling, and certainly they are well versed in cloud hyperscaling. Some of this is outlined in a Microsoft blog.
But the company said its QIO optimization solvers can work on other domains, and were joined by geospatial and logistics services system mainstay Trimble in the announcement. Trimble said it is using Azure Quantum QIO to i.d. the best routes for vehicle fleets,” ensuring fewer trucks run empty, and maximizing operating load for each trip.” Useful trait, that.
When it comes to quantum-inspired algorithms on classic computers – it will just have to do, in the words of Sammy Cahn’s immortal song, until the quantum computing thing comes along. – Jqck Vquhqn

Quantum computing moves ahead in increments

October 31, 2021 By Jack Vaughan

Q: How many quantum computer coders does it take to change lightbulb?
A: Both none and one at the same time. (Reddit)

The battle for quantum supremacy ended in 2019. As history shows, sometimes the “mop up” takes longer than the battle. As of today, quantum computing’s journey from laboratory has not developed a discernable pace. That makes it a great case for technology assessment.

When Google in 2019 claimed quantum supremacy – that its computer based on quantum qubits could outperform classical machines — IBM and others challenged that notion. No surprise there.

But, at the sme time, the rush of breathless announcements subsided. The quantum research army went back to lab work – that is, creating algorithms and solving problems of error correction, reproducibility, quantum mechanical entanglement and much more.

The answer to ‘how’s it going?’ is the same here as in the clogged-up paint shop in an old Detroit auto factory: “We’re working on it.”

Cryptography, materials science, logistics and operations are often cited as near-term targets. Some work is beginning using the relatively meager amount of qubits at hand.

The roadmaps of the Honeywells, IBMs and Googles suggest continuing incremental steps forward over this decade, not just in science labs but also in technology applications.

The fact is that the people in the quantum business are resolved to pursue the necessary and incremental steps, and that the industries that could be affected are watching closely enough so as not to be blindsided when quantum computing sees the light of day.

Quantum computing hardware will need to advance before commercial value is achieved, but in the meantime, some new software trailing in the wake of research work may creep into commercial use ahead of that day. – Jack Vaughan

Recent story
Gartner advises tech leaders to prep for action as quantum computing spreads – VentureBeat

The Edge of AI

August 25, 2021 By Jack Vaughan

August 24, 2021 – By Jack Vaughan – Since it is early in the era of Edge AI, it is fair to say that vendors are still trying to uncover the best first places to deploy AI. The value in the data center is given, but outside the (relatively) friendly confines of the data center that game is a bit unpredictable.

That is a take away I gather from my recent research for Edge AI chips take to the Field for IoT World Today. I’d be glad to predict what’s next, but the tea leaves are a bit too scattered for that.

Edge AI may arguably have a longer lineage than people generally might realize. Language-specific LISP chips targeted AI in the 1980s, and we at Digital Design then devoted attention to the phenomenon – that although it was hard to conjure up three players for a well-rounded feature.

The dim consensus today is that lack of humongous data sets stalled AI back in those days. But, that being the case or not, it is clear more powerful 32-bit and 64-bit CPUs proved more capable for whatever AI needs you might have. More recently, GPUs, DSPs, and the TPU began to become the means for AI models to crunch on big data.

As our story shows, new memory approaches are part and parcel of the Edge AI gold rush. As I composed the piece, I looked for an analogy. It was far from a good fit, but the early days of the automobile seemed somewhat apt.

Fate did not foreordain that the gas-powered internal combustion engine would come to epitomize “the auto” for many, many years. Electric, steam, and gas powered engines vied. Gas won, for a good long stay and why? It’s just that a number of sympathetic trends aligned.

Poor inter-city roads augured against cars of any kind in the early going. But the gas engine benefited from better understanding of thermodynamics, and proved maintainable, economical, and fit well with emerging mass production practices. This, at a time when the fuel – refined oil – was inexpensive. But some hybrid flair was required: The invention of electric starters eased the way for mass adoption of the internal combustion engine. In recent years, electric vehicles have made inroads, but still face hurdles to wide acceptance.

Will any of our new Edge AI designs come to resemble the Stanley Steamer? It used the locomotive boiler as analogy, but ultimately failed. The internal combustion engine did what the Steamer didnt. Now, a step or two into the 21st Century, it may be due for replacement. What next will hold sway? We will see.

So, let’s get back to Edge AI. The divergent AI approaches now rolling out should see some consolidation over time. It took many years for the internal combustion engine to settle in as the way the world powered automobiles, and many designs have vied to replace it. But the infrastructure surrounding the engine is an important key to how things play out.

Nvidia, which gained a march on competitors with its GPUs for data center-based AI, is often cited as a leader in the development of tools for AI chips. Such “infrastructure” gives it an edge on the edge too. The company is adapting its hardware and tools for Edge AI. New arrivals have new approaches at the edge but must catch up, especially in terms of tooling, which will take time.

Finding the killer app is still ahead. Where are the Edge AI use cases? It’s still early to find something on par with the Internet recommendation engines and image recognition applications that powered Nvidia’s first AI forays. One thing that can be said is that we don’t see many drones in the sky or self-driving cars on the highway. These are often cited as the places where Edge AI will thrive.

What we see most immediately is a nation of surveillance cameras, a scattering of self-evaluating joggers and bike riders, and voice-activated speakers powered by assistants (I know what this latter thing is because I just asked the Google orb on my kitchen table what she was.) These are the visible targets, but there may be others near but still less visible.

Note:
The global smart speaker market size is expected to reach USD 24.09 billion by 2028 at a CAGR of 16.9%.
The global surveillance camera market is expected to reach US$39.13 billion in 2025, growing at a CAGR of 8.17% for the time period of 2021-2025.

Lesson of Cassandra in the desert: “First, model”

April 18, 2021 By Jack Vaughan

Cassandra lessons from desert deployment

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Interim pages omitted …
  • Go to page 6
  • Go to Next Page »

Primary Sidebar

Twitter Feed

Tweets by JackIVaughan

Recent Posts

  • Gauge Taking – Random notes on Claude Shannon
  • Benedict Evans on Steps to the Future
  • Will cloud hyperscalers react as Edge erupts?
  • Shine a Light: Electro-Photonic Computing gets $4.8 M IARPA boost
  • Progressive Podcast – Digital Twins Meet the Metaverse – With George Lawton

Recent Comments

    Archives

    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • November 2021
    • October 2021
    • August 2021
    • April 2021
    • March 2021
    • January 2021
    • December 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • July 2019

    Categories

    • Computing
    • Data
    • IoT
    • Podcast
    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    Twitter Feed

    Tweets by JackIVaughan

    Progressive Gauge

    Copyright © 2022 · Jack Vaughan · Log in

    • Home
    • About
    • Blog
    • Projects and Samples
    • Video
    • Contact Jack