• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

Quantum computing moves ahead in increments

October 31, 2021 By Jack Vaughan

Q: How many quantum computer coders does it take to change lightbulb?
A: Both none and one at the same time. (Reddit)

The battle for quantum supremacy ended in 2019. As history shows, sometimes the “mop up” takes longer than the battle. As of today, quantum computing’s journey from laboratory has not developed a discernable pace. That makes it a great case for technology assessment.

When Google in 2019 claimed quantum supremacy – that its computer based on quantum qubits could outperform classical machines — IBM and others challenged that notion. No surprise there.

But, at the sme time, the rush of breathless announcements subsided. The quantum research army went back to lab work – that is, creating algorithms and solving problems of error correction, reproducibility, quantum mechanical entanglement and much more.

The answer to ‘how’s it going?’ is the same here as in the clogged-up paint shop in an old Detroit auto factory: “We’re working on it.”

Cryptography, materials science, logistics and operations are often cited as near-term targets. Some work is beginning using the relatively meager amount of qubits at hand.

The roadmaps of the Honeywells, IBMs and Googles suggest continuing incremental steps forward over this decade, not just in science labs but also in technology applications.

The fact is that the people in the quantum business are resolved to pursue the necessary and incremental steps, and that the industries that could be affected are watching closely enough so as not to be blindsided when quantum computing sees the light of day.

Quantum computing hardware will need to advance before commercial value is achieved, but in the meantime, some new software trailing in the wake of research work may creep into commercial use ahead of that day. – Jack Vaughan

Recent story
Gartner advises tech leaders to prep for action as quantum computing spreads – VentureBeat

The Edge of AI

August 25, 2021 By Jack Vaughan

August 24, 2021 – By Jack Vaughan – Since it is early in the era of Edge AI, it is fair to say that vendors are still trying to uncover the best first places to deploy AI. The value in the data center is given, but outside the (relatively) friendly confines of the data center that game is a bit unpredictable.

That is a take away I gather from my recent research for Edge AI chips take to the Field for IoT World Today. I’d be glad to predict what’s next, but the tea leaves are a bit too scattered for that.

Edge AI may arguably have a longer lineage than people generally might realize. Language-specific LISP chips targeted AI in the 1980s, and we at Digital Design then devoted attention to the phenomenon – that although it was hard to conjure up three players for a well-rounded feature.

The dim consensus today is that lack of humongous data sets stalled AI back in those days. But, that being the case or not, it is clear more powerful 32-bit and 64-bit CPUs proved more capable for whatever AI needs you might have. More recently, GPUs, DSPs, and the TPU began to become the means for AI models to crunch on big data.

As our story shows, new memory approaches are part and parcel of the Edge AI gold rush. As I composed the piece, I looked for an analogy. It was far from a good fit, but the early days of the automobile seemed somewhat apt.

Fate did not foreordain that the gas-powered internal combustion engine would come to epitomize “the auto” for many, many years. Electric, steam, and gas powered engines vied. Gas won, for a good long stay and why? It’s just that a number of sympathetic trends aligned.

Poor inter-city roads augured against cars of any kind in the early going. But the gas engine benefited from better understanding of thermodynamics, and proved maintainable, economical, and fit well with emerging mass production practices. This, at a time when the fuel – refined oil – was inexpensive. But some hybrid flair was required: The invention of electric starters eased the way for mass adoption of the internal combustion engine. In recent years, electric vehicles have made inroads, but still face hurdles to wide acceptance.

Will any of our new Edge AI designs come to resemble the Stanley Steamer? It used the locomotive boiler as analogy, but ultimately failed. The internal combustion engine did what the Steamer didnt. Now, a step or two into the 21st Century, it may be due for replacement. What next will hold sway? We will see.

So, let’s get back to Edge AI. The divergent AI approaches now rolling out should see some consolidation over time. It took many years for the internal combustion engine to settle in as the way the world powered automobiles, and many designs have vied to replace it. But the infrastructure surrounding the engine is an important key to how things play out.

Nvidia, which gained a march on competitors with its GPUs for data center-based AI, is often cited as a leader in the development of tools for AI chips. Such “infrastructure” gives it an edge on the edge too. The company is adapting its hardware and tools for Edge AI. New arrivals have new approaches at the edge but must catch up, especially in terms of tooling, which will take time.

Finding the killer app is still ahead. Where are the Edge AI use cases? It’s still early to find something on par with the Internet recommendation engines and image recognition applications that powered Nvidia’s first AI forays. One thing that can be said is that we don’t see many drones in the sky or self-driving cars on the highway. These are often cited as the places where Edge AI will thrive.

What we see most immediately is a nation of surveillance cameras, a scattering of self-evaluating joggers and bike riders, and voice-activated speakers powered by assistants (I know what this latter thing is because I just asked the Google orb on my kitchen table what she was.) These are the visible targets, but there may be others near but still less visible.

Note:
The global smart speaker market size is expected to reach USD 24.09 billion by 2028 at a CAGR of 16.9%.
The global surveillance camera market is expected to reach US$39.13 billion in 2025, growing at a CAGR of 8.17% for the time period of 2021-2025.

Lesson of Cassandra in the desert: “First, model”

April 18, 2021 By Jack Vaughan

Cassandra lessons from desert deployment

Mead’s Lessons from the History of Semiconductors

April 4, 2021 By Jack Vaughan

Carver Mead’s Lessons from the History of Semiconductors

Scenes from tinyML 2021

March 27, 2021 By Jack Vaughan

Why tinyML?

 

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 10
  • Go to page 11
  • Go to page 12
  • Go to page 13
  • Go to page 14
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2025 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack