• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Computing

Progressive Podcast – Digital Twins Meet the Metaverse – With George Lawton

January 31, 2022 By Jack Vaughan

https://progressivegauge.com/wp-content/uploads/2022/01/DigitaTwins-in-Review-2021-2022.mp3

[JANUARY 2022] – When worlds collide, technologies explode. Sometimes. Such may be the case as the metaverse meets digital twins.

Different technologies depend on each other to mutually prosper. The iPhone required advances in display glass, haptics for touch input, and so on. As well, social and business trends need to align beneficially. The iPhone, used again as example, tapped into thriving cellular networks, capable cloud data centers and ubiquitous e-commerce solutions. Today, when technology pundits gather, they look for a similarly striking transformation.

So, when we try to measure the potential of one technology, we best consider the underlying advances occurring nearby. This is especially true of the metaverse.

No matter the final outcome, by renaming his company as “Meta,” Mark Zuckerberg effectively made the term top of mind in into public consciousness. But let’s be clear that underlying the metaverse are numerous technologies that rely on each others’ advances in order to prosper.

Chief among these are digital twins. These are virtual models that stand in as digital equivalents to real world processes. Underlying the digital twin are technologies at varied stages of maturity: electronic gaming AR and VR, AI, lidar and radar, IoT, Edge Computing, simulation and automation, and more.

Vaughan

Recently I talked with tech journalist and colleague George Lawton. Together we’d reviewed he’d accomplished on the topic at VentureBeat. The mission today: To get a gauge on digital twins. It seems digital twins could well form an environment that brings together disparate technologies now percolating under the banners of IoT and metaverse. A look at vendor strategies doesn’t dissuade this notion, as Lawton indicates. Here’s is a sample of topics discussed in the podcast above.

 

Lawton

“I think it’s really the owner’s manual to the Internet of Things,” Lawton tells us. Vendors are talking about Edge, which is another way of saying they are “pushing the cloud down into the real world.”

“Microsoft, and Amazon and Google have pioneered workflows that automate the way you spin up these things,” he said. When you look at AI on the Edge, the ‘edge node’ is often a smart phone, he noted, but every machine will take on a role as an Edge node, if tech Futurists are right.

“I was surprised recently linvestigating what people are calling edge AI. What surprised me about that formulation is that it sort of speaks to this push from the cloud into the physical world,” said Lawton.

“Digital twins are going to be a cornerstone.” – Lawton on ESG, metaverse

Digital twins and the digital threads that connect them to workflows and each other will need time to mature. A lot of Silicon Valley’s hopes for a future metaverse will rise of fall as digital twin technology rolls out or stumbles.

“It might be a long way off before people find the best way of connecting the dots,” Lawton said. “But it’s one of the most promising aspects in the long term.”

It’s about bringing digital transformation out from just computer systems into the real world — into our cars and our factories and our buildings and our homes.

It is a promising underpinning for ESG as well, for pulling the carbon out of processes, tracking it, modeling it, and considering various trade-offs.

“Digital twins are going to be a cornerstone. It’s important to think about how to use it as a framework to extend into these different problems that we’re facing right now,” he said.


Related:
Digital Twin coverage on VentureBeat
22 digital twins trends that will shape 2022 – George Lawton December 30, 2021
Siemens and FDA demo digital twin factory line – George Lawton October 1, 2021
Nexar, Las Vegas digital twins direct traffic – George Lawton September 27, 2021
Lidar sensors cruise the metaverse – George Lawton September 13, 2021

Digital Twin coverage on IoT World Today
Precision of Digital Twin Data Models Hold Key to Success – Jack Vaughan January 4, 2021

Upon inflection

November 27, 2021 By Jack Vaughan

Intel 80486
Intel 80486

[Nov 2021] – It’s no easier to identify inflection points in semiconductor history in hindsight, than it is at the time they occur.

Who thinks much about the 80486 today? This CPU was the first Intel CPU to include an on-chip floating point unit. On PCs, before that, difficult math problems were handled by coprocessors. So, it was a move noted by IC market watchers. However, soon the more raucous controversy revolved around up-and-coming Reduce Instruction Set Compute (RISC) chips like MIPS and SPARC versus Complex Instruction Set Compute (CISC) chip’s like Intel’s.

Andy Grove thought about RISC, then decided the game was elsewhere, and forged ahead with 80486 knockoffs. Grove immortalized his ‘aha’ moment in tech marketing lore as an ‘Inflection Point’ – this, in his “Only the Paranoid Survive” business autobio. RISC was more noise than signal, as he had it.

[His company fielded a RISC player in the form of the i860 – it’s not hard now to recall the excitement of my trade press colleagues when that chip was unveiled (classic trade press term, that) at ISSCC at the midtown Hilton in N.Y. in 1986.]

Andy Grove and co. eventually discerned the fact that RISC advantages were not going to change the all-important PC market. The 80486 (eventually “Pentium”) family began a long run. This long run ran well even as server architecture, cloud architectures, big data architectures and – especially – cell phone architectures – came to comprise a grand share of the IC market. Now, we are looking at AI and Edge AI as a new inflection point for semiconductor schemes.

Today, the hyperscaler cloud providers call the tune. They have written the latest chapter in the book on dedicated AI processors that hold neural nets, and are also actively pursuing low-power Edge AI versions of same. A nighty little Acorn in the form of ARM chip found a home and a standalone floating-point unit [well sort of, arguably, for purposes of discussion], in the form of the GPU, came on the scene, followed by TPUs and EBTKS (Everything But The Kitchen Sink).

Is Edge in any way shape or form a threat to the cloud status quo?  It seems unlikely – hyperscalers don’t ask; they act.

Now, chip designers must reformulate machine learning architectures that work in the power-rich cloud data center for use in low-power operation on the edge, outpace a mix of competitors, and find willing customers.

From my perspective, that is one of several backgrounds to a piece I recently wrote about AI on the Edge.

The central source for “The Shape of AI to Come” on Venture Beat was Marian Verhelst, a circuits and systems researcher at Katholieke Universiteit Leuven and the Imec tech hub in Belgium, and, as well, an advisor to startup Axelera.AI and a member of the Tiny ML foundation.

“Big companies like Google, Apple, and Facebook are all investing in making chips. The AI wave caused them to start developing chips because it makes a big difference,” Verhelst told me. Moving neural net style AI/ML from the cloud to the edge brings new requirements, which the story outlines.

Basically, the startups pursue a space pioneered by hyperscaling cloud providers that could not wait for mainline chip makers to create new chip architectures. Now, the task is to extend such architecture to work on the edge. Hindsight will be the final judge – But it could well be an Inflection Point. – Jack Vaughan

Related
https://venturebeat.com/2021/11/24/the-shape-of-edge-ai-to-come/
https://www.intel.com/pressroom/archive/speeches/ag080998.htm Hey that’s a pretty old-style URL!
https://ashishb.net/book-summary/book-summary-only-the-paranoid-survives-by-andrew-grove/

 

 

 

Trimble works with Microsoft on quantum-inspired logistics

November 18, 2021 By Jack Vaughan

Azure Quantum drives data storage improvements https://t.co/ywLiqKjEbO via @VentureBeat

— Jack Vaughan (@JackIVaughan) November 16, 2021

One among the many takeaways from the COVID-19 pandemic is an awareness that industries needed to take another look at supply-chain optimization. It may not be quite up there with security, but supply-chain optimization is near top of charts in many of today’s IT surveys.
Logistics and optimization are oft-cited uses of quantum computing. Though it is still a trickle, there are growing indications quantum algorithms are being applied to research in supply-chain optimization.
These are often quantum-inspired efforts, meaning they reap the benefit of research in quantum simulation, but don’t require actual quantum computers to do the work.
An example comes by way of Microsoft, which is using its own quantum software to optimize storage – that is to increase capacity and predictability — on its Azure cloud. You can read about it on VentureBeat.
Computer storage management turns out to be a useful use quantum-inspired algorithms, to hear in Microsoft’s telling, and certainly they are well versed in cloud hyperscaling. Some of this is outlined in a Microsoft blog.
But the company said its QIO optimization solvers can work on other domains, and were joined by geospatial and logistics services system mainstay Trimble in the announcement. Trimble said it is using Azure Quantum QIO to i.d. the best routes for vehicle fleets,” ensuring fewer trucks run empty, and maximizing operating load for each trip.” Useful trait, that.
When it comes to quantum-inspired algorithms on classic computers – it will just have to do, in the words of Sammy Cahn’s immortal song, until the quantum computing thing comes along. – Jqck Vquhqn

Quantum computing moves ahead in increments

October 31, 2021 By Jack Vaughan

Q: How many quantum computer coders does it take to change lightbulb?
A: Both none and one at the same time. (Reddit)

The battle for quantum supremacy ended in 2019. As history shows, sometimes the “mop up” takes longer than the battle. As of today, quantum computing’s journey from laboratory has not developed a discernable pace. That makes it a great case for technology assessment.

When Google in 2019 claimed quantum supremacy – that its computer based on quantum qubits could outperform classical machines — IBM and others challenged that notion. No surprise there.

But, at the sme time, the rush of breathless announcements subsided. The quantum research army went back to lab work – that is, creating algorithms and solving problems of error correction, reproducibility, quantum mechanical entanglement and much more.

The answer to ‘how’s it going?’ is the same here as in the clogged-up paint shop in an old Detroit auto factory: “We’re working on it.”

Cryptography, materials science, logistics and operations are often cited as near-term targets. Some work is beginning using the relatively meager amount of qubits at hand.

The roadmaps of the Honeywells, IBMs and Googles suggest continuing incremental steps forward over this decade, not just in science labs but also in technology applications.

The fact is that the people in the quantum business are resolved to pursue the necessary and incremental steps, and that the industries that could be affected are watching closely enough so as not to be blindsided when quantum computing sees the light of day.

Quantum computing hardware will need to advance before commercial value is achieved, but in the meantime, some new software trailing in the wake of research work may creep into commercial use ahead of that day. – Jack Vaughan

Recent story
Gartner advises tech leaders to prep for action as quantum computing spreads – VentureBeat

The Web has left the lab

July 6, 2019 By Jack Vaughan

Had an opportunity to attend a keynote led by Sir Tim Berners-Lee at the Postgres Vision 2019 conference in Boston last week. At 64, the inventor of the World Wide Web, shows the same rambling demeanor as ever, and with some age he may be even more kinetic, in fact.

Time has seen his baby, the Web, tracked by controversy, a great portion of it occurring of late. Read The Web has left the lab.

 

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 4
  • Go to page 5
  • Go to page 6

Progressive Gauge

Copyright © 2026 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack