• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

Will cloud hyperscalers react as Edge erupts?

March 31, 2022 By Jack Vaughan

When first there shook the decentralization tsunami of client-server computing, the mainframers responded successfully – well, IBM anyway. Some hemming and hawing, of course. But the IBM PC was a pivotal instrument of client-server’s move away from the domination of centralized mainframe-based computing.

But a tsunami finally hits a wall. After that, the tsunami energy reflects-back to the open ocean. When that happened (when client-server rolled over to cloud), IBM was busy promoting Watson AI. Big Blue had a heap of trouble when the elastic wave of centralization surged backwards – taking the name “cloud computing”.

The company cannot claim to an adequate response to cloud – it bought SoftLayer; it bought Cloudant; it bought RedHat. It still doesn’t have a cloud.

Its lunging stumbles are regularly chronicled by Charles Fitzgerald, who I had the good pleasure to speak with for a recent story I did for Venture Beat. Fitzgerald, a Seattle-area angel investor and former platform strategist at Microsoft and VMware — as well as the proprietor of the Platformonomics blog — holds to a notion that reported CAPEX spending is a most capable discerner of a cloud company’s true chops. I second the notion – that, and number of cloud regions.

I had reason to call on Fitzgerald for the VB article “Edge Computing Will See New Workloads”. The question was: How are the big cloud providers – often called ‘hyperscalers’ – responding to the emerging paradigm known as Edge computing?

Why ask? This could be an “IBM moment” for big cloud companies. Edge methods might  gnaw away at cloud’s recently gained hegemony.

These companies know the importance of the Edge, and are responding, Fitzgerald assured me. They take different tacks of course, but underlying their different products is a common drive to push their own cloud architecture out to the edge, he said. There’s more on Venture Beat.

In my opinion, the hyperscalers will need to keep their eyes on the Edge, and respond with paranoid energy, if they are not to fall into the kind of ranks of low-growth heavyweights from which IBM is still trying to extricate itself. One wonders if a genuinely new approach to Edge would offer IBM an egress from low-growth limbo.

The Edge is percolating. IDC estimates worldwide spending on edge computing will grow to $176 billion in 2022. That’s up by 14.8% over 2021. The analyst firm said 69% of organizations plan to increase Edge investments in the next two years. As I researched the VB article, and attended IDC Directions 2022 in Boston, IDC’s Jennifer Cooke, research director for the group’s Edge strategies, told me the Edge paradigm will play out differently than client-server did in the past, if only because the workloads involved are so much more expansive. Other presenters at the event convincingly conveyed that networking will undergo great tumult at the hands of the Edge – that the future of Edge will be wireless-first; that advanced observability will be needed on the Edge; that Edge is vital because that is where the data is created and consumed. And more.

The client back in client-server days was likely a PC on a desktop – albeit, sometimes hanging off a server at a post office in the Australian Outback. As Lou Reed said in possibly his most accessible song: “Those were different times.” 

Do me a favor and check out “Edge Computing Will See New Workloads” – then, let me know what you think!

Shine a Light: Electro-Photonic Computing gets $4.8 M IARPA boost

February 19, 2022 By Jack Vaughan

Quantum Crystals Orthogonal to this story but pretty – Source: MIT
https://progressivegauge.com/wp-content/uploads/2022/02/LightMatterInFocus4.mp3

Listen to article  |  5 min

[February 19, 2022] – Earlier this month IARPA announced $4.8 million in funding for a project that is led by Boston-based electro-photonic computing startup Lightmatter and which is joined by BU and Harvard researchers. IARPA has conceptual affinity with DARPA, the Defense Dept.’s long-time running technology seeding lab, but in IARPA’s case the funding comes out of the Office of the Director of National Intelligence.

Lightmatter’s photonic computing efforts to date have centered on the idea of a supercomputer based on photonic silicon hybrid chips that specialize in performing large matrix computations. Proponents see the photonic domain as a logical next step beyond electronics, as the window on Moore’s Law closes. While barriers are significant, the photonic approach arguably requires shorter leaps forward than  quantum computing techniques that have been extensively covered in the press. It seems photonic computing as represented in efforts by  Lightmatter and others could be reasonably added to the basket of strange new brews being applied in the search for better AI processing today.

While optical data communications are well established – think “The Internet” — it’s early going for computing based on light waves. Lightmatter is one of few that has created photonic arrays that are capable of computation and which are matched with layers of digital and analog electronics that take care of much of the activity auxiliary to such computation.

A prototype board based on its chip was shown last year, as well as a box with six such chips and boards. So, any supercomputer is in the future.

Note that IARPA’s interest here is at the other end of the computing spectrum — farther from the data center, and closer to the edge. Among the obstacles Edge AI faces is power consumption, and today’s approaches to AI tend to use a lot of it.

Analog processing that calculates in memory has been put forward as a way to avoid costly (in power terms) movement of data from processor to memory in this way. The Lightmatter crew and others take a different tact. They see the optical domain as another way to save trips to memory for multiply-accumulate work that marks today’s neural nets for AI. Lightmatter is not alone in the quest.

Lightmatter is led by Nicholas Harris, a former MIT researcher whose interest in how computers work pans right down to the atomic sale, and which has led him to explore ways of coupling deep learning and Electro-Photonic Computing. His experience is by no means solely in the academic lab, as he served for a time as an R&D lead at Micron Technology in the high country of Idaho from where he originated. Lightmatter has raised more than $113 million from investors including Google Ventures and Spark Capital.

Edge AI-based silicon photonics is a bridge afar. Early going means basic chip architecture and processes need to be worked out before reliability, scalability and cost can be determined. There are still multiple approaches being taken to building in quite basic functionality as on-chips component. For example, how do assure that you are effectively shuttling light on chip? For its part, Lightmatter uses a MEMS-like structure which they call NOEMS, for “nano-opto-electro-mechanical”, based on phase shifters that operate at CMOS-compatible voltages. In effect this method guides a wave with the force of electrostatic charge, I think.

Cost is not an issue for IARPA, which is likely looking to improve the reach and ability of surveillance drones. But there are a lot of ramifications to this work If the present project bears fruit, it could further photonic computation in general terms, not just for AI apps. Moreover, what furthers photonic computation could also help spur quantum computing advancement.

A 2021 funding round saw Lightmatter gain backing from no less than he round was led by Viking Global Investors with participation from GV (formerly Google Ventures), Hewlett Packard Enterprise (HPE), Lockheed Martin, Matrix Partners, SIP Global Partners, Spark Capital, and others. – Jack Vaughan

IARPA project press release – Businesswire
Company site – Lightmatter
Citations/Nick Harris – Google Scholar
Useful silicon photonic back rounder – Ghent U.
Sally Ward-Foxton Lightmatter coverage from HotChips 2021 – EETimes

 

Progressive Podcast – Digital Twins Meet the Metaverse – With George Lawton

January 31, 2022 By Jack Vaughan

https://progressivegauge.com/wp-content/uploads/2022/01/DigitaTwins-in-Review-2021-2022.mp3

[JANUARY 2022] – When worlds collide, technologies explode. Sometimes. Such may be the case as the metaverse meets digital twins.

Different technologies depend on each other to mutually prosper. The iPhone required advances in display glass, haptics for touch input, and so on. As well, social and business trends need to align beneficially. The iPhone, used again as example, tapped into thriving cellular networks, capable cloud data centers and ubiquitous e-commerce solutions. Today, when technology pundits gather, they look for a similarly striking transformation.

So, when we try to measure the potential of one technology, we best consider the underlying advances occurring nearby. This is especially true of the metaverse.

No matter the final outcome, by renaming his company as “Meta,” Mark Zuckerberg effectively made the term top of mind in into public consciousness. But let’s be clear that underlying the metaverse are numerous technologies that rely on each others’ advances in order to prosper.

Chief among these are digital twins. These are virtual models that stand in as digital equivalents to real world processes. Underlying the digital twin are technologies at varied stages of maturity: electronic gaming AR and VR, AI, lidar and radar, IoT, Edge Computing, simulation and automation, and more.

Vaughan

Recently I talked with tech journalist and colleague George Lawton. Together we’d reviewed he’d accomplished on the topic at VentureBeat. The mission today: To get a gauge on digital twins. It seems digital twins could well form an environment that brings together disparate technologies now percolating under the banners of IoT and metaverse. A look at vendor strategies doesn’t dissuade this notion, as Lawton indicates. Here’s is a sample of topics discussed in the podcast above.

 

Lawton

“I think it’s really the owner’s manual to the Internet of Things,” Lawton tells us. Vendors are talking about Edge, which is another way of saying they are “pushing the cloud down into the real world.”

“Microsoft, and Amazon and Google have pioneered workflows that automate the way you spin up these things,” he said. When you look at AI on the Edge, the ‘edge node’ is often a smart phone, he noted, but every machine will take on a role as an Edge node, if tech Futurists are right.

“I was surprised recently linvestigating what people are calling edge AI. What surprised me about that formulation is that it sort of speaks to this push from the cloud into the physical world,” said Lawton.

“Digital twins are going to be a cornerstone.” – Lawton on ESG, metaverse

Digital twins and the digital threads that connect them to workflows and each other will need time to mature. A lot of Silicon Valley’s hopes for a future metaverse will rise of fall as digital twin technology rolls out or stumbles.

“It might be a long way off before people find the best way of connecting the dots,” Lawton said. “But it’s one of the most promising aspects in the long term.”

It’s about bringing digital transformation out from just computer systems into the real world — into our cars and our factories and our buildings and our homes.

It is a promising underpinning for ESG as well, for pulling the carbon out of processes, tracking it, modeling it, and considering various trade-offs.

“Digital twins are going to be a cornerstone. It’s important to think about how to use it as a framework to extend into these different problems that we’re facing right now,” he said.


Related:
Digital Twin coverage on VentureBeat
22 digital twins trends that will shape 2022 – George Lawton December 30, 2021
Siemens and FDA demo digital twin factory line – George Lawton October 1, 2021
Nexar, Las Vegas digital twins direct traffic – George Lawton September 27, 2021
Lidar sensors cruise the metaverse – George Lawton September 13, 2021

Digital Twin coverage on IoT World Today
Precision of Digital Twin Data Models Hold Key to Success – Jack Vaughan January 4, 2021

Upon inflection

November 27, 2021 By Jack Vaughan

Intel 80486
Intel 80486

[Nov 2021] – It’s no easier to identify inflection points in semiconductor history in hindsight, than it is at the time they occur.

Who thinks much about the 80486 today? This CPU was the first Intel CPU to include an on-chip floating point unit. On PCs, before that, difficult math problems were handled by coprocessors. So, it was a move noted by IC market watchers. However, soon the more raucous controversy revolved around up-and-coming Reduce Instruction Set Compute (RISC) chips like MIPS and SPARC versus Complex Instruction Set Compute (CISC) chip’s like Intel’s.

Andy Grove thought about RISC, then decided the game was elsewhere, and forged ahead with 80486 knockoffs. Grove immortalized his ‘aha’ moment in tech marketing lore as an ‘Inflection Point’ – this, in his “Only the Paranoid Survive” business autobio. RISC was more noise than signal, as he had it.

[His company fielded a RISC player in the form of the i860 – it’s not hard now to recall the excitement of my trade press colleagues when that chip was unveiled (classic trade press term, that) at ISSCC at the midtown Hilton in N.Y. in 1986.]

Andy Grove and co. eventually discerned the fact that RISC advantages were not going to change the all-important PC market. The 80486 (eventually “Pentium”) family began a long run. This long run ran well even as server architecture, cloud architectures, big data architectures and – especially – cell phone architectures – came to comprise a grand share of the IC market. Now, we are looking at AI and Edge AI as a new inflection point for semiconductor schemes.

Today, the hyperscaler cloud providers call the tune. They have written the latest chapter in the book on dedicated AI processors that hold neural nets, and are also actively pursuing low-power Edge AI versions of same. A nighty little Acorn in the form of ARM chip found a home and a standalone floating-point unit [well sort of, arguably, for purposes of discussion], in the form of the GPU, came on the scene, followed by TPUs and EBTKS (Everything But The Kitchen Sink).

Is Edge in any way shape or form a threat to the cloud status quo?  It seems unlikely – hyperscalers don’t ask; they act.

Now, chip designers must reformulate machine learning architectures that work in the power-rich cloud data center for use in low-power operation on the edge, outpace a mix of competitors, and find willing customers.

From my perspective, that is one of several backgrounds to a piece I recently wrote about AI on the Edge.

The central source for “The Shape of AI to Come” on Venture Beat was Marian Verhelst, a circuits and systems researcher at Katholieke Universiteit Leuven and the Imec tech hub in Belgium, and, as well, an advisor to startup Axelera.AI and a member of the Tiny ML foundation.

“Big companies like Google, Apple, and Facebook are all investing in making chips. The AI wave caused them to start developing chips because it makes a big difference,” Verhelst told me. Moving neural net style AI/ML from the cloud to the edge brings new requirements, which the story outlines.

Basically, the startups pursue a space pioneered by hyperscaling cloud providers that could not wait for mainline chip makers to create new chip architectures. Now, the task is to extend such architecture to work on the edge. Hindsight will be the final judge – But it could well be an Inflection Point. – Jack Vaughan

Related
https://venturebeat.com/2021/11/24/the-shape-of-edge-ai-to-come/
https://www.intel.com/pressroom/archive/speeches/ag080998.htm Hey that’s a pretty old-style URL!
https://ashishb.net/book-summary/book-summary-only-the-paranoid-survives-by-andrew-grove/

 

 

 

Trimble works with Microsoft on quantum-inspired logistics

November 18, 2021 By Jack Vaughan

Azure Quantum drives data storage improvements https://t.co/ywLiqKjEbO via @VentureBeat

— Jack Vaughan (@JackIVaughan) November 16, 2021

One among the many takeaways from the COVID-19 pandemic is an awareness that industries needed to take another look at supply-chain optimization. It may not be quite up there with security, but supply-chain optimization is near top of charts in many of today’s IT surveys.
Logistics and optimization are oft-cited uses of quantum computing. Though it is still a trickle, there are growing indications quantum algorithms are being applied to research in supply-chain optimization.
These are often quantum-inspired efforts, meaning they reap the benefit of research in quantum simulation, but don’t require actual quantum computers to do the work.
An example comes by way of Microsoft, which is using its own quantum software to optimize storage – that is to increase capacity and predictability — on its Azure cloud. You can read about it on VentureBeat.
Computer storage management turns out to be a useful use quantum-inspired algorithms, to hear in Microsoft’s telling, and certainly they are well versed in cloud hyperscaling. Some of this is outlined in a Microsoft blog.
But the company said its QIO optimization solvers can work on other domains, and were joined by geospatial and logistics services system mainstay Trimble in the announcement. Trimble said it is using Azure Quantum QIO to i.d. the best routes for vehicle fleets,” ensuring fewer trucks run empty, and maximizing operating load for each trip.” Useful trait, that.
When it comes to quantum-inspired algorithms on classic computers – it will just have to do, in the words of Sammy Cahn’s immortal song, until the quantum computing thing comes along. – Jqck Vquhqn
  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 9
  • Go to page 10
  • Go to page 11
  • Go to page 12
  • Go to page 13
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2025 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack