• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Uncategorized

Reporter’s Notebook – At MIT Tech Review Future Compute 2023: Navigating the straits of semis

May 9, 2023 By Jack Vaughan

[May 9, 2023 ] – When the US last year announced new export rules on advanced chips,  the role of semiconductors in modern foreign affairs reached a new zenith. The chips have assumed the stature of oil in today’s geopolitics and depriving China of the chips now seems a strategic objective.

Unease has only grown with the appearance of the ChatGPT AI Large Language Model, which is a chip-hungry, power-guzzling presence ready to take over the world, to hear networks of experts and Cassandras tell it. Just as unsettling are Chinese maneuvers around Taiwan, a crucial center of global chip production.

Such activity formed a partial backdrop for the MIT Technology Review’s recent Future Compute 2023 conference at the Cambridge, Mass. Campus. Semiconductor issues were probed in a Q&A session featuring Chris Miller, Tufts University lecturer and author.

Miller said the semiconductor has taken on an outsized role in strategizing on China, and that the focus now is both on economics and defense.

”China spends as much money importing chips each year as importing oil,” he said. “You can’t understand the structure of the world economy without putting semiconductors at the center of your analysis.”

This is increasingly true for economic issues, Miller continued. Semiconductors that drive computers and embedded systems are top of mind when defense ministries and intelligence agencies think about future procurements.

“What they know is that over the past half century one of the key forces that’s transformed the way militaries fight has been computing power,” according to Miller, who traced the developments leading to the present predicament in “Chip War: The Fight for the World’s Most Critical Technology,” a recent noteworthy [Financial Times Book of the Year 2022] look at semiconductor industry history and its ever-shifting role in the larger body politic.

“Chip War” is described by a New York Times reviewer as something of a nonfiction thriller in which ‘pocket-protector men’ at Fairchild Semiconductor and Intel  tamed the raw transistor, fashioned the Integrated Circuit, outdid the Soviet Union, and left a war weary Europe in the dust as they formed what’s now Silicon Valley. Many of those developments bear review as governments’ and companies’ take on present complexities.

The complexities include more seemingly modest products than high-end processors, Miller indicated. Simpler chips that complement the hot processers grow in importance as well.

“The entire electronics supply chain is actually beginning to shift. It’s not only at the chip level, it’s also electronics assembly and simpler components,” Miller said, adding that a reduction in China’s level of server assembly has led to a major increase in Mexico’s market share in that field.

The also point emergence of new market dynamics as large companies take on design of their own chips, which could be spurred for a wider range of companies as US Chips Act R&D funding addresses the need for less expensive chip design processes.

A qubit for your thoughts

Infant quantum computing looms as an adjacent technology where geopolitical ambitions may play out.

China, the US the EU, and countries such as Australia, Singapore, and Canada now devote research monies to pursue such quantum efforts. They stir this new ground at the same time they test the limits of Moore’s Law – the perceived dead end for further large-scale silicon chip integration, which Tuft’s Miller cites as a fundamental challenge facing the chip industry.

However, quantum technology is still-raw technology – the quantum researchers on the main are still found toiling at the qubit level with lab rigs and signal scopes – that is, the quantum equivalent of the lone transistor work that preceded development of the Integrated Circuit.

A high-point of the Future Compute 2023 agenda for me was a visit to MIT’s Engineering Quantum Systems Group’s labs. Smart people are working hard on this frontier technology. And, with notable exceptions, there is knowledge sharing going on.

But, in a conference panel on quantum at the event, the impression emerged that quantum computing needed a large-scale working version of a quantum computer before the international competition for quantum computing would reach a less-sanguine stage akin to that the advanced CPU, GPU, NPU and network processing chips now experience.

For his part, at Future Compute, Chris Miller hesitated somewhat in responding to an audience question on quantum computing.

“I struggle to say anything that intelligent on quantum computing, both because I’m really not an expert in computing, but also because there’s a chip industry that I can study and I know how to talk about, whereas quantum computing is still a prospective industry,” he said. “We all hope it will materialize but it hasn’t materialized in a practical form.”

My take

Global chip wars must be viewed in the context of a real war underway in Ukraine. It has exposed the pivotal role of new technology in the exercise of war, as well as the vulnerability of the supply chains that feed modern commerce. It’s also pushed diplomacy to the sidelines, narrowing the opportunity for maneuver in the semiconductor straits.

Will cloud hyperscalers react as Edge erupts?

March 31, 2022 By Jack Vaughan

When first there shook the decentralization tsunami of client-server computing, the mainframers responded successfully – well, IBM anyway. Some hemming and hawing, of course. But the IBM PC was a pivotal instrument of client-server’s move away from the domination of centralized mainframe-based computing.

But a tsunami finally hits a wall. After that, the tsunami energy reflects-back to the open ocean. When that happened (when client-server rolled over to cloud), IBM was busy promoting Watson AI. Big Blue had a heap of trouble when the elastic wave of centralization surged backwards – taking the name “cloud computing”.

The company cannot claim to an adequate response to cloud – it bought SoftLayer; it bought Cloudant; it bought RedHat. It still doesn’t have a cloud.

Its lunging stumbles are regularly chronicled by Charles Fitzgerald, who I had the good pleasure to speak with for a recent story I did for Venture Beat. Fitzgerald, a Seattle-area angel investor and former platform strategist at Microsoft and VMware — as well as the proprietor of the Platformonomics blog — holds to a notion that reported CAPEX spending is a most capable discerner of a cloud company’s true chops. I second the notion – that, and number of cloud regions.

I had reason to call on Fitzgerald for the VB article “Edge Computing Will See New Workloads”. The question was: How are the big cloud providers – often called ‘hyperscalers’ – responding to the emerging paradigm known as Edge computing?

Why ask? This could be an “IBM moment” for big cloud companies. Edge methods might  gnaw away at cloud’s recently gained hegemony.

These companies know the importance of the Edge, and are responding, Fitzgerald assured me. They take different tacks of course, but underlying their different products is a common drive to push their own cloud architecture out to the edge, he said. There’s more on Venture Beat.

In my opinion, the hyperscalers will need to keep their eyes on the Edge, and respond with paranoid energy, if they are not to fall into the kind of ranks of low-growth heavyweights from which IBM is still trying to extricate itself. One wonders if a genuinely new approach to Edge would offer IBM an egress from low-growth limbo.

The Edge is percolating. IDC estimates worldwide spending on edge computing will grow to $176 billion in 2022. That’s up by 14.8% over 2021. The analyst firm said 69% of organizations plan to increase Edge investments in the next two years. As I researched the VB article, and attended IDC Directions 2022 in Boston, IDC’s Jennifer Cooke, research director for the group’s Edge strategies, told me the Edge paradigm will play out differently than client-server did in the past, if only because the workloads involved are so much more expansive. Other presenters at the event convincingly conveyed that networking will undergo great tumult at the hands of the Edge – that the future of Edge will be wireless-first; that advanced observability will be needed on the Edge; that Edge is vital because that is where the data is created and consumed. And more.

The client back in client-server days was likely a PC on a desktop – albeit, sometimes hanging off a server at a post office in the Australian Outback. As Lou Reed said in possibly his most accessible song: “Those were different times.” 

Do me a favor and check out “Edge Computing Will See New Workloads” – then, let me know what you think!

Lesson of Cassandra in the desert: “First, model”

April 18, 2021 By Jack Vaughan

Cassandra lessons from desert deployment

Mead’s Lessons from the History of Semiconductors

April 4, 2021 By Jack Vaughan

Carver Mead’s Lessons from the History of Semiconductors

Scenes from tinyML 2021

March 27, 2021 By Jack Vaughan

Why tinyML?

 

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to Next Page »

Progressive Gauge

Copyright © 2023 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack