• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects and Samples
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

Gauge Taking – Random notes on Claude Shannon

May 31, 2022 By Jack Vaughan

From Shannon’s Relays paper

[May 31, 2022] – I recently watched a documentary on the life of Claude Shannon. “The Bit Player” is a worthwhile piece on the originator of information theory. This caused me to retrieve an Obituary/Appreciation I wrote on his passing in February of 2001. Here I present some of the appreciation part.

Shannon was born in Petoskey, Mich., and grew up in Gaylord, Mich. He worked as a messenger for Western Union while in Gaylord High School, and attended college at MIT, where he was a member of Tau Beta Pi.

His paper, “A Symbolic Analysis of Relay and Switching Circuits,” which led to a long association with Bell Laboratories, laid out Shannon’s theories on the relationship of symbolic logic and relay circuits.

While at Bell Labs, Shannon wrote the landmark “The Mathematical Theory of Communication.” The information content of a message, he theorized, consists simply of the number of 1’s and 0’s it takes to transmit it. In a real sense, Shannon conceived of the “bit” that is now so widely used to represent data.

Shannon’s work led to many inventions used by both technology developers and end users. His theories can truly be described as pervasive today.

When I was young, Shannon’s work was a tough nut to crack, but it certainly was intriguing. As a high school boy, I was interested in the future — maybe more so than now, when I live and breathe and work in what that future became. Grappling with Shannon’s basic information theories was part of my education about the future.

Growing up in a Wisconsin city across the lake from Shannon’s birthplace, I tried to plow through the town library as best I could. I wanted to learn about computers, automation, and the combination of the two that was known in those days (the 1960s) as cybermation.* I discovered for myself — by chance, really — that the fundamental elements of those ideas were Shannon’s inventions.

Much of his greatest work revolved around defining information in relation to “noise,” the latter phenomenon being quite familiar to anyone who often tried desperately to home in on radio signals before digital communication filters came into being. I came to appreciate that aspect of Shannon’s work later on when, as a journalist, I had the opportunity to learn and write about digital signal processing.

Day and night, data, messages, music, and more swirls around us — all made possible to some extent by the idea of communicating electronically in 1’s and 0’s. It is something to think that a Western Union messenger could have conceived of this new world.

~~~~~~~~~~~~~~~~~

Postscript: Of course, Shannon did not envisage or invent PacMan or Puff Daddy MP3s – tho the documentary shows him to be a great fan of shellac jazz. He imagined the river for others to sail. This conjures the words of Herbert Kroemer, inventor of heterostructures that changed evolution of ICs and commercial optics. “A pendulum … goes back and forth from science to applications. Science creates applications … [they] stimulate new science-it’s not one way or the other,” Kroemer said in a Nobelist interview.

Now read the rest of the story –  Digital pioneer Claude Shannon dead at 84 – Computerworld – Feb 28, 2001

~~~~~~~~~~~~~~~~~

* I think we meant ‘cybernetics’.

Benedict Evans on Steps to the Future

April 16, 2022 By Jack Vaughan

[April 16, 2022] – Had the great good pleasure to sit in on a closing keynote at IDC Directions last month. It was led by Benedict Evans. Primarily he discussed Web 3 and The Metaverse – two areas that now vie for attention in the daily cauldron called high technology.

Who is Benedict Evans? A very witty London-based independent analyst who worked with a16z, NBC Universal and others. He has a thoughtful blog comprising essays, and a nifty email newsletter. And his hand is firmly on the pulse of The Next Big Thing.

There are always new upswelling technologies, shifting about like tectonic plates, as Evans’ presentation makes clear. Some will sputter, some will morph, and some will succeed — though few will thrive outright in the way that the PC and the smartphone thrived.

Evans outlines the major differences for the two presently vying Next Big Things: Web 3 and The Metaverse. The former is a new form of data and compute distribution heavily reliant on self-governing subnetworks based on blockchain. Its potential is not as a mere cryptocurrency exchange but as a wide spanning platform.

The blockchain underpinning is key to Web 3.0. It, said Evans, is: “a vision of an open source distributed virtualized computing system in which anybody can contribute, anybody can connect and all of the participants have some share of governance.”

Meanwhile the latter Next Big Thing candidate – the metaverse — is a way of saying that “VR isn’t just about goggles to play games, or do remote surgery … instead, this might be the next universal platform after smartphones.”

Evans points out these are new morphs of former technologies … or as he puts it:

Both of these in a sense are terms that rebrand, re conceptualize, or re imagine some ideas that have been around for a while.

Thus, the two platforms bring together individual techs a’ percolating, at least in the VC community dream.
The Metaverse thesis is the simpler of the two, he said,

VR and AR become the next universal device after smartphones. They break out of games — they break out far beyond games.

Yet, with the Metaverse, Evans said, there is a hurdle it may not vault. That is due to the case that electronic gaming remains a small market in comparison to the markets for PCs or smartphones. I suppose a new generation raised on these things may expand, and thus expand the potential of the Metaverse. Then too, phenomena like Roblox could combine aspects of blockchain-imbued Web 3 community and Metaversian components.

The point he raises on mitigating the Metaverse’s future – which I’d amend to include Web 3, or What Have You — is that it is not so much about whether the technology gets better, but whether,

If it does, who will care? And, will this break out and become a universal product?

There is no future except in use cases, as the saying goes.

My reflection on Evans’ treatise holds that, for today’s exciting assortment of new technologies (IoT, digital twins, ML, Edge, blockchain, quantum computing) to glom into something new and take hold, good large-scale use cases will need to be imagined, created, and promoted.

My experience speaking with chip builders focused on ML on the IoT edge tells me that they are working on this. Smart speakers seem closest to scaling – FitBits less so. Digital Twin makers as well are looking for large markets, but highly vertical uses seem to diffuse their chances there, and the are less likely to find the mass consumer market swath, but it’s not impossible.

*****

Besides being the first chance to get out of the house and to a conference, covering Edge Computing, IDC Directions 2022 in Boston’s Seaport District was also a nice return to something I’ve always enjoyed: the keynote on the future. Huzzah to Evans and IDC! Such Think Pieces were once de rigueur for some events but less so since everyone is in such a damn hurry. Evans’ critique was compelling, and I think his presentation usefully crystallizes constructive modes of technology analysis. Truly thought leading.

RELATED
Ben Evans Site and Presentation https://www.ben-evans.com/presentations
IDC Directions Site and Video Replay https://www.idc.com/events/directions/proceedings
Progressive Gauge Podcast Metaverse Ruminations https://progressivegauge.com/2022/01/31/progressive-podcast-digital-twins-meet-the-metaverse/

 

Will cloud hyperscalers react as Edge erupts?

March 31, 2022 By Jack Vaughan

When first there shook the decentralization tsunami of client-server computing, the mainframers responded successfully – well, IBM anyway. Some hemming and hawing, of course. But the IBM PC was a pivotal instrument of client-server’s move away from the domination of centralized mainframe-based computing.

But a tsunami finally hits a wall. After that, the tsunami energy reflects-back to the open ocean. When that happened (when client-server rolled over to cloud), IBM was busy promoting Watson AI. Big Blue had a heap of trouble when the elastic wave of centralization surged backwards – taking the name “cloud computing”.

The company cannot claim to an adequate response to cloud – it bought SoftLayer; it bought Cloudant; it bought RedHat. It still doesn’t have a cloud.

Its lunging stumbles are regularly chronicled by Charles Fitzgerald, who I had the good pleasure to speak with for a recent story I did for Venture Beat. Fitzgerald, a Seattle-area angel investor and former platform strategist at Microsoft and VMware — as well as the proprietor of the Platformonomics blog — holds to a notion that reported CAPEX spending is a most capable discerner of a cloud company’s true chops. I second the notion – that, and number of cloud regions.

I had reason to call on Fitzgerald for the VB article “Edge Computing Will See New Workloads”. The question was: How are the big cloud providers – often called ‘hyperscalers’ – responding to the emerging paradigm known as Edge computing?

Why ask? This could be an “IBM moment” for big cloud companies. Edge methods might  gnaw away at cloud’s recently gained hegemony.

These companies know the importance of the Edge, and are responding, Fitzgerald assured me. They take different tacks of course, but underlying their different products is a common drive to push their own cloud architecture out to the edge, he said. There’s more on Venture Beat.

In my opinion, the hyperscalers will need to keep their eyes on the Edge, and respond with paranoid energy, if they are not to fall into the kind of ranks of low-growth heavyweights from which IBM is still trying to extricate itself. One wonders if a genuinely new approach to Edge would offer IBM an egress from low-growth limbo.

The Edge is percolating. IDC estimates worldwide spending on edge computing will grow to $176 billion in 2022. That’s up by 14.8% over 2021. The analyst firm said 69% of organizations plan to increase Edge investments in the next two years. As I researched the VB article, and attended IDC Directions 2022 in Boston, IDC’s Jennifer Cooke, research director for the group’s Edge strategies, told me the Edge paradigm will play out differently than client-server did in the past, if only because the workloads involved are so much more expansive. Other presenters at the event convincingly conveyed that networking will undergo great tumult at the hands of the Edge – that the future of Edge will be wireless-first; that advanced observability will be needed on the Edge; that Edge is vital because that is where the data is created and consumed. And more.

The client back in client-server days was likely a PC on a desktop – albeit, sometimes hanging off a server at a post office in the Australian Outback. As Lou Reed said in possibly his most accessible song: “Those were different times.” 

Do me a favor and check out “Edge Computing Will See New Workloads” – then, let me know what you think!

Shine a Light: Electro-Photonic Computing gets $4.8 M IARPA boost

February 19, 2022 By Jack Vaughan

Quantum Crystals Orthogonal to this story but pretty – Source: MIT
https://progressivegauge.com/wp-content/uploads/2022/02/LightMatterInFocus4.mp3

Listen to article  |  5 min

[February 19, 2022] – Earlier this month IARPA announced $4.8 million in funding for a project that is led by Boston-based electro-photonic computing startup Lightmatter and which is joined by BU and Harvard researchers. IARPA has conceptual affinity with DARPA, the Defense Dept.’s long-time running technology seeding lab, but in IARPA’s case the funding comes out of the Office of the Director of National Intelligence.

Lightmatter’s photonic computing efforts to date have centered on the idea of a supercomputer based on photonic silicon hybrid chips that specialize in performing large matrix computations. Proponents see the photonic domain as a logical next step beyond electronics, as the window on Moore’s Law closes. While barriers are significant, the photonic approach arguably requires shorter leaps forward than  quantum computing techniques that have been extensively covered in the press. It seems photonic computing as represented in efforts by  Lightmatter and others could be reasonably added to the basket of strange new brews being applied in the search for better AI processing today.

While optical data communications are well established – think “The Internet” — it’s early going for computing based on light waves. Lightmatter is one of few that has created photonic arrays that are capable of computation and which are matched with layers of digital and analog electronics that take care of much of the activity auxiliary to such computation.

A prototype board based on its chip was shown last year, as well as a box with six such chips and boards. So, any supercomputer is in the future.

Note that IARPA’s interest here is at the other end of the computing spectrum — farther from the data center, and closer to the edge. Among the obstacles Edge AI faces is power consumption, and today’s approaches to AI tend to use a lot of it.

Analog processing that calculates in memory has been put forward as a way to avoid costly (in power terms) movement of data from processor to memory in this way. The Lightmatter crew and others take a different tact. They see the optical domain as another way to save trips to memory for multiply-accumulate work that marks today’s neural nets for AI. Lightmatter is not alone in the quest.

Lightmatter is led by Nicholas Harris, a former MIT researcher whose interest in how computers work pans right down to the atomic sale, and which has led him to explore ways of coupling deep learning and Electro-Photonic Computing. His experience is by no means solely in the academic lab, as he served for a time as an R&D lead at Micron Technology in the high country of Idaho from where he originated. Lightmatter has raised more than $113 million from investors including Google Ventures and Spark Capital.

Edge AI-based silicon photonics is a bridge afar. Early going means basic chip architecture and processes need to be worked out before reliability, scalability and cost can be determined. There are still multiple approaches being taken to building in quite basic functionality as on-chips component. For example, how do assure that you are effectively shuttling light on chip? For its part, Lightmatter uses a MEMS-like structure which they call NOEMS, for “nano-opto-electro-mechanical”, based on phase shifters that operate at CMOS-compatible voltages. In effect this method guides a wave with the force of electrostatic charge, I think.

Cost is not an issue for IARPA, which is likely looking to improve the reach and ability of surveillance drones. But there are a lot of ramifications to this work If the present project bears fruit, it could further photonic computation in general terms, not just for AI apps. Moreover, what furthers photonic computation could also help spur quantum computing advancement.

A 2021 funding round saw Lightmatter gain backing from no less than he round was led by Viking Global Investors with participation from GV (formerly Google Ventures), Hewlett Packard Enterprise (HPE), Lockheed Martin, Matrix Partners, SIP Global Partners, Spark Capital, and others. – Jack Vaughan

IARPA project press release – Businesswire
Company site – Lightmatter
Citations/Nick Harris – Google Scholar
Useful silicon photonic back rounder – Ghent U.
Sally Ward-Foxton Lightmatter coverage from HotChips 2021 – EETimes

 

Progressive Podcast – Digital Twins Meet the Metaverse – With George Lawton

January 31, 2022 By Jack Vaughan

https://progressivegauge.com/wp-content/uploads/2022/01/DigitaTwins-in-Review-2021-2022.mp3

[JANUARY 2022] – When worlds collide, technologies explode. Sometimes. Such may be the case as the metaverse meets digital twins.

Different technologies depend on each other to mutually prosper. The iPhone required advances in display glass, haptics for touch input, and so on. As well, social and business trends need to align beneficially. The iPhone, used again as example, tapped into thriving cellular networks, capable cloud data centers and ubiquitous e-commerce solutions. Today, when technology pundits gather, they look for a similarly striking transformation.

So, when we try to measure the potential of one technology, we best consider the underlying advances occurring nearby. This is especially true of the metaverse.

No matter the final outcome, by renaming his company as “Meta,” Mark Zuckerberg effectively made the term top of mind in into public consciousness. But let’s be clear that underlying the metaverse are numerous technologies that rely on each others’ advances in order to prosper.

Chief among these are digital twins. These are virtual models that stand in as digital equivalents to real world processes. Underlying the digital twin are technologies at varied stages of maturity: electronic gaming AR and VR, AI, lidar and radar, IoT, Edge Computing, simulation and automation, and more.

Vaughan

Recently I talked with tech journalist and colleague George Lawton. Together we’d reviewed he’d accomplished on the topic at VentureBeat. The mission today: To get a gauge on digital twins. It seems digital twins could well form an environment that brings together disparate technologies now percolating under the banners of IoT and metaverse. A look at vendor strategies doesn’t dissuade this notion, as Lawton indicates. Here’s is a sample of topics discussed in the podcast above.

 

Lawton

“I think it’s really the owner’s manual to the Internet of Things,” Lawton tells us. Vendors are talking about Edge, which is another way of saying they are “pushing the cloud down into the real world.”

“Microsoft, and Amazon and Google have pioneered workflows that automate the way you spin up these things,” he said. When you look at AI on the Edge, the ‘edge node’ is often a smart phone, he noted, but every machine will take on a role as an Edge node, if tech Futurists are right.

“I was surprised recently linvestigating what people are calling edge AI. What surprised me about that formulation is that it sort of speaks to this push from the cloud into the physical world,” said Lawton.

“Digital twins are going to be a cornerstone.” – Lawton on ESG, metaverse

Digital twins and the digital threads that connect them to workflows and each other will need time to mature. A lot of Silicon Valley’s hopes for a future metaverse will rise of fall as digital twin technology rolls out or stumbles.

“It might be a long way off before people find the best way of connecting the dots,” Lawton said. “But it’s one of the most promising aspects in the long term.”

It’s about bringing digital transformation out from just computer systems into the real world — into our cars and our factories and our buildings and our homes.

It is a promising underpinning for ESG as well, for pulling the carbon out of processes, tracking it, modeling it, and considering various trade-offs.

“Digital twins are going to be a cornerstone. It’s important to think about how to use it as a framework to extend into these different problems that we’re facing right now,” he said.


Related:
Digital Twin coverage on VentureBeat
22 digital twins trends that will shape 2022 – George Lawton December 30, 2021
Siemens and FDA demo digital twin factory line – George Lawton October 1, 2021
Nexar, Las Vegas digital twins direct traffic – George Lawton September 27, 2021
Lidar sensors cruise the metaverse – George Lawton September 13, 2021

Digital Twin coverage on IoT World Today
Precision of Digital Twin Data Models Hold Key to Success – Jack Vaughan January 4, 2021

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Interim pages omitted …
  • Go to page 6
  • Go to Next Page »

Twitter Feed

Tweets by JackIVaughan

Progressive Gauge

Copyright © 2022 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects and Samples
  • Video
  • Contact Jack