• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Blog

Shine a Light: Electro-Photonic Computing gets $4.8 M IARPA boost

February 19, 2022 By Jack Vaughan

Quantum Crystals Orthogonal to this story but pretty – Source: MIT
https://progressivegauge.com/wp-content/uploads/2022/02/LightMatterInFocus4.mp3

Listen to article  |  5 min

[February 19, 2022] – Earlier this month IARPA announced $4.8 million in funding for a project that is led by Boston-based electro-photonic computing startup Lightmatter and which is joined by BU and Harvard researchers. IARPA has conceptual affinity with DARPA, the Defense Dept.’s long-time running technology seeding lab, but in IARPA’s case the funding comes out of the Office of the Director of National Intelligence.

Lightmatter’s photonic computing efforts to date have centered on the idea of a supercomputer based on photonic silicon hybrid chips that specialize in performing large matrix computations. Proponents see the photonic domain as a logical next step beyond electronics, as the window on Moore’s Law closes. While barriers are significant, the photonic approach arguably requires shorter leaps forward than  quantum computing techniques that have been extensively covered in the press. It seems photonic computing as represented in efforts by  Lightmatter and others could be reasonably added to the basket of strange new brews being applied in the search for better AI processing today.

While optical data communications are well established – think “The Internet” — it’s early going for computing based on light waves. Lightmatter is one of few that has created photonic arrays that are capable of computation and which are matched with layers of digital and analog electronics that take care of much of the activity auxiliary to such computation.

A prototype board based on its chip was shown last year, as well as a box with six such chips and boards. So, any supercomputer is in the future.

Note that IARPA’s interest here is at the other end of the computing spectrum — farther from the data center, and closer to the edge. Among the obstacles Edge AI faces is power consumption, and today’s approaches to AI tend to use a lot of it.

Analog processing that calculates in memory has been put forward as a way to avoid costly (in power terms) movement of data from processor to memory in this way. The Lightmatter crew and others take a different tact. They see the optical domain as another way to save trips to memory for multiply-accumulate work that marks today’s neural nets for AI. Lightmatter is not alone in the quest.

Lightmatter is led by Nicholas Harris, a former MIT researcher whose interest in how computers work pans right down to the atomic sale, and which has led him to explore ways of coupling deep learning and Electro-Photonic Computing. His experience is by no means solely in the academic lab, as he served for a time as an R&D lead at Micron Technology in the high country of Idaho from where he originated. Lightmatter has raised more than $113 million from investors including Google Ventures and Spark Capital.

Edge AI-based silicon photonics is a bridge afar. Early going means basic chip architecture and processes need to be worked out before reliability, scalability and cost can be determined. There are still multiple approaches being taken to building in quite basic functionality as on-chips component. For example, how do assure that you are effectively shuttling light on chip? For its part, Lightmatter uses a MEMS-like structure which they call NOEMS, for “nano-opto-electro-mechanical”, based on phase shifters that operate at CMOS-compatible voltages. In effect this method guides a wave with the force of electrostatic charge, I think.

Cost is not an issue for IARPA, which is likely looking to improve the reach and ability of surveillance drones. But there are a lot of ramifications to this work If the present project bears fruit, it could further photonic computation in general terms, not just for AI apps. Moreover, what furthers photonic computation could also help spur quantum computing advancement.

A 2021 funding round saw Lightmatter gain backing from no less than he round was led by Viking Global Investors with participation from GV (formerly Google Ventures), Hewlett Packard Enterprise (HPE), Lockheed Martin, Matrix Partners, SIP Global Partners, Spark Capital, and others. – Jack Vaughan

IARPA project press release – Businesswire
Company site – Lightmatter
Citations/Nick Harris – Google Scholar
Useful silicon photonic back rounder – Ghent U.
Sally Ward-Foxton Lightmatter coverage from HotChips 2021 – EETimes

 

Progressive Podcast – Digital Twins Meet the Metaverse – With George Lawton

January 31, 2022 By Jack Vaughan

https://progressivegauge.com/wp-content/uploads/2022/01/DigitaTwins-in-Review-2021-2022.mp3

[JANUARY 2022] – When worlds collide, technologies explode. Sometimes. Such may be the case as the metaverse meets digital twins.

Different technologies depend on each other to mutually prosper. The iPhone required advances in display glass, haptics for touch input, and so on. As well, social and business trends need to align beneficially. The iPhone, used again as example, tapped into thriving cellular networks, capable cloud data centers and ubiquitous e-commerce solutions. Today, when technology pundits gather, they look for a similarly striking transformation.

So, when we try to measure the potential of one technology, we best consider the underlying advances occurring nearby. This is especially true of the metaverse.

No matter the final outcome, by renaming his company as “Meta,” Mark Zuckerberg effectively made the term top of mind in into public consciousness. But let’s be clear that underlying the metaverse are numerous technologies that rely on each others’ advances in order to prosper.

Chief among these are digital twins. These are virtual models that stand in as digital equivalents to real world processes. Underlying the digital twin are technologies at varied stages of maturity: electronic gaming AR and VR, AI, lidar and radar, IoT, Edge Computing, simulation and automation, and more.

Vaughan

Recently I talked with tech journalist and colleague George Lawton. Together we’d reviewed he’d accomplished on the topic at VentureBeat. The mission today: To get a gauge on digital twins. It seems digital twins could well form an environment that brings together disparate technologies now percolating under the banners of IoT and metaverse. A look at vendor strategies doesn’t dissuade this notion, as Lawton indicates. Here’s is a sample of topics discussed in the podcast above.

 

Lawton

“I think it’s really the owner’s manual to the Internet of Things,” Lawton tells us. Vendors are talking about Edge, which is another way of saying they are “pushing the cloud down into the real world.”

“Microsoft, and Amazon and Google have pioneered workflows that automate the way you spin up these things,” he said. When you look at AI on the Edge, the ‘edge node’ is often a smart phone, he noted, but every machine will take on a role as an Edge node, if tech Futurists are right.

“I was surprised recently linvestigating what people are calling edge AI. What surprised me about that formulation is that it sort of speaks to this push from the cloud into the physical world,” said Lawton.

“Digital twins are going to be a cornerstone.” – Lawton on ESG, metaverse

Digital twins and the digital threads that connect them to workflows and each other will need time to mature. A lot of Silicon Valley’s hopes for a future metaverse will rise of fall as digital twin technology rolls out or stumbles.

“It might be a long way off before people find the best way of connecting the dots,” Lawton said. “But it’s one of the most promising aspects in the long term.”

It’s about bringing digital transformation out from just computer systems into the real world — into our cars and our factories and our buildings and our homes.

It is a promising underpinning for ESG as well, for pulling the carbon out of processes, tracking it, modeling it, and considering various trade-offs.

“Digital twins are going to be a cornerstone. It’s important to think about how to use it as a framework to extend into these different problems that we’re facing right now,” he said.


Related:
Digital Twin coverage on VentureBeat
22 digital twins trends that will shape 2022 – George Lawton December 30, 2021
Siemens and FDA demo digital twin factory line – George Lawton October 1, 2021
Nexar, Las Vegas digital twins direct traffic – George Lawton September 27, 2021
Lidar sensors cruise the metaverse – George Lawton September 13, 2021

Digital Twin coverage on IoT World Today
Precision of Digital Twin Data Models Hold Key to Success – Jack Vaughan January 4, 2021

Upon inflection

November 27, 2021 By Jack Vaughan

Intel 80486
Intel 80486

[Nov 2021] – It’s no easier to identify inflection points in semiconductor history in hindsight, than it is at the time they occur.

Who thinks much about the 80486 today? This CPU was the first Intel CPU to include an on-chip floating point unit. On PCs, before that, difficult math problems were handled by coprocessors. So, it was a move noted by IC market watchers. However, soon the more raucous controversy revolved around up-and-coming Reduce Instruction Set Compute (RISC) chips like MIPS and SPARC versus Complex Instruction Set Compute (CISC) chip’s like Intel’s.

Andy Grove thought about RISC, then decided the game was elsewhere, and forged ahead with 80486 knockoffs. Grove immortalized his ‘aha’ moment in tech marketing lore as an ‘Inflection Point’ – this, in his “Only the Paranoid Survive” business autobio. RISC was more noise than signal, as he had it.

[His company fielded a RISC player in the form of the i860 – it’s not hard now to recall the excitement of my trade press colleagues when that chip was unveiled (classic trade press term, that) at ISSCC at the midtown Hilton in N.Y. in 1986.]

Andy Grove and co. eventually discerned the fact that RISC advantages were not going to change the all-important PC market. The 80486 (eventually “Pentium”) family began a long run. This long run ran well even as server architecture, cloud architectures, big data architectures and – especially – cell phone architectures – came to comprise a grand share of the IC market. Now, we are looking at AI and Edge AI as a new inflection point for semiconductor schemes.

Today, the hyperscaler cloud providers call the tune. They have written the latest chapter in the book on dedicated AI processors that hold neural nets, and are also actively pursuing low-power Edge AI versions of same. A nighty little Acorn in the form of ARM chip found a home and a standalone floating-point unit [well sort of, arguably, for purposes of discussion], in the form of the GPU, came on the scene, followed by TPUs and EBTKS (Everything But The Kitchen Sink).

Is Edge in any way shape or form a threat to the cloud status quo?  It seems unlikely – hyperscalers don’t ask; they act.

Now, chip designers must reformulate machine learning architectures that work in the power-rich cloud data center for use in low-power operation on the edge, outpace a mix of competitors, and find willing customers.

From my perspective, that is one of several backgrounds to a piece I recently wrote about AI on the Edge.

The central source for “The Shape of AI to Come” on Venture Beat was Marian Verhelst, a circuits and systems researcher at Katholieke Universiteit Leuven and the Imec tech hub in Belgium, and, as well, an advisor to startup Axelera.AI and a member of the Tiny ML foundation.

“Big companies like Google, Apple, and Facebook are all investing in making chips. The AI wave caused them to start developing chips because it makes a big difference,” Verhelst told me. Moving neural net style AI/ML from the cloud to the edge brings new requirements, which the story outlines.

Basically, the startups pursue a space pioneered by hyperscaling cloud providers that could not wait for mainline chip makers to create new chip architectures. Now, the task is to extend such architecture to work on the edge. Hindsight will be the final judge – But it could well be an Inflection Point. – Jack Vaughan

Related
https://venturebeat.com/2021/11/24/the-shape-of-edge-ai-to-come/
https://www.intel.com/pressroom/archive/speeches/ag080998.htm Hey that’s a pretty old-style URL!
https://ashishb.net/book-summary/book-summary-only-the-paranoid-survives-by-andrew-grove/

 

 

 

Trimble works with Microsoft on quantum-inspired logistics

November 18, 2021 By Jack Vaughan

Azure Quantum drives data storage improvements https://t.co/ywLiqKjEbO via @VentureBeat

— Jack Vaughan (@JackIVaughan) November 16, 2021

One among the many takeaways from the COVID-19 pandemic is an awareness that industries needed to take another look at supply-chain optimization. It may not be quite up there with security, but supply-chain optimization is near top of charts in many of today’s IT surveys.
Logistics and optimization are oft-cited uses of quantum computing. Though it is still a trickle, there are growing indications quantum algorithms are being applied to research in supply-chain optimization.
These are often quantum-inspired efforts, meaning they reap the benefit of research in quantum simulation, but don’t require actual quantum computers to do the work.
An example comes by way of Microsoft, which is using its own quantum software to optimize storage – that is to increase capacity and predictability — on its Azure cloud. You can read about it on VentureBeat.
Computer storage management turns out to be a useful use quantum-inspired algorithms, to hear in Microsoft’s telling, and certainly they are well versed in cloud hyperscaling. Some of this is outlined in a Microsoft blog.
But the company said its QIO optimization solvers can work on other domains, and were joined by geospatial and logistics services system mainstay Trimble in the announcement. Trimble said it is using Azure Quantum QIO to i.d. the best routes for vehicle fleets,” ensuring fewer trucks run empty, and maximizing operating load for each trip.” Useful trait, that.
When it comes to quantum-inspired algorithms on classic computers – it will just have to do, in the words of Sammy Cahn’s immortal song, until the quantum computing thing comes along. – Jqck Vquhqn

Quantum computing moves ahead in increments

October 31, 2021 By Jack Vaughan

Q: How many quantum computer coders does it take to change lightbulb?
A: Both none and one at the same time. (Reddit)

The battle for quantum supremacy ended in 2019. As history shows, sometimes the “mop up” takes longer than the battle. As of today, quantum computing’s journey from laboratory has not developed a discernable pace. That makes it a great case for technology assessment.

When Google in 2019 claimed quantum supremacy – that its computer based on quantum qubits could outperform classical machines — IBM and others challenged that notion. No surprise there.

But, at the sme time, the rush of breathless announcements subsided. The quantum research army went back to lab work – that is, creating algorithms and solving problems of error correction, reproducibility, quantum mechanical entanglement and much more.

The answer to ‘how’s it going?’ is the same here as in the clogged-up paint shop in an old Detroit auto factory: “We’re working on it.”

Cryptography, materials science, logistics and operations are often cited as near-term targets. Some work is beginning using the relatively meager amount of qubits at hand.

The roadmaps of the Honeywells, IBMs and Googles suggest continuing incremental steps forward over this decade, not just in science labs but also in technology applications.

The fact is that the people in the quantum business are resolved to pursue the necessary and incremental steps, and that the industries that could be affected are watching closely enough so as not to be blindsided when quantum computing sees the light of day.

Quantum computing hardware will need to advance before commercial value is achieved, but in the meantime, some new software trailing in the wake of research work may creep into commercial use ahead of that day. – Jack Vaughan

Recent story
Gartner advises tech leaders to prep for action as quantum computing spreads – VentureBeat

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 10
  • Go to page 11
  • Go to page 12
  • Go to page 13
  • Go to page 14
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Primary Sidebar

Twitter Feed Until Mad Musk Man Goes

Tweets by JackIVaughan

Recent Posts

  • Talkin bout my Good Bubble
  • Random Notes: Polar Vortex Watch – Jellification – They Call the Winter Storm “Fern”
  • Look back: Multiglobal Hyperbole Engines in 2025 – Pt.1
  • ‘Co-Evolution’ and the trend of ‘Cyberselfish’
  • Information Examiner October 2025 – Connectors tackle AI with MCP

Recent Comments

    Archives

    • March 2026
    • February 2026
    • December 2025
    • October 2025
    • September 2025
    • August 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • July 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • November 2021
    • October 2021
    • August 2021
    • April 2021
    • March 2021
    • January 2021
    • December 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • July 2019

    Categories

    • AI
    • Computing
    • Data
    • Development
    • History of Science
    • IoT
    • Observability
    • Podcast
    • Political World
    • Quantum Computing
    • Random Notes
    • The Trade
    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    Progressive Gauge

    Copyright © 2026 · Jack Vaughan · Log in

    • Home
    • About
    • Blog
    • Projects, Samples and Items
    • Video
    • Contact Jack