• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

Random Notes: Pining for Blackwell, GPT 5

September 2, 2024 By Jack Vaughan

Happy Labor Day 2024 to Workers of the World!
Nvidia hits bumps in overdrive – That Wall Street meme is about to be cresting. A flaw in its Blackwell production plan is just that, we are assured. In a newsletter followup to a Jensen Huang earnings report interview as described by Bloomberg’s Ed Ludlow and Ian King:

Nvidia had to make a change to the design’s lithography mask. This is the template used to burn the lined patterns that make up circuits onto the materials deposited on a disk of silicon. Those circuits are what gives the chip the ability to crunch data.

At the least it is a reminder of the elemental fact that the course of semiconductor manufacturing does not always run smooth. As David Lee reminds on Bloomberg: Hardware is hard. Elemental facts are the first casualties in bull markets and technology hype cycles.

Even if the Gods of Uncertainty are kind, the educated consumer will allow that “Blackwell will be capacity constrained,” as quite ably depicted in Beth Kindig’s recent Forbes posting.

~~~~~~~~~~~~~

GPT 5, hurry fast! – This Blackwell Boding is marked with a rumored re-capitalizing of Open AI. And that with concerns about the delivery of GPT 5. Where is GPT 5? asks Platformonomics. In his Aug 30 edition of Platformonomics TGIF, Charles Fitzgerald bullet-points the reasons to be doubting that GPT 5 can round the bend in time. Possible explanations include:

*GPT-5 is just late — new scale brings new challenges to surmount

*It took time to get that much hardware in place

*Scaling has plateaued

*The organizational chaos at Open AI had consequences

*Open AI is doing more than just another scaling turn of the crank with GPT-5?

The skeptical examiner wonders if Open AI’s valuation wont edge down a bit, even though it is too big to fail and headed by the smartest man in the world. At the least, again, one has to observe the water level as it declines in Open AI’s moat.

~~~~~~~~~~~~~

Nunc ad aliquid omnino diversum

Deep Sea Learning – The Chicxulub event doomed 75 percent of Earth’s species. Details of the devastation were gathered by long core tubes drilled into the seafloor by the JOIDES Resolution ship now to be retired. It was a punch in the gut said a scientist.

Benthic foraminifera from Deep Sea off New Zealand.

In extra Innings

Danny Jansen in Superposition –  Plays for both teams in same game. In June he was at bat for the Blue Jays in Fenway when a storm stopped the game. Later, he was traded. In August the game was resumed, and he was now a catcher for the Red Sox. “Jays beat Red Sox 4-1, and Jansen shows up on both sides of box score – an MLB first!”

 

 

Mendelianum Musings

July 15, 2024 By Jack Vaughan

Source: Mendelianum Moravian Museum

I recently picked up for a summer read “The Gene” by Siddhartha Mukherjee. As I began to plow through the nearly 600-page book, it seemed to display the accidents and unforeseen circumstances that can track scientific research and technological innovation.

≠

The Gene begins with Gregor Mendel in the monastery in Brno, now a part of the Czech Republic. There the eventual founder of the science of genetics is perceived as slow, happy in the garden with his peas, not smart or articulate enough to be more than a substitute teacher. The friar abbots try and give him every chance to gain a useful education, and perhaps step up from substitute. And by some phenomenal luck, he’s sent to study in Vienna. Thus, to study under no less than Doppler.

Yes, he comes to study under Austrian physicist Christian Doppler, the mathematician and physicist who proposed that the perceived pitch of sound or the color of light was not fixed but depended on the relative locations and velocities of the observer and the source. His principles on the nature of change in wave frequency influence work that led to today’s radio astronomy efforts, radar, sonar, and more. It must be seen as a happy accident, for Mendel to learn from Doppler, even if he never passed an exam.

Mendel patiently raised peas in his garden. He experimentally crossbred the pea plants and dutifully documented the results. Some viewers have seen him as a plodder, with no theoretical understanding of underlying forces at work. But author Mukherjee assures that Mendel knew “he was trying to unlock the material basis and laws of heredity.”

The author also writes that Doppler’s example as a physicist informed Mendel’s efforts. Mendel found the elements that could reveal an underlying pattern that could be described numerically as he arrayed different bits of data on plants – height, texture, color. That is, a numerical model that marked the inheritance of traits.

This ended up in a research paper presented to the Natural Science Society in Brno. But Mendel’s station at the far reaches of the scientific community assigned his work to a type of oblivion that was a long time in lifting.

Mukherjee cites a geneticist describing this period of oblivion as “one of the strangest silences in the history of biology.”

The Mendel story contrasts with Darwin’s story in Mukherjee’s work. Darwin had a position close to the center in the scientific culture of his day. But Darwin and others struggled to move the science of heredity forward after the big bang of Origin of the Species.

The mechanism was already described — or pointed to — in some measure by Mendel, but his duties as a cleric  led him to be “choked by administrative work,” and his paper became for him a capstone, as he labored as a sanctified clerk. Gradually over decades his work was discovered and replicated, eventually triggering a general evangelization of Mendel.

Yes, the initial wilting on the vine of Mendel’s work was not anything that couldn’t have been foreseen. As Mukherjee observes, Darwin’s reading of his keystone paper took place at the Linnean Society in London. August, not? But Mendel presented at the Natural Science Society of Brno, far afield. That Mendel’s work slashed steadily, like a scythe through the pages of time, until it reached an audience, speaks volumes for its worth. – Jack Vaughan

Related
The Gene – On Amazon
On the Road to the Double-Helix – Progressive Gauge Blost

Noting the passing in May of Neil Raden

July 7, 2024 By Jack Vaughan

Noting the passing in May of Neil Raden, who was one of the most unforgettable characters I ever met in my computer trade press days. His death came after a long illness, progress of which he shared with the tech communities in which he’d long been a notable voice.

Neil led an independent consulting and analysis practice as the head of Hire Brains in Santa Fe, New Mexico. He was his own kind of 60s guy, in my experience. That is, he was goateed, a bit skeptical, but truly enthusiastic about technological advances that influenced his era.

Like many others, he came out of fields that weren’t essentially technological, but which were channels to building out new computerized methods, working from first principles. This led him on a winding journey as the mainframe gave way to the PC and the cloud, with problems to solve every step of the way. In his case, the seed soil was advanced mathematical studies and actuarial experience. He forged a home-grown view on technology, with special emphasis on databases, data warehouses and common-sense problem solving.

I got to know Neil on the data warehouse beat, which I covered for Software Magazine, Application Development Trends and SearchDataManagement.com.

When a reporter asked him a question he’d break it down carefully, and look at it from different perspectives…most of which had yet to occur to you. Each question invited yet another strategy for writing your story — that or ten other ones. With Neil, it wasn’t hard to jump from IoT to tensor matrices to federated learning to differential privacy and to data lake houses (tho, the latter was not his favorite!).

With some disappointment, you’d bring the conversation back to the original topic – you got a deadline, right?  But, for my money, any conversation with Neil was a master class in technology assessment.  And if I wanted to talk about Telstar or the Perceptron, he was down with all that too. Following Neil’s train of thought could be like riding the notes of jazz player’s solo.

In the late teens, I’d see him at Oracle Open World in San Franciso, and he’d talk about topological algebra – like chaos theory, a lodestone interest of his. A couple of years later, he’s speaking with me for a story on data and IoT, and topological algebra magically comes up again! I still can’t figure it  out. But I try.

On his health, I have no way of knowing if Neil saw what was coming back then, but I do know he was into being here and now. Recalling how Neil closed an interview, after some flights of technology fancy. He’d just shown me a picture of nature in his adopted home of New Mexico.

He said: “I’ll tell you something really funny, Jack. I’m looking out my window right now.I look out the window and it is beautiful. The land rises up. And on the other side of that is the Rio Grande, and on the other side of the Rio Grande is … ” My recording stopped there.

Well, Neil Raden is sure on the other side of the Rio Grande now. I have to say thanks, I appreciated you sharing your time!

 

–30–

Neil Raden: From a Reporter’s Notebook –

On edge computing and edge AI

One thing I’m concerned about is that the edge is far too important to be controlled by a couple of mega vendors. Once that becomes proprietary it’ll be a disaster.

Why he studied math

I studied math. I studied math because I didn’t want to write papers, and look what I do now!

Question to ask of a new technology paradigm, for example, event processing

The question is ‘can an organization really change the way it operates?’ The technology may not be the hardest part.

On the data lakehouse

That one really cracks me up. Okay, so we build a data lake and now we can’t do anything with it. Oh, don’t worry we have a new thing — we’re going to call it a data lake house. And we’re going to give you some analytical functions like a data warehouse on top of it. And, you know, I’m trying not to laugh.

 

There are numerous postings on LinkedIn that readily show how very many Neil touched. You can find that feeling, and a sense of Neil’s dedication, in a recent tribute by diginomica.com Editor John Reed, who made sure to cast a light on Neil’s recent writings on AI Ethics, an area he was especially dedicated to  covering. Raden’s writings for the publication can be found here. Some earlier work is also to be found on Medium.

Neil is survived by his wife, TS (Susie) Wiley, his children Mara, Aja, Jacob, Max, and Zoe, eight grandchildren, a brother, Jonathan Raden, and a sister, Audrey Raden.

 

 

 

Cadence discusses AI-driven fit-for-purpose chips

May 22, 2024 By Jack Vaughan

Phonautograph

The era of hyperscalers designing their own fit-for-purpose chips began with a series of AI chips from Google, Microsoft and AWS. Others cite the work of Apple and others to forge their own destinies in custom-chip designs for smartphones.

The trend has continued, but it is not clear when or if it will spread to other ranks among Information Technology vendors.

The chips specifically built to run the big players’ finely honed AI algorithms are, for now, the sweet spot for fit-for-purpose.

The surge in interest in in-house chip designs got rolling a few years ago with Google’s Tensor Processing Unit, which is specially tuned to meet Google’s AI architecture. The search giant has followed that with the Argos chip, meant to speed YouTube video transcoding, and Axion, said to drive data center energy efficiencies.

Chip design for Google and its ilk is enabled by deep pockets of money. The big players have ready mass markets that can justify big expenses in IC design staff and resources as well.

Chief among those resources is Electronic Design Automation tooling from the likes of Cadence Systems.  This week, Anirudh Devgan, President & CEO of Cadence, discussed the trend at the J.P. Morgan 52nd Global Technology, Media and Communications Conference in Boston.

He said the key reasons companies go the self-designed route are: Achieving domain-specific product differentiation, gaining control over the supply chain and production schedule, and realizing cost benefits at scale when their chip shows it will find use at sufficient volume.

Domain-specific differentiation allows companies to a create chip tailored to their unique needs, according to Devgan.

“It’s a domain specific product. It can do something a regular standard product cannot do,” he said, pointing to Tesla’s work on chips for Full Self Driving, and phone makers’ mobile computing devices that run all day on a single battery charge.

Like all companies dependent on components to power new products, the big players want to have assurance they can meet schedules, and an in-house chip design capability can help there, Devgan continued.

“You have some schedule, you want some control over that,” he told the JP Morgan conference attendees.

For the in-house design to work economically, scale of market is crucial. AI’s apparent boundless opportunity works for the hyperscalers here.

In the end, their in-house designed chip may cost less, when they cut the big chip maker’s over-size role out of the cost equation.

Where does this work? As always…”it depends.”

“It depends on each application, how much it costs, but definitely in AI there is volume, and volume is growing,” Devgan said, and he went on to cite mobile phones, laptops and autos as areas where the volume will drive the trend of custom chip creation.

Devgan declined to estimate how much system houses will take on the task of chip design going forward. Cadence wins in either case, by selling tools to semiconductor manufacturers, hyperscaling cloud leaders and  system houses.

He said: “We will leave that for the customer and the market to decide. Our job is to support both fully, and we are glad to do that.”

The trend bears watching. Years of technology progress has been based on system houses and their customers working with standard parts. Trends like in-house chip design may have the momentum to drastically rejigger today’s IT vendor Who’s Who, which has already been thoroughly rearranged in the wake of the cloud and the web. -jv

OpenAI GPT-4o Lands with Mini Thud or: Generative AI balances Hype and Reality in Chatbot Market Quest

May 19, 2024 By Jack Vaughan

It’s still too early to gauge Generative AI’s limits. That is another way of saying a circus atmosphere of hyperbole and demo theatrics is far from played out. The word “plateau” is heard today, and maybe a leveling off is only natural.

But now the uncertain space of ‘what it can’t do yet’ is mined each day. If Generative AI efforts plateau, and it merely changes the chatbot market as we know it, Generative AI will go down as a really big large-language disappointment.

This week’s OpenAI rollout of GPT-4o didn’t help. One can’t blame the OpenAI crew for trying their best to present awe inspiring on-stage demos as they saddle onto Danish Modern furniture set that bodes a comforting future. After all, there’s need to show their labs’ work is world changing or — barring that — fun.

The upstart’s was one among other episodes in the week’s AI Wars, as OpenAI’s demo was joined by Google I/O’s product roll outs on another stage in another corner of the Web, reported by Sean Michael Kerner and others.

For their part, the OpenAI crew walked through pet tricks, such as, asking the applications to translate “Why do we do linear equations?” into sparkling Italian.

Google’s show was just as breathless.

Yes, for OpenAI, the free app is a step into a new realm. (Although, as George Lawton points out, “free” is always an onion to unpeel.) And, yes, it vastly surpasses a voice-to-text demo of the 1990s. Does it move the bar much further than the Smart Speaker did in the mid-2010s? Let an army of pundits ponder this.

Our take: OpenAI’s announcement of something akin to a free-tier product was a bit short on awe. We’d second Sharon Goldman of Fortune who marked GPT-4o as “OpenAI’s emotive on steroids voice assistant.”

Of course, more accessible and easier entry for a wider range of people is OpenAI’s ticket to broadening into consumer markets. That’s where the killer app that justifies big valuations may be. Gain the consumer, and the enterprise follows.

That’s where OpenAI will meet the public and duke it out with friends like AWS, Google and Microsoft. There’s Apple too, which is likely now prepping spirited demos that show it has heard the bubbling cries of drowning users of Siri.

The next battle will be different than what has come before for heavily financed OpenAI. This stage in the technology’s evolution brings the OpenAI boffins down from the high ground, They say “hello whitebread-light demo patter” — just like Google, AWS or Microsoft product managers!

For a company that’s gained outsized attention in big headline deals for crucial infrastructure for big cloud players, it’s time to move toward apps. If it is to gain ground on a big scale, it will have to reach consumers. We take that as a less than nuanced theme in the GPT-4o roll out.

Cousin IoT: Brave New World Update

As if by chance, we sat in on Transforma’s report on IoT markets this very same week. While ably detailing the currents and eddies of IoT in the decades to come It seemed to convey a message relevant to Gen AI’s future course.

It’s been a long time since IoT first promised a brave new technology future — and such promises were never quite on the scale of Generative AI — but IoT has been grinding away gainfully, nevertheless.

IoT industry players have faced the same kind of existential challenge that GenAI is about to encounter. That is: The need to find a killer consumer app that it can power.

Transforma’s recent survey reports that there were 16.1 billion active IoT devices at the end of 2033. Annual device sales will grow from 4.1 billion in 2023 to 8.7 billion (a CAGR of 8%).

Yet, the world — even the industrial market within the bigger world — seems little changed. IoT’s top use cases, now and looking forward, fall short in terms of the energetic dynamism represented in early visions of IoT that looked more like StarTrek or the Jetsons.

Transforma looking toward the top three IoT use cases in 2033 cited 1- electronic labels; 2-building lights; and 3-headphones. You can come knocking because the van is not rocking, at least in terms of excitement. Still, these use cases represent real businesses.

Now, the assertion here — that Generative AI will be viewed in the future much as IoT is viewed today — is tentative. The agreement here is likely inexact … but may be useful for predicasts. Finally, my purpose here is not to put-down these young technologists’ efforts, but just to suggest that OpenAI and underlying Generative AI are in for a tough fight. — Jack Vaughan

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Go to page 6
  • Go to page 7
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2026 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack