• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

Information Examiner October 2025 – Connectors tackle AI with MCP

October 2, 2025 By Jack Vaughan

The rise of Agentic AI and Large Language Models (LLMs) is transforming classic data integration, with the Model Context Protocol (MCP) emerging as a key piece of the new tooling. This protocol is changing how users interact with model data, and traditional data companies now race to meet the new requirements.

ISVs and enterprises can’t move fast enough on the new AI front, and traditional business databases will often be central. BI reporting will be an early target. Software architecture leads may turn increasingly to data connectivity providers like CData Software if they are going to move fast without breaking things. [Read more…] about Information Examiner October 2025 – Connectors tackle AI with MCP

Progressive Gauge Information Examiner – Week of Sept 1 2025

September 1, 2025 By Jack Vaughan

Enterprise AI attention turns to data architecture – Getting Generative AI apps up and running is the first problem that enterprise teams encounter. Then comes maintaining those systems. As Sanjay Mohan points out in a recent blog post, this makes for a moving target, as data flowing into the AI engines must be continually monitored. Constant changes are inherent in active computer data. Read more.

Speaking of Agentic AI Data Architecture – Connecting-to and building-out Generative AI is going to mean some changes in the way data architects approach solutions. Mohan Varthakavi’s recent Diversity.net “Reimagining Data Architecture for Agentic AI” post addresses the data challenge from an overarching perspective, and it is an interesting take. Read more.

Data to the fore as Snowflake and Nvidia report – Development of specialized AI agents will be key indicator for future of generative AI, analyst Mandeep Singh tells Bloomberg podcast audience. Surprisingly perhaps, this puts Snowflake in a more competitive position as AI rules the airwaves. Read more.

Lab note: On Reading Sime’s “Lise Meitner”

August 5, 2025 By Jack Vaughan

I have been reading Lise Meitner: A Life in Physics [U Cal Press;1996] by Ruth Lewin Sime. This is an important work, vital to gaining understanding of evolving experimentation in pursuit of the nature of the atom in the 20th Century. But the greater importance is in the light it shines on Lise Meitner, who continues to arise from the footnotes of too many histories, to take a chief post in the history of science — this in the face of a crushing force of dehumanization. [Read more…] about Lab note: On Reading Sime’s “Lise Meitner”

The Last Word on AI This Time

June 24, 2025 By Jack Vaughan

When I first heard of Generative AI, I was skeptical. Although it was clearly a gigantic step forward for machine learning.  I covered the Hadoop/Big Data era – for five years. As noted before, we would ask what do we do with Big Data? The answer it turned out was Machine Learning. But it was complex, hard to develop, difficult to gather data for, and ROI was complicated or ephemeral. People would bemusedly ask if it had uses east of Oakland Bay. My experience with Big Data colored my perspective on Generative AI.

Generative AI requires great mountains of data to work. Herding that data is labor intensive. As with previous machine learning technologies, getting desired results from the model is difficult. Some engineer will come up with a tool to solve problems. And find some VC, and startups race about. Few apps get from prototype proof-of-concept to operational. Few pay their own way.

 

Benedict Evans, Geoffrey Hinton and Gary Marcus are just some of the people who more ably than I critiqued the LLM. But great excitement was unleashed on the global public. And there wasn’t much of an ear for their ‘wait a minute’ commentary.

 

But early this year, Deep Seek seemed to show that the rush to Generative AI – the rush of money for electricity and land to deploy widely – should be more carefully considered. Deep Seek was an event that arrived at a receptive moment.

 

It seems in a way a textbook case of technology disruption. That advocates were blind to the limits of scalability for LLMs, coming up with greater and greater kluges – think nuclear power – SMRs or other.

 

Meanwhile, a crew at a resource-strapped Chinese quant-tank saw ends around.  The designers focused on efficiency rather than brute force, employing methods such as reduced floating-point precision, optimized GPU instruction sets, and AI distillation.

Engineers love benchmarks – they love to tear them down! Benchmarks are biased, yes. Even a child can figure that. But … when you look at Deep Seek’s recipe, it is clever engineering. None of it is new. Others have worked on TinyML for years. The type of hardware optimizations they did were bread and butter years ago. There are plenty of computer scientists the are working to get around Generative AI’s issues [scale/cost/hallucination/use cases being the big ones]. These issues make this Silicon Valley baby a suspect redeemer. With respect, Jim Cramer sometimes oversteps his actual areas of expertise.

That Deep Seek moment – don’t let anyone tell you it is more or less than that – has just been followed by an upswing in the  contrarian view on LLMs.  A world that would have nothing of it a year ago is now seriously discussing “The Illusion of Thinking” – a paper by Apple researchers that questions the true reasoning capabilities of large language models.

“The Illusion of Thinking” This may put a pall on Agentic AI, which has conveniently arisen as the answer to Generative AI’s first task: to finagle its way into real world workflows. Now, as summer begins, there is more of an ear for voices that cite the challenges, obstacles, and over-sell that have marked the last 24 months or so of the AI era. That can be helpful in the big task of understanding and taming LLMs for greater usage.

Penultimately,  we have to hold in mind some contrary points going forward. It is not LLMs are not valuable, just that they have limits that hyperbole has obscured.

Inspiring to me was a recent post by network systems expert and author Bruce Davie. He reminds us that a rational middle path is often preferable to the extreme predictions of doom, bust, or boom that characterize today’s AI tempest. Humans can skew but the mean always calls, and we may be seeing that now. [Thanks to Davie for cueing me to New Yorker writer Joshua Rothman and, in turn F. Scott Fitzgerald, he of the adage of holding “two opposed ideas  the mind at the same time” seen above.]

This seems like a good time to let the Substack cogitate on these great matters. While I may post yet this season, I am kicking up my heels, and dreaming about slapping skeeters in Up North in Wisconsin. And taking “Lise Meitner: A Life in Physics” down from the shelf.

Source Code: Bill Gates’ Harvard Days

May 29, 2025 By Jack Vaughan

Gates Source Code BookWith the likes of Sam Altman and Elon Musk dashing about, we crouch for shelter now in an era where well-funded high-tech bros can live a life that was once reserved only for Doctor Strange.

That tends to make Bill Gates’ “Source Code: My Beginnings” (Knopf, 2025) a much more warmfy and life-affirming book than it might otherwise have been. In this recounting of his early days, and founding of Microsoft, he paints a colorful picture of a bright and excitable boy making good. Much of Source Code is set in “the green pastures of Harvard University.”

The boy wonder to be was born in Seattle in 1955, when computers were room sized, and totally unlike the consumer devices  which humans now ponder like prayer books as they walk city streets.

His family was comfortable and gave him a lot of room to engage a very curious imagination. His mother called it precociousness, and it’s  a trait he dampered down when he could. He had a fascination with basic analytical principles, which held him in stead when the age of personal computers dawned. [Read more…] about Source Code: Bill Gates’ Harvard Days

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2025 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack