• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

China Moon Racing

February 4, 2025 By Jack Vaughan

This first ran in January on my Medium Blog.

It’s been something of an afterthought since the US Apollo program ended in December of 1972, but the Moon is moving into the public spotlight again.

What’s becoming clear is that a new Moon race is well underway, one with plenty of participants but one most pointedly pitting China against the US.

The competition has a different character than it had in the now distant past — it’s become more a long-running endurance race and less the clearly defined sprint it was in the 1960s, when fear of Sputnik tended to unite sentiment in the US. [Read more…] about China Moon Racing

Brief exploration: Generative AI circa 2025

December 10, 2024 By Jack Vaughan

Complexities in deploying Gen AI and LLMs dim the light on some initial hype. These are the days of engineering, coordination, and integration.

By Jack Vaughan

The early procession of ‘2025 Outlooks’ seems to start with a lot of looks back. It’s mostly about Generative AI, which has proved to be a stock market mover, stock art maker, and stock item in years in review.

The song remains the same but the tenor may change. ChatGPT is two years old, and its market shaking meteoric rise now feels a bit less meteoric, if only because changing the world requires effort.

[Read more…] about Brief exploration: Generative AI circa 2025

Get a grep

November 20, 2024 By Jack Vaughan

Details vary in different telling, but all agree that Unix operating system co-creator Ken Thompson developed grep while at Bell Labs. His impetus came from a request by a manager for a program that could search files for patterns.

 

Thompson had written and had been using a program, called ‘s’ (for ‘search’), which he debugged and enhanced overnight, the story goes. They nursed it and rehearsed it and grep sprung forth. “g” stands for “global,” “re” stands for “regular expression, “p” stands for “print.” To get something to display on screen in those days you used “print.” Thompson coming up with a software tool, and sharing it throughout the office, and perhaps beyond; to me that captured a moment in time.

 

I picked up on this based on an assigned mini-series for Data Center Knowledge. Also in this mini-series was a look at the roots of the kill command and the birth of SSH security. [links below].  I knew bits of the early Unix history but had to dig for this one.

[Read more…] about Get a grep

AI, Neural Researchers Gain 2024 Nobel for Physics

October 8, 2024 By Jack Vaughan

Updated – The Royal Swedish Academy of Sciences today announced that neural network science pioneers John Hopfield and Geoffrey Hinton will be awarded the Nobel Prize in Physics for 2024. The two researchers are cited for “foundational discoveries and inventions that enable machine learning with artificial neural networks.”

The Nobel award is a capstone of sorts for two premier researchers in neural networks. This computer technology has gained global attention in recent years as the underpinning engine for large-scale advances in computer vision, language processing, prediction and human-machine interaction.

John Hopfield is best known for efforts to re-create interconnect models of the human brain. At the California Institute of Technology in the 1980s, he developed foundational concepts for neural network models.

These efforts led to what became known as a Hopfield network architecture. In effect, these are simulations that work as associative memory models that can process, store and retrieve patterns based on partial and imprecise information inputs.

Geoffrey Hinton, now a professor of computer science at the University of Toronto, began studies in the 1970s on neural networks mimicking human brain activity. Such reinterpretations of the brain’s interconnectedness found gradually greater use in pattern recognition through many years, leading to extensive use today in small- and large-arrays of semiconductors.

In the 1980s, Hinton and fellow researchers focused efforts on backpropagation algorithms for training neural networks and, later, so-called deep learning networks which are at the heart of today’s generative AI models. As implemented on computer chips from Nvidia and others, these models are now anticipated to drive breakthrough innovations in a wide variety of scientific and business fields. Hinton has been vocal in concerns about the future course of AI, and how it may have detrimental effects. He recently left a consulting position at Google, so he could speak more freely about his concerns.

Control at issue

What’s now known as artificial intelligent in the form of neural networks began to take shape in the 1950s. Research in both neural networks and competitive expert system AI approaches saw a decline in the 1990s as disappointing results accompanied the end of the Soviet Union and the Cold War, which had been a big driver of funds.

This period was known as the “AI Winter.” Hinton’s and Hopfield’s work was key in carrying the neural efforts forward until advances in language and imaging processing inspired new interest in the neural approaches.

Drawbacks still track the neural movement – most notably the frequent difficulty researchers face in understanding the work going on within a “black box’ of neural networks.

Both Hinton and Hopfield addressed the issues of AI in society during Nobel Prize-related press conferences. The unknown limits of AI capabilities – particularly a possible future superintelligence – have been widely discussed, and were echoed in reporters’ questions and Nobelists’ replies.

“We have no experience of what it’s like to have things smarter than us,” Hinton told the presser assembled by the Nobel committee. It’s going to be wonderful in many respects, in areas like healthcare. But we also have to worry about a number of possible bad consequences. Particularly the threat of these things getting out of control.”

Hopfield, by Zoom, echoed those concerns, discussing neural-based AI at an event honoring his accomplishment at Princeton.

Taking a somewhat evolutionary perspective, he said: “I worry about anything which says ‘I’m big. I’m fast. I’m faster than you are. I’m bigger than you are, and I can also outrun you. Now can you peacefully inhabit with me?’ I don’t know.”

But is it physics?

As implemented on computer chips from Nvidia and others, the neural models are anticipated to drive breakthrough innovations in a wide variety of scientific and business fields.

Both Hinton’s and Hopfield’s work spanned fields like biology, physics, computer science, and neuroscience. The multidisciplinary nature of much of this work required Nobel judges to clamber and jimmy and ultimately fit the neural network accomplishments into the Physics category.

“The laureates’ work has already been of the greatest benefit. In physics we use artificial neural networks in a vast range of areas, such as developing new materials with specific properties,” said Ellen Moons, Chair of the Nobel Committee for physics in a statement. – JV

 

shown above Picture: TRW neural network based pattern recognition, 1990s

Chain of complexities

September 23, 2024 By Jack Vaughan

You say you’re working at the Empire State, and there’s a giant gorilla on the building? Could you try smearing the Chrysler Bldg with bananas?

I was working on a story on Data Lakes Unlocked recently, around the time of Great Midwestern comedian Bob Newhart’s passing. Thinking: the explosion of big web data created challenges that existing technology failed at, making room for the data lake, which solved some problems and overlooked others.

Initially, data lakes were perceived as ungoverned repositories where raw data was thrown with the hope of finding insights later, with about as much luck as I might have had with an arcade treasure hunt crane. But the Data Lakers refined their approach over many years to include more structure, governance, and metadata creation. This evolution led to the emergence of the data lakehouse, which combines aspects of both data warehouses and data lakes, and which is being renamed as we speak.

This Newhartian dialog came to me.

What it amounts to is walking through a chain of complexities – the challenges that confront a new version of an old technology. Something like a dialectic. Iceberg data management platform is a great new tool, but it is in some ways to be looked at as an improvement on Hadoop, much as Apache Parquet was, and, in much the same way, as was Apache Hive.

This is Bob Newhart homage. I think the sound version is a good way to engage with this content

https://progressivegauge.com/wp-content/uploads/2024/09/WhatIf.m4a

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Yeah, hi Bill? Yes, this is Jack in IT. The CFO was just down here and she had some questions on some of the AI GPU analytics processing bill we have.
Yes. You think you have a handle on it?
And so what is the problem?
You say you need a consistent way to manage and query data and you say you need acid compliance. Well, it sounds kind of difficult …
To deal with schema Evolution?
Well I know there are a lot of things on your plate – that’s that’s quite a lot of problems you got there. Go on, I’m sorry.
And oh, but you found a solution and what’s the solution? Apache Iceberg, okay!
Bill, why do they call it Iceberg?
It’s a fitting metaphor for a vast amount of hidden data.
You know, Bill, if it costs us too much the data maybe can just stay hid.
Okay. Well, how much is saving a lot of time and money going to cost us?
You say, the table commits add up to a lot of small files. But that’s okay. Because you’re going to mitigate it with compaction and partitioning and write optimization. Okay.
And you’re going to do data modeling. This time for sure!
Bill, we are on your side. I’m coming down there with the accountants – but we have to know how much this will cost us.
You say you are working remotely from the Cape?
I guess I’ll fire up Zoom.

 

The Master – YouTube

 

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2025 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack