• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Development

Source Code: Bill Gates’ Harvard Days

May 29, 2025 By Jack Vaughan

Gates Source Code BookWith the likes of Sam Altman and Elon Musk dashing about, we crouch for shelter now in an era where well-funded high-tech bros can live a life that was once reserved only for Doctor Strange.

That tends to make Bill Gates’ “Source Code: My Beginnings” (Knopf, 2025) a much more warmfy and life-affirming book than it might otherwise have been. In this recounting of his early days, and founding of Microsoft, he paints a colorful picture of a bright and excitable boy making good. Much of Source Code is set in “the green pastures of Harvard University.”

The boy wonder to be was born in Seattle in 1955, when computers were room sized, and totally unlike the consumer devices  which humans now ponder like prayer books as they walk city streets.

His family was comfortable and gave him a lot of room to engage a very curious imagination. His mother called it precociousness, and it’s  a trait he dampered down when he could. He had a fascination with basic analytical principles, which held him in stead when the age of personal computers dawned. [Read more…] about Source Code: Bill Gates’ Harvard Days

Get a grep

November 20, 2024 By Jack Vaughan

Details vary in different telling, but all agree that Unix operating system co-creator Ken Thompson developed grep while at Bell Labs. His impetus came from a request by a manager for a program that could search files for patterns.

 

Thompson had written and had been using a program, called ‘s’ (for ‘search’), which he debugged and enhanced overnight, the story goes. They nursed it and rehearsed it and grep sprung forth. “g” stands for “global,” “re” stands for “regular expression, “p” stands for “print.” To get something to display on screen in those days you used “print.” Thompson coming up with a software tool, and sharing it throughout the office, and perhaps beyond; to me that captured a moment in time.

 

I picked up on this based on an assigned mini-series for Data Center Knowledge. Also in this mini-series was a look at the roots of the kill command and the birth of SSH security. [links below].  I knew bits of the early Unix history but had to dig for this one.

[Read more…] about Get a grep

Chain of complexities

September 23, 2024 By Jack Vaughan

You say you’re working at the Empire State, and there’s a giant gorilla on the building? Could you try smearing the Chrysler Bldg with bananas?

I was working on a story on Data Lakes Unlocked recently, around the time of Great Midwestern comedian Bob Newhart’s passing. Thinking: the explosion of big web data created challenges that existing technology failed at, making room for the data lake, which solved some problems and overlooked others.

Initially, data lakes were perceived as ungoverned repositories where raw data was thrown with the hope of finding insights later, with about as much luck as I might have had with an arcade treasure hunt crane. But the Data Lakers refined their approach over many years to include more structure, governance, and metadata creation. This evolution led to the emergence of the data lakehouse, which combines aspects of both data warehouses and data lakes, and which is being renamed as we speak.

This Newhartian dialog came to me.

What it amounts to is walking through a chain of complexities – the challenges that confront a new version of an old technology. Something like a dialectic. Iceberg data management platform is a great new tool, but it is in some ways to be looked at as an improvement on Hadoop, much as Apache Parquet was, and, in much the same way, as was Apache Hive.

This is Bob Newhart homage. I think the sound version is a good way to engage with this content

https://progressivegauge.com/wp-content/uploads/2024/09/WhatIf.m4a

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Yeah, hi Bill? Yes, this is Jack in IT. The CFO was just down here and she had some questions on some of the AI GPU analytics processing bill we have.
Yes. You think you have a handle on it?
And so what is the problem?
You say you need a consistent way to manage and query data and you say you need acid compliance. Well, it sounds kind of difficult …
To deal with schema Evolution?
Well I know there are a lot of things on your plate – that’s that’s quite a lot of problems you got there. Go on, I’m sorry.
And oh, but you found a solution and what’s the solution? Apache Iceberg, okay!
Bill, why do they call it Iceberg?
It’s a fitting metaphor for a vast amount of hidden data.
You know, Bill, if it costs us too much the data maybe can just stay hid.
Okay. Well, how much is saving a lot of time and money going to cost us?
You say, the table commits add up to a lot of small files. But that’s okay. Because you’re going to mitigate it with compaction and partitioning and write optimization. Okay.
And you’re going to do data modeling. This time for sure!
Bill, we are on your side. I’m coming down there with the accountants – but we have to know how much this will cost us.
You say you are working remotely from the Cape?
I guess I’ll fire up Zoom.

 

The Master – YouTube

 

Random Notes: Pining for Blackwell, GPT 5

September 2, 2024 By Jack Vaughan

Happy Labor Day 2024 to Workers of the World!
Nvidia hits bumps in overdrive – That Wall Street meme is about to be cresting. A flaw in its Blackwell production plan is just that, we are assured. In a newsletter followup to a Jensen Huang earnings report interview as described by Bloomberg’s Ed Ludlow and Ian King:

Nvidia had to make a change to the design’s lithography mask. This is the template used to burn the lined patterns that make up circuits onto the materials deposited on a disk of silicon. Those circuits are what gives the chip the ability to crunch data.

At the least it is a reminder of the elemental fact that the course of semiconductor manufacturing does not always run smooth. As David Lee reminds on Bloomberg: Hardware is hard. Elemental facts are the first casualties in bull markets and technology hype cycles.

Even if the Gods of Uncertainty are kind, the educated consumer will allow that “Blackwell will be capacity constrained,” as quite ably depicted in Beth Kindig’s recent Forbes posting.

~~~~~~~~~~~~~

GPT 5, hurry fast! – This Blackwell Boding is marked with a rumored re-capitalizing of Open AI. And that with concerns about the delivery of GPT 5. Where is GPT 5? asks Platformonomics. In his Aug 30 edition of Platformonomics TGIF, Charles Fitzgerald bullet-points the reasons to be doubting that GPT 5 can round the bend in time. Possible explanations include:

*GPT-5 is just late — new scale brings new challenges to surmount

*It took time to get that much hardware in place

*Scaling has plateaued

*The organizational chaos at Open AI had consequences

*Open AI is doing more than just another scaling turn of the crank with GPT-5?

The skeptical examiner wonders if Open AI’s valuation wont edge down a bit, even though it is too big to fail and headed by the smartest man in the world. At the least, again, one has to observe the water level as it declines in Open AI’s moat.

~~~~~~~~~~~~~

Nunc ad aliquid omnino diversum

Deep Sea Learning – The Chicxulub event doomed 75 percent of Earth’s species. Details of the devastation were gathered by long core tubes drilled into the seafloor by the JOIDES Resolution ship now to be retired. It was a punch in the gut said a scientist.

Benthic foraminifera from Deep Sea off New Zealand.

In extra Innings

Danny Jansen in Superposition –  Plays for both teams in same game. In June he was at bat for the Blue Jays in Fenway when a storm stopped the game. Later, he was traded. In August the game was resumed, and he was now a catcher for the Red Sox. “Jays beat Red Sox 4-1, and Jansen shows up on both sides of box score – an MLB first!”

 

 

PASCAL language originator Nicholas Wirth, 89

January 12, 2024 By Jack Vaughan


Nicholas Wirth, whose 1970 creation of the PASCAL programming language dramatically influenced the development of modern software engineering techniques, died January 1. He was 89.

A long-time professor of Informatics at the ETH Institute in Zurich, Wirth produced PASCAL primarily as a teaching tool, but its innovative language constructs set the stage for C, C++ and Java software languages that flourished in the late 20th Century, and which still have wide use today. Wirth was the winner of the 1984 ACM Turing Award, among other honors.

Wirth named the language after 15th Century philosopher and mathematician, Blaise Pascal, who is often credited as the inventor of the first digital calculator. The PASCAL language was a bridge of sorts, as it attempted to span language styles for business computing by the then-dominant COBOL language, and scientific computing as seen with the FORTRAN language.

The field of software engineering was still in its early days when Wirth began his pioneering work as a graduate student and then as assistant professor at Stanford University. At the time, the status of early hardware implementations from numerical calculators to general-purpose computers was still somewhat nascent.

It was becoming apparent this time that a language focused solely on numerical processing would encounter obstacles as computing evolved.

It was hoped, Wirth wrote in a paper, that “the undesirable canyon between scientific and commercial programming … could be bridged.” [This and other materials cited in this story appeared in ”Recollections about the Development of PASCAL,” published as part of “History of Programming Languages,” in 1996 by Addison-Wesley.]

Wirth also contributed to development of the ALGOL and MODULA languages, often bringing special focus to development of compilers that translated source software into running machine code. His book, “Algorithms + Data Structures = Programs” is often cited as a keystone text for those interested in the fundamentals of software design.

Education was the primary goal of work on PASCAL, which Wirth undertook after a disappointing experience on a somewhat squabbling committee of experts looking to standardize a new version of ALGOL 60.

In his words, Wirth decided “to pursue my original goal of designing a general-purpose language without the heavy constraints imposed by the necessity of finding a consensus among two dozen experts about each and every little detail.”

The work of Wirth and some co-conspirators built on ALGOL, but also drew implicitly on emerging thinking on structured approaches to software development, such as those outlined by E.W. Dijkstra. Certainly, the early ‘bubble gum and bailing wire’ days of computer software were receding as Wirth began his work.

“Structured programming and stepwise refinement marked the beginnings of a methodology of programming, and became a cornerstone in helping program design become a subject of intellectual respectability,” according to Wirth.

The fact that PASCAL was designed as a teaching tool is important – it was constructed in a way that allowed new programmers to learn sound method, while applying their own enhancements.

“Lacking adequate tools, build your own,” he wrote.

And, while innovation was a goal, pragmatism for Wirth was also a high-order requirement.

Wirth’s work on PASCAL started about the time he headed from Stanford to ETH Zurich, with definition of the language achieved in 1970. It was formally announced in “Communications of the ACM” in 1971. Though never the most popular language in terms of numbers, it was very influential, contributing to a movement that emphasized general applications use, strong typing and human-readable code.

PASCAL gained great currency in the early days of personal computers in the United States. A version that became known as Turbo Pascal was created by Anders Hejlsberg, and was licensed and sold by Borland International beginning in 1983. On the wings of Borland ads in Byte magazine, Turbo PASCAL became ubiquitous among desktop programmer communities.

Work of pioneers like Wirth gain special resonance today as a Generative AI paradigm appears poised to automate larger portions of the programming endeavor. Automated code generation is by no means completely new, and the surprises and ‘gotchas’ of the pioneers’ era will no doubt be revisited as the understanding of effective software development processes continues to evolve. Wirth’s words and work merit attention in this regard, and in regard as well to fuller understanding of software evolution. – J.V.

The Progressive Gauge obituary gets some extra time now, with a few minutes of extra material.

My Take: I came to cover software for embedded systems, electronic design automation and, then, business applications, at a time when structured programming and defined methodologies were in full ascendance.
Methodologies narrowed over many years into a general accepted path to software modeling. But they tended to flower wildly at first. They started with some philosophic underpinnings, but could easily be charged as well as something that worked for someone that they thought could become a product. The downside was each new methodology might suggest that your problem was the methodology you were already using, which was not always the actual case. The joke of a bench developer I shared a cube with for a while was “What’s a methodology?” Answer: “A method that went to college.”

The methodology generally carried the name of the inventor, not a historical figure as with PASCAL.

The complexity of the deepest levels of the new operating system, compiler, design and language approaches was daunting for this writer.

My work in the computer trade press afforded me the opportunity to meet Unix co-creator Dennis Ritchie of Bell Labs (at the time he was rolling out the Plan 9 OS)and Java author James Gosling (then of Sun Microsystems). Met two of the three “UML Amigos,” Grady Booch, then of Rational Software, and Ivar Jacobson, head of Ivar Jacobson International. Interviewing Ivar on one occasion was particularly memorable. He asked that I interview a software avatar he had just created, an animated figure which spoke and represented his thinking on technology issues, rather than speak with him directly.

These assignments surely were ‘a tall order for a short guy!’

I offer these notes here in my role as a generalist. While computer history doesn’t repeat, it rhymes. This history is always worth a re-visit, especially as clearly-new paradigms fly out of coders’ 2024 work cubes. It has been interesting for me to look at the origination of PASCAL, and to learn about Nicholas Wirth, the individual who brought PASCAL into the world.

Pascaline
Pascaline

Links
https://cacm.acm.org/magazines/2021/3/250705-50-years-of-pascal/fulltext
https://www.standardpascaline.org/PascalP.html
https://amturing.acm.org/award_winners/wirth_1025774.cfm
http://PASCAL.hansotten.com/niklaus-wirth/recollections-about-the-development-of-PASCAL/
https://blogs.embarcadero.com/50-years-of-pascal-and-delphi-is-in-power/

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Progressive Gauge

Copyright © 2025 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack