• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Blog

At Dynatrace Perform 2025: Non-breaking break points make their point

February 4, 2025 By Jack Vaughan

Live Debugger captures key performance data as working code does its work — Giving developers a real-time view into issues.

By Jack Vaughan

[February 4 ] – At Dynatrace’s Perform 2025 user conference in Las Vegas, the observability software company announced Live Debugger, said to enable developers to non-invasively access runtime operations. Founder and CTO Bernd Greifeneder described this more succinctly, as “non-breaking break points.”

That’s succinct – but it borders on the paradoxical, oxymoronic or contradictory. What goes on?

The basic premise of the portfolio update is to allow developers to set a marker without interfering with the runtime. They can capture stack traces, variable values, and process information without the onerous labor of reproduction and redeployment. The new capabilities are supported directly from the Dynatrace platform or by using the company’s native Visual Studio Code and JetBrains IDE plugins. And Dynatrace advises on ways to set breakpoints on interfaced programs where source code is unavailable. [Read more…] about At Dynatrace Perform 2025: Non-breaking break points make their point

China Moon Racing

February 4, 2025 By Jack Vaughan

This first ran in January on my Medium Blog.

It’s been something of an afterthought since the US Apollo program ended in December of 1972, but the Moon is moving into the public spotlight again.

What’s becoming clear is that a new Moon race is well underway, one with plenty of participants but one most pointedly pitting China against the US.

The competition has a different character than it had in the now distant past — it’s become more a long-running endurance race and less the clearly defined sprint it was in the 1960s, when fear of Sputnik tended to unite sentiment in the US. [Read more…] about China Moon Racing

Brief exploration: Generative AI circa 2025

December 10, 2024 By Jack Vaughan

Complexities in deploying Gen AI and LLMs dim the light on some initial hype. These are the days of engineering, coordination, and integration.

By Jack Vaughan

The early procession of ‘2025 Outlooks’ seems to start with a lot of looks back. It’s mostly about Generative AI, which has proved to be a stock market mover, stock art maker, and stock item in years in review.

The song remains the same but the tenor may change. ChatGPT is two years old, and its market shaking meteoric rise now feels a bit less meteoric, if only because changing the world requires effort.

[Read more…] about Brief exploration: Generative AI circa 2025

Get a grep

November 20, 2024 By Jack Vaughan

Details vary in different telling, but all agree that Unix operating system co-creator Ken Thompson developed grep while at Bell Labs. His impetus came from a request by a manager for a program that could search files for patterns.

 

Thompson had written and had been using a program, called ‘s’ (for ‘search’), which he debugged and enhanced overnight, the story goes. They nursed it and rehearsed it and grep sprung forth. “g” stands for “global,” “re” stands for “regular expression, “p” stands for “print.” To get something to display on screen in those days you used “print.” Thompson coming up with a software tool, and sharing it throughout the office, and perhaps beyond; to me that captured a moment in time.

 

I picked up on this based on an assigned mini-series for Data Center Knowledge. Also in this mini-series was a look at the roots of the kill command and the birth of SSH security. [links below].  I knew bits of the early Unix history but had to dig for this one.

[Read more…] about Get a grep

AI, Neural Researchers Gain 2024 Nobel for Physics

October 8, 2024 By Jack Vaughan

Updated – The Royal Swedish Academy of Sciences today announced that neural network science pioneers John Hopfield and Geoffrey Hinton will be awarded the Nobel Prize in Physics for 2024. The two researchers are cited for “foundational discoveries and inventions that enable machine learning with artificial neural networks.”

The Nobel award is a capstone of sorts for two premier researchers in neural networks. This computer technology has gained global attention in recent years as the underpinning engine for large-scale advances in computer vision, language processing, prediction and human-machine interaction.

John Hopfield is best known for efforts to re-create interconnect models of the human brain. At the California Institute of Technology in the 1980s, he developed foundational concepts for neural network models.

These efforts led to what became known as a Hopfield network architecture. In effect, these are simulations that work as associative memory models that can process, store and retrieve patterns based on partial and imprecise information inputs.

Geoffrey Hinton, now a professor of computer science at the University of Toronto, began studies in the 1970s on neural networks mimicking human brain activity. Such reinterpretations of the brain’s interconnectedness found gradually greater use in pattern recognition through many years, leading to extensive use today in small- and large-arrays of semiconductors.

In the 1980s, Hinton and fellow researchers focused efforts on backpropagation algorithms for training neural networks and, later, so-called deep learning networks which are at the heart of today’s generative AI models. As implemented on computer chips from Nvidia and others, these models are now anticipated to drive breakthrough innovations in a wide variety of scientific and business fields. Hinton has been vocal in concerns about the future course of AI, and how it may have detrimental effects. He recently left a consulting position at Google, so he could speak more freely about his concerns.

Control at issue

What’s now known as artificial intelligent in the form of neural networks began to take shape in the 1950s. Research in both neural networks and competitive expert system AI approaches saw a decline in the 1990s as disappointing results accompanied the end of the Soviet Union and the Cold War, which had been a big driver of funds.

This period was known as the “AI Winter.” Hinton’s and Hopfield’s work was key in carrying the neural efforts forward until advances in language and imaging processing inspired new interest in the neural approaches.

Drawbacks still track the neural movement – most notably the frequent difficulty researchers face in understanding the work going on within a “black box’ of neural networks.

Both Hinton and Hopfield addressed the issues of AI in society during Nobel Prize-related press conferences. The unknown limits of AI capabilities – particularly a possible future superintelligence – have been widely discussed, and were echoed in reporters’ questions and Nobelists’ replies.

“We have no experience of what it’s like to have things smarter than us,” Hinton told the presser assembled by the Nobel committee. It’s going to be wonderful in many respects, in areas like healthcare. But we also have to worry about a number of possible bad consequences. Particularly the threat of these things getting out of control.”

Hopfield, by Zoom, echoed those concerns, discussing neural-based AI at an event honoring his accomplishment at Princeton.

Taking a somewhat evolutionary perspective, he said: “I worry about anything which says ‘I’m big. I’m fast. I’m faster than you are. I’m bigger than you are, and I can also outrun you. Now can you peacefully inhabit with me?’ I don’t know.”

But is it physics?

As implemented on computer chips from Nvidia and others, the neural models are anticipated to drive breakthrough innovations in a wide variety of scientific and business fields.

Both Hinton’s and Hopfield’s work spanned fields like biology, physics, computer science, and neuroscience. The multidisciplinary nature of much of this work required Nobel judges to clamber and jimmy and ultimately fit the neural network accomplishments into the Physics category.

“The laureates’ work has already been of the greatest benefit. In physics we use artificial neural networks in a vast range of areas, such as developing new materials with specific properties,” said Ellen Moons, Chair of the Nobel Committee for physics in a statement. – JV

 

shown above Picture: TRW neural network based pattern recognition, 1990s

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Primary Sidebar

Twitter Feed Until Mad Musk Man Goes

Tweets by JackIVaughan

Recent Posts

  • ‘Co-Evolution’ and the trend of ‘Cyberselfish’
  • Information Examiner October 2025 – Connectors tackle AI with MCP
  • Progressive Gauge Information Examiner – Week of Sept 1 2025
  • Lab note: On Reading Sime’s “Lise Meitner”
  • The Last Word on AI This Time

Recent Comments

    Archives

    • December 2025
    • October 2025
    • September 2025
    • August 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • July 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • November 2021
    • October 2021
    • August 2021
    • April 2021
    • March 2021
    • January 2021
    • December 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • July 2019

    Categories

    • AI
    • Computing
    • Data
    • Development
    • History of Science
    • IoT
    • Observability
    • Podcast
    • Political World
    • Quantum Computing
    • Random Notes
    • The Trade
    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    Progressive Gauge

    Copyright © 2025 · Jack Vaughan · Log in

    • Home
    • About
    • Blog
    • Projects, Samples and Items
    • Video
    • Contact Jack