• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

AI, Neural Researchers Gain 2024 Nobel for Physics

October 8, 2024 By Jack Vaughan

Updated – The Royal Swedish Academy of Sciences today announced that neural network science pioneers John Hopfield and Geoffrey Hinton will be awarded the Nobel Prize in Physics for 2024. The two researchers are cited for “foundational discoveries and inventions that enable machine learning with artificial neural networks.”

The Nobel award is a capstone of sorts for two premier researchers in neural networks. This computer technology has gained global attention in recent years as the underpinning engine for large-scale advances in computer vision, language processing, prediction and human-machine interaction.

John Hopfield is best known for efforts to re-create interconnect models of the human brain. At the California Institute of Technology in the 1980s, he developed foundational concepts for neural network models.

These efforts led to what became known as a Hopfield network architecture. In effect, these are simulations that work as associative memory models that can process, store and retrieve patterns based on partial and imprecise information inputs.

Geoffrey Hinton, now a professor of computer science at the University of Toronto, began studies in the 1970s on neural networks mimicking human brain activity. Such reinterpretations of the brain’s interconnectedness found gradually greater use in pattern recognition through many years, leading to extensive use today in small- and large-arrays of semiconductors.

In the 1980s, Hinton and fellow researchers focused efforts on backpropagation algorithms for training neural networks and, later, so-called deep learning networks which are at the heart of today’s generative AI models. As implemented on computer chips from Nvidia and others, these models are now anticipated to drive breakthrough innovations in a wide variety of scientific and business fields. Hinton has been vocal in concerns about the future course of AI, and how it may have detrimental effects. He recently left a consulting position at Google, so he could speak more freely about his concerns.

Control at issue

What’s now known as artificial intelligent in the form of neural networks began to take shape in the 1950s. Research in both neural networks and competitive expert system AI approaches saw a decline in the 1990s as disappointing results accompanied the end of the Soviet Union and the Cold War, which had been a big driver of funds.

This period was known as the “AI Winter.” Hinton’s and Hopfield’s work was key in carrying the neural efforts forward until advances in language and imaging processing inspired new interest in the neural approaches.

Drawbacks still track the neural movement – most notably the frequent difficulty researchers face in understanding the work going on within a “black box’ of neural networks.

Both Hinton and Hopfield addressed the issues of AI in society during Nobel Prize-related press conferences. The unknown limits of AI capabilities – particularly a possible future superintelligence – have been widely discussed, and were echoed in reporters’ questions and Nobelists’ replies.

“We have no experience of what it’s like to have things smarter than us,” Hinton told the presser assembled by the Nobel committee. It’s going to be wonderful in many respects, in areas like healthcare. But we also have to worry about a number of possible bad consequences. Particularly the threat of these things getting out of control.”

Hopfield, by Zoom, echoed those concerns, discussing neural-based AI at an event honoring his accomplishment at Princeton.

Taking a somewhat evolutionary perspective, he said: “I worry about anything which says ‘I’m big. I’m fast. I’m faster than you are. I’m bigger than you are, and I can also outrun you. Now can you peacefully inhabit with me?’ I don’t know.”

But is it physics?

As implemented on computer chips from Nvidia and others, the neural models are anticipated to drive breakthrough innovations in a wide variety of scientific and business fields.

Both Hinton’s and Hopfield’s work spanned fields like biology, physics, computer science, and neuroscience. The multidisciplinary nature of much of this work required Nobel judges to clamber and jimmy and ultimately fit the neural network accomplishments into the Physics category.

“The laureates’ work has already been of the greatest benefit. In physics we use artificial neural networks in a vast range of areas, such as developing new materials with specific properties,” said Ellen Moons, Chair of the Nobel Committee for physics in a statement. – JV

 

shown above Picture: TRW neural network based pattern recognition, 1990s

Chain of complexities

September 23, 2024 By Jack Vaughan

You say you’re working at the Empire State, and there’s a giant gorilla on the building? Could you try smearing the Chrysler Bldg with bananas?

I was working on a story on Data Lakes Unlocked recently, around the time of Great Midwestern comedian Bob Newhart’s passing. Thinking: the explosion of big web data created challenges that existing technology failed at, making room for the data lake, which solved some problems and overlooked others.

Initially, data lakes were perceived as ungoverned repositories where raw data was thrown with the hope of finding insights later, with about as much luck as I might have had with an arcade treasure hunt crane. But the Data Lakers refined their approach over many years to include more structure, governance, and metadata creation. This evolution led to the emergence of the data lakehouse, which combines aspects of both data warehouses and data lakes, and which is being renamed as we speak.

This Newhartian dialog came to me.

What it amounts to is walking through a chain of complexities – the challenges that confront a new version of an old technology. Something like a dialectic. Iceberg data management platform is a great new tool, but it is in some ways to be looked at as an improvement on Hadoop, much as Apache Parquet was, and, in much the same way, as was Apache Hive.

This is Bob Newhart homage. I think the sound version is a good way to engage with this content

https://progressivegauge.com/wp-content/uploads/2024/09/WhatIf.m4a

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Yeah, hi Bill? Yes, this is Jack in IT. The CFO was just down here and she had some questions on some of the AI GPU analytics processing bill we have.
Yes. You think you have a handle on it?
And so what is the problem?
You say you need a consistent way to manage and query data and you say you need acid compliance. Well, it sounds kind of difficult …
To deal with schema Evolution?
Well I know there are a lot of things on your plate – that’s that’s quite a lot of problems you got there. Go on, I’m sorry.
And oh, but you found a solution and what’s the solution? Apache Iceberg, okay!
Bill, why do they call it Iceberg?
It’s a fitting metaphor for a vast amount of hidden data.
You know, Bill, if it costs us too much the data maybe can just stay hid.
Okay. Well, how much is saving a lot of time and money going to cost us?
You say, the table commits add up to a lot of small files. But that’s okay. Because you’re going to mitigate it with compaction and partitioning and write optimization. Okay.
And you’re going to do data modeling. This time for sure!
Bill, we are on your side. I’m coming down there with the accountants – but we have to know how much this will cost us.
You say you are working remotely from the Cape?
I guess I’ll fire up Zoom.

 

The Master – YouTube

 

New tune: Oracle and AWS sign cloud pact at Oracle Cloud World

September 10, 2024 By Jack Vaughan

Since Oracle really got serious about the cloud back in 2018, its  ‘Oracle’s Generation 2 Cloud Platform’ has evolved in a number of ways without forestalling AWS’s ascent in the database management space.

 

So, that made Oracle Cloud World 2024 a great occasion to declare victory and shake hands with AWS, as the company had earlier done with Azure SQL maker Microsoft and Google Cloud.

 

Oracle’s reported cloud advances made it one of the brighter lights on the stock market this year, but the company still faces the challenge to boost capex spending in order to go toe-to-toe with big cloud players. The biggies are feverishly building out bigger cloud data centers as Generative AI workloads grow. Oracle is re-defining a cloud region to include some smaller cloud setups.

 

This latest handshake includes the launch of Oracle Database@AWS, a new offering that “allows customers to access Oracle Autonomous Database on dedicated infrastructure and Oracle Exadata Database Service within AWS.” Workloads running on Oracle RAC are also covered.

 

The announcement eases migration headaches, Brian Tilzer, Chief Digital, Analytics and Technology Officer, Best Buy, said in a statement.

 

“This announcement makes it easier for us to move some of our database workloads to AWS,” concurred Joe Frazier, Head of Architecture and Platform Engineering, Fidelity Investments.

 

That means the Oracle database’s tight connection to Oracle infrastructure will be supported in all three of the big clouds. And it may save some capex on its own multiyear cloud data center rollout.

 

“What if we embedded an Oracle data center right into an AWS data center?” asked Oracle Chairman and CTO Larry Ellison at the event in Las Vegas. He outlined benefits to users in terms of  workload migration, system integration, low-latency and simple billing.

 

That’s not the tenor of question Ellison asked in past Oracle annual conferences, where he sometimes harshly lectured on alleged shortcomings of AWS offerings. This reporter has written before that Oracle’s pride often borders on arrogation. But a 2022 confab saw some lessening of the “Born to Raise Hell” tattooed version of Larry Ellison.

 

Ellison’s manner was further subdued this year, sitting with new AWS CEO Matt Garman. He was downright cordial.

 

Ellison told Garman that one of his biggest customers, Jaime Dimon, CEO of JP Morgan Chase, asked when the Oracle database was going to run on AWS each time they met. With the AWS deal, Ellison can scratch that item on the to-do list.

 

The bottom line: Oracle’s customers want multicloud support and Oracle better help by making these kind of deals. Multicloud means options, but options still seem to center on three big cloud providers. Oracle’s data prowess alone will not solve this. Will its efforts to move customers to its own cloud be due for reduced attention?

 

Now that this new rendition of Oracle cloud strategy is accomplished, maybe it is time to rename the yearly Oracle Conference Oracle AI World. Unsurprisingly, that was a very major push, both in Oracle’s quarterly report, and at its showcase conference this year. – J. Vaughan PG

~~~~~~~~~~~~~~~~~~~~~

Shown above: Crowd awaiting Ellison keynote.

Random Notes: Pining for Blackwell, GPT 5

September 2, 2024 By Jack Vaughan

Happy Labor Day 2024 to Workers of the World!
Nvidia hits bumps in overdrive – That Wall Street meme is about to be cresting. A flaw in its Blackwell production plan is just that, we are assured. In a newsletter followup to a Jensen Huang earnings report interview as described by Bloomberg’s Ed Ludlow and Ian King:

Nvidia had to make a change to the design’s lithography mask. This is the template used to burn the lined patterns that make up circuits onto the materials deposited on a disk of silicon. Those circuits are what gives the chip the ability to crunch data.

At the least it is a reminder of the elemental fact that the course of semiconductor manufacturing does not always run smooth. As David Lee reminds on Bloomberg: Hardware is hard. Elemental facts are the first casualties in bull markets and technology hype cycles.

Even if the Gods of Uncertainty are kind, the educated consumer will allow that “Blackwell will be capacity constrained,” as quite ably depicted in Beth Kindig’s recent Forbes posting.

~~~~~~~~~~~~~

GPT 5, hurry fast! – This Blackwell Boding is marked with a rumored re-capitalizing of Open AI. And that with concerns about the delivery of GPT 5. Where is GPT 5? asks Platformonomics. In his Aug 30 edition of Platformonomics TGIF, Charles Fitzgerald bullet-points the reasons to be doubting that GPT 5 can round the bend in time. Possible explanations include:

*GPT-5 is just late — new scale brings new challenges to surmount

*It took time to get that much hardware in place

*Scaling has plateaued

*The organizational chaos at Open AI had consequences

*Open AI is doing more than just another scaling turn of the crank with GPT-5?

The skeptical examiner wonders if Open AI’s valuation wont edge down a bit, even though it is too big to fail and headed by the smartest man in the world. At the least, again, one has to observe the water level as it declines in Open AI’s moat.

~~~~~~~~~~~~~

Nunc ad aliquid omnino diversum

Deep Sea Learning – The Chicxulub event doomed 75 percent of Earth’s species. Details of the devastation were gathered by long core tubes drilled into the seafloor by the JOIDES Resolution ship now to be retired. It was a punch in the gut said a scientist.

Benthic foraminifera from Deep Sea off New Zealand.

In extra Innings

Danny Jansen in Superposition –  Plays for both teams in same game. In June he was at bat for the Blue Jays in Fenway when a storm stopped the game. Later, he was traded. In August the game was resumed, and he was now a catcher for the Red Sox. “Jays beat Red Sox 4-1, and Jansen shows up on both sides of box score – an MLB first!”

 

 

Mendelianum Musings

July 15, 2024 By Jack Vaughan

Source: Mendelianum Moravian Museum

I recently picked up for a summer read “The Gene” by Siddhartha Mukherjee. As I began to plow through the nearly 600-page book, it seemed to display the accidents and unforeseen circumstances that can track scientific research and technological innovation.

≠

The Gene begins with Gregor Mendel in the monastery in Brno, now a part of the Czech Republic. There the eventual founder of the science of genetics is perceived as slow, happy in the garden with his peas, not smart or articulate enough to be more than a substitute teacher. The friar abbots try and give him every chance to gain a useful education, and perhaps step up from substitute. And by some phenomenal luck, he’s sent to study in Vienna. Thus, to study under no less than Doppler.

Yes, he comes to study under Austrian physicist Christian Doppler, the mathematician and physicist who proposed that the perceived pitch of sound or the color of light was not fixed but depended on the relative locations and velocities of the observer and the source. His principles on the nature of change in wave frequency influence work that led to today’s radio astronomy efforts, radar, sonar, and more. It must be seen as a happy accident, for Mendel to learn from Doppler, even if he never passed an exam.

Mendel patiently raised peas in his garden. He experimentally crossbred the pea plants and dutifully documented the results. Some viewers have seen him as a plodder, with no theoretical understanding of underlying forces at work. But author Mukherjee assures that Mendel knew “he was trying to unlock the material basis and laws of heredity.”

The author also writes that Doppler’s example as a physicist informed Mendel’s efforts. Mendel found the elements that could reveal an underlying pattern that could be described numerically as he arrayed different bits of data on plants – height, texture, color. That is, a numerical model that marked the inheritance of traits.

This ended up in a research paper presented to the Natural Science Society in Brno. But Mendel’s station at the far reaches of the scientific community assigned his work to a type of oblivion that was a long time in lifting.

Mukherjee cites a geneticist describing this period of oblivion as “one of the strangest silences in the history of biology.”

The Mendel story contrasts with Darwin’s story in Mukherjee’s work. Darwin had a position close to the center in the scientific culture of his day. But Darwin and others struggled to move the science of heredity forward after the big bang of Origin of the Species.

The mechanism was already described — or pointed to — in some measure by Mendel, but his duties as a cleric  led him to be “choked by administrative work,” and his paper became for him a capstone, as he labored as a sanctified clerk. Gradually over decades his work was discovered and replicated, eventually triggering a general evangelization of Mendel.

Yes, the initial wilting on the vine of Mendel’s work was not anything that couldn’t have been foreseen. As Mukherjee observes, Darwin’s reading of his keystone paper took place at the Linnean Society in London. August, not? But Mendel presented at the Natural Science Society of Brno, far afield. That Mendel’s work slashed steadily, like a scythe through the pages of time, until it reached an audience, speaks volumes for its worth. – Jack Vaughan

Related
The Gene – On Amazon
On the Road to the Double-Helix – Progressive Gauge Blost

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Go to page 6
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2026 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack