• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Computing

AI, Neural Researchers Gain 2024 Nobel for Physics

October 8, 2024 By Jack Vaughan

Updated – The Royal Swedish Academy of Sciences today announced that neural network science pioneers John Hopfield and Geoffrey Hinton will be awarded the Nobel Prize in Physics for 2024. The two researchers are cited for “foundational discoveries and inventions that enable machine learning with artificial neural networks.”

The Nobel award is a capstone of sorts for two premier researchers in neural networks. This computer technology has gained global attention in recent years as the underpinning engine for large-scale advances in computer vision, language processing, prediction and human-machine interaction.

John Hopfield is best known for efforts to re-create interconnect models of the human brain. At the California Institute of Technology in the 1980s, he developed foundational concepts for neural network models.

These efforts led to what became known as a Hopfield network architecture. In effect, these are simulations that work as associative memory models that can process, store and retrieve patterns based on partial and imprecise information inputs.

Geoffrey Hinton, now a professor of computer science at the University of Toronto, began studies in the 1970s on neural networks mimicking human brain activity. Such reinterpretations of the brain’s interconnectedness found gradually greater use in pattern recognition through many years, leading to extensive use today in small- and large-arrays of semiconductors.

In the 1980s, Hinton and fellow researchers focused efforts on backpropagation algorithms for training neural networks and, later, so-called deep learning networks which are at the heart of today’s generative AI models. As implemented on computer chips from Nvidia and others, these models are now anticipated to drive breakthrough innovations in a wide variety of scientific and business fields. Hinton has been vocal in concerns about the future course of AI, and how it may have detrimental effects. He recently left a consulting position at Google, so he could speak more freely about his concerns.

Control at issue

What’s now known as artificial intelligent in the form of neural networks began to take shape in the 1950s. Research in both neural networks and competitive expert system AI approaches saw a decline in the 1990s as disappointing results accompanied the end of the Soviet Union and the Cold War, which had been a big driver of funds.

This period was known as the “AI Winter.” Hinton’s and Hopfield’s work was key in carrying the neural efforts forward until advances in language and imaging processing inspired new interest in the neural approaches.

Drawbacks still track the neural movement – most notably the frequent difficulty researchers face in understanding the work going on within a “black box’ of neural networks.

Both Hinton and Hopfield addressed the issues of AI in society during Nobel Prize-related press conferences. The unknown limits of AI capabilities – particularly a possible future superintelligence – have been widely discussed, and were echoed in reporters’ questions and Nobelists’ replies.

“We have no experience of what it’s like to have things smarter than us,” Hinton told the presser assembled by the Nobel committee. It’s going to be wonderful in many respects, in areas like healthcare. But we also have to worry about a number of possible bad consequences. Particularly the threat of these things getting out of control.”

Hopfield, by Zoom, echoed those concerns, discussing neural-based AI at an event honoring his accomplishment at Princeton.

Taking a somewhat evolutionary perspective, he said: “I worry about anything which says ‘I’m big. I’m fast. I’m faster than you are. I’m bigger than you are, and I can also outrun you. Now can you peacefully inhabit with me?’ I don’t know.”

But is it physics?

As implemented on computer chips from Nvidia and others, the neural models are anticipated to drive breakthrough innovations in a wide variety of scientific and business fields.

Both Hinton’s and Hopfield’s work spanned fields like biology, physics, computer science, and neuroscience. The multidisciplinary nature of much of this work required Nobel judges to clamber and jimmy and ultimately fit the neural network accomplishments into the Physics category.

“The laureates’ work has already been of the greatest benefit. In physics we use artificial neural networks in a vast range of areas, such as developing new materials with specific properties,” said Ellen Moons, Chair of the Nobel Committee for physics in a statement. – JV

 

shown above Picture: TRW neural network based pattern recognition, 1990s

Chain of complexities

September 23, 2024 By Jack Vaughan

You say you’re working at the Empire State, and there’s a giant gorilla on the building? Could you try smearing the Chrysler Bldg with bananas?

I was working on a story on Data Lakes Unlocked recently, around the time of Great Midwestern comedian Bob Newhart’s passing. Thinking: the explosion of big web data created challenges that existing technology failed at, making room for the data lake, which solved some problems and overlooked others.

Initially, data lakes were perceived as ungoverned repositories where raw data was thrown with the hope of finding insights later, with about as much luck as I might have had with an arcade treasure hunt crane. But the Data Lakers refined their approach over many years to include more structure, governance, and metadata creation. This evolution led to the emergence of the data lakehouse, which combines aspects of both data warehouses and data lakes, and which is being renamed as we speak.

This Newhartian dialog came to me.

What it amounts to is walking through a chain of complexities – the challenges that confront a new version of an old technology. Something like a dialectic. Iceberg data management platform is a great new tool, but it is in some ways to be looked at as an improvement on Hadoop, much as Apache Parquet was, and, in much the same way, as was Apache Hive.

This is Bob Newhart homage. I think the sound version is a good way to engage with this content

https://progressivegauge.com/wp-content/uploads/2024/09/WhatIf.m4a

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Yeah, hi Bill? Yes, this is Jack in IT. The CFO was just down here and she had some questions on some of the AI GPU analytics processing bill we have.
Yes. You think you have a handle on it?
And so what is the problem?
You say you need a consistent way to manage and query data and you say you need acid compliance. Well, it sounds kind of difficult …
To deal with schema Evolution?
Well I know there are a lot of things on your plate – that’s that’s quite a lot of problems you got there. Go on, I’m sorry.
And oh, but you found a solution and what’s the solution? Apache Iceberg, okay!
Bill, why do they call it Iceberg?
It’s a fitting metaphor for a vast amount of hidden data.
You know, Bill, if it costs us too much the data maybe can just stay hid.
Okay. Well, how much is saving a lot of time and money going to cost us?
You say, the table commits add up to a lot of small files. But that’s okay. Because you’re going to mitigate it with compaction and partitioning and write optimization. Okay.
And you’re going to do data modeling. This time for sure!
Bill, we are on your side. I’m coming down there with the accountants – but we have to know how much this will cost us.
You say you are working remotely from the Cape?
I guess I’ll fire up Zoom.

 

The Master – YouTube

 

Noting the passing in May of Neil Raden

July 7, 2024 By Jack Vaughan

Noting the passing in May of Neil Raden, who was one of the most unforgettable characters I ever met in my computer trade press days. His death came after a long illness, progress of which he shared with the tech communities in which he’d long been a notable voice.

Neil led an independent consulting and analysis practice as the head of Hire Brains in Santa Fe, New Mexico. He was his own kind of 60s guy, in my experience. That is, he was goateed, a bit skeptical, but truly enthusiastic about technological advances that influenced his era.

Like many others, he came out of fields that weren’t essentially technological, but which were channels to building out new computerized methods, working from first principles. This led him on a winding journey as the mainframe gave way to the PC and the cloud, with problems to solve every step of the way. In his case, the seed soil was advanced mathematical studies and actuarial experience. He forged a home-grown view on technology, with special emphasis on databases, data warehouses and common-sense problem solving.

I got to know Neil on the data warehouse beat, which I covered for Software Magazine, Application Development Trends and SearchDataManagement.com.

When a reporter asked him a question he’d break it down carefully, and look at it from different perspectives…most of which had yet to occur to you. Each question invited yet another strategy for writing your story — that or ten other ones. With Neil, it wasn’t hard to jump from IoT to tensor matrices to federated learning to differential privacy and to data lake houses (tho, the latter was not his favorite!).

With some disappointment, you’d bring the conversation back to the original topic – you got a deadline, right?  But, for my money, any conversation with Neil was a master class in technology assessment.  And if I wanted to talk about Telstar or the Perceptron, he was down with all that too. Following Neil’s train of thought could be like riding the notes of jazz player’s solo.

In the late teens, I’d see him at Oracle Open World in San Franciso, and he’d talk about topological algebra – like chaos theory, a lodestone interest of his. A couple of years later, he’s speaking with me for a story on data and IoT, and topological algebra magically comes up again! I still can’t figure it  out. But I try.

On his health, I have no way of knowing if Neil saw what was coming back then, but I do know he was into being here and now. Recalling how Neil closed an interview, after some flights of technology fancy. He’d just shown me a picture of nature in his adopted home of New Mexico.

He said: “I’ll tell you something really funny, Jack. I’m looking out my window right now.I look out the window and it is beautiful. The land rises up. And on the other side of that is the Rio Grande, and on the other side of the Rio Grande is … ” My recording stopped there.

Well, Neil Raden is sure on the other side of the Rio Grande now. I have to say thanks, I appreciated you sharing your time!

 

–30–

Neil Raden: From a Reporter’s Notebook –

On edge computing and edge AI

One thing I’m concerned about is that the edge is far too important to be controlled by a couple of mega vendors. Once that becomes proprietary it’ll be a disaster.

Why he studied math

I studied math. I studied math because I didn’t want to write papers, and look what I do now!

Question to ask of a new technology paradigm, for example, event processing

The question is ‘can an organization really change the way it operates?’ The technology may not be the hardest part.

On the data lakehouse

That one really cracks me up. Okay, so we build a data lake and now we can’t do anything with it. Oh, don’t worry we have a new thing — we’re going to call it a data lake house. And we’re going to give you some analytical functions like a data warehouse on top of it. And, you know, I’m trying not to laugh.

 

There are numerous postings on LinkedIn that readily show how very many Neil touched. You can find that feeling, and a sense of Neil’s dedication, in a recent tribute by diginomica.com Editor John Reed, who made sure to cast a light on Neil’s recent writings on AI Ethics, an area he was especially dedicated to  covering. Raden’s writings for the publication can be found here. Some earlier work is also to be found on Medium.

Neil is survived by his wife, TS (Susie) Wiley, his children Mara, Aja, Jacob, Max, and Zoe, eight grandchildren, a brother, Jonathan Raden, and a sister, Audrey Raden.

 

 

 

Cadence discusses AI-driven fit-for-purpose chips

May 22, 2024 By Jack Vaughan

Phonautograph

The era of hyperscalers designing their own fit-for-purpose chips began with a series of AI chips from Google, Microsoft and AWS. Others cite the work of Apple and others to forge their own destinies in custom-chip designs for smartphones.

The trend has continued, but it is not clear when or if it will spread to other ranks among Information Technology vendors.

The chips specifically built to run the big players’ finely honed AI algorithms are, for now, the sweet spot for fit-for-purpose.

The surge in interest in in-house chip designs got rolling a few years ago with Google’s Tensor Processing Unit, which is specially tuned to meet Google’s AI architecture. The search giant has followed that with the Argos chip, meant to speed YouTube video transcoding, and Axion, said to drive data center energy efficiencies.

Chip design for Google and its ilk is enabled by deep pockets of money. The big players have ready mass markets that can justify big expenses in IC design staff and resources as well.

Chief among those resources is Electronic Design Automation tooling from the likes of Cadence Systems.  This week, Anirudh Devgan, President & CEO of Cadence, discussed the trend at the J.P. Morgan 52nd Global Technology, Media and Communications Conference in Boston.

He said the key reasons companies go the self-designed route are: Achieving domain-specific product differentiation, gaining control over the supply chain and production schedule, and realizing cost benefits at scale when their chip shows it will find use at sufficient volume.

Domain-specific differentiation allows companies to a create chip tailored to their unique needs, according to Devgan.

“It’s a domain specific product. It can do something a regular standard product cannot do,” he said, pointing to Tesla’s work on chips for Full Self Driving, and phone makers’ mobile computing devices that run all day on a single battery charge.

Like all companies dependent on components to power new products, the big players want to have assurance they can meet schedules, and an in-house chip design capability can help there, Devgan continued.

“You have some schedule, you want some control over that,” he told the JP Morgan conference attendees.

For the in-house design to work economically, scale of market is crucial. AI’s apparent boundless opportunity works for the hyperscalers here.

In the end, their in-house designed chip may cost less, when they cut the big chip maker’s over-size role out of the cost equation.

Where does this work? As always…”it depends.”

“It depends on each application, how much it costs, but definitely in AI there is volume, and volume is growing,” Devgan said, and he went on to cite mobile phones, laptops and autos as areas where the volume will drive the trend of custom chip creation.

Devgan declined to estimate how much system houses will take on the task of chip design going forward. Cadence wins in either case, by selling tools to semiconductor manufacturers, hyperscaling cloud leaders and  system houses.

He said: “We will leave that for the customer and the market to decide. Our job is to support both fully, and we are glad to do that.”

The trend bears watching. Years of technology progress has been based on system houses and their customers working with standard parts. Trends like in-house chip design may have the momentum to drastically rejigger today’s IT vendor Who’s Who, which has already been thoroughly rearranged in the wake of the cloud and the web. -jv

Observer Monitor: SolarWinds views observability on multicloud

March 31, 2024 By Jack Vaughan

Today’s observability movement pits newbies fresh out of a few consulting gigs against established performance players with the “burden” of long-time customers. Established performance players have revenue, but they surf choppy waters.

The long-time performance players come to the fray with strong roots in tooling for applications, web monitoring and so on.

They continue to update their offerings in pursuit of observability. The observability software space is heavy on the metrics that can derive from log monitoring and, increasingly, full-out to include cloud native development, shift-left security, and multicloud support. This keeps vendors busy.

SolarWinds is among the performance incumbents that have embraced logs and gone cloud native. The company is building toward observability from foundations in mostly mid-market network performance and management products and services.

The company boosted its log analytics efforts markedly in 2018, with its acquisition of Loggly, which it has continued to field as an independent entity, even as it has integrated Loggly capabilities into its SolarWinds Observability platform.

Bridging the cloud gap

We spoke recently with Jeff Stewart, vice president for product management at SolarWinds. Under discussion were recent updates to SolarWinds Observability, originally released in 2023 to cloud-native and on-premises observability offerings.

The moves build on the 2022 release of SolarWinds Observability, which takes on application, infrastructure, database, network, log, and user experience observability, in the form of a SaaS platform.

Recent updates include query-oriented database monitoring enhancements, as well as improvements to visual explainer plan software.

The company is building out to the cloud at the same time that some users are reacting to rising cloud costs by more carefully picking what will be cloud bound. They will also judge what will be most closely monitored.

Said Stewart:

Customers are in different camps on their journey to the cloud, and migration of cloud workloads from on-premises to clouds like  Azure or AWS. We’ve seen customers that have gone full steam to the cloud, only to figure out that maybe it wasn’t the best idea to move all of their workloads, and that they should have been more selective based on security needs, budget needs or even performance needs, depending on where the application sits. And then there are people that have been very successful with their migration to the cloud. For existing customers that need visibility into different clouds, whether Azure or AWS, we’ve added capabilities in our hybrid cloud observability offering to support them on that journey. But we’ve also enabled them, as they make a decision to go more into the cloud to instantaneously start to send their data up to our SaaS offering.

What has appeared is a visibility gap on networks and security as users enter the realm of multicloud, according to Stewart, who touts Solar Wind’s lineage in network monitoring here. He said:

When applications or workloads are deployed across multicloud, we see some configurations where a part of the application is talking to another part of the application in a different cloud, which becomes very cost prohibitive. So, providing visibility into how traffic is traversing multicloud environments, as well as traffic that’s going from on-premises to the cloud, is a visibility gap that we see and are working to address with our offerings.

Database visibility

Clearly, visibility of database performance is no longer an isolated, on-premises event. And the database query in the multicloud space can introduce new complexity. The runaway query, which was a feature of early client-server’s darkside, is taking on a new tenor as hybrid and multicloud use grows wider.

Stewart said Solar Winds’ background in database performance positions it to deal with the new distributed computing paradigm. That’s where a variety of databases are in place, with locations that can span the globe.

Now, even high-level executives can view this activity, as observability metrics are rolled up. In e-commerce on-line selling, where efficient customer facing applications directly translate into revenue, they find query issues particularly telling.

My Take

Despite clear interest in observability tooling, the complex demands for monitoring modern systems challenge vendors and users alike.

Deeper and wider hybrid cloud environments can cause costs to rapidly escalate, requiring that IT users carefully pick and choose what they monitor.

Like others in the market, SolarWinds faces the challenge of keeping best-of-breed tools fresh for compartmentalized networking and database users, while building out an ever-broader platform of capabilities intended to run across broad multiclouds. – Jack Vaughan

 

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Interim pages omitted …
  • Go to page 6
  • Go to Next Page »

Progressive Gauge

Copyright © 2025 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack