• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

Observer Monitor: SolarWinds views observability on multicloud

March 31, 2024 By Jack Vaughan

Today’s observability movement pits newbies fresh out of a few consulting gigs against established performance players with the “burden” of long-time customers. Established performance players have revenue, but they surf choppy waters.

The long-time performance players come to the fray with strong roots in tooling for applications, web monitoring and so on.

They continue to update their offerings in pursuit of observability. The observability software space is heavy on the metrics that can derive from log monitoring and, increasingly, full-out to include cloud native development, shift-left security, and multicloud support. This keeps vendors busy.

SolarWinds is among the performance incumbents that have embraced logs and gone cloud native. The company is building toward observability from foundations in mostly mid-market network performance and management products and services.

The company boosted its log analytics efforts markedly in 2018, with its acquisition of Loggly, which it has continued to field as an independent entity, even as it has integrated Loggly capabilities into its SolarWinds Observability platform.

Bridging the cloud gap

We spoke recently with Jeff Stewart, vice president for product management at SolarWinds. Under discussion were recent updates to SolarWinds Observability, originally released in 2023 to cloud-native and on-premises observability offerings.

The moves build on the 2022 release of SolarWinds Observability, which takes on application, infrastructure, database, network, log, and user experience observability, in the form of a SaaS platform.

Recent updates include query-oriented database monitoring enhancements, as well as improvements to visual explainer plan software.

The company is building out to the cloud at the same time that some users are reacting to rising cloud costs by more carefully picking what will be cloud bound. They will also judge what will be most closely monitored.

Said Stewart:

Customers are in different camps on their journey to the cloud, and migration of cloud workloads from on-premises to clouds like  Azure or AWS. We’ve seen customers that have gone full steam to the cloud, only to figure out that maybe it wasn’t the best idea to move all of their workloads, and that they should have been more selective based on security needs, budget needs or even performance needs, depending on where the application sits. And then there are people that have been very successful with their migration to the cloud. For existing customers that need visibility into different clouds, whether Azure or AWS, we’ve added capabilities in our hybrid cloud observability offering to support them on that journey. But we’ve also enabled them, as they make a decision to go more into the cloud to instantaneously start to send their data up to our SaaS offering.

What has appeared is a visibility gap on networks and security as users enter the realm of multicloud, according to Stewart, who touts Solar Wind’s lineage in network monitoring here. He said:

When applications or workloads are deployed across multicloud, we see some configurations where a part of the application is talking to another part of the application in a different cloud, which becomes very cost prohibitive. So, providing visibility into how traffic is traversing multicloud environments, as well as traffic that’s going from on-premises to the cloud, is a visibility gap that we see and are working to address with our offerings.

Database visibility

Clearly, visibility of database performance is no longer an isolated, on-premises event. And the database query in the multicloud space can introduce new complexity. The runaway query, which was a feature of early client-server’s darkside, is taking on a new tenor as hybrid and multicloud use grows wider.

Stewart said Solar Winds’ background in database performance positions it to deal with the new distributed computing paradigm. That’s where a variety of databases are in place, with locations that can span the globe.

Now, even high-level executives can view this activity, as observability metrics are rolled up. In e-commerce on-line selling, where efficient customer facing applications directly translate into revenue, they find query issues particularly telling.

My Take

Despite clear interest in observability tooling, the complex demands for monitoring modern systems challenge vendors and users alike.

Deeper and wider hybrid cloud environments can cause costs to rapidly escalate, requiring that IT users carefully pick and choose what they monitor.

Like others in the market, SolarWinds faces the challenge of keeping best-of-breed tools fresh for compartmentalized networking and database users, while building out an ever-broader platform of capabilities intended to run across broad multiclouds. – Jack Vaughan

 

Multicloud: Evolving technology paradigms create rising complexity

February 13, 2024 By Jack Vaughan

Late last year I had the opportunity to cover Multicloud issues for Muse at SDxCentral, and it was a bit of an eye opener in this regard. We were writing about cloud over 30 years ago, and utility and grid computing before that. But a visit to cloud computing today finds obstacles yet to confront.

The Multicloud milieu today seemed to say there was still very little in the way of deep integration between clouds, all these years later. It seems another case in the evolution of computing  where every new solution seems to hold the seeds of its own obsolescence.

Read Evolving technology paradigms create rising complexity on Medium

 

PASCAL language originator Nicholas Wirth, 89

January 12, 2024 By Jack Vaughan


Nicholas Wirth, whose 1970 creation of the PASCAL programming language dramatically influenced the development of modern software engineering techniques, died January 1. He was 89.

A long-time professor of Informatics at the ETH Institute in Zurich, Wirth produced PASCAL primarily as a teaching tool, but its innovative language constructs set the stage for C, C++ and Java software languages that flourished in the late 20th Century, and which still have wide use today. Wirth was the winner of the 1984 ACM Turing Award, among other honors.

Wirth named the language after 15th Century philosopher and mathematician, Blaise Pascal, who is often credited as the inventor of the first digital calculator. The PASCAL language was a bridge of sorts, as it attempted to span language styles for business computing by the then-dominant COBOL language, and scientific computing as seen with the FORTRAN language.

The field of software engineering was still in its early days when Wirth began his pioneering work as a graduate student and then as assistant professor at Stanford University. At the time, the status of early hardware implementations from numerical calculators to general-purpose computers was still somewhat nascent.

It was becoming apparent this time that a language focused solely on numerical processing would encounter obstacles as computing evolved.

It was hoped, Wirth wrote in a paper, that “the undesirable canyon between scientific and commercial programming … could be bridged.” [This and other materials cited in this story appeared in ”Recollections about the Development of PASCAL,” published as part of “History of Programming Languages,” in 1996 by Addison-Wesley.]

Wirth also contributed to development of the ALGOL and MODULA languages, often bringing special focus to development of compilers that translated source software into running machine code. His book, “Algorithms + Data Structures = Programs” is often cited as a keystone text for those interested in the fundamentals of software design.

Education was the primary goal of work on PASCAL, which Wirth undertook after a disappointing experience on a somewhat squabbling committee of experts looking to standardize a new version of ALGOL 60.

In his words, Wirth decided “to pursue my original goal of designing a general-purpose language without the heavy constraints imposed by the necessity of finding a consensus among two dozen experts about each and every little detail.”

The work of Wirth and some co-conspirators built on ALGOL, but also drew implicitly on emerging thinking on structured approaches to software development, such as those outlined by E.W. Dijkstra. Certainly, the early ‘bubble gum and bailing wire’ days of computer software were receding as Wirth began his work.

“Structured programming and stepwise refinement marked the beginnings of a methodology of programming, and became a cornerstone in helping program design become a subject of intellectual respectability,” according to Wirth.

The fact that PASCAL was designed as a teaching tool is important – it was constructed in a way that allowed new programmers to learn sound method, while applying their own enhancements.

“Lacking adequate tools, build your own,” he wrote.

And, while innovation was a goal, pragmatism for Wirth was also a high-order requirement.

Wirth’s work on PASCAL started about the time he headed from Stanford to ETH Zurich, with definition of the language achieved in 1970. It was formally announced in “Communications of the ACM” in 1971. Though never the most popular language in terms of numbers, it was very influential, contributing to a movement that emphasized general applications use, strong typing and human-readable code.

PASCAL gained great currency in the early days of personal computers in the United States. A version that became known as Turbo Pascal was created by Anders Hejlsberg, and was licensed and sold by Borland International beginning in 1983. On the wings of Borland ads in Byte magazine, Turbo PASCAL became ubiquitous among desktop programmer communities.

Work of pioneers like Wirth gain special resonance today as a Generative AI paradigm appears poised to automate larger portions of the programming endeavor. Automated code generation is by no means completely new, and the surprises and ‘gotchas’ of the pioneers’ era will no doubt be revisited as the understanding of effective software development processes continues to evolve. Wirth’s words and work merit attention in this regard, and in regard as well to fuller understanding of software evolution. – J.V.

The Progressive Gauge obituary gets some extra time now, with a few minutes of extra material.

My Take: I came to cover software for embedded systems, electronic design automation and, then, business applications, at a time when structured programming and defined methodologies were in full ascendance.
Methodologies narrowed over many years into a general accepted path to software modeling. But they tended to flower wildly at first. They started with some philosophic underpinnings, but could easily be charged as well as something that worked for someone that they thought could become a product. The downside was each new methodology might suggest that your problem was the methodology you were already using, which was not always the actual case. The joke of a bench developer I shared a cube with for a while was “What’s a methodology?” Answer: “A method that went to college.”

The methodology generally carried the name of the inventor, not a historical figure as with PASCAL.

The complexity of the deepest levels of the new operating system, compiler, design and language approaches was daunting for this writer.

My work in the computer trade press afforded me the opportunity to meet Unix co-creator Dennis Ritchie of Bell Labs (at the time he was rolling out the Plan 9 OS)and Java author James Gosling (then of Sun Microsystems). Met two of the three “UML Amigos,” Grady Booch, then of Rational Software, and Ivar Jacobson, head of Ivar Jacobson International. Interviewing Ivar on one occasion was particularly memorable. He asked that I interview a software avatar he had just created, an animated figure which spoke and represented his thinking on technology issues, rather than speak with him directly.

These assignments surely were ‘a tall order for a short guy!’

I offer these notes here in my role as a generalist. While computer history doesn’t repeat, it rhymes. This history is always worth a re-visit, especially as clearly-new paradigms fly out of coders’ 2024 work cubes. It has been interesting for me to look at the origination of PASCAL, and to learn about Nicholas Wirth, the individual who brought PASCAL into the world.

Pascaline
Pascaline

Links
https://cacm.acm.org/magazines/2021/3/250705-50-years-of-pascal/fulltext
https://www.standardpascaline.org/PascalP.html
https://amturing.acm.org/award_winners/wirth_1025774.cfm
http://PASCAL.hansotten.com/niklaus-wirth/recollections-about-the-development-of-PASCAL/
https://blogs.embarcadero.com/50-years-of-pascal-and-delphi-is-in-power/

Who took my soapbox? A note on media and AI

January 7, 2024 By Jack Vaughan

As 2023 came to its end, a New York Times suit affirmed a general impression that Generative AI and ChatGPT would find some friction on the way to a well-hyped, lead-pipe cinch and especially glorious future.

For those that have used this software, a recent improvement on existing machine learning interaction, it is not surprising. Microsoft’s/OpenAI’s ChatGPT and its main competitor, Google Bard, are breakthroughs. They provide a different level of access to the world’s knowledge.

Instead of pointing the searcher to brief fair-use citations of Web stories ala Google Search, ChatGPT and Bard provide somewhat thoughtful summaries of issues — ones that might do a junior or middle-level manager quite well when it’s time for yearly performance evaluations.

The new paradigm for Web activity threatens beleaguered publishers. They are not on a roll. The 4th estate is now painted as an unwanted gate keeper of opinion. Publishers that saw an advertising market pulverized by Google Search results now see an AI wunderkind about to drain publishing’s last pennies.

An anticipated slew of AI suits is now spearheaded by the Times, which filed a lawsuit citing OpenAI and Microsoft for copyright breach. Some go tsk, tsk. Wall Street oddsmakers that enjoyed an AI stock bump in 2023 were quickest to dismiss the Times’ chances versus ChatGPT. OpenAI has said it is in discussions with some publishers, and will work to achieve a beneficial arrangement.

Among the financial community, concern spreads that Generative AI’s magical abilities could be dampered. That is summed up by Danny Cevallos, MSNBC legal analyst, who worries about the impossible obligation to mechanize copyright royalties for AI citations across the globe.

The concerns comes despite the multidecade success of Silicon Valley’s Altruistic Surveillance movement. It can find you wherever you are, web users know. Still, Cevallos highlights the difficulty of, for example, finding and paying a copyright owner in a log cabin somewhere in Alaska.

“That would mean the end of future AI,” he said on CNBC’s Power Lunch. “It could be argued that the Times has to lose for progress to survive.”

We can anticipate that glory-bound Generative AI will find some rocks in its pathway in 2024 — but most will be in the form of stubborn, familiar IT implementation challenges. In the meantime, people that make a living in media will have to work to promote their interests, as other commercial interests chip away under the cover of AI progress. – Jack Vaughan

— 30 —

Is AWS a diminishing AI laggard – or is it right about on time?

December 12, 2023 By Jack Vaughan

Harvard Stadium

AWS is lagging and racing to catch up in Generative AI and Large Language Models (LLMs). Or so an industry meme holds.  When a smattering of new COVID  isolations end and the dust settles in the weeks after Amazon’s re:Invent 2023 conference in Las Vegas, that notion may be due for a revision.

Like all its competitors, AWS is working to put Generative AI technology in place – that means latching it on to other application offerings and adapting new tools and schemes for developers.

Among challenges that now face teams creating Generative AI applications are vector embeddings. These processes are an important step in handling data for consumption by Large Language Models (LLMs) that betoken a new era of chatbots. Perhaps as importantly, vector embeddings are also useful in slightly less futuristic applications, such as search, recommendation engines and personalization engines.

When Wall Street wags ask whether AWS is a diminishing AI laggard or peaking at just the right time, they probably don’t devote too much thought to the types of vectors machine learning engines are now churning. But building such “infrastructure” is important on the path to working AI.

AWS put vector techniques front and center in AI and data announcements at re:Invent 2023. A centerpiece of this is Amazon Titan Multimodal Embeddings, just out. The software works to convert images and short text into numerical representations that generative learning models can use. These are used to unpack the semantic meanings of data,  and to uncover important relations between data points.

Putting new-gen AI chatbots aside for the moment, it’s worth mentioning that recommendation and personalization tasks are likely beneficiaries of vector and AI progress. Once the province of Magnificent 7 Class vendors, these application types have become part of more and more organizations’ systems portfolios.

As you may imagine, they add considerable complexity to a developer’s typical day.  Here, Amazon AWS has set a course to simplify such work for customers.

Before some words on that, a few words about these kinds of embeddings: Vector embeddings are numerical representations created by LLMs from words, phrases or blocks of text. The vectors are more useful for new styles of machine learning, which seek to find meaning in data points.

This is useful, but development managers need to find skilled-up programmers and architects to make this leap forward. That is some of the feedback AWS says it’s getting from customers. Enter Swami Sivasubramanian.

Sivasubramanian is vice president of data and AI, at AWS. At re:Invent he told attendees: “Our customers use vectors for GenAI applications. They told us they want to use them in their existing databases so that they can eliminate the learning curve in terms of picking up a new programing paradigm, new tools, APIs and SDKs. Importantly, when your vectors and business data are stored in the same place, your applications will run faster and there is not data sync or data movement to worry about.”

Do you want to bring in a vector database to handle this work – adding to your relational databases, document databases, graph databases, and so on? AWS, which has used re:Invent after re:Invent to spotlight such new database offerings is shifting here to promote “run you vectors in your existing database” rather than bring in another new-fangled database.

So, central to AWS’s take is a push to provide vector data handling within existing Amazon databases, rather than standalone vector databases, although Amazon supports 3rd-party vector database integration as well.

Among many Amazon initiatives Sivasubramanian discussed at re:Invent 2023 were vector support for DocumentDB, DynamoDB, Aurora PostgreSQL, Amazon RDS for PostgreSQL, MemoryDB for Redis, Neptune Analytics, Amazon OpenSearch Serverless, and Amazon Bedrock.

The moment sets up a classic soup-to-nuts vendor vs. best-of-breed vendor paradigm. Among the best-of-breed database upstarts are Milvus, Pinecone, Zilliz and others.

Meanwhile, vector support has sounded as a drumroll for database makers of all ilk of late. Here is a small sampling. In September, IBM said it planned to integrate vector database capability into watsonx.data for use in retrieval augmented generation (RAG) use cases. Also in September, Oracle disclosed plans to add semantic search capabilities using AI vectors to Oracle Database 23c. On the heels of re:Invent, NoSQL stalwart MongoDB announced GA for MongoDB Atlas Vector Search. And, prior to re:Invent, Microsoft went public with vector database add-ons for Azure container apps.

Is AWS a diminishing AI laggard – or is it right about on time? No surprise here. The answer is somewhere in between the two extremes, just as it is somewhere between the poles on the soup-to-nuts-to-best-of-breed continuum. It will be interesting to see how the vector database market evolves. – Jack Vaughan

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 4
  • Go to page 5
  • Go to page 6
  • Go to page 7
  • Go to page 8
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2026 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack