• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

Observer Monitor Dispatch: Dynatrace, StarTree, More

May 10, 2024 By Jack Vaughan

Kubernetes defense in focus as Dynatrace rolls with Runecast tech

Kubernetes’ quick rise to prominence in cloud computing may have left a few holes in applications’ defenses. That is something Dynatrace looks to address with Kubernetes Security Posture Management (KSPM) software. It’s said to employ observability data to enable quick response and mitigation of risks.

Dynatrace has a lineage when it comes to AI, originally arising out of the movement that placed AI agents on network nodes in order to track activity. KSPM employs  Dynatrace’s Davis hypermodal AI which combines predictive AI, Causal AI and Generative AI methods. The company said KSPM, thus accoutered, can ably convey immediate context for decision making as threats occur.

The company said  KSPM follows the integration of Runecast cloud native technology into the Dynatrace platform following the company’s successful acquisition earlier this year. Runecast technology supports continuous Kubernetes vulnerability scans, security compliance based on best practices, and remediation recommendations.

AI continues to find  renewed influence in the observability space. Updates are coming quickly, as this follow latest follows up on Dynatrace’s January roll-out of AI Observability extensions for large language models (LLMs).

 

StarTree Cloud extends observability

StarTree Cloud gains new observability and anomaly detection capabilities as well as vector search capabilities for its underlying Apache Pinot database engine with a new release reported by StarTree.

StarTree offers these services to customers of a cloud-based database-as-a-service that is specially targeting  analytics jobs. Like Confluent, StarTree arose out of the open-source activity of LinkedIn during the 2010s.

While Confluent has focused most effectively on data ingestion, StarTree has concentrated on data analytics, based on an implementation of the Pinot OLAP distributed columnar database.

The StarTree product suite is said to serve user-facing applications where a broad user base can query real-time data. The company noted DoorDash as a customer in this regard. The company said it partners with cloud and big data players such as AWS, Google Cloud, Microsoft, Confluent, Databricks and others in customer engagements.

The observability functionality new to the platform should allow  StarTree users, serving as developers or system administrators, to troubleshoot issues that arise within their cloud-based applications.

StarTree announced the general availability of StarTree ThirdEye software, offering multidimensional anomaly detection. As well, a write API supporting real-time sync for ELT pipelines such as Debezium, Fivetran, or dbt , is now in “private preview,” and integrations with visualization platforms, including Tableau and Grafana are available now.

Like others, StarTree is bringing nearest-neighbor vector search capabilities to its users, as well as users of the open-source Apache Pinot project.

Besides DoorDash, StarTree cites customers such as  Citi, Stripe, Nubank, and Zomato. For its part, in March, Citi announced a strategic investment in StarTree.

At the time,  Katya Chupryna, Director, Markets Strategic Investments at Citi marked StarTree and the underlying Pinot engine for speed of data ingestion.

“User-facing analytics have seen profound growth in recent years across all industries, accelerating the need for enterprise-ready, real-time data solutions. StarTree and Pinot’s speed of ingestion is unmatched on complex queries over rapidly changing data,” Chupruna said.

Also on tap

Cisco announced a virtual appliance for its AppDynamics On-Premises application observability offering. It’s aimed at users looking for customer-managed observability for on-premises deployments or cloud-based deployments where the customer retains control of all data and associated operations. AI-Powered Detection and Remediation with Cisco’s Cognition Engine is said to speed anomaly detection and root cause analysis … Riverbed announced Riverbed Unified Agent which allows IT to add SaaS-delivered visibility modules – for example, for end user experience and network monitoring – without adding more agents. Riverbed’s Platform initially launches with approximately 35 pre-built application integrations for third-parties including ServiceNow, Dynatrace, AppDynamics and DataDog. A Topology Viewer generates dynamic mapping of connected devices, while Riverbed NPM+, using the Riverbed Unified Agent, is said to overcome network blind spots created by remote work, public cloud, and encrypted architectures such as Zero Trust environments. This, while extending packet visibility to collected decrypted data at every user and server endpoint, including gaps such as encrypted tunnels in Zero Trust architectures. [PG]

Net neutrality is on the books again

April 28, 2024 By Jack Vaughan

[Updatte: In August 2024, the Sixth Circuit Court of Appeals issued a temporary stay on the implementation of these rules.]

[April 26, 2024] – The stars and planets may seem in their usual places this morning, but change came overnight, as Net Neutrality returned to America. If you blink you missed it, and it’s not guaranteed to stick.

Net Neutrality is again on the books, as the Federal Communications Commission on Thursday voted to restore rules requiring telecommunications companies to treat all apps and sites equally.

“It is incredibly important that there be ongoing oversight for the most important network of the 21st century. It’s that simple,” former FCC chief Tom  Wheeler said on  the Politico Tech podcast on the eve of that vote.

Voted in when Wheeler led the FCC during the Obama era, and then out during the Trump Administration, net neutrality is a technology hot potato in this partisan political climate. Like other Trump era edicts US President Joe Biden has reversed, this one will be challenged via litigation and legislation going forward.

But a return to the Internet as a regulated utility is important. That means no throttling, no superfast lanes, no favors for golf buddies. As the Three Stooges would say: “None of this and none of that.”

That look at blocking is a look back, and not the point today, Wheeler says.

“To define it in terms of no blocking and no throttling. I mean, that is so much yesterday’s issue. The broader question here is, will there be an ongoing expectation for all of the activities of this really important 21st century network? And will there be flexibility on the part of the FCC to deal with those?”

Wheeler’s caution comes as regulation of any kind is widely derided, even as Artificial Intelligence doomsaying predicts an internet more volatile than the often rocky network it is today.

Wheeler, a former cable industry executive who now teaches at Harvard’s Kennedy School of Government outlines the issues in his  “Techlash – Who makes the rules in the digital Gilded Age?” – a recent book that looks for guides in the history of communications.

The premise of the book is that it’s valuable to look to the 19th century Gilded Age as a clue to what’s going on now in communications.

In the 19th century there were great waves of industrialization as steam, trains and electricity gained traction, workers were exploited, farms gave way to cities, and Robber Barrons leveraged all.

Today, we all agree, there’s been great waves of technology changing the way we live, and how that is communicated. What we came to call mass communications – evolved out of the telegraph, the telephone, radio and television. These all took time to settle….

This is how Wheeler poses it in Techlash:

“While the economic model is still about maximizing revenue, it is no longer about the need for balance and veracity. Like the early ideological media, the new media profits by playing to users’ preferences and prejudices. The difference is that software algorithms organize the information to deliver what each user likes in order to hold the user’s attention to see as many revenue generating ads as possible.”

Technology companies make money “through the most sophisticated and secret content curation ever devised,” Wheeler writes. Let’s take this to include Big Data, algorithms, collaborative filtering, portals, recommendation engines, personalization engines and a parade of machine learning models.

With a next age of AI bubbling in a slew of Large Language Models, that secret curation model has already remade media and society. Having an FCC that works for a neutral network is important as that next age looms. – J Vaughan

Observer Monitor: SolarWinds views observability on multicloud

March 31, 2024 By Jack Vaughan

Today’s observability movement pits newbies fresh out of a few consulting gigs against established performance players with the “burden” of long-time customers. Established performance players have revenue, but they surf choppy waters.

The long-time performance players come to the fray with strong roots in tooling for applications, web monitoring and so on.

They continue to update their offerings in pursuit of observability. The observability software space is heavy on the metrics that can derive from log monitoring and, increasingly, full-out to include cloud native development, shift-left security, and multicloud support. This keeps vendors busy.

SolarWinds is among the performance incumbents that have embraced logs and gone cloud native. The company is building toward observability from foundations in mostly mid-market network performance and management products and services.

The company boosted its log analytics efforts markedly in 2018, with its acquisition of Loggly, which it has continued to field as an independent entity, even as it has integrated Loggly capabilities into its SolarWinds Observability platform.

Bridging the cloud gap

We spoke recently with Jeff Stewart, vice president for product management at SolarWinds. Under discussion were recent updates to SolarWinds Observability, originally released in 2023 to cloud-native and on-premises observability offerings.

The moves build on the 2022 release of SolarWinds Observability, which takes on application, infrastructure, database, network, log, and user experience observability, in the form of a SaaS platform.

Recent updates include query-oriented database monitoring enhancements, as well as improvements to visual explainer plan software.

The company is building out to the cloud at the same time that some users are reacting to rising cloud costs by more carefully picking what will be cloud bound. They will also judge what will be most closely monitored.

Said Stewart:

Customers are in different camps on their journey to the cloud, and migration of cloud workloads from on-premises to clouds like  Azure or AWS. We’ve seen customers that have gone full steam to the cloud, only to figure out that maybe it wasn’t the best idea to move all of their workloads, and that they should have been more selective based on security needs, budget needs or even performance needs, depending on where the application sits. And then there are people that have been very successful with their migration to the cloud. For existing customers that need visibility into different clouds, whether Azure or AWS, we’ve added capabilities in our hybrid cloud observability offering to support them on that journey. But we’ve also enabled them, as they make a decision to go more into the cloud to instantaneously start to send their data up to our SaaS offering.

What has appeared is a visibility gap on networks and security as users enter the realm of multicloud, according to Stewart, who touts Solar Wind’s lineage in network monitoring here. He said:

When applications or workloads are deployed across multicloud, we see some configurations where a part of the application is talking to another part of the application in a different cloud, which becomes very cost prohibitive. So, providing visibility into how traffic is traversing multicloud environments, as well as traffic that’s going from on-premises to the cloud, is a visibility gap that we see and are working to address with our offerings.

Database visibility

Clearly, visibility of database performance is no longer an isolated, on-premises event. And the database query in the multicloud space can introduce new complexity. The runaway query, which was a feature of early client-server’s darkside, is taking on a new tenor as hybrid and multicloud use grows wider.

Stewart said Solar Winds’ background in database performance positions it to deal with the new distributed computing paradigm. That’s where a variety of databases are in place, with locations that can span the globe.

Now, even high-level executives can view this activity, as observability metrics are rolled up. In e-commerce on-line selling, where efficient customer facing applications directly translate into revenue, they find query issues particularly telling.

My Take

Despite clear interest in observability tooling, the complex demands for monitoring modern systems challenge vendors and users alike.

Deeper and wider hybrid cloud environments can cause costs to rapidly escalate, requiring that IT users carefully pick and choose what they monitor.

Like others in the market, SolarWinds faces the challenge of keeping best-of-breed tools fresh for compartmentalized networking and database users, while building out an ever-broader platform of capabilities intended to run across broad multiclouds. – Jack Vaughan

 

Multicloud: Evolving technology paradigms create rising complexity

February 13, 2024 By Jack Vaughan

Late last year I had the opportunity to cover Multicloud issues for Muse at SDxCentral, and it was a bit of an eye opener in this regard. We were writing about cloud over 30 years ago, and utility and grid computing before that. But a visit to cloud computing today finds obstacles yet to confront.

The Multicloud milieu today seemed to say there was still very little in the way of deep integration between clouds, all these years later. It seems another case in the evolution of computing  where every new solution seems to hold the seeds of its own obsolescence.

Read Evolving technology paradigms create rising complexity on Medium

 

PASCAL language originator Nicholas Wirth, 89

January 12, 2024 By Jack Vaughan


Nicholas Wirth, whose 1970 creation of the PASCAL programming language dramatically influenced the development of modern software engineering techniques, died January 1. He was 89.

A long-time professor of Informatics at the ETH Institute in Zurich, Wirth produced PASCAL primarily as a teaching tool, but its innovative language constructs set the stage for C, C++ and Java software languages that flourished in the late 20th Century, and which still have wide use today. Wirth was the winner of the 1984 ACM Turing Award, among other honors.

Wirth named the language after 15th Century philosopher and mathematician, Blaise Pascal, who is often credited as the inventor of the first digital calculator. The PASCAL language was a bridge of sorts, as it attempted to span language styles for business computing by the then-dominant COBOL language, and scientific computing as seen with the FORTRAN language.

The field of software engineering was still in its early days when Wirth began his pioneering work as a graduate student and then as assistant professor at Stanford University. At the time, the status of early hardware implementations from numerical calculators to general-purpose computers was still somewhat nascent.

It was becoming apparent this time that a language focused solely on numerical processing would encounter obstacles as computing evolved.

It was hoped, Wirth wrote in a paper, that “the undesirable canyon between scientific and commercial programming … could be bridged.” [This and other materials cited in this story appeared in ”Recollections about the Development of PASCAL,” published as part of “History of Programming Languages,” in 1996 by Addison-Wesley.]

Wirth also contributed to development of the ALGOL and MODULA languages, often bringing special focus to development of compilers that translated source software into running machine code. His book, “Algorithms + Data Structures = Programs” is often cited as a keystone text for those interested in the fundamentals of software design.

Education was the primary goal of work on PASCAL, which Wirth undertook after a disappointing experience on a somewhat squabbling committee of experts looking to standardize a new version of ALGOL 60.

In his words, Wirth decided “to pursue my original goal of designing a general-purpose language without the heavy constraints imposed by the necessity of finding a consensus among two dozen experts about each and every little detail.”

The work of Wirth and some co-conspirators built on ALGOL, but also drew implicitly on emerging thinking on structured approaches to software development, such as those outlined by E.W. Dijkstra. Certainly, the early ‘bubble gum and bailing wire’ days of computer software were receding as Wirth began his work.

“Structured programming and stepwise refinement marked the beginnings of a methodology of programming, and became a cornerstone in helping program design become a subject of intellectual respectability,” according to Wirth.

The fact that PASCAL was designed as a teaching tool is important – it was constructed in a way that allowed new programmers to learn sound method, while applying their own enhancements.

“Lacking adequate tools, build your own,” he wrote.

And, while innovation was a goal, pragmatism for Wirth was also a high-order requirement.

Wirth’s work on PASCAL started about the time he headed from Stanford to ETH Zurich, with definition of the language achieved in 1970. It was formally announced in “Communications of the ACM” in 1971. Though never the most popular language in terms of numbers, it was very influential, contributing to a movement that emphasized general applications use, strong typing and human-readable code.

PASCAL gained great currency in the early days of personal computers in the United States. A version that became known as Turbo Pascal was created by Anders Hejlsberg, and was licensed and sold by Borland International beginning in 1983. On the wings of Borland ads in Byte magazine, Turbo PASCAL became ubiquitous among desktop programmer communities.

Work of pioneers like Wirth gain special resonance today as a Generative AI paradigm appears poised to automate larger portions of the programming endeavor. Automated code generation is by no means completely new, and the surprises and ‘gotchas’ of the pioneers’ era will no doubt be revisited as the understanding of effective software development processes continues to evolve. Wirth’s words and work merit attention in this regard, and in regard as well to fuller understanding of software evolution. – J.V.

The Progressive Gauge obituary gets some extra time now, with a few minutes of extra material.

My Take: I came to cover software for embedded systems, electronic design automation and, then, business applications, at a time when structured programming and defined methodologies were in full ascendance.
Methodologies narrowed over many years into a general accepted path to software modeling. But they tended to flower wildly at first. They started with some philosophic underpinnings, but could easily be charged as well as something that worked for someone that they thought could become a product. The downside was each new methodology might suggest that your problem was the methodology you were already using, which was not always the actual case. The joke of a bench developer I shared a cube with for a while was “What’s a methodology?” Answer: “A method that went to college.”

The methodology generally carried the name of the inventor, not a historical figure as with PASCAL.

The complexity of the deepest levels of the new operating system, compiler, design and language approaches was daunting for this writer.

My work in the computer trade press afforded me the opportunity to meet Unix co-creator Dennis Ritchie of Bell Labs (at the time he was rolling out the Plan 9 OS)and Java author James Gosling (then of Sun Microsystems). Met two of the three “UML Amigos,” Grady Booch, then of Rational Software, and Ivar Jacobson, head of Ivar Jacobson International. Interviewing Ivar on one occasion was particularly memorable. He asked that I interview a software avatar he had just created, an animated figure which spoke and represented his thinking on technology issues, rather than speak with him directly.

These assignments surely were ‘a tall order for a short guy!’

I offer these notes here in my role as a generalist. While computer history doesn’t repeat, it rhymes. This history is always worth a re-visit, especially as clearly-new paradigms fly out of coders’ 2024 work cubes. It has been interesting for me to look at the origination of PASCAL, and to learn about Nicholas Wirth, the individual who brought PASCAL into the world.

Pascaline
Pascaline

Links
https://cacm.acm.org/magazines/2021/3/250705-50-years-of-pascal/fulltext
https://www.standardpascaline.org/PascalP.html
https://amturing.acm.org/award_winners/wirth_1025774.cfm
http://PASCAL.hansotten.com/niklaus-wirth/recollections-about-the-development-of-PASCAL/
https://blogs.embarcadero.com/50-years-of-pascal-and-delphi-is-in-power/

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 4
  • Go to page 5
  • Go to page 6
  • Go to page 7
  • Go to page 8
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2026 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack