• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Computing

Kreps, Dorsey, riff on ChatGPT

May 29, 2023 By Jack Vaughan

Stagg Field Nuclear Pile [fragment]

[Boston — May 2023] — “It’s a law that any conversation around technology has to come back to AI within five minutes.”

Well put, Jay Kreps, co-founder and CEO for real-time streaming juggernaut Confluent. Speaking at J.P. Morgan’s Boston Tech Investor event, Kreps knew this was coming. ChatGPT rules the news these days.

Given the daily pounding of 1,000 reporters’ laptops, given Nvidia’s vault into the highest clouds of valuation, it is no surprise that ChatGPT generative AI is the recurring topic. It will impede all other discussion, just as expected by tech stalwarts at J.P. Morgan’s and others’ tech events.

It’s the 600-lb. ChatBot in the room, and it is bigger than big.

Confluent chief on Chatbot interaction

Back in the nascent days of social media, the founders of Confluent, then working at LinkedIn, created a distributed commit log that stored streams of records. They called it Kafka and grew it out into a fuller data stream processing system. It’s intent is to bring to the broader enterprise real-time messaging capabilities akin to that of the Flash Boys of Wall Street.

The company is still in “spend a buck to make a buck” mode. For the quarter ending March 31, Confluent revenues increased 38% to $174.3M, while net jumped 35% to $152.6M. Customers include Dominos, Humana, Lowes, Michelin and others. In January it purchased would-be competitor, Immerok, a leading contributor to the Apache Flink stream processing project.

What’s the significance of real-time streaming in “the age of AI,” Kreps is asked at the Boston event. He says:

It’s really about how a company can take something like a large language model that has a very general model of the world and combine it with information about that company, and about customers, and be able to put those things together to do something for the business.

He gives an example: A large travel company wants to have an interactive chatbot for customers. Seems the barrier ChatGPT faces there for improvements is not so high. As Kreps said: “The chatbots were always pretty bad. It’s like interacting with like the stupidest person that you’ve ever talked to.”

Improvements needed for chatbots include a real-time view of all the information the company holds about customers and operations.

What do you need to make that work? Well, you need to have the real-time view of all the information about them, their flights, their bookings, their hotel, are they going to make their connection, etcetera. And you need a large language model which can take that information and answer arbitrary questions that the customer might ask. So the architecture for them is actually very simple. They need to put together this real time view of their customers, what’s happening, where the flights are, what’s delayed what’s going on. And then they need to be able to call out to a service for the generative AI stuff, feed it this data, feed it the questions from customers, and … integrate that into their service, which is very significant. This is a whole new way of interacting with their customers. And I think that that pattern is very generalizable.

Popping the question: Dorsey

For Jack Dorsey, the question “What about ChatGTP?” is raw meat. He melded SMS and the Web to create Twitter, and now with a nod to bitcoin and block chain has built Block, nee Square. The financial services and digital payments company posted revenue results for the three months ended April 1 that increased 26% to $4.99B, while net loss decreased a significant 92% to $16.8M. The good news was based on increased use of its Cash App product.

At the J.P. Morgan tech investor conference, Dorsey told the people, while hype obviously abounds, true progress rides on use cases.

There’s a ton of hype right now. And I think there’s a lot of companies being started that are going to fail because of that hype. I think the technology industry is very trendy, and very fashionable and jumps from one thing to the next, to the next, to the next. It wasn’t so long ago that we were only talking about Bored Apes and Crypto and NFTs and now we’re talking only about AI and how it’s going to kill us.

There’s always some truth in all these things. I just would caution any company that’s approaching it from a technology perspective, [to] instead use a use case perspective. What is the use case you’re trying to solve? And what technologies can you use to solve it more creatively?

THAT’S THE WAY IT IS — Clearly, panelists and podiumists are preparing to take on ChatGPT questions. At the same time, the clamor of the now will shift to prioritizing generative AI strategically within a host of technology initiatives. ChatGPT may be generalizable — but the proof will not appear overnight. The proof is in the business use case.

Reporter’s Notebook – At MIT Tech Review Future Compute 2023: Navigating the straits of semis

May 9, 2023 By Jack Vaughan

[May 9, 2023 ] – When the US last year announced new export rules on advanced chips,  the role of semiconductors in modern foreign affairs reached a new zenith. The chips have assumed the stature of oil in today’s geopolitics and depriving China of the chips now seems a strategic objective.

Unease has only grown with the appearance of the ChatGPT AI Large Language Model, which is a chip-hungry, power-guzzling presence ready to take over the world, to hear networks of experts and Cassandras tell it. Just as unsettling are Chinese maneuvers around Taiwan, a crucial center of global chip production.

Such activity formed a partial backdrop for the MIT Technology Review’s recent Future Compute 2023 conference at the Cambridge, Mass. Campus. Semiconductor issues were probed in a Q&A session featuring Chris Miller, Tufts University lecturer and author.

Miller said the semiconductor has taken on an outsized role in strategizing on China, and that the focus now is both on economics and defense.

”China spends as much money importing chips each year as importing oil,” he said. “You can’t understand the structure of the world economy without putting semiconductors at the center of your analysis.”

This is increasingly true for economic issues, Miller continued. Semiconductors that drive computers and embedded systems are top of mind when defense ministries and intelligence agencies think about future procurements.

“What they know is that over the past half century one of the key forces that’s transformed the way militaries fight has been computing power,” according to Miller, who traced the developments leading to the present predicament in “Chip War: The Fight for the World’s Most Critical Technology,” a recent noteworthy [Financial Times Book of the Year 2022] look at semiconductor industry history and its ever-shifting role in the larger body politic.

“Chip War” is described by a New York Times reviewer as something of a nonfiction thriller in which ‘pocket-protector men’ at Fairchild Semiconductor and Intel  tamed the raw transistor, fashioned the Integrated Circuit, outdid the Soviet Union, and left a war weary Europe in the dust as they formed what’s now Silicon Valley. Many of those developments bear review as governments’ and companies’ take on present complexities.

The complexities include more seemingly modest products than high-end processors, Miller indicated. Simpler chips that complement the hot processers grow in importance as well.

“The entire electronics supply chain is actually beginning to shift. It’s not only at the chip level, it’s also electronics assembly and simpler components,” Miller said, adding that a reduction in China’s level of server assembly has led to a major increase in Mexico’s market share in that field.

The also point emergence of new market dynamics as large companies take on design of their own chips, which could be spurred for a wider range of companies as US Chips Act R&D funding addresses the need for less expensive chip design processes.

A qubit for your thoughts

Infant quantum computing looms as an adjacent technology where geopolitical ambitions may play out.

China, the US the EU, and countries such as Australia, Singapore, and Canada now devote research monies to pursue such quantum efforts. They stir this new ground at the same time they test the limits of Moore’s Law – the perceived dead end for further large-scale silicon chip integration, which Tuft’s Miller cites as a fundamental challenge facing the chip industry.

However, quantum technology is still-raw technology – the quantum researchers on the main are still found toiling at the qubit level with lab rigs and signal scopes – that is, the quantum equivalent of the lone transistor work that preceded development of the Integrated Circuit.

A high-point of the Future Compute 2023 agenda for me was a visit to MIT’s Engineering Quantum Systems Group’s labs. Smart people are working hard on this frontier technology. And, with notable exceptions, there is knowledge sharing going on.

But, in a conference panel on quantum at the event, the impression emerged that quantum computing needed a large-scale working version of a quantum computer before the international competition for quantum computing would reach a less-sanguine stage akin to that the advanced CPU, GPU, NPU and network processing chips now experience.

For his part, at Future Compute, Chris Miller hesitated somewhat in responding to an audience question on quantum computing.

“I struggle to say anything that intelligent on quantum computing, both because I’m really not an expert in computing, but also because there’s a chip industry that I can study and I know how to talk about, whereas quantum computing is still a prospective industry,” he said. “We all hope it will materialize but it hasn’t materialized in a practical form.”

My take

Global chip wars must be viewed in the context of a real war underway in Ukraine. It has exposed the pivotal role of new technology in the exercise of war, as well as the vulnerability of the supply chains that feed modern commerce. It’s also pushed diplomacy to the sidelines, narrowing the opportunity for maneuver in the semiconductor straits.

“Mythical Man-Month” Author Frederick Brooks, at 91

December 20, 2022 By Jack Vaughan

[Dec 20, 2022] – Noting here the passing at 91 last month of Frederick Brooks, director of some of IBM’s most important mainframe-era programming projects. He was a key figure in establishing the idea that software projects should be intelligently engineered and organized.

He helped as much as anyone to move the mysterious art of tinkering with computer code toward a profession capable of repeatable results. “The Mythical Man-Month,” his 1975 distillation of years of development management, became a common reference work in many a developer’s desk library.

La Brea Tar Pits – Huntington Library.

Working at IBM in the 1950s and 1960s, and spearheading development of the vaunted IBM/360, Brooks gave a lie to notions that were bedrock in hardware-software projects, and came up with a few notable inventions as well.

Especially, he is credited with IBM’s decision to settle on an eight-bit byte. This allowed the systems to handle both text and numerals. Strange to think there was a time when machines were dedicated either to text handling or numerical calculation, but it was so!

He oversaw the development of systems that could be offered with an expandable range of processor and memory equipment at different price points, thus entering the development era of “platform” over “product.”

Brooks studied and found some surprising truths about complex software and projects – the most telling: That projects slow down at a greater rate when leaders add people as a project gets closer to completion.

He also saw the dangerous lure technology offers in the form of “The Silver Bullet” that promises a sudden tech- or organizational-style breakthrough.

With these and other observations, Brooks help build a philosophical underpinning for structured analysis, a school of thinking that held sway for software during an era of big projects marked especially by NASA’s Apollo program.

Ed Yourdon, Ivar Jacobsen, Tom DeMarco and others would take Brooks’ work into the 1990s. Like him, they realized it’s not just about “the code” – that the culture of the organization can play a more dominant role.

Brooks paved the path forward with emphasis on requirements gathering. But he foresaw the tarpit that beckoned with any search for the greatest schema perfection ahead of actually getting the project going.

He conveyed this without embracing the extreme that says “Fail Fast,” and he did it as always with a measure of humor. It’s right there in the title for one of “The Mythical Man-Month” chapters: “Plan to Throw One Away.”

Some of Brooks’ musings do echo another era. I don’t think we have Man Months anymore – mythical or other. Unquestionably too, team programmers have gained much more responsibility over the years, so Brooks’ emphasis on the manager wears thin [Heck, a whole era of development sprang from gritty JavaScript developers who found a way around obstacles their managers took for granted as their ‘lot in life.’]

But managers, at the end of the day, bear the greatest responsibility for the software project. Technology acumen is just a table stake. Their communications and organization skills must be stellar, as Brooks indicates when he writes:

The Tower of Babel was perhaps the first engineering fiasco, but it was not the last. Communication and its consequent, organization, are critical for success. The techniques of communication and organization demand from the manager much thought and as much experienced confidence as the software technology itself. – From The Mythical Man-Month.

Is software engineering really a profession? The question will continue to be asked, and Brooks work will likely ever be part of that discussion.

Coda: My days as a software project manager were brief – about a year. [After which my colleagues welcomed me back to editorial and told me they thought I’d been crazy ever to leave.] What I learned building web sites was that, no matter what you think your problem is, it is probably a project management problem. I owe that to Brooks. As expressed in his thoughtful and very often bemused writings, Brooks’ thinking on the topic informed my and many others’ efforts to ‘ship the darn thing’. – Jack Vaughan, 2022

How well can Nvidia tread the Agglomerverse?

September 25, 2022 By Jack Vaughan

Nvidia has worked hard to emerge from the worlds of graphic cards, gaming, and bitcoin mining to become a potent presence in enterprise AI considerations. It also is poised to play as a key vendor in the Metaverse, an AR-imbued but ill-defined repository for the next version of the Web.

More work is in store now as the GPU company – like most companies of any sort – navigates a more difficult economic environment – one where macro winds auger a possible enterprise spending slowdown. Already, Nvidia CEO Jensen Huang has led his crew into spaces others could not imagine.

Graphic Processing Units (GPUs) support ultrahigh memory bandwidth applications. They can churn through neural networks and sundry matrix multiplications like banshees. Huang and company pursue all their possible uses, and created a large portfolio of use cases, even as would-be competitors nip at their heels with more specialized offerings.

Visionary Huang, who we heard last week in keynotes and press conferences related to Nvidia’s GTC 2022 event, calls Nvidia an “Accelerated Computing Company.” And, he has set out to exploit “the Full Accelerated Computing Stack.”

These ambitions take form in a true slew of new offerings – ranging from the Nvidia DLSS3 deep learning sampler to GeForce RTX Series GPUs for neural rendering to Omniverse Cloud Services for Building and Operating Industrial Metaverse Applications, the Omniverse Replicator for synthetic data production and the 2,000-TFLOPS Thor SOC. The latter is probably well-described as “a super chip of epic proportions.”

Nvidia was early to see the possibility that AR/VR technology could drive a more interactive world-wide computing environment. The company coined it “the Omniverse” but now it’s joined others in the “metaverse” quest. For now, the metaverse is a loose agglomeration (the ‘Agglomerverse’?) of such elements as physics simulation, digital twins, and, of course, AI modeling. This puts Nvidia in competition or what Sam Alpert called coopetition with a host of other vendors. Hype vastly surpasses reality in today’s metaverse and the pay-off is both unclear and distant.

Meanwhile, Enterprise AI has found a place in data centers, and Nvidia has established a genuine foothold there. Obscured in the rush of GTC 2022 product announcements were less-than-flashy Apache Spark accelerator technology and AI inference announcements that may show up in revenue reports sooner than metaverse cases. Huang, for his part, sees the two technical domains playing off one another.

Be that as it may, in the metaverse and enterprise AI alike, Huang needs boots on the ground. These undertakings need great advances in skilling around big data.

It remains to be proved that corporations are anymore ready now to take on enterprise AI and the metaverse with imagination and execution. Can they imagine and execute on par or better than they did with Big Data Hadoop beginning ten years ago?

It’s worth noting that GTC 2022 software tools announcement were as proliferate as hardware news, showing the company is seeking ways to simplify the way to such advancements. Nvidia will likely need to take on greater headcount, and forge more mega-partnerships like one announced with Deloitte last week, if it going to successfully seed enterprise AI and metaverse apps.

Like most, Nvidia’s stock has been in free fall. But some of its challenges are unique. When US Government policy looked to slow down or block the transfer of advanced AI to China, Nvidia felt the brunt of it.

Meanwhile, the general rout of crypto currency impedes chip sales to crypto miners – and, as some news reports have it, a recent 2.0 update to the Ethereum blockchain takes a new proof-of-stake approach to processing and reduces the general call for GPUs for mining.

At the same time, the gaming card market has gone from famine to glut in the 24-month-plus period following the start of the global COVID pandemic. Moreover, the cost of these ever-bigger and more functional chips goes up-up-up, emptying gamer’s’ coffers.

Successes in these areas gave Nvidia wiggle room as it pursued enterprise AI. The wiggle room gets smaller just as the metaverse and enterprise AI to-do list gets taller. Among this week’s slew of portfolio additions there are some parts that will find users more quickly than others, and its up to Nvidia to suss those out and ensure they prosper. – Jack Vaughan

What’s it take to make #Metaverse real? [asks @deantak ]. In #GTC22 presser, Jensen discusses GDN – that is: a global Graphics Delivery Network – and notes as analog #Akamai Content Delivery Network (CDN). He said: “We have to put a new type of data center around the world.” pic.twitter.com/6Ur8IFwGJ3

— Jack Vaughan (@JackIVaughan) September 21, 2022

Jensen: We have a rich suite of domain specific application frame works. Now we need an army of experts to help customers apply these AI frameworks to automate their businesses. [Cue Deloitte soundtrack.] https://t.co/XBGewQGALP

— Jack Vaughan (@JackIVaughan) September 21, 2022

Omniverse Replicator — enables developers to generate physically accurate 3D synthetic data, and build custom synthetic-data generation tools to accelerate the training and accuracy of perception networks. https://t.co/t8HnVWvCcT

— Jack Vaughan (@JackIVaughan) September 20, 2022

Tech segments merge and fork

September 15, 2022 By Jack Vaughan

Source ID
Source: IDC

The Skeptical Examiner. Tech Industry segments merge and fork in generally obscure ways. That can be driven arbitrarily by the categorization strategies that work for analyst groups like Gartner or IDC, but it’s also driven by the fact that technology buyers don’t live in categories convenient for marketers.

Among vendors’ deflection strategies in interviews is this: “You are comparing apples and oranges.” The implication: They have no competition.
No competition if the world is in neat compartments.

In the fruit section of any supermarket you will find people grabbing apples, oranges, blueberries, bananas; I’ve never seen anyone grab a cumquat. And tech buying can mirror this wanton buyer promiscuousness.

That occurs today while looking at IDC’s Market Glance that looks at the High Performance and Performance Intensive Computing sectors. The sets and subsets thereof are subjective and various … and often collide.

The cursory viewer may be surprised by the extent to which Nvidia, and IBM compete here and there. That says something about IBM’s challenges, which, obviously, come from more directions than just Nvidia.

On the Nvidia side, it tees up a question as to whether or not the chip and tools maker can support multiple efforts successfully, as it looks to break out of the gamer-crypto space, and to thrive in the new vistas of AI.

IBM’s focus on AI, which arguably seeded the wide renewed interest in the area, seems back-burner stuff for now – as it dims down the hype machine that was Watson.

Is ‘AI’ another name for high-performance computing?

I know the Nvidia/IBM angle on this IDC chart (above and below) surprised me. As one wag said: Check with your ophthalmologist before viewing it. – J.V.

 

.@IDC‘s Market Glance for Performance Intensive Computing. The convergence of HPC w/ AI, Big Data, Data Analytics, and Quantum Computing brings consolidation of infrastructure bringing decades of HPC’s best practices into the forefront to achieve optimal price/performance! pic.twitter.com/N6r5c0m2F6

— Matt Eastwood (@matteastwood) September 13, 2022

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Go to page 6
  • Go to Next Page »

Progressive Gauge

Copyright © 2025 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack