• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

AI

Kreps, Dorsey, riff on ChatGPT

May 29, 2023 By Jack Vaughan

Stagg Field Nuclear Pile [fragment]

[Boston — May 2023] — “It’s a law that any conversation around technology has to come back to AI within five minutes.”

Well put, Jay Kreps, co-founder and CEO for real-time streaming juggernaut Confluent. Speaking at J.P. Morgan’s Boston Tech Investor event, Kreps knew this was coming. ChatGPT rules the news these days.

Given the daily pounding of 1,000 reporters’ laptops, given Nvidia’s vault into the highest clouds of valuation, it is no surprise that ChatGPT generative AI is the recurring topic. It will impede all other discussion, just as expected by tech stalwarts at J.P. Morgan’s and others’ tech events.

It’s the 600-lb. ChatBot in the room, and it is bigger than big.

Confluent chief on Chatbot interaction

Back in the nascent days of social media, the founders of Confluent, then working at LinkedIn, created a distributed commit log that stored streams of records. They called it Kafka and grew it out into a fuller data stream processing system. It’s intent is to bring to the broader enterprise real-time messaging capabilities akin to that of the Flash Boys of Wall Street.

The company is still in “spend a buck to make a buck” mode. For the quarter ending March 31, Confluent revenues increased 38% to $174.3M, while net jumped 35% to $152.6M. Customers include Dominos, Humana, Lowes, Michelin and others. In January it purchased would-be competitor, Immerok, a leading contributor to the Apache Flink stream processing project.

What’s the significance of real-time streaming in “the age of AI,” Kreps is asked at the Boston event. He says:

It’s really about how a company can take something like a large language model that has a very general model of the world and combine it with information about that company, and about customers, and be able to put those things together to do something for the business.

He gives an example: A large travel company wants to have an interactive chatbot for customers. Seems the barrier ChatGPT faces there for improvements is not so high. As Kreps said: “The chatbots were always pretty bad. It’s like interacting with like the stupidest person that you’ve ever talked to.”

Improvements needed for chatbots include a real-time view of all the information the company holds about customers and operations.

What do you need to make that work? Well, you need to have the real-time view of all the information about them, their flights, their bookings, their hotel, are they going to make their connection, etcetera. And you need a large language model which can take that information and answer arbitrary questions that the customer might ask. So the architecture for them is actually very simple. They need to put together this real time view of their customers, what’s happening, where the flights are, what’s delayed what’s going on. And then they need to be able to call out to a service for the generative AI stuff, feed it this data, feed it the questions from customers, and … integrate that into their service, which is very significant. This is a whole new way of interacting with their customers. And I think that that pattern is very generalizable.

Popping the question: Dorsey

For Jack Dorsey, the question “What about ChatGTP?” is raw meat. He melded SMS and the Web to create Twitter, and now with a nod to bitcoin and block chain has built Block, nee Square. The financial services and digital payments company posted revenue results for the three months ended April 1 that increased 26% to $4.99B, while net loss decreased a significant 92% to $16.8M. The good news was based on increased use of its Cash App product.

At the J.P. Morgan tech investor conference, Dorsey told the people, while hype obviously abounds, true progress rides on use cases.

There’s a ton of hype right now. And I think there’s a lot of companies being started that are going to fail because of that hype. I think the technology industry is very trendy, and very fashionable and jumps from one thing to the next, to the next, to the next. It wasn’t so long ago that we were only talking about Bored Apes and Crypto and NFTs and now we’re talking only about AI and how it’s going to kill us.

There’s always some truth in all these things. I just would caution any company that’s approaching it from a technology perspective, [to] instead use a use case perspective. What is the use case you’re trying to solve? And what technologies can you use to solve it more creatively?

THAT’S THE WAY IT IS — Clearly, panelists and podiumists are preparing to take on ChatGPT questions. At the same time, the clamor of the now will shift to prioritizing generative AI strategically within a host of technology initiatives. ChatGPT may be generalizable — but the proof will not appear overnight. The proof is in the business use case.

Big embedded player Infineon snags Tiny MLer Imagimob

May 24, 2023 By Jack Vaughan

Infineon Technologies AG  last week acquired Stockholm-based Imagimob AB, one of the most active players bringing AI to edge devices.

 

[Published May 24, 2023] – Germany-based chip maker Infineon Technologies AG  last week acquired Stockholm-based Imagimob AB, one of the most active players among a slew of startups seeking to bring AI-based machine learning to embedded devices on the edge of the Internet of Things (IoT). Terms were not disclosed.

 

Imagimob provides end-to-end development tools and cloud-based services intended to bring the much-vaunted capabilities of neural machine learning (ML) models from the cloud data center to edge devices. These devices have small footprints, rigorous memory limits, and strict constraints on power consumption. The aspiration to do a lot with little is summed up in the umbrella term “TinyML.”

 

The edge AI devices that Imagimob seeks to support also must cope with a wide variety of sensor types, including sensors that measure and analyze vision, movement, pressure, heat, velocity and other data formats. Business uses are broad, ranging from surveillance cameras and refrigerator monitors in retail settings to actuators and anomaly detectors in oil industry field equipment, and beyond.

 

Infineon’s Thomas Rostech said the purchase is based on his company’s contention that artificial Intelligence and machine learning are about to enter every embedded application, enabling new functionalities. In a statement, Rosteck, who is president of Infineon’s Connected Secure Systems division, boosted Imagimob’s platform and expertise in developing machine learning for edge devices.

 

In recent years, Infineon has worked to build out a portfolio of advanced sensors and IoT solutions. This is an area in which software is expected to play a key role. For example, the market for edge AI software is set to grow to €10.0B in 2032, from €738.5M in 2022, for a CAGR of 29.8% over the forecast period, according to Global Edge AI Software Market Research.

 

Founded in 2013, Imagimob tech- and business-side leaders came out of the mobile applications market. Since inception, its teams have worked on a wide variety of edge AI use cases. These include gunshot and other audio event detection, fall detection, condition monitoring, signal classifiers, safety vests, and more.

 

Imagimob has been highly active within the TinyML community, centered in part around the TinyML Foundation, which is dedicated to nurturing ultra-low power machine learning. Imagimob software has been demoed in showcases with Synaptics, Syntiant, Texas Instruments, and other edge AI hardware concerns.

 

Responding to request for comment, an Infineon spokesperson said the company plans to integrate Imagimob into its organizational structure and that customer relationships with Imagimob’s customers will continue, including partners working on competitor’s hardware, “in alignment with the compliance regulations.”

 

To date, IoT growth has been fitful in the enterprise, as businesses look to move past proof-of-concept projects and achieve return-on-investment. Potential enterprise application areas that include retail, healthcare, supply chain and other operations are places where processing data on the edge translates into cost savings versus processing data in the cloud. The need to off-load processing to the edge becomes more acute as data intensive AI and machine learning capabilities come into play. Imagimob efforts to enable AI’s march from data center to the Internet’s edge are expected to fill out more fully with the backing of the larger chip maker Infineon.

 

Enterprise IoT has lost some luster in recent years as vendors grapple with a very extensive array of use cases. Intelligence in the form of machine learning makes sense, and so does the rise of TinyML as a next stage in delivering on the wide promise of IoT. But deep resources and breakthroughs on the software development side are required. That is at the same time that the venture capital markets have become less benign. So, more matches such as Infineon’s and Imagimob can be anticipated. – Jack Vaughan

The March of the Language Models

April 17, 2023 By Jack Vaughan

[April 17, 2023] – Had the opportunity to speak with Forrester Analyst Ronan Curran recently for a VentureBeat article. Of course, the topic was ChatGPT, generative AI, and Large Language Models.

His counsel was both optimistic and cautionary – a good summation of the bearings IT decision makers should set as they begin yet another tango with a new technology meme.

A handy summarizer-paraphraser tells me that Curran told VentureBeat that it would be a mistake to underestimate the technology, although it is still difficult to critically examine the many potential use cases for generative AI.

Yes, such applies to each technical challenge – every day. And it bears repeating as each new technology whispers or yells that the fundamental rules no longer apply – and yet they do.

Looking back on my conversation with Curran, I find insight in what some would say is obvious. The large language models are … large! And, as Curran told me, because they are large, they cost a lot to compute and train. This reminds us, as others have, that the LLM should be viewed like polo or horse racing – as a game for the rich.

Why do we say game for the rich? On one level, the LLM era stacks up as megacloud builders’ battle, albeit with aspects of the playground grudge match. Microsoft leader Satya Nadella, who had the thankless task of competing with Google on the search front, almost seems to chortle: “This new Bing will make Google come out and dance, and I want people to know that we made them dance.”

For the cloud giants, the business already had aspects of a war of attrition, as they staked data center regions across the globe. The folks at Semianalysis.com have taken a hard stab at estimating a day in the life of an LLM bean counter, and they suggest a “model indicating that ChatGPT costs $694,444 per day to operate in compute hardware costs.” Of course, these are back of the envelope estimates – and the titans that host LLMs will look to engineer savings.

The new LLM morning summons to mind a technology that  consumed  much attention not so long ago: Big Data. The magic of Hadoop had a difficult time jumping from the likes of Google, Facebook and Netflix to the broader market. Maybe Big Data should have been named ‘Prodigious Data’ – because that would have offered fairer warning to organizations that had to gather such data, administer it, and come up with clever and profitable use cases.

“What is Big Data good for?” was a common question, even in its heyday. Eventually the answer was “machine learning”.

Much of Big Data remained in the realm of the prototype. In the end, it was a step forward for enterprise analytics. Successes and failures alike came under the banner of prototyping. Clearly, experimentation is where we are now with ChatGPT.

The more interesting future for more people may lie in outcomes with small language models, Forrester’s Curran told me. These will succeed or fail on a use case by use case basis.

As industry observer Benedict Evans writes in “ChatGPT and the Imagenet moment,” ChatGPT feels like a step-change forward in the evolution of machine learning. It falls something short of sentience. There is potential but there are plenty of questions to answer before its arc can be well gauged.  [eof]

Read “Forrester: Question generative AI uses before experimentation” – VentureBeat Feb 24, 2023
https://venturebeat.com/ai/forrester-question-generative-ai-uses-before-experimentation/

Read “ChatGPT and the Imagenet moment” – ben-evans.com Dec 14, 2022
https://www.ben-evans.com/benedictevans/2022/12/14/ChatGPT-imagenet

The Inference Cost Of Search Disruption – Large Language Model Cost Analysis – Semianalysis.com Feb 9, 2023
https://www.semianalysis.com/p/the-inference-cost-of-search-disruption

Progressive Gauge

Copyright © 2023 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack