
Kafka messaging and streaming software champion Confluent gained a $250 million round of funding last week. Given the economic environment, that was particularly notable. While it’s generally been one fast-breaking news day after another on the Covid19 front, it’s been a slow month of news days for these types of venture capital announcements.
Founded in 2014, Confluent has been led since inception by former members of the LinkedIn development team that created Kafka software. The software puts a focus on the logs associated with collected data, and turns the incumbent computing paradigm around, by placing some processing of data ahead of the storage of that data.
The Kafka software helped enable the kind of Web applications that marked LinkedIn’s rise : Think “People You May Know” and “Who Viewed My Profile.” Former LinkedIn technologists and now Confluent leaders Jay Kreps, Neha Narkhede and Jun Rao helped forge the new apps on the back of the open source publish and subscribe architecture called Kafka. It uses a publish-and-subscribe mechanism that notifies only interested nodes when relevant events occur.
Through recent years, Kafka became a ubiquitous part in many a big data infrastructure. Kafka is the pipeline of the new oil known as big data, and, at Confluent, the former LinkedIn engineers have been working to build on that – to be the go-to tool for data streaming and event processing in automated systems that make decisions in real time.
This funding round implies a $4.5 billion valuation, giving Confluent the bragging rights of a Silicon Valley Unicorn. These are particularly special bragging rights, given the uncertain times. A big question that still needs to be answered is how much profit is left for industry to rea[ in applications such as “People You May Know” and “Who Viewed My Profile” – Kafka event streaming can grow to handle much more than such apps, and fraud detection is a notable must-have there, but this will all take innovative design and engineering, and there will have to be some kind of pot of gold at the proverbial end of the day.
This plush valuation puts a spotlight on Confluent, much as a spotlight was cast on Cloudera in 2014. By that time, Cloudera had garnered over $700 million in funding from Intel in an investment that eventually came up a relative cropper for the chipmaker. Perhaps the question today is whether there is more money in tomorrow’s big data pipes than there was in yesterday’s big data processors.
In its early days, the Confluent tag line claimed to enable “organizations to harness the business value of live data.” These days, it calls itself “the event streaming platform pioneer.” If it is to succeed, it will have to get a lot of things right. While it is not wrong to call Confluent a pioneer, the notion of event streams going mainstream has been raised before, and is still to be tested.
More Analysis
It was under the banner of Complex Event Processing (CEP) that streaming technologies, publish-and-subscribe messaging systems and specialized databases became part of the financial trading systems that changed the face of Wall Street beginning in the 1990s. The tools were also part of government surveillance apparatus that culled streams of data looking for anomalous terrorists. The people that most tended to use these tools talked very little about them.
Admittedly, with open source and distributed cloud computing, the landscape is far different than then. But companies will still have to weigh the development cost needed to achieve the realest of real time (which is comparable to the lowest of latency) versus the value that can be exploited in the data. Are the tech-infused industry disruptors of recent years still the model?
Look at one. Uber used collections of Kafka and other new technologies to log events — locations data, services use, price estimates, trip routes — that could be mined for benefit. Fair to say, little expense was spared in this company’s buildout; but, while Uber managed to decimate the taxi industry incumbent, it did so without remotely turning a profit.
Meanwhile, the skills to build these systems are hard to come by, because data flow and event processing represent a big change in computer and data architecture. This is something that Confluent is quite aware of. It’s focus is on improving tooling for mainstream consumption, and its performance in this regard could well decide how its efforts will fare.
It could be challenging however. Startups that sought to bring CEP to wider use back in the day did not fare so well. Ultimately, Aleri, Apampa, Streambase and others were purchased by more established software vendors for relatively small amounts in the wake of the 2008 economic downturn. All this is ably portrayed in Thomas Dinsmore’s “Disruptive Analytics” [Apress, 2016].
As Dinsmore in this writing estimated, the risk-reward for streaming analytics was still the outstanding question. That’s probably still the case.
He writes: “Managers must distinguish between the costs of streaming data processing, on the one hand, and the benefits of reduced latency.”
Looking back on some coverage of CEP I did for eBizQ a number of years ago touches on this as well.
At that time, Neal Ward-Dutton told me: “If you look at it just from a technology angle, you can look at a lot of crazy ideas.” What is required, he said, is to build effective partnerships between people who understand the business and the technology.
Of course, that is the matter at the heart of one of our current era’s most oscillatory buzz saws, “DevOps.” The idea is not so new, after all, and so has been called upon to spawn off spring. Think: ITOps, DataOps, AIOPs and – you heard it here first, IoTOps.
Some pieces from Vaughan Portfolio
http://www.ebizq.net/topics/event_processing/features/13382.html
– Jack Vaughan
