• Skip to primary navigation
  • Skip to main content
Progressive Gauge

Progressive Gauge

Media and Research

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack
  • Show Search
Hide Search

Jack Vaughan

Oracle gets the memo

October 24, 2022 By Jack Vaughan

Conf
Ellison updates Oracle Cloud World crowd.

THE SKEPTICAL ENQUIRER – Pride bordering on arrogation is a typical trait of successful enterprise tech companies – it’s definitely been the case with Oracle Corp. Many times it’s met the challenge of a changing computing paradigm with brio, taking its lead from irascible founder Larry Ellison, but the sum substance of its claims would be argued.

I may be a tea leaf reader here, but I see a subtle shift in Ellison’s irascibility in the minutes of last week’s Oracle Cloud World 2022 event.

Ellison’s lively quotes have long been a reporter’s friend. But it’s hard to relay his cloud computing pitches without adding footnotes. The consensus big cloud players are AWS, Google and Microsoft – while enterprise incumbents like Oracle, Teradata, and IBM are cited as tertiary at best.

Overall, Oracle’s financial growth has been modest in the last decade, while its cloud claims have been bold. It is by no means alone in the creative work it has done to define cloud, which is a promise of future growth, on its accounting ledger.

But, in terms of pitching everyday database automation advances of Oracle Autonomous Database [It’s enterprise blocking and tackling, right?] as the one-true self-driving database for the cloud of the future – well, Oracle has no equal there.

Its latest tactic is to boost its acquisition of health system giant Cerner as a great opportunity to rapidly modernize Cerner’s systems, move them to the Oracle Cloud, and count them as such on the financial report.

As I said, the company still experiences gradual overall gains, so I may be talking style rather than substance when I say they missed some big new tech boats, although they revisit these product line gaps from time to time. I’m talking about databases, development tools and persistence stratagems that all called for less spend and new thinking:

 

  • Oracle missed opportunities to expand its MySQL brand at a time when competitive PostgreSQL database versions were becoming go-to players in AWS and Microsoft stables for open-source distributed SQL. In September, the company came out with a bigger/better MySQL known as MySQL the Heatwave Database Service.
  • In fact, Oracle played down, ignored or glossed over inexpensive clustered open-source databases – both SQL and NoSQL — despite having the original Berkeley NoSQL stalwart in its hand.
  • JSON-oriented document database development led by MongoDB is a particular thorn. Oracle, like others, found ways to bring JSON to its RDBMS, but Mongo is still on the up stroke. Oracle last week addressed what it called a mismatch between JSON and the Oracle SQL database in the form of “JSON Relational Duality” – a new developer view supported in Oracle Database 23c.
  • Also, Oracle was slow to support cloud-friendly S3-style object storage. Not surprising in that the great goal is to place data into an Oracle database. But maybe it doesn’t have to be Oracle Autonomous Database. Last week, Oracle described MySQL Heatwave Lakehouse which may be a step in a broader direction of support.

This latter trait, object storage, seems to be getting a bit more of Oracle’s attention, as Snowflake Inc. rises on the back of its 3-in-1 distributed cloud data warehouse, data lake and data lake house. At Oracle’s yearly confab, Snowflake seemed to have garnered leader Ellison’s grudging admiration.

No small feat, that!

It’s a little hard to directly discern Ellison-speak-on-the-page. But his presentation moves salesfolks – and customers too. This recalls Curt Monash once calling him “one of the great public speakers and showmen of the era.”

What did Ellison tell the conference crowd? Well, besides a lot about how Oracle technology helped deliver Covid-19 vaccines, he spoke about the cloud market. People are moving from single-cloud to multi-cloud architectures, Ellison told the crowd at Oracle Cloud World 2022.

“The fact that this is happening is changing the behavior of technology providers…So the first thing that happened as people use multiple clouds is that service providers started deploying new services in multiple clouds, maybe most famously its Snowflake,” he said. Then, in a nod to the new highflier, he added, “And Oracle got the memo.”

“You know, we noticed, and we actually thought that was a good idea,” he said.

Of course, evolution to multi-cloud is a chance for Oracle to take another at bat in the cloud game. The bad news, as in the past, is that so much of the company’s effort is toward moving everything into the Oracle database. That is why any shifts to emphasize MySQL Heatwave would be notable.

Jack Vaughan is a writer and high-tech industry observer.

~~~~~~~~~~~~~~~~~~

My Back Pages
Data Lake Houses Join Data Lake in Big Data Analytics Race – IoT World Today – 2021
Purveyors of data lake houses and cloud data warehouses are many, and opinions on qualifications vary.

Oracle Cloud IaaS security gets superhero status from Ellison – TechTarget – 2018 [mp3 – Podcast]
It’s incremental!

Updated Oracle Gen 2 Cloud aims to challenge cloud leaders – TechTarget – 2018
The central product for Oracle is the database, and all the related tools that support the continued dependence of customers on the database.

On top of the RDB mountain – ADTmag 2001
Never lose the database sale: It’s tattooed on their foreheads.

Progressive Gauge Recently Noted – Edge, Quantum Computing, More

September 29, 2022 By Jack Vaughan

Here’s a brief video look at some of the advanced technology trends we’ve been watching in top web journals and our own humble Progressive Gauge Blog – analyzing current activity based on experience over 20 years in the computer trade press, now called media. We start of*f with discussion of Quantum computing that is moving subatomic waves/particles​.

How well can Nvidia tread the Agglomerverse?

September 25, 2022 By Jack Vaughan

Nvidia has worked hard to emerge from the worlds of graphic cards, gaming, and bitcoin mining to become a potent presence in enterprise AI considerations. It also is poised to play as a key vendor in the Metaverse, an AR-imbued but ill-defined repository for the next version of the Web.

More work is in store now as the GPU company – like most companies of any sort – navigates a more difficult economic environment – one where macro winds auger a possible enterprise spending slowdown. Already, Nvidia CEO Jensen Huang has led his crew into spaces others could not imagine.

Graphic Processing Units (GPUs) support ultrahigh memory bandwidth applications. They can churn through neural networks and sundry matrix multiplications like banshees. Huang and company pursue all their possible uses, and created a large portfolio of use cases, even as would-be competitors nip at their heels with more specialized offerings.

Visionary Huang, who we heard last week in keynotes and press conferences related to Nvidia’s GTC 2022 event, calls Nvidia an “Accelerated Computing Company.” And, he has set out to exploit “the Full Accelerated Computing Stack.”

These ambitions take form in a true slew of new offerings – ranging from the Nvidia DLSS3 deep learning sampler to GeForce RTX Series GPUs for neural rendering to Omniverse Cloud Services for Building and Operating Industrial Metaverse Applications, the Omniverse Replicator for synthetic data production and the 2,000-TFLOPS Thor SOC. The latter is probably well-described as “a super chip of epic proportions.”

Nvidia was early to see the possibility that AR/VR technology could drive a more interactive world-wide computing environment. The company coined it “the Omniverse” but now it’s joined others in the “metaverse” quest. For now, the metaverse is a loose agglomeration (the ‘Agglomerverse’?) of such elements as physics simulation, digital twins, and, of course, AI modeling. This puts Nvidia in competition or what Sam Alpert called coopetition with a host of other vendors. Hype vastly surpasses reality in today’s metaverse and the pay-off is both unclear and distant.

Meanwhile, Enterprise AI has found a place in data centers, and Nvidia has established a genuine foothold there. Obscured in the rush of GTC 2022 product announcements were less-than-flashy Apache Spark accelerator technology and AI inference announcements that may show up in revenue reports sooner than metaverse cases. Huang, for his part, sees the two technical domains playing off one another.

Be that as it may, in the metaverse and enterprise AI alike, Huang needs boots on the ground. These undertakings need great advances in skilling around big data.

It remains to be proved that corporations are anymore ready now to take on enterprise AI and the metaverse with imagination and execution. Can they imagine and execute on par or better than they did with Big Data Hadoop beginning ten years ago?

It’s worth noting that GTC 2022 software tools announcement were as proliferate as hardware news, showing the company is seeking ways to simplify the way to such advancements. Nvidia will likely need to take on greater headcount, and forge more mega-partnerships like one announced with Deloitte last week, if it going to successfully seed enterprise AI and metaverse apps.

Like most, Nvidia’s stock has been in free fall. But some of its challenges are unique. When US Government policy looked to slow down or block the transfer of advanced AI to China, Nvidia felt the brunt of it.

Meanwhile, the general rout of crypto currency impedes chip sales to crypto miners – and, as some news reports have it, a recent 2.0 update to the Ethereum blockchain takes a new proof-of-stake approach to processing and reduces the general call for GPUs for mining.

At the same time, the gaming card market has gone from famine to glut in the 24-month-plus period following the start of the global COVID pandemic. Moreover, the cost of these ever-bigger and more functional chips goes up-up-up, emptying gamer’s’ coffers.

Successes in these areas gave Nvidia wiggle room as it pursued enterprise AI. The wiggle room gets smaller just as the metaverse and enterprise AI to-do list gets taller. Among this week’s slew of portfolio additions there are some parts that will find users more quickly than others, and its up to Nvidia to suss those out and ensure they prosper. – Jack Vaughan

What’s it take to make #Metaverse real? [asks @deantak ]. In #GTC22 presser, Jensen discusses GDN – that is: a global Graphics Delivery Network – and notes as analog #Akamai Content Delivery Network (CDN). He said: “We have to put a new type of data center around the world.” pic.twitter.com/6Ur8IFwGJ3

— Jack Vaughan (@JackIVaughan) September 21, 2022

Jensen: We have a rich suite of domain specific application frame works. Now we need an army of experts to help customers apply these AI frameworks to automate their businesses. [Cue Deloitte soundtrack.] https://t.co/XBGewQGALP

— Jack Vaughan (@JackIVaughan) September 21, 2022

Omniverse Replicator — enables developers to generate physically accurate 3D synthetic data, and build custom synthetic-data generation tools to accelerate the training and accuracy of perception networks. https://t.co/t8HnVWvCcT

— Jack Vaughan (@JackIVaughan) September 20, 2022

Tech segments merge and fork

September 15, 2022 By Jack Vaughan

Source ID
Source: IDC

The Skeptical Examiner. Tech Industry segments merge and fork in generally obscure ways. That can be driven arbitrarily by the categorization strategies that work for analyst groups like Gartner or IDC, but it’s also driven by the fact that technology buyers don’t live in categories convenient for marketers.

Among vendors’ deflection strategies in interviews is this: “You are comparing apples and oranges.” The implication: They have no competition.
No competition if the world is in neat compartments.

In the fruit section of any supermarket you will find people grabbing apples, oranges, blueberries, bananas; I’ve never seen anyone grab a cumquat. And tech buying can mirror this wanton buyer promiscuousness.

That occurs today while looking at IDC’s Market Glance that looks at the High Performance and Performance Intensive Computing sectors. The sets and subsets thereof are subjective and various … and often collide.

The cursory viewer may be surprised by the extent to which Nvidia, and IBM compete here and there. That says something about IBM’s challenges, which, obviously, come from more directions than just Nvidia.

On the Nvidia side, it tees up a question as to whether or not the chip and tools maker can support multiple efforts successfully, as it looks to break out of the gamer-crypto space, and to thrive in the new vistas of AI.

IBM’s focus on AI, which arguably seeded the wide renewed interest in the area, seems back-burner stuff for now – as it dims down the hype machine that was Watson.

Is ‘AI’ another name for high-performance computing?

I know the Nvidia/IBM angle on this IDC chart (above and below) surprised me. As one wag said: Check with your ophthalmologist before viewing it. – J.V.

 

.@IDC‘s Market Glance for Performance Intensive Computing. The convergence of HPC w/ AI, Big Data, Data Analytics, and Quantum Computing brings consolidation of infrastructure bringing decades of HPC’s best practices into the forefront to achieve optimal price/performance! pic.twitter.com/N6r5c0m2F6

— Matt Eastwood (@matteastwood) September 13, 2022

Sandia researcher pursues programming tricks to improve quantum computing’s chances

August 8, 2022 By Jack Vaughan

The Photon Box (1920s)

A grand sampling of today’s quantum movers and shakers shows many come from physics depts, R&D equipment makers, and the like, and that should remain the case for some time. This is one among many clues indicating the quantum computing industry is still struggling to be born. One foot is very much in the labs where quantum physics research has been going on for 10s of years.

A lot of work going on now has to do with moving particles and waves in ways that forward knowledge of the underlying quantum aspects of nature. It’s the stuff of headlines describing breakthroughs in Science and Nature magazines, as well as Nobel prizes. Manipulating things subatomic – it’s still not quite commonplace activity.

A news release from Sandia National Labs provides a glimpse into what goes on as researchers strive to scale-up quantum computing beyond the original test beds. From Sandia comes word that the Dept of Energy Office of Science has conferred a five-year Early Career Research Program Award grant to Timothy Proctor to improve the quantum computer programs now being devised at the honored research labs. A look at Proctor’s work discloses some of the friction points with which ‘quantumists’ must deal.

Proctor came to Sandia via Leeds University, where he explored a variety of techniques for creating computational quantum gates. A lot of that work has to do with creating operating system software that manages new types of hardware and communications. Error correction, boot order, and other operational traits that were fairly suitably solved some time ago in classical computing are still frontier undertakings for the quantum kind.

These days, he is looking at how commands are arranged and structured and what effect different approaches have on computing accuracy. That use case, in fact, is one of the key ones that fledgling quantum software houses are pursuing. Not surprisingly, comparing results on highly divergent quantum computing types is the first order of business for many who are just now dipping toes in the water.

As part of their public debates on physics in the 1920s Einstein and Bohrs did thought experiments. After all, there was no apparatus to separate, observe and manipulate atomic and sub-atomic particles. So, they used their minds. Laboratory rigs accomplish those experiments today – but the scale does not yet match the scale where quantum computing dependably scales beyond today’s best systems. When will that change? Work like Proctor’s algorithmic efforts will yield clues. – Jack Vaughan

Related
https://newsreleases.sandia.gov/proctor_award/
https://qpl.sandia.gov

 

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 8
  • Go to page 9
  • Go to page 10
  • Go to page 11
  • Go to page 12
  • Interim pages omitted …
  • Go to page 16
  • Go to Next Page »

Progressive Gauge

Copyright © 2026 · Jack Vaughan · Log in

  • Home
  • About
  • Blog
  • Projects, Samples and Items
  • Video
  • Contact Jack