MongoDB CEO Ittycheria: AI Has Reached ‘A Crucible Moment’ In Its Development.
- by nlqip
In a wide-ranging interview with CRN at last week’s MongoDB.local NYC event, CEO Dev Ittycheria offered his views on why AI development is about to enter a more “transformative” phase, the thinking behind the company’s MongoDB AI Application Program to accelerate AI application development, and the role partners are playing in the AI wave.
MongoDB occupies a critical position in the AI wave now sweeping the IT industry. With MongoDB Atlas, the company’s next-generation cloud database and development platform, along with related products, its strategic alliances with all three cloud hyperscalers, and its extensive base of channel and technology partners, the New York-based company is well-positioned for what comes next in AI.
Last week at the company’s MongoDB.local NYC event, president and CEO Dev Ittycheria, in a keynote speech (pictured), said AI is at a “crucible moment” – defined as the inflection points in time when pivotal decisions are made that establish long-term trends and determine a technology’s trajectory.
“We’re at a time where every company is facing a crucible moment when it comes to AI. AI can completely reimagine how you run your business, how you engage with your customers, and how you become more nimble and efficient,” Ittychria said.
[Related: The Coolest Database System Companies Of The 2024 Big Data 100]
Ittycheria compared the current AI wave to previous tectonic shifts in the IT industry including the internet, the iPhone, and cloud computing. Each was marked by early (and sometimes failed) businesses and tech products followed by more transformative technologies. (The CEO even showed a graphic of a newspaper headline from 2000 asking whether the Internet was “just a passing fad.”)
Ittycheria said the real value of AI will be generated by the applications and services that are developed and run on platforms like MongoDB Atlas. “It’s a no-brainer that you’re going to see an explosion in the number of [AI-based] applications built this coming year.”
To that end, at the event, MongoDB launched its MAAP (MongoDB AI Application Program) initiative that provides a complete technology stack, services and other resources to help businesses develop and deploy at scale applications with advanced generative AI capabilities. MAAP, with MongoDB Atlas at its core, includes reference architectures and technology from a who’s who in the AI space including all three cloud platform giants, LLM (large language model) tech developers Cohere and Anthropic, and a number of AI development tool companies including Fireworks.ai, LlamaIndex and Credal.ai.
In addition, MongoDB unveiled a number of new MongoDB Atlas capabilities that make it easier for businesses and organizations to build, deploy and run modern applications – including those with generative AI capabilities.
Those unveilings included the general availability of MongoDB Atlas Stream Processing for developing applications that work with streaming “data in motion,” the public preview of MongoDB Atlas Edge Server for deploying and operating distributed applications in the cloud and at the edge, and the general availability of MongoDB Atlas Search Nodes on AWS and Google Cloud and in preview mode for Microsoft Azure.
MongoDB, meanwhile, has enjoyed robust growth. For its fiscal 2024 (ended Jan. 31, 2024) MongoDB reported total revenue of $1.68 billion, up 27 percent year over year, highlighted by 34-percent growth in MongoDB Atlas revenue.
In a wide-ranging interview with CRN at the MongoDB.local NYC event, Ittycheria offered his views on why AI development is about to enter a more “transformative” phase, the thinking behind the company’s MongoDB AI Application Program to accelerate AI application development, and the role partners are playing in the AI wave. His answers have been lightly edited for clarity and length.
In your keynote speech you said that AI is at a “crucible moment” right now. How so?
I think it’s a pending crucible moment, how people essentially prepare and leverage [AI] for the future. The point I was trying to say is it hasn’t completely transformed my life – not yet. The analogy I’d use is like 1998 and 2000. The internet had not completely transformed my life just yet, but it was soon going to do so. I started to help businesses leveraging the internet, my personal life was affected by the internet. Even today, if internet access goes out, all hell breaks loose.
So why do you think the crucible moment is still pending? We’re not at an AI crucible moment just yet?
I think, to me, it’s back to [the fact] there’s lots of investments being made at the infrastructure layer of AI, building out compute, training models, putting in all the foundational elements for AI. But the first generation of AI apps that we’re seeing, these chatbot tools that do research, summarize information and generate content. They serve some utility, but they are not truly transformative.
One point I was making was that also happened when the first set of apps for the iPhone came out. They were nice utilities. But then when you saw how those apps ultimately affected every [person] in their lives and how they lived, how they worked, how they collaborated, that was delivered through the apps. And so the point I was making is that I think today we just have some interesting, nice [AI] utilities to serve some function or purpose. But they’re not transformative, as they will be in the future.
As you said in your keynote, as happened with the internet in early days, is there a danger of overstating or overestimating the impact of AI, either right now or in the future?
Every morning I wake up to something new in AI. Some new company gets funding, some new company is rolling out a new service or capability, or [there is] some new worry about safety, the negative implications of AI – there’s always something coming out. There’s a lot of interest around AI. But I do think, like with any tech adoption that has happened to us, it’s very easy to over-extrapolate in the short term, but underestimate in the long term.
I think the point of my keynote was to show that what happened in 2000, some people considered the internet a fad because in March of 2000 the bubble burst and they said, ‘This is just a lot of hype.’ And it was at that exact moment that the seeds were being planted for a new generation of apps and services that transformed my life.
So the first wave of AI applications that you just mentioned, what would be the easy ones? What would be an example of that?
We see a lot of customers rolling out chat bots for support services. It’s a relatively easy app because you have your own corporate data. We’re rolling out support agents. We have the corporate data, best practices, common errors that customers make, common misconfigurations, common questions. And it’s quite beneficial because instead of, you know, requiring a human to respond, you can very quickly get some bots to respond back with accurate information and that helps both the customer and helps us. It helps the customer get the answer quickly and helps us because we don’t have to expend a lot of resources and cost to do so.
So those are what I call almost no-brainer kind of apps that people are using. And you see some people using that for internal use, like we have an internal chatbot for our sales and marketing teams if they want to find information about a product and want to find information about how to handle, potentially, a question that a customer has or how to potentially put a proposal together. We have all these things, there’s all this institutional knowledge in our platform company that we can quickly bring to the fingertips of our people. I’m not saying it’s not of value, but it’s not exactly transformative. It’s just a way to become more efficient.
So what is your vision of how AI applications will evolve? What will the next wave of applications look like?
I’m starting to see signs – I’ll give you a good example. We have an automotive company [customer] who, obviously, as you can imagine, get calls from their customers when something’s not working, their car’s not working properly. What they have done is, they have recorded what the engine sounds like and to correlate engine sounds to the root-cause issue.
Now what they can do [using AI applications] is, when you’re driving one of their cars and you’re having a problem, they’ll ask if you can record the sound of the engine of your car and send that sound file to them. And that file will be used to diagnose – with a high probability of [accuracy] – what the problem is. So if you’re thinking that your car may have a funny noise, it could be, using simple analysis, that your brakes need to be replaced because they squeak or your carburetor needs to be repaired or replaced.
So that’s an interesting way, a whole new modality, to essentially troubleshoot. I don’t require the customer to come to my shop, leave the car for a couple days. Now they say, ‘We know what the problem is. We’re going to order the part for you. Show up tomorrow, the day after, and you’ll be in and out very, very quickly.’ That’s an example of something that’s a little bit more transformative than a chatbot.
There’s no question customers are feeling overwhelmed. And I would say a combination of overwhelmed and fearful. They’re overwhelmed because the rate and pace in innovation in AI is so high, it’s almost like there’s a new announcement every week. Meta came out with Meta Llama 3 [April 18] – their benchmarks apparently are as good as OpenAI. ChatGPT: some people are saying maybe I shouldn’t bet on open source. A couple weeks earlier Mistral
came out with a very low-cost model that was as good as ChatGPT 4. Before that Anthropic came out with a new model. So the rate and pace of change in AI is very, very high. People are feeling overwhelmed, which means they get paralyzed.
But on the other side of the coin, they’re fearful. They’re fearful because if I don’t do anything, don’t act and try to take advantage of new technology, there are risks that my competition will do so and potentially not just marginalize me but potentially disrupt me. That’s the tension that customers are feeling.
And that’s the whole purpose behind our MAAP program. With the MongoDB Application Program they come out with a validated set of reference architectures for a bunch of use cases, built-in integrations that can get started right away, and technical expertise to get you up and running as fast as possible.
But to your point, we’re not locking people into any one ecosystem. We work with any cloud, any LLM, any orchestration tool, any fine-tuning model. The point is, the only way to overcome this is you’ve got to start somewhere. You can’t just sit on the sidelines [and] do nothing. But you also don’t want to put all your eggs in one basket. So our point is, get started, try some projects, learn. And through that experience you’ll find out what’s important to you and what’s most impactful for you. And you’ll gain confidence – or maybe lose confidence, based on the technologies you’re using – and decide, ‘Okay, I want to change this component, maybe this orchestration layer I was using just doesn’t fit me, I’m going to try something else.’ But it allows them to learn without taking an exorbitant amount of risk.
Talk a little bit more about how the MAAP initiative addresses these issues. How will providing all these components in one environment help people get past the fear and the feeling of being overwhelmed.
Well, one thing is, people need databases. So it’s not like your database technology is going to change. You can feel pretty comfortable betting on MongoDB. If they feel like, “Hey, I’ve started with OpenAI, but I’d rather use [Meta AI] Llama.” It’s very easy with APIs to all the large language models. And so that becomes relatively easy to point your API from one foundational model to another if you’re using a certain set of orchestration layers, tools like llamaindex or LangChain. It’s not that difficult. We have integrations with all those tools so that you can decide which one fits the way you want to work.
And then there’s fine tuning tools, and so on and so forth. You can still use Fireworks for AI integration and model hosting capabilities. So depending on what your model is – some customers want to do everything in house, they want to use open source and they may run the models in house. Some customers want to start in the cloud. You can start with Amazon, you can start with Azure. But if you find [that] most of your data is in AWS, maybe you stay on AWS, we give you that flexibility to do so. We’re the only one that partners with all three [cloud hyperscalers].
We’ve written quite a bit about MongoDB’s strategic relationships with the three hyperscalers: AWS, Microsoft Azure and Google Cloud, and how MongoDB works with them even though some of them have competing products. What’s their role here?
We’re very popular in all three clouds. There’s tons of MongoDB usage in all three clouds. [The hyperscalers] are doing it to ensure that their own customers are using MongoDB to get access to all the Mongo DB capabilities. And it’s not just from an integration point of view. It’s from the go-to-market – we’re in all the co-sell programs they have, we’re on the first-party consoles. We’re the only ISV featured in all their startup programs. So startups, whether they’re building on AWS, Google [Cloud] or Azure, can leverage MongoDB. Those are examples of the fact that we really try and make it easy for our customers. No matter where they are, they have the freedom to run their workloads anywhere.
What kind of opportunities does the MAAP initiative create for MongoDB’s partner base beyond the specific launch partners, including systems integrators (SIs) and ISVs?
Boutique SIs are part of the program. And the reason we started there is because we want to start with partners who really know MongoDB and really know some of the integrations we’ve built. In terms of people, Accenture and other large SIs have lots of resources, but not all of them are trained on MongoDB. You can expect us to expand to the larger SIs. But we wanted to start out of the gate with people who could really hit the ground running.
I do expect the partnerships to expand over time [and] you will see us adding more partners to the program. There’ll be regional partners by geo[ography], there’ll be industry-specific partners by vertical industry. And there’ll be domain-specific partners with particular use cases or technology.
The key takeaway I would want to leave with you is that partnerships are a core element of this. We know we can’t do it all on our own.
We wanted to come out of the gate – what customers really appreciate is a perspective and a point of view. What they don’t want is, ‘I can do anything you want. Tell me what you want.’ That’s not very helpful. ‘Give me a starting point, give me a place where I can start and get going.’ And as they get more experience and all that they’ll start outperforming their own point of view. We’re trying to help them.
And when we talk to customers, we’re not cloud dependent or partial to any one cloud. We’re not partial to any one LLM. And so customers feel like, ‘Okay, I can trust your feedback because it’s not like you have one set of tricks you’re trying to sell me.’ My point is, it’s not like we have a finite set of solutions and every answer is our own solution. That gives us more credibility with our customers.
In addition to MAAP, there were a series of announcements at the MongoDB event including the general availability of MongoDB Atlas Stream Processing, the general availability of MongoDB Atlas Vector Search integrated with Amazon Bedrock, and collaboration with Google Cloud to optimize Gemini Code Assist for building applications on MongoDB. How do partners benefit from these expanded capabilities and what kind of opportunities do they create for partners?
So what partners can do – because we have a lot of people who are building applications on top of MongoDB – is that now they can have a much more simplified architecture to do things like event-based applications with stream processing. It’s all designed around dealing with live data and data in motion. That’s happening. And you have to react to those events and figure out how you respond to events. And so we make it easy for partners to build these event-driven applications, which to me is the modern app of the future. You can’t have a static app that can’t deal with new data. For our partners, it just enables them to have a much more simplified architecture because the stream processing and the OLTP engine of MongoDB is all essentially one platform, so it’s much easier and much more liberating to build on MongoDB than to have to use a bunch of bespoke technologies.
The same with Edge Server. We’ve talked about how to run in the cloud, all the processes on the cloud. But as you know, if you have customers now, like a retail customer who has stores in a particular geography, a country or region, they need to do local processing. Bringing intelligence down to the edge is an important element, you can’t run everything in the cloud. Over time you’re going to start even running models on your devices – Apple is already making some noises about how that’s going to happen soon. So you need to be able to process and deal with models that could be in devices. Or it could be on edge servers, say, in a store, it could be in a hospital where you need very quick access. There’s a whole host of new use cases that are going to emerge.
In your keynote you mentioned MongoDB’s investments in a half-dozen “boutique” systems integrators including PeerIslands, Pureinsights and Gravity 9. They are obviously a key part of MAAP. Do you foresee additional investments like this?
One of our board members is a gentleman named Frank D’Souza who was co-founder and CEO of Cognizant [Technology Solutions]. And as you know, Cognizant became a very, very large systems integrator. What he shared with us – it was his advice that we ultimately acted on – was that the challenge the largest SIs have is that they’re so large [and] they’re also so distributed, [that] just because you get one team to understand and apply MongoDB for one or two accounts does not mean the rest of the organization has the capability or the wherewithal to do so.
So his point was, invest in SIs who become experts and know how to build applications on top of MongoDB. And then ultimately, once they get to a reasonable size scale, you know, someone like an Accenture or an Infosys or TCS may come in and acquire that company. Accenture buys a new company, like, almost every week. That’s the way that they inculcate new technologies into their organizations because it’s too hard to do so organically.
And so that’s part of our strategy if we want to inculcate a bunch of systems integrators who have become deep experts in MongoDB, whether it’s just building new applications on MongoDB or migrating legacy applications onto MongoDB. And we see huge demand for that.
And by the way, we’ve already done a lot of business with Accenture and TCS and Infosys and Cognizant and Capgemini and all that. In fact, I have a call with one of the top leaders of Accenture tomorrow. We don’t want to be in the systems integration business. This is more of a way to help build up a competency that one of the large SIs may find very attractive. And there’s analogues: Salesforce does this themselves. They had a tough time getting large SIs to focus on Salesforce technology. As their business grew, they invested in these boutique SIs that ultimately were acquired by the larger SIs.
I know MongoDB has been quite bullish on its financials given its 27-percent revenue growth in fiscal 2024. How is this year looking? Are you seeing any impact of the uncertain economy? And is this AI wave immune to the economy?
I can’t comment too much on the [fiscal 2025 first quarter] quarter because that would be non-public information. A year ago we started seeing signs of the economy slowing down – that’s when the Fed [U.S. Federal Reserve] was raising rates [and] interest rates were spiking. And what we said was, you know, organizations are kind of in a conundrum, they have to figure out how to do more with less, but they also have to invest in AI to really figure out how they can leverage this new technology for a competitive advantage.
And so I think customers are very interested in becoming more confident on AI and figuring out how, you know – the questions they ask are where to get started. ‘What use-cases should I focus on and what tech stack I should use?’ That being said, they’re being very prudent because there’s got to be a compelling return on investment for them.
Source link
lol
In a wide-ranging interview with CRN at last week’s MongoDB.local NYC event, CEO Dev Ittycheria offered his views on why AI development is about to enter a more “transformative” phase, the thinking behind the company’s MongoDB AI Application Program to accelerate AI application development, and the role partners are playing in the AI wave. MongoDB…
Recent Posts
- Arm To Seek Retrial In Qualcomm Case After Mixed Verdict
- Jury Sides With Qualcomm Over Arm In Case Related To Snapdragon X PC Chips
- Equinix Makes Dell AI Factory With Nvidia Available Through Partners
- AMD’s EPYC CPU Boss Seeks To Push Into SMB, Midmarket With Partners
- Fortinet Releases Security Updates for FortiManager | CISA