Gold prices steady ahead of Fed decision, Trump’s tariff deadline
On Tuesday, 13 May 2025, Datadog (NASDAQ:DDOG) participated in the 53rd Annual JPMorgan Global Technology, Media and Communications Conference. CEO Olivier Pomel discussed the company’s robust growth, driven by digital transformation and cloud migration, while addressing challenges such as geopolitical concerns and the complexity of modern systems. The conversation highlighted Datadog’s strategic investments and the role of AI in shaping future growth.
Key Takeaways
- Datadog’s Q1 results show strong growth, with CRPO up 30% and a rise in eight-figure deals.
- AI plays a crucial role in Datadog’s strategy, with significant investments in AI observability products.
- The company sees limited impact from geopolitical issues on cloud adoption.
- Datadog is investing heavily in engineering and sales capacity to capture market opportunities.
- The company is optimistic about AI-driven growth and enterprise adoption of AI technologies.
Financial Results
- Q1 Performance:
- Current Remaining Performance Obligation (CRPO) growth accelerated by 30%.
- Increase in eight-figure deals, from 1 in the previous year to 11 in the current Q1.
- 45% of the Fortune 500 are customers, with a median Annual Recurring Revenue (ARR) of less than $500,000, highlighting growth potential.
- Market Position:
- Datadog is outpacing industry growth and gaining market share.
- Investment Strategy:
- Approximately 30% of revenue goes to engineering investments.
- Sales capacity investments have yielded strong returns and productivity.
Operational Updates
- Cloud Migration:
- Cloud migration remains robust, with AI accelerating this trend.
- Customer Base:
- Over 10 AI-native companies exceed $1 million in ARR for Datadog.
- AI Natives:
- AI natives account for 8.5% of ARR in Q1.
- European & Canadian Data Hosting:
- Minimal impact from potential hesitancy in using US cloud providers due to a lack of viable alternatives.
- Sales Capacity:
- Positive results from sales capacity investments made in the latter half of the previous year.
Future Outlook
- AI-Driven Growth:
- AI-native companies are expected to predict future demand from larger enterprises.
- Long-Term Opportunity:
- Opportunities in managing data residency laws across different geographies.
- Enterprise Adoption of AI:
- Enterprises are expected to follow AI natives, starting with third-party models and scaling to homegrown models.
- AI Code Generation:
- Significant potential in AI code generation for monitoring, understanding applications, and ensuring code safety.
Q&A Highlights
- Cloud Activity:
- Datadog focuses less on quarterly fluctuations in hyperscaler revenue due to supply-side factors.
- Trade War Impact:
- Minimal impact from trade wars, as Datadog’s solutions are part of transformational investments.
- AI Native Companies:
- Diverse group involved in AI infrastructure, model building, agent development, and consumer applications.
- LLM Observability:
- Datadog’s LLM Observability ensures models are operational, fast, correct, safe, and valuable.
- DeepSeek Impact:
- Enthusiasm for lower-cost AI models could lead to broader adoption.
For more detailed insights, please refer to the full transcript below.
Full transcript - 53rd Annual JPMorgan Global Technology, Media and Communications Conference:
Mark Murphy, Head of Software Research, JPMorgan: Okay. Good afternoon, everyone. I am Mark Murphy, head of software research, with JPMorgan. Great pleasure to be here today with Olivier Pomel, CEO and cofounder of Datadog. So Olivier, I just want to say thank you for taking the time to be with us again, and it’s great to have you here at the conference.
Olivier Pomel, CEO and Co-founder, Datadog: Thank you. Great to be here in Boston.
Mark Murphy, Head of Software Research, JPMorgan: So it’s interesting when you run through and look at the stats. Datadog is one of only four enterprise software companies with at least $3,000,000,000 in revenue and growing mid-20s plus. The other three are Palantir, CrowdStrike, we were just talking about, and Snowflake. So, clearly, something very rare is happening at Datadog. Can you explain in layman’s terms what is the problem that you’re out there solving for customers?
And what do you think is the phenomenon that’s fueling that level of growth and scale?
Olivier Pomel, CEO and Co-founder, Datadog: Yeah. So, as a reminder, we do observability and security for cloud environments, which means we sell to engineering teams primarily. And we help them understand whether their applications, their systems, their AI, all of that is working properly, is performing, is doing what it’s supposed to do for the business, and is secure. So the the bigger trends that make us successful are digital transformation, cloud migration, and now AI transformation. This is just the specific motion or specific trends.
But if you zoom out, really, the problem we solve and how we keep delivering more value to our customers is that help them make sense of all of the complexity of their systems and their applications. If you replay what happened over the past fifty, sixty years in technology or in software in particular, there’s been a dramatic increase of productivity. It’s been easier and easier to write applications. There’s been more and more components off the shelf that you could combine, whether that’s software libraries, cloud components, SaaS, advanced languages, now AI models. What you see happening pretty much everywhere is an escalation of complexity, a complete explosion of complexity, and the humans are having trouble keeping up with that.
And that’s what we help them with.
Mark Murphy, Head of Software Research, JPMorgan: So, Olivier, you have over 30,000 customers. They’re using Datadog to observe what’s happening all across their own hyperscaler environments and basically to keep it running. So, it gives you quite the vantage point to comment on and really try to understand the trends. And we look back on it, and interestingly, in Q4, while, you know, the stock market was was rising, it was it was a very disappointing, quarter across all four of the of the top hyperscalers. All four of them either missed or or guided below in q four.
And then after that, you had the threat of a trade war coming in. The market was sliding. And now in Q1, the hyperscaler results actually felt a little more stable. When you look specifically at Azure, it accelerated. You know, it accelerated and surprised quite positively.
How are you reading the tea leaves on, you know, cloud activity for for ’20 ’20 ’5 amidst potentially a trade war? Certainly, the headlines are changing every day.
Olivier Pomel, CEO and Co-founder, Datadog: We try not to look too closely at what happens quarter to quarter to the major cloud providers. And one reason for that, at least in relation to what we do at Datadog, is that a lot of those generations and changes in revenue are tied to the supply of GPUs, for example, and things that are, I would say, somewhat decorrelated from market demand and broad trends. So you’ll see the longer term, all of that even up, but quarter to quarter it’s quite hard to make sense of it. From our vantage point, though, what we see is that cloud migration is alive and well. It can accelerate a little bit at times, slow down a little bit at times, but it remains around a certain trend line, and we expect it to continue for the very long term.
We also expect it to accelerate and have a longer runway, thanks to the AI transition that is also happening now. Of course, digital transformation and cloud migrations are prerequisites to AI transformation. I think all of our customers and we can talk more about that, but all of our customers, whether they’re AI natives or traditional enterprises, all of them realize that.
Mark Murphy, Head of Software Research, JPMorgan: Okay. So core trends, healthy and resilient. And you’ve been consistent on that on stage here, I think, over the course of several years. Thinking back to March and April, we did hear some feedback that companies in The EU and actually Canada becoming a little hesitant, right, to put more data into a US cloud provider, Amazon, Azure, Google, etcetera. And there was this comment that they were realizing that they need an Airbus of the cloud, right, to maintain independence, a local cloud provider.
Do you see anything tangible happening there? Do you think the bark is louder than the bite?
Olivier Pomel, CEO and Co-founder, Datadog: We do hear that feedback. So, the last few times when I’ve been in Europe, I’ve heard definitely the intent of owning more of the data, hosting more of the data locally, giving more business to local players, as opposed to relying too much on global players and in The US in particular. Now, that being said, we don’t actually see that much of an impact at this point because there are no viable options outside of the major cloud providers. If you go next step down in technology, it’s quite a bit less performant and quite a bit more expensive. I don’t think it makes sense, even if you have the intent of doing more business locally, to go for alternates.
Now, if you look at the way the world can evolve over the next few years, I do think there’s going to be more of a motion to at least host more of the data locally and have more local governance. And as far as we’re concerned, look, first of all, we’ll go where our customers go. If they want to run data a certain way, we will be there for them. But I also think it creates an opportunity for a company like ours. Hosting data in many different geographies, jurisdictions, and having a lot of different residency laws to comply with is a huge headache.
It’s a big problem. And I think most companies will need help, and they’ll need our help to manage that. So I think long term, it’s an opportunity.
Mark Murphy, Head of Software Research, JPMorgan: So it’s more talk than action. You do think there could be a trend line there, multi year, but net net opportunity for Datadog?
Olivier Pomel, CEO and Co-founder, Datadog: Yeah, I think the real battlefield, so to speak, in the short term is going to be AI, and the ownership and the control over the AI models. I think that’s going to drive the decisions that are being made for the rest of the data centers. To put it another way, if what you really want is to be able to build and host an AI model, and if for doing that the only available option you have is the existing hyperscalers, you’ll go with existing hyperscalers. You will not wait ten years to do it so you can build your own local equivalent before you can train AI models. Okay.
Mark Murphy, Head of Software Research, JPMorgan: Very helpful perspective. So, want to think back for a moment, Olivier, and the mood out there was very different three or four weeks ago. Coming into this earnings season, we were saying that investors were too pessimistic and we were emphasizing that really the hard data wasn’t changing as much as the soft data. So, we were saying the mood has changed, but the activity hadn’t really changed. For Datadog, the Q1 results, though, were very solid.
And what stood out was you had the bookings health. You had backlog growth or CRPO, it actually accelerated noticeably. It accelerated 30%. And then eight figure deals, you had done one a year ago in Q1. You did 11 this time.
So what do you think drove strength during Q1? And why did you have so many customers booking that much business given the kind of environment we’ve been in?
Olivier Pomel, CEO and Co-founder, Datadog: Yeah. Mean, I think there’s really two reasons. The first one is, we’re still early in cloud migration, and cloud migration is going well. One of the stats we disclosed was that we have only 45% of the Fortune 500 that are customers today, and the median AR with those Fortune 500 is less than half a million dollar a year. Which tells you that there’s tons of growth to be had with all those customers.
They’re still early in their migration in general. And for them as companies, their cloud spend is a relatively small part not only of their IT budget but also of their top line. Meaning that that’s the part that they’re going to invest in, that’s the part that’s transformative, that’s not the part that is at risk if they’re going to face issues in the short term. So that’s number one, a very healthy market. The second reason we did well is that we’ve invested over the past year, especially in the second half of last year, in building up our sales capacity.
And that sales capacity is coming online and we see great ROI and good productivity there. And even though we’re a leader in observability, think they weren’t updated, gotten their numbers this week on the market share. We’re also taking share. We’re growing faster than the rest of the industry. Okay.
Mark Murphy, Head of Software Research, JPMorgan: Cloud migration’s doing well. The sales capacity kicked in. You’re structurally gaining share. We had showed, data, you know, again, coming into this earnings, season, Olivier, that that the current environment was was actually looking it’s less severe than what we saw during the COVID lockdowns, and it is actually less severe than the onset of we we call it the software recession, was the second half of twenty twenty two. You know, we thought investors were expecting this environment to be worse than than those prior cycles.
Can you compare and contrast what you’re seeing now? Because you see the day to day consumption, but you’ve also you also see the pipeline looking forward.
Olivier Pomel, CEO and Co-founder, Datadog: Yeah. I mean, if you look at what happened, at least seen from our business, so during COVID actually was fine because that was the explosion of online, etcetera. What was trickier for us was the end of COVID and the flattening of the demand from all of the cloud native companies basically. Which were the ones that were spending big, that were done with that cloud migration, that were fully at scale in the cloud, that tried to save as much as they could in as short a time as possible. So that was fairly painful, and you saw that in all of the numbers we released at the time.
Today, though, if you look at the current situation, the companies that are growing faster, like those that replace the cloud natives, are the AI natives, and they’re accelerating. And they’re still fairly early in their runway. And the bulk of our business, the bulk of the demand we see is these larger enterprises that are still fairly early. And that, as I said before, only spend a small fraction of their OpEx and an even smaller fraction of their top line on the cloud, and that’s really what they’re investing in. So from where we stand, we clearly don’t see the same kind of pressure.
Now obviously, if things take a turn for the much worse, everybody’s gonna try and save money, it’s going to be more difficult for everyone at every single level. But today, in the numbers we have and what we see in consumption or in what we see in the booking size or the willingness of customers to do deals, we don’t see any impact.
Mark Murphy, Head of Software Research, JPMorgan: Have you tried to look at that specifically in the industries that are most heavily tariff impacted? We think of because to be fair, we had a wave of pre announcements from airlines. There were a bunch of retailers that had problems. There were automakers having problems. Is any of that looking like the canary in the coal mine right now in the data?
Olivier Pomel, CEO and Co-founder, Datadog: We don’t, but again, the span of the economy is fairly small at this point. So we had actually a great quarter in Q1 for traditional communities. Q4 also was great for these. We mentioned in our earnings call that one of our best new logo deals was a car manufacturer. We actually signed two car manufacturers that same day, I remember.
We signed a few airlines over the last few quarters and those are growing nicely. But again, the budget we see, the spend on us, that spend that is growing, is part of the transformational investment. That’s not part of the much larger carrying costs they have for their supply chain, their factories, their operations, their aircraft. That I think is what save money.
Mark Murphy, Head of Software Research, JPMorgan: Okay. So it’s more of an insulated pocket for you. I’m gonna talk for a moment about the AI native trend, Olivier. You know, Datadog has just clearly stood out for developing one of the strongest AI tailwinds, you know, really across the entire software landscape. The AI natives reached 8.5% of your ARR in Q1.
So, it’s really it’s actually Microsoft and Datadog are the two companies that are quantifying a really substantial tailwind at this point. Can you help us understand who are those companies that are driving AI for you? Then why do you think Datadog is so linked to it?
Olivier Pomel, CEO and Co-founder, Datadog: Yeah, so it’s pretty much the new incarnation of the cloud natives. Think if you started a company in the past two, three years, you’re very likely an AI native. If AI is not part of your pitch, you probably cannot raise money, is my guess. So that’s mostly newer companies. We do have revenue concentration in that cohort of customers.
We have one customer that’s now our largest customer that is meaningfully larger than the others in that cohort. But we also see a diversity of emerging winners in that cohort. So we now have more than 10 companies that are AI native, that are over 1,000,000 in ARR for us. And those are growing both as businesses and also in terms of their consumption of the cloud in general. And when you look at the makeup of those companies and the way it is they do, they cover the gamut of what you need to build in AI.
Are infrastructure companies for AI, there are model builders, there are agent companies, whether they are coding agents or legal agents or other kinds of business agents. And then there are also various applications that don’t necessarily build models themselves, but that are built on top of these models and that generate value based on that for consumers, for example.
Mark Murphy, Head of Software Research, JPMorgan: So if drill down into that and look at how they’re using it, I think Datadog isn’t really getting involved so much in the training side, right, where they’re where they’re building the models. You are involved in the inferencing stage. So, in other words, product, you know, reaches commercialization and and and then you’re getting involved. Inferencing feels like it’s a lot earlier stage to us and that it’s probably going to have more durable, more explosive growth because basically you just look around and say, well, there’s so many models that are still being built. Would you agree with that?
Olivier Pomel, CEO and Co-founder, Datadog: Yes. And to level set, I think a lot of the training today is still very, like it’s more of a research activity and it’s largely one off, homegrown, and I would say a smaller fraction of companies are doing that at scale. Inferencing is where the action is. That’s where you actually have to serve customers and you scale with the demand and you provide value. Typically, even when you’re a model builder and you ship a model and you have customers connect to it, that model just doesn’t live on its own.
There are other layers on top of it. There are databases behind authentication systems, firewalls, everything that you would find in a traditional application that needs to be monitored with it. Also, as these models and AI companies are becoming more and more sophisticated, the models don’t operate in a vacuum anymore. Many of these models get smarter and smarter by becoming agents and by using various tools. Those tools need to be run as well.
So if I take the example of what we build at VelDocs, we’re building agents to automate a lot of the work that SREs and engineers are doing. As part of that, there are models, but then those models run queries, they ask for data, they run automation scripts, all of these different things are applications that need to run. And so what we see is as more and more AI gets adopted applications grow, there’s a bigger diversity of components that need to be monitored, that’s exactly what we do.
Mark Murphy, Head of Software Research, JPMorgan: Okay. So it’s been an interesting journey, Olivier. For the last year or two, when I will ask institutional investors, what do you think of various AI products or copilots? A lot of people shrug their shoulders and seem a little unimpressed at what had been out there. We started commenting recently that the killer AI product for the buy side has finally arrived.
It is here. That product is called deep research. And the response that we get is very different when we ask about that. People say that it’s amazing. You know, it’ll take a project.
It’ll divide it into sub, subtasks. It will write Python code for you. Basically, it saves people a ton of time. What do you think the advent of these reasoning models is going to do for growth of inferencing?
Olivier Pomel, CEO and Co-founder, Datadog: We think we’ll see acceleration of the growth there. I mean, you’re right that the reasoning models and the improvements of models and their ability to use tools really help deliver more value. We’ve seen that internally. We see that with our customers that are using those products, and we see that with our customers that are producing those products. And so we think there’s going to be, as I said,
Mark Murphy, Head of Software Research, JPMorgan: many more diverse applications to monitor with these kinds of reasoning models. They have a different level of complexity, they have a different level of compute intensiveness. So, sometimes there’s eight GPUs clocking at once. Does that make it more important or more of a challenge to try to observe and monitor those environments?
Olivier Pomel, CEO and Co-founder, Datadog: Well, I would say it depends. If you’re a model builder, maybe you’ve built a lot of that technology already. You build the model, you understand exactly how it runs. Maybe you build some of the technology to understand what happens within the model. But even if you’re a model builder, if your product is say an agent that is going to crawl the web and that is going to use a browser to try and simulate your actions and book flights for you and things like that, which you’ve seen those products and those agents are there and they’re getting better and better every day.
You will need to run infrastructure that is crawling the web. You will need to run infrastructure that is running these agents in sandboxes or these browsers in sandboxes and recording them and storing the images. So you’re going to run a very, very sophisticated and diverse software stack. And all of that, at the end of the day you’ll see maybe 5%, ten %, twenty % of your compute is going to be the model itself, but the rest is going to be the rest of the applications that here to help the model, to feed the model with information, and to help the model do its job.
Mark Murphy, Head of Software Research, JPMorgan: And so, one of the big hurdles that these LLM providers will have is dealing They’re trying to deal with the bias in the models, the hallucination in the models. They’re trying to deal with the drift in the models. And it has come to our attention that Datadog can help them with that. Can you help us understand, what are the mechanics there? What does that value proposition look like?
Olivier Pomel, CEO and Co-founder, Datadog: Yes, so we have a product that is LLM observability. And the questions that that product answers are around, is my model First all, the basic stuff. Is my model up? Is it working? Is it fast?
How much is it costing me? So that’s the very basics. And then beyond that, you can ask, is my model correct? Is my model safe? Is it leaking data?
Is it saying what it’s not supposed to say? Is it saying what I’m expecting it to say? And then the last step, which is even harder and I think that part we’re still building is, is my model doing what it’s supposed to do for the business? In other words, when folks interact, if I have end users, they interact with a bot for example, or with a feature that involves some open ended thinking. What are they doing after that?
Are they buying more? Are they staying longer? Basically, are they doing what I want them to do with the system? So we’re doing all that. Think that’s with the current iteration of those models, and we expect those models to evolve a lot.
We already see a move from our customers using these models in chatbots to customers using them site agents that are always running and don’t necessarily require a human to prompt them. So we’re seeing an evolution there already. But if you zoom out even further and if you look at what might happen in the future, if more and more of the application is not coded but is emergent and is stochastic and is this model that is somewhat unpredictable in some ways, we see that there’s a ton of value to be provided by observability because the value goes from initially packaging and training model to understanding what it’s actually doing every day in a real situation with real users and how it’s changing over time.
Mark Murphy, Head of Software Research, JPMorgan: Okay, so that sounds like it would play into the AI tailwind that you have had, which you disclose that as coming from AI natives. I think, so it’s the big model builders, etcetera. One of the big questions is, when do we think enterprise adoption of AI is going to start kicking in? And how do we want to try to forecast it? Like, in other words, when do you think a big bank, a big retailer, a big pharmaceutical company is going to be done training a model and is going to actually be deploying it and then therefore driving revenue for Datadog?
Olivier Pomel, CEO and Co-founder, Datadog: I think the best way to think about it is to look at what the cloud natives or whether the AI natives are doing and see that as the future of what the rest of the market and the big enterprises are going to do. If you caricature, you can think of three steps in the AI maturity. The first step is you test applications with third party models. The second step is you scale those applications with third party models. And then the last step is keep scaling these applications but not with some homegrown models.
And if you look at the AI native companies, many of them are between step two and step three now. So we see a lot of companies that started by building on top of third party models that are reaching some market fit, that are growing very fast, and then that are trying to or starting to augment these third party models with homegrown models and maybe even replace some of those models with homegrown models in the end. When you look at larger enterprises, we’re between step one and two right now. So they’re between testing, and in some cases they’re starting to scale some of those applications. That’s where we are.
But when you look at the incredible growth of our AI native cohort, we see that really as a sign of the future demand we’re going to see from those enterprise customers.
Mark Murphy, Head of Software Research, JPMorgan: So what’s happening with AI natives will inevitably trickle its way out into the broader landscape of enterprise?
Olivier Pomel, CEO and Co-founder, Datadog: Yes, and you know, it’s very similar to the we’ve seen that movie before with cloud migration.
Mark Murphy, Head of Software Research, JPMorgan: For sure.
Olivier Pomel, CEO and Co-founder, Datadog: When we started the company, there was absolutely no interest from enterprises into the cloud. I remember the first time I pitched a bank you might know, I was going to
Mark Murphy, Head of Software Research, JPMorgan: say it sounds familiar.
Olivier Pomel, CEO and Co-founder, Datadog: But that tune changed pretty quickly. Think it became to be understood as not only viable for the large enterprise, but also a big competitive advantage and a true way to transform. I think with AI, everybody understood much faster than it was going to be a competitive advantage. I think there are still questions about the safety of it and how fast a transformation can happen. But to us, there’s no doubt that the larger prices are going to follow in the footsteps of the AI natives.
Mark Murphy, Head of Software Research, JPMorgan: Okay. So, AI, Olivier, is also impacting code writing itself very rapidly. And you may have seen the CEO of Anthropic, you know, recently said that in twelve to eighteen months, one hundred percent of all code could be written by AI. And I’m sure there’s a bit of hyperbole, in this, as always. But can you speak to how that trend might play out for Datadog?
Because I think, in theory, more code being written more rapidly, more applications being deployed. There’s just more out there that needs monitoring.
Olivier Pomel, CEO and Co-founder, Datadog: Yeah, that’s right. I would say the opportunity is even bigger because when you think of the whole continuum for delivering value in delivering applications, Right now, most of the time is spent coding still and conceiving the applications, and then after that, less time is spent bringing it to production and making sure it works right. I think as more and more code can be written faster and without necessarily the intervention of humans, we have these situations where the humans have all these suggestions, all these lists of things that the machines have produced, and then they’re the ones who need to validate it and make sure it actually works and secure and everything else. So we think we can do that. We think what becomes valuable and the problem that is truly valuable in the end is understanding how that code actually does what it’s supposed to be doing.
Is it helping the business? And also, is it safe? Is it running? How is it changing over time? What happens when all the components that interact with it change over time?
And how does it behave in production environment? So we think it’s a huge opportunity for us.
Mark Murphy, Head of Software Research, JPMorgan: Is AI helping Datadog itself? Is it helping you write code faster? Are there any other AI efficiencies? Is AI handling support tickets for you?
Olivier Pomel, CEO and Co-founder, Datadog: I’ll give you just a quick example on that. Just to show the acceleration. When we first started adopting Copilot, it took us more than a year to get the whole team to adopt Copilot. And the reason for that was that, for coding specifically. The reason for that is that it was fairly helpful in a number of cases, but also fairly disappointing in a number of others.
And as a software company that builds a lot of low level software, databases, optimization systems, and things that are, I would call, hard engineering problems. We have engineers who are quick to dismiss output that is okay but not great. Right. That thing is not, I mean yes, it’s giving me something, but it’s not great, so I’m not gonna use it. Now, fast forward a year later, when we started adopting coding agents, it took us just a couple of months to have the whole team pretty much adopt the coding agents.
And the reason for that is that it’s that much better. And everybody from the enthusiasts to the skeptics, everybody sees the value and start adopting them much faster. And as part of that, we see more and more of the code that is being written, or at least influenced by AI, and we think that that progression is not going to stop.
Mark Murphy, Head of Software Research, JPMorgan: How about the progression of DeepSeek? When DeepSeek kind of dropped onto the landscape, which was back in January, we had hosted a large investor call. And we had a contact saying it’s gonna reduce the cost of inferencing by 90%. And I think one of the questions is, does that cause a flywheel? You know, if the cost of building an AI model comes down, then are you just gonna have a lot more AI models coming out into the marketplace?
Do you see any lasting impact from it?
Olivier Pomel, CEO and Co-founder, Datadog: I mean we see much more enthusiasm for the models in general. I think there’s too many impacts. The first one is yes, if the cost is done by 90%, it means you’re gonna do 10 times more of it. Some things that were impractical before because the AI model was too expensive to run, and really if you wanted for it to have a good chance of being right, you needed to run it 100 times. Now you can do that.
Before you could not. So that’s one of the effects. The second effect we’ve seen is that it really was a wake up call for many companies that had sort of started to believe that you needed $10,000,000,000 in investment and 200 researchers to build differentiated AI models. And it turns out that you don’t. It turns out that many companies in their domain can have an impact, can innovate, can build state of the art models.
And we’ve seen many, many more companies start investing and start building those models. I think as a result, we probably are looking at a future where it is less likely that we will see the AI innovation concentrated in one or two players, and that there will be a more robust ecosystem. There will still be leaders, there will still be dominating companies that will capture large parts of markets where massive investments can be brought to bear against a variety of use cases. But I think we’ll also see a lot more specialized vendors and local vendors that will be able to innovate there.
Mark Murphy, Head of Software Research, JPMorgan: So, Olivier, in the remaining several minutes that we have, I do want to ask you about philosophically how you view investments, especially into headcount. We’ve we publish these statistics every quarter where, you know, we try to look at hiring trends across the software landscape. You know, basically basically, for the last two or three years, it’s been it’s been very sluggish. The after I got off the stage with Microsoft today, Microsoft announced a 3% layoff. And so, it feels a little different where you growth companies that are very healthy and very strong, and they’re trimming out some of the head count.
Datadog feels like more than any other company, feels like it’s investing to win. And you took headcount up 25% last year. What are you seeing differently with respect to this overall investment cadence?
Olivier Pomel, CEO and Co-founder, Datadog: I mean, see that we’re early and there’s so much white space, whether that’s on the product side or on the demand side and the market coverage, that we keep building the engineering teams and we keep building the sales capacity on the go to market side. We constrain by capacity in both situations. I would say the only limit that we’ve given ourselves is we invest around 30% of our top line in engineering, and we keep that going. And if we get more efficiency with AI, we’ll probably keep investing that because we’ll be able to produce more and we’ll be less constrained on capacity on that side. I think that’s the equation.
Mark Murphy, Head of Software Research, JPMorgan: How do you think about, in the last minute that we’ve got here, you know, we’ve always felt that the best software companies out there, the ones that you want to align with, They’re going to try to consolidate point products. They’re going to try to put it on a single platform. But the key thing to look for is they’re going to do it organically, right? They’re to be builders rather than acquiring. And you know, we see that very clearly.
One of our discussions said there’s growth in every Datadog contract that I see. Culturally, how do you think about preserving this level of organic innovation and kind of avoiding the pitfall of becoming the big slow moving company?
Olivier Pomel, CEO and Co-founder, Datadog: Again, we’re building a lot. We have a very humble attitude to customer conversations, terms of whenever we talk to customers, we’re here to listen to them telling us what the problems are and what works and what doesn’t. We’re not here to tell them how the world should work. And that gives us a lot of insights on what it is we need to build. And the other thing we do is, as a company, we’re fairly disciplined about understanding what’s valuable and what’s not.
And we do that by having fairly transparent pricing in terms of what we charge for, what we don’t. We don’t do heavy bundling. We keep adding features to the same stuff. Whether that works or not, we try to have very clean signals to keep ourselves honest so we know what’s valuable and what’s not. And we want to keep that going as long as we can.
Mark Murphy, Head of Software Research, JPMorgan: Great to see the discipline. Thank you for keeping the Internet running for all of us. And thank you so much for joining us again, Olivier.
Olivier Pomel, CEO and Co-founder, Datadog: Alright. Thank you. I appreciate it.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.