Elastic at Rosenblatt’s AI Summit: Embracing AI Transformation

Published 12/06/2025, 00:02
Elastic at Rosenblatt’s AI Summit: Embracing AI Transformation

On Wednesday, 11 June 2025, Elastic NV (NYSE:ESTC) took center stage at Rosenblatt’s 5th Annual Technology Summit - The Age of AI 2025. The discussion, led by Steve Kearns, General Manager of Search, offered a comprehensive look at Elastic’s strategic focus on AI, highlighting both the opportunities and challenges as the company transitions into an AI-driven entity.

Key Takeaways

  • Elastic is evolving into a Search AI company, emphasizing AI-driven solutions across industries.
  • The company is integrating with major AI platforms and providers like NVIDIA, Anthropic, and OpenAI.
  • Elastic is investing in security, enhancing SIEM and AI features to improve threat detection.
  • The acquisition of Keep Alerting aims to boost workflow automation and security capabilities.
  • Elastic aspires to be the most open platform for AI application development, focusing on agentic AI workflows.

Operational Updates

  • Elastic is positioning itself as a key player in AI by offering high-performance storage, search, and analysis of both structured and unstructured data.
  • The company markets itself in three primary ways: developer tools, observability products, and modern security solutions.
  • Integration with leading language model providers and platforms, such as AWS Bedrock, is underway.
  • Elastic’s acquisition of Keep Alerting enhances its capabilities in workflow automation, particularly in the security domain.
  • The partnership with NVIDIA’s enterprise AI factory establishes Elastic as the default vector database.

Future Outlook

  • Elastic aims to be the most open platform for building AI-powered applications, with a strong focus on agentic AI.
  • The company envisions automating complex business processes by providing the right context for AI models.
  • Elastic plans to leverage its search and relevance technologies to become a critical context provider for agentic AI workflows.

Q&A Highlights

  • Customer Adoption of GenAI Applications: Elastic reports a mix of customers in production and experimental phases, depending on use case complexity.
  • Elastic’s Value Proposition in Security: The company offers a comprehensive SIEM and threat hunting platform with prebuilt security rules and AI-powered attack discovery.
  • Implementation of Modern SIEM: Elastic provides easy integration with existing systems, supported by robust migration tools.
  • Use of Observability Data: Elastic’s platform allows the use of observability data for generating business insights.

For a deeper dive into Elastic’s strategic directions and insights from the conference, refer to the full transcript below.

Full transcript - Rosenblatt’s 5th Annual Technology Summit - The Age of AI 2025:

Blair Abernethy, Software Analyst, Rosenblatt: Good afternoon, everyone. It’s Blair Abernethy here, software analyst with Rosenblatt. Back with another inner infrastructure software company, Elastic. With us is is Steve Kearns. Steve is the general manager of search at Elastic, joining us today.

Thanks, Steve, for for joining us. I think you might be on mute there. Yeah.

Steve Kearns, General Manager of Search, Elastic: Yeah. Alright. Happy to be here. Thanks for happy. I

Blair Abernethy, Software Analyst, Rosenblatt: I just wanted to you know, for the audience, if we could just set the context a little bit, for those who might not be, that familiar with Elastic. Just give us a brief overview of the business as it stands today and and, you know, some of the problems or or challenges that that you help customers, to address? And then and then give us a us a little bit on, your background as well, which would be helpful.

Steve Kearns, General Manager of Search, Elastic: Yeah. Yeah. That’s great. Well, so with me, I’m Steve Kearns. I am, as you said, the GM of the search business here at Elastic, and I’ve been with Elastic for almost eleven years.

So I’ve watched the company evolve from the very early days, helped to to sort of turn us into the company we are today. Most of my background has been in building out the core platforms of Elastic, the core underlying technologies, and then bringing those to market through the different sort of ways that that we go to market. So I’ll I’ll kinda touch on that a little bit. But for me, my almost my entire career has been in the search world or the information retrieval or information extraction parts of the world. So this is really a natural home for me in a lot of ways.

So at at Elastic, you know, as a company, we we call ourselves the Search AI company, and that describes sort of like the ethos that we bring to building our products and also sort of how we compete in our own way from our own perspective in the different markets that we participate in. And when I think about it, the idea behind search, it’s really about how do you provide a high performant way to store, to search, and to analyze vast amount of data, whether that’s structured data or unstructured data? And when you have that data at scale, the data keeps growing. It’s growing faster. And and there’s a belief that we have that the companies if we can help companies make better use of that data, that those companies will be more successful and that we can be more successful as well.

And if you think about, like, making use of this this kind of data, right, whether that’s log messages, whether that’s, you know, ecommerce information, whether that’s transaction histories and trade information with these fast moving real time data flows, You know, the the concepts of search really matter. How do you find the most important transaction? How do you identify fraud within that? How do you bring the right product to somebody who’s looking for to purchase something on your ecommerce site? Like, these these different areas, they all share that need to find the right information.

Sometimes that’s a document or it’s a product or sometimes that’s a an anomalous transaction in a flow of transactions. But that idea of working across massive amounts of data is really important. The part of being that AI Search AI company is the AI piece. And if you think about what does it mean to have an AI ready or an AI powered platform that you’re going to be building on or using, one of the most important parts is that part of can you get back the right piece of information quickly, with the right kind of filtering and power and security and all of the other things that it takes to build a real production application, how do you do that? And so so, you know, the the when when you think about it, this is really what Elastic does.

The the heart of everything that we do is a a search engine we call Elasticsearch. That’s the core of all of the offerings that we we have. It’s our single data storage where we put the vast majority of our investment as a differentiated core technology that we build everything on. And we sort of We sort of go to market then in three ways. One is we take Elasticsearch and a number of surrounding technologies to market for developers,

to We’re gonna give you the best set of tools to build compelling generative AI or traditional applications. There shouldn’t be a better platform to build a modern application on than Elasticsearch, especially if you have ambitions to, make that a a more engaging, more interactive kind of an application, or you wanna build sort of GenAI engagements or conversational AI, agentic AI on top. Like This is the toolkit for developers to build, and then we use it ourselves. We use that toolkit to build an observability product.

So logs, metrics, traces. It’s a combination of unstructured log messages, structured metric and traces data, and how do you look across those to identify potential issues happening within your operational environment. Like, if your internal applications are down, your business doesn’t run. And so the speed, the time that it takes to understand that, the efficiency that you can bring with the right platform and the right technology really makes a difference in the observability space. And that’s especially true when you think about how the the environments that people are running their applications in are more complicated with this the advent of micro microservices running on Kubernetes.

Like, the number of parts of a typical application are exploding, and the ability to look across those and to see, you know, the relationships, to see, you know, what’s actually going on underneath is important. And that complexity matters on security. So the way that we go to market is as a a security product, a modern day SIM plus plus, like above and beyond what you might have traditionally thought of as a SIM. But it’s the same challenge there. The the attack surface for a typical company has never been more complex than it is now, and it’s only in increasing.

And, again, how do you look across those different signals, the security events, the audit logs, the network traffic data that’s happening? How do you look across all of these signals to find the real attacks? Each individual piece might not look scary, but when you combine them, wow. You’ve now completed the MITRE attack framework for, you know, identifying a really complicated, sophisticated intruder, sort of when you’re walking around your internal systems and putting your business at risk. And so when we look at it in that way, this idea of taking this powerful core set of search technology, if you will, or sort of powered technology, they’re both, as a core technology and as a set of principles, we bring that to market for developers, for observability and DevOps engineers, and for security professionals, for CSOs, and and security teams.

Blair Abernethy, Software Analyst, Rosenblatt: Okay. That’s very helpful. And and if we if we look at the the your your traditional search market, let’s just talk about just, like, before the advent or the the explosion of AI that’s happened in the last number of years, particularly in the last, three years. So what would customers and and and I guess also it’s important for people to understand that Elastic Elasticsearch is a is an open source product. Right?

So Right. Yeah. But, you know, what would what would be sort of the typical use cases for Elasticsearch prep before AI came along?

Steve Kearns, General Manager of Search, Elastic: Yeah. Yeah. It’s a great question. I sort of I I sort of bucket this in a few different, areas, but you you you could call it search powered applications or search based app or just applications. And, if you were to look the the use cases for building on Elasticsearch, they’re super wide.

From things you might traditionally think of as search, like ecommerce or even like document management systems, legal systems, you know, all of these different kinds of places where I’m very clearly running a search, and I’m expecting 10 blue links back where I can go and investigate any of those kinds of systems, classic enterprise search, you know, even, you know, many of the applications that you’ll use sort of on your computer or on your your phone today, they are are these kinds of search powered applications. There have been a lot of applications that are search powered, though, that you might not be thinking of as search necessarily. So the the use cases can span all the way to things like some of the the early and and really fun examples are, like, transaction tracking. If you are a bank and you want to understand, you know, where where is all of the time going when somebody, you know, creates a a transaction, how does that spread across my systems? How do I understand the flow of information?

How do I visualize who and where and what part of my network is is dealing with the most of that? That’s a custom application that you can build on top of a a data store or search engine like Elastic. Even things you know, if I were to think about, like, logistics, Elastic is really good at geospatial data. And so if you were a logistics company trying to figure out where are all my trucks, what packages are on the trucks, when will they arrive, how do I plan for that, you need to have the ability to build an application that can take that into account. And so the range of applications that you can build on top of a a search engine, it’s pretty wide.

And the reasons that people will pick us, you know, just the raw capability. Like, can I just do search relevance? Can I get good results back? Is is an obvious one. But there’s also an element of of search that, like, every field, every piece of information you put into Elastic needs to be searching.

That’s the default in our system, and that’s very different from if you’re dealing with a traditional relational database where you would say, I put a bunch of data in, and then I separately define which fields do I want to be able to search on or or query against or filter on. And then I have to decide one by one, and you’re making that sort of decision late. Then I change the database schema and I wait a while for that to apply and work with the DBA. With a system like Elasticsearch, all the data you put in is searchable. So if you as the developer say, ah, I now wanna filter on this other field I didn’t think of before, or my user has asked me to go and and extend the application to look at it in this new way.

For us, that’s trivial if you’ve built the application on top of a system like Elasticsearch. And so the range of use cases is super wide. It’s like the traditional search applications for us are like, we are a NoSQL database. We are a search engine. We are also a vector database.

We are also a geospatial engine, and we’re also a a columnar store for doing your sort of rich analytics very efficiently. So the the use cases traditionally have been super wide that people have used Elastic for, and that’s one of the challenges sometimes as we talk about, like, what are the use cases? What are the the scenarios? But, anyway, that that that gets it started. I’m happy to go into a few more specifics if you’re interested.

Yeah.

Blair Abernethy, Software Analyst, Rosenblatt: That that’s really helpful. I mean, it does show that, prior to the to the, the advent of this AI wave, that started a couple of years back, you know, there’s many many use cases that are ongoing for for you guys. Let’s talk about it. Let’s talk about AI in terms of sort of what’s Elastic’s overall strategy when it comes to AI, and how are you sort of, approaching, working at it, I guess. And you can define it however you want, but the to me, it’s a tech it’s such a horizontal technology.

It applies to many things your customers are doing, but it also actually applies to your own business in a lot of different ways.

Steve Kearns, General Manager of Search, Elastic: Oh, absolutely. Yeah. This is it’s one of the fun parts, I think, of being at Elastic is that we’re both building these core building blocks for developers, and then we’re using those building blocks to build our observability and our security solution. And so we actually get to see it from kinda multiple perspectives in terms of what does it take to actually build the right infrastructure, the right components, and then the right applications using sort of AI. And so for us, when when I think about the, maybe I’ll start on the developer platform side, like, what are the tools?

How do we think about the building blocks that get built? And and which of those do we build? And which of those do we partner for? And then I’ll I’ll talk a little bit about how we use them in our observability and security solutions. But for for the developer platform side of things, you know, we think about vector search and semantic search.

In a lot of cases, this idea is really about getting better results back. If you think about, like, when I used to run a search and I would I would get 10 blue links back, as a user, I know what my job is. My job is to look at the links, click the ones I think might have the answer, then go read it, and and decide if I got the right answer. And if not, keep going. When you start to imagine a system that’s more conversational, now I get just one answer back.

And so the the answer had better be right or I lose trust in that application very quickly. And so the importance of getting the right information back and getting that into the context of the model and letting the model use that to answer properly is significantly higher. And so when we think about our role in in this sort of AI ecosystem, job number one is making sure that you can get the right answers to give you the tools to get great relevance out of any dataset that you might be working with. And to do that, we think about this in a couple of layers of things that we can sort of build. On on the one hand, we need to be the best vector database because one of the ways the one of the key techniques for getting better search relevance, better results back is is using semantic search, using not just the words themselves that are in the documents in the query, but the meaning behind them.

And so when you hear us and and others talk about sort of vector search or semantic search, we’re really trying to say, how do I match not just the terms, but the ideas, the concepts, and the meaning? And to do that, it takes a bunch of of layers of technology, and we try to provide as much of that out of the box as we can and give you the flexibility to bring the rest or customize that further on your own. And so to do vector search, you need to take the text of the query and of the documents and sort of generate the embeddings or to create the vectors from the meaning of those words, and you need a language model for that. We provide a party set of models ourselves. We provide an embedding model that we call LCRT.

It’s a competitive, lightweight, efficient model that, again, is on by default in our cloud. So it’s like you can walk up to the system, and in seconds, you can be doing semantic search on top of data that you’ve brought to Elasticsearch. Elasticsearch. It should be that easy. And when you’re using us, especially in cloud, it is that easy.

And so that’s part of the reason that we provide these party retrieval models. We have the same thing on the what’s called the reranking model. And so it’s just another part of that. How do I get the best results? I start by bringing back the best candidates, then I have an option to rerank them to say, got 15 candidates, but which one is the best?

Which two are the best? How should I order those, coming back? And so we provide these party embedding models and reranking models, directly from Elastic, available by default, sort of in our cloud, and it’s a really nice experience for users.

Blair Abernethy, Software Analyst, Rosenblatt: Are are those open source, or are those on on your cloud subscription?

Steve Kearns, General Manager of Search, Elastic: Yeah. They’re they are available to run sort of anywhere our products are available, but they are not free to use. These are part of our paid capabilities. And so the Elser model and the Remrank model, just like you can consume them very easily in our cloud and available and on by default, if you are a self managed customer running and operating the the the software yourself, no problem. Like, they you can very easily sort of with with a single API call, get those installed locally and and be up and running very quickly.

Blair Abernethy, Software Analyst, Rosenblatt: Okay. Excellent. Excellent.

Steve Kearns, General Manager of Search, Elastic: The the part of this that’s important, I think, for us in in where we play is we wanna provide that simplified experience. So one way we do that is with these party very competitive models. They’re just available for you, very easy to use. We also wanna make it very easy to test and to work with other systems. And so we have this ambition of being the most open platform to build on.

And so if you choose, you say, hey. Your model’s nice, but I I’m already familiar with this other embedding model or reranking model. Great. We have integrations with all of the major providers, both of the LLM side of things, so Anthropic, OpenAI, all of the rest, as well as all the places that you would wanna run models. And so if you are on AWS, great.

We’ve got a nice integration with Bedrock. Whether you wanna run embedding models or or large language models and connect them to the system, very easy to do that. And then we provide other layers that make building this stuff easier. So something that we call the AI play this is a simplified experience to say, hey. I’ve got data in Elastic.

What would it be like if I chatted with it? And what if I tuned and tweaked to the relevance models or my query differently? Do I get better results? Do I get worse results? And so we provide this as a right out of the box capability in a minute.

You can now be chatting with the data inside of Elasticsearch, and you can say, hey. I’d like to see how, you know, I don’t know, Claude four or Sonnet does versus, you know, the current o o three or something or o three Pro. And so you can very easily now plug in these different back end models and say, hey. How does that affect my chat experience? Now let me change my queries.

Let me change my embedding model. And giving you that quick feedback cycle that as a developer is so important to decide, is this gonna work? And if it’s gonna work, I’ll invest the next amount of time to go to the next step. And and so these tools, this simplified getting started experience for developers is huge, really simplifies that onboarding, that testing process, that iteration process that it takes to get good answers and good answer quality. And so I think that’s that’s sort of the answer for us as a developer.

We wanna be providing as much out of the box ease of use as we can with while still letting you open the box and really configure it right to the nth degree because getting the right answer is what determines a a a successful GenAI powered application versus one that’s not successful. There’s a lot that goes into that, but if you’re not getting the right information to the model, you’re never gonna get the right answers out, and your users will lose trust very quickly.

Blair Abernethy, Software Analyst, Rosenblatt: Are your customers, Steve, are they are they, would you say they’re still largely in the experimental stage in building, their GenAI applications, or, you know, are they moving them to production? Are you seeing more movement to production? And then I wanna able to let’s leave, AgenTik AI out of the out of it for the discussion for a minute because we’ll come back to that.

Steve Kearns, General Manager of Search, Elastic: It’s a great question. I think every company is sort of on their own maturity journey is maybe the right way to say that. So we have certain customers, and I know Ash has talked about this in some of our earnings calls. Like, a a leading automotive company, has a number of these generative AI powered applications internally in production already, and they’re building out more of those as they go forward. And it’s great there because they’ve they’ve identified a pattern, like a system for what’s the technology layers that we’re gonna use, how do we bring it to production, how do we evaluate its success.

Other companies are much earlier in that cycle, and so we sort of see people along that spectrum. In fact, we’ve got a number of folks and, again, another one that Ash had mentioned was a sporting goods retailer in North America. They are already using us to power the ecommerce portion portion of their search with traditional lexical search and a lot of advanced configuration around that. And they’re saying, we wanna bring semantic search or vector search to improve those results a little bit further because we can see the benefits of that in our early testing. How do we scale it?

How do we bring it out? They’re starting that journey with us, which is very exciting. And this is a common pattern that we’ll see is that people will find one use case that’s working for them, and they’ll invest in that because it’s the biggest, or they’ll find one specific use case that’s manageable and high value where they can prove out a technology architecture, bring that use case to production, and then that becomes a standard that they’re gonna build on and expand from. And so I it’s like these are still there’s still a lot of learning that happens at each one of the the sort of customers that we see, but we’re seeing people progress through that journey, and and we’re seeing them find that success. And so we are seeing these applications reach production, but it’s still a long journey for a lot of folks who are new to building this for just the time.

There’s a lot of layers that you have to think about evaluating accuracy differently. It’s not about did did the right answer show up in the top 10 that you can skim, but it’s like, is the quality of the answer that the LLM gave correct enough of the time? And it’s a it it sounds like a subtle difference, but, actually, the mental model for how you test that, how you evaluate it is a little bit different.

Blair Abernethy, Software Analyst, Rosenblatt: Yeah. You know, interesting. Interesting. You know, if if we shift gears for a here and just talk a little bit about more your solution side of things, security is is an important area. And I just saw some of your guys that were you had a big booth at RSA

Steve Kearns, General Manager of Search, Elastic: Yeah.

Blair Abernethy, Software Analyst, Rosenblatt: Last month. And, you know, I I think the it would be helpful for you to sort of explain Elastic’s value proposition in security. Like, what are you guys you you’re not, you know, doing identity management, and you’re not, you know, a firewall company or anything like that, but you have a lot of customers using your security solutions now. That would be really helpful to sort of frame that up for us.

Steve Kearns, General Manager of Search, Elastic: Yeah. And and maybe I’ll I’ll split it into a couple pieces. I’ll tie back some of the AI portions may maybe in a moment. But when you think about, you know, the the security space as a whole, it is very difficult to protect the entire footprint of an organization. There’s all kinds of different data that are involved in that.

So there’s security data that looks a lot like logs. Right? Events from all the different sort of security related systems in your environment. There’s the actual logs from all of the authentication that happens across every application in the environment. And so, you know, a big part of how we got started in security was people saying, hey.

I have a ton of data, and I’m not able to process that in my legacy or traditional SIM, and I need a way to make sure that I know what’s in this data, that I can use this to help protect my company, protect my data. And so that idea of, of sort of us as a threat hunting platform is where we got started. And very quickly, we realized there’s so much more that we can do. And so today, we provide an entire SIM end to end with out of the box security rules that are detecting not like a signature of what a a individual attack looks like, but what are the things that an attack would do? They would move from system to system.

They would you know, you would see failed logins. You would see these other actions happening across the system. And so we provide a a SIEM detection engine that’s prebuilt with both ML, traditional machine learning, like what are unusual patterns that we’re seeing from a given user, from a given host, from a given service, and what it’s accessing. And then things like how would I look across those? And then saying, I’ve got one anomaly here, one anomaly there.

Are they related? If they are, then that’s now a pretty serious alert that we wanna go and surface. And so having this prebuilt security content makes a big difference, and you can see, oh, you know, over the years, we’ve expanded that capability significantly. In fact, in security, we go all the way to endpoint security. So the actual endpoint collection of security related events off a laptop or a desktop or a server and also protecting those endpoints.

And so we have really a fully featured security suite, if you will, in terms of the capabilities that we can provide. And that place where we start to see our advantages sort of multiply are as that eco like, as the amount of data grows, the number of security rules you have to set up grows. As the security rules grow, the number of alerts that get generated grow. And then now suddenly, the security teams are overwhelmed with the number of alerts that they have right now.

Blair Abernethy, Software Analyst, Rosenblatt: Impossible to to, really know what’s going on. Right?

Steve Kearns, General Manager of Search, Elastic: And you’re limited then by the amount of wall clock time a human can scan these things. This is where AI comes in. And we have a number of AI powered features, but the one I’ll mention because it’s so it it’s so obvious in some senses how powerful it is. Imagine if you can ask, the AI to take a look at your alert history. All of these alerts that a human doesn’t have time to go through, have the AI look at your runbooks that you have for your organization.

Look at your network map that describes what systems do what, you know, where the important ones are, the sensitive ones are, and then use that to actually scan through those thousands of alerts that happen every single day and surface the alerts or the set of alerts or the string of alerts that happen. We call this feature attack discovery, and it allows us to using AI building on top of the power of Elasticsearch as this sort of AI enabled data store, it allows us to surface these attack vectors that you wouldn’t really be able to see looking at any one alert. It automates the work of a lot of people. And, again, this isn’t about taking the work away. It’s about focusing the time, of the analysts that you have on the most significant things.

You still have to look through the rest. You still have to be really thoughtful about how you manage those alerts. But if we can surface for you an active ongoing incident before you would have been able to find it before, we’ve now saved that organization significant monetary or or reputational risk because they’re able to get ahead of it more quickly.

Blair Abernethy, Software Analyst, Rosenblatt: If a customer is coming to you, new, and they’re looking for a modern SIEM, what’s the are the connectors all built to all the major to the CrowdStrike and Palo Alto and all the all the major guys of the world so that, you know, does it take them a year and a half to get this up and running? Or how long does it take to to to get in there and configure it? And I guess, in in many cases, some of the customer might already have a, you know, let’s call it, a legacy SIEM or, you know, they’ve they’ve got something they’ve been using in the past. Or Yeah. What what’s what’s the barrier to get you to get them to adopt you?

Steve Kearns, General Manager of Search, Elastic: It’s a great question. For, like, a net new environment, we have quite a lot of integrations already on the shelf with all the kind of major systems that you would expect. If you’ve got an existing endpoint provider, it’s very likely we have integrations for their alerts and and their telemetry to be able to come right in. You’ve got, you know, common firewalls, common other ecosystems, common applications, very straightforward to do that, and it’s it’s getting easier and easier to bring in net new data sources. So if you have a custom application with your own flavor of audit logs, for example, we have an AI powered, ingestion approach now, called the sort of AI assistant for ingesting data, and this makes it much easier to bring in a custom data source.

And every company has some custom data sources that are specific just to them, and we’re trying to simplify that process of bringing the data in. If you do, though, if you do have an existing environment, like an existing install base or existing SIEM product, it’s likely that you have a whole set of security rules, a whole set of processes and workflows around that. And we we’ve been working over the last few years to reduce the number of capability gaps. So we have this new query language called ESQL, incredibly powerful, supports joining across multiple data sources, really a powerful set of querying capabilities that close a lot of the functional areas that were hard in Elastic historically.

Blair Abernethy, Software Analyst, Rosenblatt: So that’s your flavor Elastic’s flavor of SQL?

Steve Kearns, General Manager of Search, Elastic: Exactly. Yeah. The the the the Elasticsearch query language, the the ESQL. It’s a very powerful pipeline query language, makes these really complex queries something that you can build piece by piece in a very natural way that matches the way a lot of analysts think. And and then we’ve got migration tooling.

And so we actually have the way to take, alerts and and and more written by written in, like, Splunk’s SPL, for example, and and translate that into the Elasticsearch query language and and into our alert and and rule system. And so these are are, you know, individual capabilities. You can ask it one off. You can walk right up to our AI assistant inside the application and say, hey. Here’s a here’s how I used to run this query.

What does this look like in the SQL? And, and we’re continuing to build on that kind of capability going forward.

Blair Abernethy, Software Analyst, Rosenblatt: K. That’s excellent. Actually, there was a, just back in May, I think you guys acquired a company called Keep Alerting. Are you familiar with that one?

Steve Kearns, General Manager of Search, Elastic: I am. Yeah.

Blair Abernethy, Software Analyst, Rosenblatt: Yeah. We’re

Steve Kearns, General Manager of Search, Elastic: very excited.

Blair Abernethy, Software Analyst, Rosenblatt: Can you just tell us what that it’s open source AI op. What is that? What you know, why did you feel this was a a good target for you guys?

Steve Kearns, General Manager of Search, Elastic: Yeah. we we love these kinds of sort of technology tuck ins that are are really additive to the capabilities that we have that that pull forward some of the the things that we’re in the process of building anyway with a more mature sort of set of technologies. And so what what Keep Alerting while it’s focused on the sort of AI ops side of the world, what they’ve really built is a a really nice, almost like think of it as a workflow engine to say, I’m investigating a secure or, like, either a security incident or a a an operational issue. What are the steps that I wanna go and take to investigate this further or to take steps to remediate this issue? Again, same same concept for security and observability.

The data, the terminology that we might use is different. In security, you call it SOAR. In observability, you would call it a sort of AI ops. But that platform that they built is a powerful extensible platform that really does plug right into the way we think about taking action on top of alerts or other triggers within the system. So we sort of think about this as giving us this workflow automation engine that will power a lot of our SOAR capabilities in the future, the security operations, and response actions, and do the same thing for us in the observability space.

And we’ve been building pieces of this, and we have some of these capabilities in the platform. But that ability to bring in a a more mature, more complete end to end kind of a capability and extend the platform in that way is wonderful. And I’ll tie it all the way back. One of the benefits of building on this single core platform means that, you know, in the security side, the sore use cases are critical for us. This is an area that we get lots of of interest from our customers, lots of excitement around Keap and having those capabilities.

But I, thinking about the search business, I’m really interested because when you look at an agentic AI workflow, what

Blair Abernethy, Software Analyst, Rosenblatt: is that

Steve Kearns, General Manager of Search, Elastic: really but a a workflow powered with some of those steps powered by an AI, making decisions. That’s it’s not just a true false or, a you know, a structured conditional. Like, ask the model, run this tool, track the results. If it gives you the right answer, keep going or bail out. And this idea of having a workflow engine as a core primitive across our platform gives us this opportunity to now use that in a couple of different ways.

So it’s a as with any acquisition, it takes time to integrate the technology to integrate the team. But we’re really excited, very excited about the team to be joining us. They’re they’ve been a wonderful addition already.

Blair Abernethy, Software Analyst, Rosenblatt: That’s interesting because that’s that’s So you’re building it in you you will build it right into the core of the platform. And and then, I guess, if we start to think about, AgenTeq AI is all about taking action. Right? It’s it’s evaluating my environment, and, making a decision and then taking action on that decision. Well, if you have a workflow engine that can be tied into that, that just helps to to to amplify that effect.

Right?

Steve Kearns, General Manager of Search, Elastic: That’s right. And it simplifies it. I mean, for for us, when we think about some of the agentic ecosystems as well, here also, we wanna be the most open platform, and so we’re not gonna force you to use only a tool or only a workflow engine that we provide. We have great integrations and a great partnership with the folks at LangChain and Lama Index. We’re working with a lot of the other agentic AI frameworks as well to say, we wanna make sure that Elastic, if you start there, that we are the vector database, the sort of context provider of choice in that ecosystem.

But if you start with Elastic, we wanna give you the easiest path to success. And so having a a workflow engine in the company, in the in the platform, just gives us another set of tools, another layer, another widening of the platform for developers to say we can give you a better set of tools, a richer set of tools. You can use ours or bring your own. We’re we’re very open in that way. But if we can make it simpler, we wanna do that.

Blair Abernethy, Software Analyst, Rosenblatt: Let let’s talk about that a little let’s take the conversation up a little on in on terms of Augetic AI. So so, you know, where do you where does Elastic see itself fitting in this, you know, emerging I’m gonna call it an emerging ecosystem, rapidly emerging ecosystem that where enterprises wanna build these you know, the term is agentic, but, I mean, effectively, it’s a it’s a, you know, situationally aware piece of software that has agency to, make decisions based upon things that happen and, without human intervention. And so so how does, how do how do you kinda look at sort of where where Elastic will fit in that world, and and how do you do you go at it just with partners, or just trying to understand how how Elastic, carves out its its space in the in an agentic, world.

Steve Kearns, General Manager of Search, Elastic: Yeah. It’s it’s a great question. Yeah. I think long term, we’re big believers in the the sort of, agentic future, and it’s gonna take a bunch of steps in iteration to get there. The both in terms of the models, the the maturity of the other parts of the ecosystem, but we wanna make sure that as this happens, that that we are are play a big role in it.

And it actually goes back to what I what I sort of mentioned before, this this, like, evolution of of when I ask a question and I get blue links, my job is to look at them as the human. When I get one answer back, just a question and answer, like a very simple, GenAI kind of an application, if I don’t have the right answer to give the model, the model is not gonna be able to get the correct answer to give back to me. And so the quality of answers, even in the simple cases today, matters in a big way. And now it matters that much more when you take the human out of the loop because at least if I’m the human, I can say, this doesn’t seem right. If I ask about the holiday policy or something like that and it gives me an answer that talks about Greece, I’m based in The US, so I’m not gonna have like, it doesn’t matter what the Greece answer is.

I I want The US answer. I’ll know that’s wrong. But if you have that in a fully automated workflow, now you start to wonder, is it going to be able to make the right choices? Is it going to have access to the right kind of information? And so I think what we’ll see is this kind of circle of, like, there’s a lot of excitement.

As we get to production, the long tail, the the the ability to get the answer right all of the time really does matter in these kinds of applications. It’s easy to do a proof of concept. It’s easy to find a use case that works perfectly, you know, as a demo. But when you put it in front of the users or you put it in front of your teams to automate hundreds or thousands or millions of these sort of workflows that happen within companies all the time, the accuracy really matters a lot. And so I think that, you know, when you imagine then to get good accuracy, what do you need?

Well, you need smart models, so there’s a lot of great people working on that. We’re not trying to build foundation models, so we’re gonna rely on others, the OpenAI’s, your Anthropix, and so to build these great reasoning models that are able to participate in the process in a richer way. But they still need to have the right context. And and, again, when when we look at what does it mean to provide the right context, data inside of a business well, maybe we’ll start somewhere else. If all you’re asking is like a world knowledge question, the models should be able to just answer those questions because they have great general world knowledge.

But if you’re doing anything interesting, it involves your own own business’ information, the stuff that only you have access to that those external models aren’t trained on. And the most interesting business data is live. It’s changing on a reg basis. It’s changing continually. It’s not like a train it once and run it for a year or train it once and run it for a month.

It’s like, no. No. We got a new customer today. I need to know what to tell them, and I need to know their information. So this ability to bring context to those models, we believe will continue to be a critical part.

Whether we still call it rag, you know, a year from now, two years from now, five years from now, I don’t know if we’ll call it that. But the idea of bringing the information that the model needs to the model and the correct information, that’s not going to go away. This is going to be important for

Blair Abernethy, Software Analyst, Rosenblatt: a very, very long time because Corporations aren’t gonna want the all that data out in the wild anyway. Right? So there’s there’s this proprietariness, this this IP nature of my enterprise data.

Steve Kearns, General Manager of Search, Elastic: That’s right. And also security aspect. Like, one of the things that sometimes gets overlooked with some of the excitement in these use cases is, like, if you and I are inside a company and we both ask the the sort of, agent the same question, we might not we probably shouldn’t get the same answer if we’re in different departments or if we have different customers. And so if I’m at a financial services institution, if I’m a a high net worth, you know, sort of a a a analyst or something like that supporting high net worth individuals, when I ask a question, I don’t want to be able to seize, you know, somebody who’s not my customer’s information. That’s scary.

Right? And and so the ability for these systems to have per user, per per document, per field within a document level security is absolutely critical. And that’s actually even already something that we’re seeing when people are making choices on what vector database to choose. In fact, even today’s GenAI applications, they’re saying, hey. I need this, like, the security to really work, and they’re coming back to us and saying, ah, yes.

This is a a thing that Elastic as a mature data platform has had for years. This is the kind of system that I’m gonna need to build the rest of this application on top of. And so we’ll start to see, I think, that get more prominence as this as these use cases get beyond, the proof of concept, especially in the agentic side where you’re unleashing the model, you better be giving it the right answers because otherwise, who knows what what it will do next. You know?

Blair Abernethy, Software Analyst, Rosenblatt: So so if you’re if I have, been using Elastic for, let’s just say, observability in my organization, and I’ve indexed all this data, right, or and I pointed at my corporate store data stores, I’ve indexed it all, So you’re able to then, you have the metadata around that data. Right? That’s right. So now you can control, what’s what you surfaced in a model or what you surface in a in in a let’s call it intelligent application that comes after after the data. Right?

Steve Kearns, General Manager of Search, Elastic: That’s right. And and I think having that ability to control what data you’re bringing forward makes a huge difference. And one of the fun things and you sort of highlighted it there naturally. One of the fun things about building and using Elastic, even for something like observability, is that data even observability data often has a business aspects to it. And so, you know, if I go back to one of my favorite sort of historic use cases, there was a a telecoms company in in South America who was building using us for just observing their cell phone towers.

And one of the fun things that they did is they said, well, we started to use the disconnect data of our telemetry to see where are people typically disconnecting in a geospatial area, and then let’s build a cell tower there because now we know where people are with our devices. We know where they’re

Blair Abernethy, Software Analyst, Rosenblatt: Where they’re losing signal. Yeah.

Steve Kearns, General Manager of Search, Elastic: Incredibly valuable. And so but they started by looking at this as an observability problem just to say how are the towers performing, and they realized that there’s another way to apply that kind of data. And so I think that there’s a lot of that kind of innovation that we’re going to continue to see building on top of, you know, the the the other kinds of information that people are collecting across their business.

Blair Abernethy, Software Analyst, Rosenblatt: Well, that’s something I I’ve talked to some of the other observability players where Dynatrace were tracking earlier today on this. But, you know, instead of just observing an IT system to see how it’s functioning, but actually business operational observability, you know, I guess, are you are you seeing customers kinda looking at using your platform to going into more business operations?

Steve Kearns, General Manager of Search, Elastic: Yeah. It’s a good use case. It it’s a tough one because every business is shaped differently. And so it doesn’t necessarily look like a a standard repeatable, here is exactly how you connect your purchasing department to, you know, your your, you know, accounts receivable or something like this. There are certain patterns, I think, that can be replicated there.

But what we’re seeing is is people being creative with the data because Elastic as a platform gives you that power and that flexibility. So I think that that’s almost like a value add. You know, we do the core things, start by, you know, observing my infrastructure, and then what else can I use this for? How do I learn more about my customers and their behavior and feed that back into my sales and my promotional machine? And so I think that that’s a it’s a natural extension in some senses of of how you would build on Elastic.

It’s not necessarily a separate vertical or something like that that we we track in that same way, but it’s something that the flexibility of us as that data platform, yeah, makes it much easier, I think, than a system that was much more closed that was saying, here’s your, you know, your APM UI. Here’s your metrics UI. Here’s your like, no. This flexibility means you can look across those sources when and where it matters.

Blair Abernethy, Software Analyst, Rosenblatt: We’re we’re we’re coming up on our time. We’re gonna have four or five minutes here. Steven, I I wanna ask you a couple more things. You announced an integration with NVIDIA’s, enterprise AI factory last month. Can you just tell us a little bit about what you’re doing there?

Steve Kearns, General Manager of Search, Elastic: Yeah. Yeah. It’s a great partnership. We’re so happy with the partnership with NVIDIA. Obviously, in the world of inference, the world of AI, NVIDIA is the place where you wanna be running a lot of those workloads.

And one of the things that NVIDIA has started to do is they’ve said, how do We help to box up, package up all of the things that you need from the hardware side, through vendors like Dell and others, through the NVIDIA chips themselves, all the way down to the soft layer to make use of that to build compelling applications. And, you know, we’re really happy to be the vector database inside that reference architecture that they’re providing. And so that kind of a partnership of working with a company like NVIDIA to be the default vector database in that reference architecture is huge because it’s validation of the way that we partner. It’s validation of the way that we support these kinds of applications in a very rich and powerful way so that the the leading company in the AI sort of universe wants to pull us into that reference architecture. And so it’s a I think it’s just another example of where our technology being present and visible with all of our partner is really valuable to us, and I think will help simplify that process of building for a lot of our customers.

Blair Abernethy, Software Analyst, Rosenblatt: Excellent. Excellent. I’m gonna ask you just to in the in the last two minutes here to put on your your, you know, long term vision here. Where where does AgenTeq AI go for Elastic? Like, what what what can this thing look like?

Like in a few years?

Steve Kearns, General Manager of Search, Elastic: Yeah. It’s a good question. I I think there’s, there’s a couple of ways to think about this. If I look at it, the one of the the promises and the hopes and the dreams of the the sort of agentic workflows is that we can start to automate a lot of the high human toil processes across businesses, and there are a lot of those that happen inside of every single business. And so as those, as the number of those workflows start to be automated, start to work, and start and and the technology starts to catch up to really be able to deliver on that, I I think it’s a very exciting future because that means that we can have the people focus on higher value things than a lot of the sort of repeats or emotions.

And it means that that these things now can be a base that we build from. But, ultimately, those workflows, they will only succeed, I think, if they have the right context, they have the right access to the right information in the right way across a business. And so I think that our role in that is pretty clear being that context provider for those workflows. And I think that we from the from the heritage of our technology going all the way back to the early days of search and relevance to modern search and relevance on top of, you know, LLMs and and on top of these language models with with embedding models and spectral search and so this gives us that advantage to be a better platform to build these things on top of. And so that’s where I think you’ll see this sort of evolve a little bit and where where I think we can play an exciting piece of that.

Blair Abernethy, Software Analyst, Rosenblatt: Yeah. No. It’s fascinating and rapidly evolving. So there’s no question about it. Listen, Steve, this has been fantastic.

Thank you very much for taking the time with us this afternoon, and we’ll be watching Elastic for additional really cool innovations over the next couple of years. So thank

This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers
© 2007-2025 - Fusion Media Limited. All Rights Reserved.