Oracle at Oracle AI World 2025 Keynote: Pioneering AI Integration

Published 15/10/2025, 03:02
© Reuters.

On Tuesday, 14 October 2025, Oracle Corporation (NYSE:ORCL) unveiled its strategic vision at the Oracle AI World 2025 Keynote. The conference highlighted Oracle’s commitment to integrating AI across its product stack, emphasizing both the promise of innovation and the challenges of implementation. The event revealed Oracle’s latest AI advancements, showcasing both opportunities and potential hurdles in its AI-driven future.

Key Takeaways

  • Oracle launched the AI-native Oracle AI Database 26AI, introducing AI vectors for semantic content representation.
  • The Oracle AI Data Platform, offering end-to-end data analytics and AI capabilities, is now generally available.
  • Oracle’s partners have committed over $1.5 billion towards AI training and development.
  • A focus on open standards and customer data security was emphasized throughout the conference.
  • Oracle aims to lead in AI by re-architecting its products to provide a seamless AI experience across its ecosystem.

Oracle AI Database 26AI

  • Oracle introduced the AI-native Oracle AI Database 26AI, featuring AI vectors for similarity searches.
  • AI agents are integrated for multi-step workflows, enhancing database management and security.
  • Specialized AI chatbot assistance is available for database diagnostics and knowledge management.

AI App Development and Trust Generative Development (GenDev)

  • Trust Generative Development for Enterprise (GenDev) was introduced to build trustworthy AI applications.
  • Utilizes solution-centric languages like SQL and Open Application Specification Language.
  • Implements a sophisticated rules engine to enforce data privacy and restrict LLM inputs.

Oracle AI Lakehouse

  • The Oracle Autonomous AI Lakehouse combines Apache Iceberg’s independence with Oracle AI Database power.
  • AI vector search and rich analytics SQL enhance performance and secure data access.
  • Existing data warehouses are automatically converted into AI lakehouses for seamless integration.

Oracle AI Data Platform

  • The platform offers a comprehensive suite for data analytics and AI, featuring an open lakehouse and unified catalog.
  • An integrated developer workbench supports data integration, engineering, and science.
  • The Agent Hub provides an agentic user experience, facilitating the building of AI agents.

In conclusion, Oracle’s advancements in AI aim to transform enterprise capabilities. For a detailed understanding, refer to the full transcript below.

Full transcript - Oracle AI World 2025 Keynote:

Juan Loaiza, Database Team, Oracle: All right, thanks everyone for joining today. Talk about the AI for Data revolution. It is a revolution. As we know, AI is changing everything. It’s the next big thing in data management, without a doubt. It’s not a revolution any of us can afford to ignore. The quote I like the best about AI is this one. It says AI won’t replace humans, but humans with AI will replace humans without AI. That applies to enterprises also. The key to thriving in this age of AI is to transform yourself and your enterprise into an AI leader and use AI to deliver breakthrough insights, innovations, productivity before your competitors do. Oracle AI for Data is all about helping you do that. The great news is AI for Data is both easy to learn and easy to use. You’ll see that when I describe things today.

Now, before we jump into AI, I want to talk a little bit about what I’m going to say about the Oracle AI for Data strategy. What we’re doing, why we’re doing it, what’s different about it. Most importantly, how it’s going to benefit you and your enterprises. I’m going to talk both about what we have today and about a few things that we will have very soon. It’s important to understand AI is a huge focus for us in the database team. We have over 100 different projects going on in AI. We’re using AI throughout everything. It’s the most rapidly changing field ever. We’re going to keep putting out AI innovations on a quarterly basis. This is going to be changing in real time. Before I launch into AI, I want to discuss briefly our overall strategy for Oracle Database.

It’s pretty straightforward, which is we design our products for customers that think and act strategically and that understand that gluing things together causes a lot of problems. It creates a lot of costs, it creates a lot of complexity. We also really believe in open standards and we know that you do also. Why? Because open standards are key to creating the competitive ecosystem that benefits everyone. It benefits vendors and customers. We’ve been focusing for many years on a converged architecture and what this means. It’s an open standards based architecture that puts together best of breed support for all the different data types of workloads and makes them all work together seamlessly. For AI, we’re doing the exact same thing. We’re putting it all together with everything else and making it work seamlessly.

That’s the overall strategy and there are going to be three parts to my talk tonight. First of all, it talks about what we’re believing, how we’re architecting these things together. We believe that architecting AI and data together is going to enable simpler and better results, and I’m going to show that. That’s the first part of my talk. Then I’m going to talk about architecting AI, data, and app dev together to enable faster and better innovations everywhere and innovations that you can trust. We’ll get back to this whole trust topic. Finally, I’m going to talk about architecting AI, data, and open standards together. That’s a big deal for us. We’re going to be making some announcements there. Again, that’s all about enabling AI insights for all your data because it’s open everywhere. Those are the three big parts.

Let’s start talking about how we architect AI and data together for our four key products. I’m going to start with AI Database, AI App Dev, AI Lakehouse, and AI Data Platform. First up is Oracle AI Database. This is a new thing. Today we launched Oracle AI Database 26AI. You notice two things here. One is it’s Oracle AI Database. Now it’s not Oracle Database, it’s Oracle AI Database. We have a new release number, 26AI. It’s available now, you can download it today, you can start using it today. The idea of Oracle AI Database is that we’ve architected AI and data together to create a next generation AI native database. This is something that we’ve been working intensely on. The two key things that we focus on are the two key AI breakthroughs. One is LLMs, which we all know about.

The other one is AI vectors, which is a new data type in databases, which I’m going to talk about. In addition to that, there are dozens of other AI improvements. This new release, 26AI, is a long-term support release that’s fully compatible and it replaces Oracle 23AI. After today, there’s no more Oracle 23AI. Everything going forward is Oracle 26AI. It’s very easy to adopt because basically you apply the October patch, this month’s patch, to your database and you’re on Oracle AI Database 26AI. It’s adding AI functionality on top of our previous Oracle 23AI. That’s what it’s doing. It’s not changing any of the architecture, it’s just adding AI on top. If you’re running Oracle Database 19C, it’s also gaining AI capabilities because a lot of the tools that I talked about today, the AI tools run on top of Oracle Database 19C. Let’s dive into it.

First, I’m going to start with AI vectors. Some of you know this already, but it’s new to a lot of people. That is the key new AI data type in databases. Oracle Database now has a new data type called AI vector. What is an AI vector? An AI vector is a sequence of numbers that represent the semantic content or meaning of a complex object. It could be a document, an image, a video, or a pattern of data. This is a brand new data type enabled by AI. Oracle generates these AI vectors from objects using the AI model of your choice. You hand Oracle a document or an image, we run the AI model, we generate an AI vector for the object. That’s what an AI vector is, but what’s it good for? Oracle AI Database can both store vectors, which is easy.

More importantly, it can quickly find similar objects using AI vectors. Up until very recently, the only thing that databases were able to do with things like documents and images is find exact matches, the exact same pixels, the exact same words. If you already know the pixels, then there’s no point in searching for something you already have. What you really want for these complex objects is the ability to do a separate similarity search, find things that are similar. That picture you see there of the couches, if you hand it the picture of the gray couch on the left, you don’t want the exact same picture back. You want to find the couch that’s most similar to it if you’re trying to buy something similar.

What you can see there is some couches are more similar, some less similar, and the more similar ones have similar numbers in their vectors. That’s the key attribute of vectors. The more similar the numbers in the vector, the more similar the objects. What we can do in Oracle is find similar objects by just looking at the numbers. We have very fast vector AI indexing so that in milliseconds we can find the closest similarity to any document, image, video, anything like that. That’s the key idea behind AI vectors in the database. That’s the new technology that it brings to databases. Now, if we want to answer business questions with AI, what we need to do is find relevant business data in the database and then use the LLM to answer the questions. To find relevant business data, you need to do two things.

Number one is you need to do traditional database search and you need to do this new AI vector search, and you have to combine them together. That’s the key, and I’m going to be talking about that. I’m going to show you how easy it is to do that now with Oracle AI Database because we have AI architected into the database. Here’s an example where we’ve architected the vectors and the AI to work together. The example here is an employee asks a question. In this case, the question is, does my dental plan cover braces for my 19 year old? That’s a question written in natural language. What we’re going to do is we’re going to take that question and vectorize it, convert it into a vector, and then we’re going to search our documents in the database to find the closest document that matches that question.

Then we’re going to verify using normal database search that that benefits document applies to that employee. That’s a simple example of how you combine the new age vector search to actually find documents that answer a question and then make sure that the document is relevant to that particular user. It’s the combination of the two. Now, how hard is it to do this? It’s very simple. Here’s a little simple SQL statement that does exactly that. I’ll read you what it says. It says select the documents from the benefits documents table where that benefits document applies to this employee. That last line, the order by vector distance, says find the closest matching document. That’s all it takes to answer that question in SQL in Oracle AI Database.

You see that with one line, one extra line of that SQL, we’re able to answer these complex questions that were never possible before. An English language question, finding a document and returning it. Because both the documents and the business data are in the same database, it’s completely consistent, everything is current. We return that answer in milliseconds because of our vector indexing technology. That means it’s good for both OLTP and analytics because something that returns in milliseconds is very good for OLTP. We can search across vectors and all types of data in Oracle. Text, data spatial, JSON, XML, you name it, we’ve made it work together with these new AI vectors. I just talked about finding the relevant document, but what happened to the answer to the question? You could go read the document.

It’s even better if we take the results of searching for that document and pass it to the LLM along with the question, then the LLM can generate an exact answer that you see there in purple at the bottom. This is called retrieval augmented generation or RAG. In Oracle AI Database, we’ve automated that whole process so you can either invoke it through APIs or you can run the whole thing as a single SQL statement. That’s built in, makes it super simple because it’s architected into the Oracle database. Let’s hear what a few customers have to say about using Oracle AI Vector Search.

Number one reason we wanted to use Vector on Oracle Database is because it.

Keeps it next to all the other.

Data, it’s converged, and that allows us.

To add the same security policies and.

Not have to have a whole separate system.

What we find today is that a lot of our customers’ data are distributed across multiple, multiple different technologies. Oracle Autonomous Database helps them to automate their entire deployment and also bring AI.

To the data much quicker.

Unidentified speaker: We have chosen Oracle Autonomous Database because of the ability to use all the different functionality to access data. That means vector search, select AI, geospatial analysis, blockchain table, and so on and so forth.

What we found, we compared Oracle Database.

Juan Loaiza, Database Team, Oracle: Vector search to a lot of the.

Other options out there.

Just the sheer scalability seemed larger. We know it scaled higher than limits that I’ve heard about elsewhere. It’s faster, it’s a great solution, it’s cutting edge, it’s scalable, it has the security you need, it just plugs into so much. All right, that’s great. Thanks. That’s Oracle AI vector search.

Unidentified speaker: That’S.

Juan Loaiza, Database Team, Oracle: Now we have an even newer technology now that goes beyond that RAG that we just talked about called AI agents. Vector searches, you hand a question and some data to the LLM, the LLM answers the question. Now what you can do with AI agents, the agents can take multi-step workflows, they can plan, they can try multiple approaches to answering the question, they can use tools, they can take actions. That’s the new generation of AI technologies which we’ve also built into the Oracle database. One thing we’ve done, AI agents is a huge deal across the industry and we’re integrating with all the different AI agent frameworks that you see listed there. Everything from OCI to GCP, Azure, AWS, lots of other things like LangGraph. In addition to that, we’re also architecting AI agents directly into the Oracle database.

We provide frameworks to easily build, deploy, and manage in-database AI agents. These in-database AI agents will be, number one, more secure because your data never leaves the database, number two, faster, and number three, much easier to use. Three examples that we have there: Select AI Agent, Private Agent Factory, and SQLcl MCP Server. There’s kind of three different approaches to AI agents in the Oracle AI Database. Now I’m going to have Chris Rice show us how LLMs can use the Oracle SQLcl MCP Server to easily answer questions about data in an Oracle database.

Unidentified speaker: MCP allows an LLM to answer natural language questions about your database in a very similar way to a SQL expert, but much faster. The LLM first automatically queries the database metadata to find relevant tables and columns. Then the LLM writes and executes SQL via MCP to provide an answer to the user. It’s like having a SQL expert that is ready to assist you at any time. Oracle SQL is now accessible via MCP to enable powerful and simple agentic AI. SQL or SQLcl commands can be invoked by any MCP client such as Client, Claude, or Copilot. SQLcl MCP server even works for Oracle Database 19. Note that we saw the client identified as well as the LLM in use was identified. These two pieces of information are captured and logged into the database audit trail.

This allows for the DBA or administrator to keep track of which LLMs are in use and which clients are in use. Now I’ve further refined it and asked it to summarize the employees by departments. Now I’m going to ask it what are the most expensive departments, which salaries are the highest. Here it had to do a multi-table join to come up with the department names and salaries. Now I’m going to ask it if the location has an impact, which again it writes the query and the query is a multi-table join to come up with the correct answer. Finally, I’m going to ask it to produce the report in a markdown format so that I can save and share that report with anyone that I need to share this information with.

Juan Loaiza, Database Team, Oracle: All right, so that’s a quick demo of the Oracle MCP Agent. It’s basically a SQL Expert on demand that you can use and MCP Agent. You can download it today, this Oracle MCP Agent, and use it against any database. 19c, any version of Oracle database that works, you can download it for free. Use it today. It’s a great way to get started with AI. Actually, I hit the button ahead of time. Back it up, please. There we go. We talked about how AI can answer questions about data in your database, but it’s not limited to data in your database. It can also combine its own knowledge and things that it finds in public domain, like by doing a web search to answer questions. Here’s an example. You ask how my ABC product is doing against the market.

The AI agents will look through your database, find the answer for how ABC is doing in your product. It will look out on the web, find out how their products are doing, and give you a very detailed answer. It’s amazing that this is even possible. It’s actually super simple to do also. It’s not just possible, it’s super easy. I talked about generating answers using AI and how we built that into the AI database. There’s a lot more that we’ve done with AI, and one thing we’ve done is we’ve also architected AI to help with database management pain points. Just like we have human specialists in different fields, we are also providing specialized AI chatbot assistance to help with different kinds of database management. We’re releasing an AI management assistant, a security assistant, a knowledge assistant, and a diagnostic assistant.

Each of these uses different data and different strategies to get answers to help you manage your databases. It’s another giant area that we’re investing in. In addition to that, we’re also transforming data development using AI. In fact, we’re transforming every step in the data development workflow. I’m going to go quickly through each of these steps to give you an idea of the benefits of this AI. First step in data development is you got to create a schema. You got to go create your tables. Now, you can easily generate a schema from a natural language description with the SQL Developer AI Assistant. You just tell it what you want, it generates the schema. You can also.

The next step is a new step, which is you need to declare the intent and semantics of your schema, of your tables and columns, so that LLMs can understand them. Because the LLM, if you can’t understand your schema, if all it has to go on is cryptic column names or cryptic columns or cryptic comments, I should say, then it’s not going to do a good job of doing anything. We’ve added something called annotations to the database. You can basically annotate your tables, your columns, your procedures, tell AI what the semantics of it are. What does this data do? What does it mean? When do you use it? Annotating schemas is a really important thing to do. Those of us that have been around for decades in database know that things like SQL tuning was the thing that we used to do.

The new version of that in the AI world is annotating your data, describing the data to the LLM. Again, Chris Rice is going to demonstrate a tool that we developed that you can use to annotate your database data and schema.

Unidentified speaker: We’re going to annotate the schema in this database and to get started all we have to do is select the AI enrichment in the tree. It will install the necessary PL/SQL procedures to begin annotation. These procedures have been backported so they will work across Database 19 and 26AI. We’re going to switch to a database that has already begun annotating and creating AI enrichment. As we can see, the AI enrichment has already begun. There is a group called Human Resources and there are tables awaiting enrichment. If we open the enrichment dashboard, we’ll see that the schema itself has an enrichment added to say that this is a combination of multiple things from personnel to accounting to sales to logistics. We also get a dashboard of how annotated this schema is. There will also be feature functions to import/export this for portability.

If we look at the group, which is the Human Resources group, we’ll see that the PRS_tables actually make up the Human Resources schema. We can see here that these columns have a simple description to make the LLM more aware of what the actual values are. The annotations are freeform, so things like aliases or business terms can also be added. These annotations are a powerful and standard way for the developers to declare the intent of the data in the semantics. So the LLM is more aware of the data structures and intent of the data. Here we’re making a new group for shipping. These are all our log_underbar, which stands for logistics. By adding an annotation, the LLM may not confuse log for server logs, but instead it’ll know it’s for shipping and logistics.

In addition, of course, the Oracle AI Database also automatically provides high-level data usage and information to assist the LLMs.

Juan Loaiza, Database Team, Oracle: All right, great. That was creating a schema. Now we’ve annotated the schema, we told the AI what this schema is all about. Next thing is to load data. Now you can easily load data with AI-assisted creation of ETL pipelines. You tell it what the pipeline you want is, it’ll create the syntax you need to create that pipeline. Next step, you have to do some testing. We’ve also added AI assistance for generating synthetic test data that closely matches the production data. After that you’re ready to go. Now we’re ready to use the data. We have a feature called Select AI that lets you access the data using natural language. In fact, it works with a select statement. The select statement, instead of having SQL, has a natural language expression. You can see that example on the right, it says select AI.

That’s a select statement you can hand to the database from any tool. How many employees have not submitted their benefits plan choice? From that natural language question, it’s going to go to the database, it’s going to figure that out, it’ll return it to you, then you can ask follow-up questions, you can say show counts by department, so it remembers where it was and it can add to that question. You can run this from any tool that accepts SQL. This is also really useful for generating SQL statements as a first draft for developers. It does much more than this even. We’ve created the, we’re now querying the data, but there’s a lot of different models and a lot of different parameters to these models.

The other thing that’s important to do is to test and optimize the embedding models, the LLMs, and the parameters for your data and your use case. We’ve provided a tool for that also called Oracle AI Optimizer in toolkit. That’s the data development workflow. The other important thing to understand is all this AI technology is engineered and architected into the database. What that means is all of Oracle’s mission critical capabilities just work seamlessly. Disaster recovery works, transactions work, security analytics, parallel SQL, everything works. Why? Because it’s all architected together by us. You can just use it and you don’t have to worry about any of these things. This is technology that we’ve been maturing for decades. It’s very mature technology that you can run in mission critical databases. Okay, so that was a very brief overview of Oracle AI databases.

Oracle AI Database, as I mentioned, there’s dozens of additional AI capabilities, way more than I have time to talk about today. For example, Exadata AI, Smart Scan, GoldenGate Distributed AI, and dozens of more capabilities. That’s the first section of the talk, AI database. Huge advancements in AI database. It’s really changing the way databases work. Now let’s move on to AI app dev. Let’s talk about what’s going on there. This is what we want the future of AI app to look like. You say what you want an app to do and the LLM generates the app for you. It’s pretty simple. We’ve all seen videos of this happening. It can do this. It can generate apps in seconds with thousands of lines of code, which is orders of magnitude faster than we can now. What’s the catch?

The catch is these are enterprise apps that we’re talking about and enterprises need to be able to trust those apps. It generates thousands of lines of code. How do you know you can trust that? How do you know it’s secure, correct, dependable? You can tell the app, hey, make sure this is correct, make sure this is dependable. You can’t trust that either. You can try to put guardrails around it, but that’s kind of trying to fix things after the fact. What we’re focused on at Oracle is designing trust into the core of the architecture so that it just works and you can really trust it. We call these technologies and best practices that maximize both the innovation and the trust by architecting the whole thing together. AI data and app dev, we call these technologies Trust Generative Development for Enterprise, or GenDev for short.

I’m going to give you a quick walkthrough of what we’re doing in this space. The first thing to know about GenDev is that it’s focused on high-level solution-centric languages like SQL. SQL is a solution-centric language. Why? Because enterprises can’t trust these thousands of lines of code that no human understands and nobody can evolve, nobody could. The code is never static. If you think about it, it’s a lot easier to understand something that’s 30 lines of SQL than thousands of lines of code which does the equivalent of those 30 lines of SQL. That’s why these solution-centric languages are really important. That’s what we’re focused on. Our GenDev methodology. We generate solutions using solution-centric languages like SQL and this open application specification language, which I’m going to talk about later, not so much on traditional code. That’s the first step, understandability.

Understandability is key to trusting the generated solution, but by itself is not enough. There are other risks that we have to deal with, things like data-level risks and application-level risks. We have to resolve those risks before we can trust the application. We’re going to start by talking about the data-level risks. There are three big data-level risks that you can have even if you’re using SQL. One is the generated SQL can break data correctness. Why? Because it might violate data consistency or business rules for the data. The second is data evolvability. If the SQL depends on the underlying format of the data, then you’re not going to be able to evolve the underlying format. We have to take care of that. The third and most important is data privacy. We don’t want to allow a user to view another user’s data.

That can’t happen under any circumstances. I’m going to talk about these three things and the approach that we’re taking. This is really important. Generating this trust with AI is super important. How does GenDev address data correctness and evolvability? We do it by combining SQL with these new trusted data APIs to access data. The trusted data APIs use this new technology that I’ve talked about here before called JSON Relational Duality. What that does is it provides a JSON object interface on top of SQL that allows each app to read and write exactly the data it wants. In that little screen you see there, that’s a purchase order. The API would take all the data in the purchase order, everything, and either read it or write it from the database, exactly the data that that particular application wants.

The keys here are that these APIs ensure ACID consistency for all the data that the API accesses. The entire purchase order has ACID consistency. It doesn’t use locks, because if you use locks, you can abuse locks and lock up the whole database. Another key thing it does is allow you to validate business rules on the full business objects. For example, on a full PO, you can define the rules around that and it can validate that on every change, no matter what. If you change it through SQL, you change one row, one column, we can validate the entire business object to make sure it’s valid. The third key is that that data API doesn’t expose the underlying schema. That enables both app independence. You can write different apps without them interfering with each other, and schema evolution. You can evolve the schema under the app.

These technologies are really key to architect trust into the underlying database so that you don’t have to worry about AI messing it up. We talked about those. The next one, and probably the most important one, is data privacy. This is a huge risk in the age of AI. Why? Because you can’t expose other users’ private data. You can’t expose people’s medical records, you can’t expose their financial records, you can’t expose their buying records. You actually go to jail for doing that. The hard part is the privacy rules are very complex and specific to the role of individuals in each company, in each industry. A simple example here is that if you have an employee record, there are different people in the company that can see different aspects of this. The payroll department can see part of the employee record.

The human resource department can see a different part. Managers can see parts of it. There are very complex rules about who can see what, where, when, and how. The database has never understood these rules. This has been the problem. Today these data privacy rules are implemented in applications and enforced by custom application code. This creates two big AI risks. One is, when you’re using AI to generate new apps, how do you get it to enforce the rules? You can tell it, here are the rules. Make sure when you generate the apps, you enforce the rules, but you can’t be sure that it’s going to do it. The second and even bigger risk is that to really gain from AI, you want AI to be able to directly access the data. The trouble with that is, of course, it bypasses these application privacy rules.

That MCP example that we just saw was a good example of that. The AI was going directly against the database and accessing data. How do you know it’s not going to reveal private data? These are two giant risks that have to be resolved in order to make AI useful in a real enterprise. The only way this is going to happen, the only way we can fully trust data privacy enforcement, is to implement it down at the source in the database itself. You can’t trust the application because it can be bypassed and you can generate applications that don’t follow the rules. What we’ve done now is we built into the database very sophisticated rules, a very sophisticated rules engine that can specify exactly what each end user in a company can, which data that end user can access. The database enforces these rules.

When an end user uses SQL to access data, or an AI generates SQL to access data for that end user, that end user is only going to see the data that he’s supposed to see. Won’t see any other data. SQL won’t return columns and rows, the things that that user is not authorized to see. That whole enforcement mechanism is built into the database where it can’t be bypassed. I think this is absolutely vital for using AI in the future without going to jail because you’re exposing private data. That’s kind of data level risk. Let’s move on to app level risks. There are a lot of app level risks. There are three categories. One is the LLMs can be tricked into malicious actions or answers and other ones LLMs can just misunderstand what you ask for.

You ask for something and maybe what you said is ambiguous, maybe it misunderstands it. Another thing, as we all know, LLMs can hallucinate, can make things up, it can actually make mistakes in generating solutions. All these things are very risky when you’re dealing with enterprise data. There are two kinds of applications. One is internal applications that are used by experts that understand the risks and can deal with them. There’s another kind of applications that you have to have full trust for. For example, applications that are used by the general public that you have no control over and you can’t train. To deal with that kind of application, it requires limiting how AI can be used in the app. There are two things that we have to do to do that. We have to restrict the LLM’s inputs, usage, and results.

We have to validate the reasoning behind the results. That’s the only way we can really trust these things for AI to generate these things. This is actually a very sophisticated area. I’m only going to show an example here today because it can get more complicated. I’m going to show you how. We have a combination of two things. One’s called Trusted Answer Search, another thing called Apex Interactive Reports that you can use to deliver trusted answers to natural language questions. You can make sure that the answer you’re getting is correct. Okay? Starting with Trusted Answer Search, the way this works is a user asks a question, okay? That’s similar to what we saw before in natural language. Instead of having the user, the LLM, try to answer that question, what we do is we use vector search to match that question to pre-created reports.

You might have 100, 200, 500 pre-created reports. We match the question to the nearest pre-generated report. We don’t use the LLM to directly answer the question. You can use LLMs to generate these reports or to help generate these reports, or to guide the answer. Natural language question, use vector search. We’ve matched it to a report. Now Shakiba is going to demonstrate how Apex Interactive Reports provides trusted answers within the real-time report.

Shakiba, Oracle: Millions of Apex applications rely on interactive reports to help people explore data on their own. They surface trusted information in a way anyone can customize. Now with natural language support, you can simply talk to your reports. Let me show you. Show me all employees that are managers. One question and the interactive report applies the filters. For me, what’s happening here is important. The AI takes my intent and turns it into report settings I can see and change. It’s not generating arbitrary SQL behind the scenes. Everything is applied as transparent filters, so I know I can trust the results. Group these managers by department and show me their team sizes. Now the report organizes them by department with each team size right there. I can even ask imprecise or ambiguous questions like show me high performing managers with a tenure of four years or more.

This is interesting because high performing isn’t a column in the data. The system interprets my intent, a rating of 4 or higher, and applies the right filters for me. I can also customize the appearance and shape of my report. Highlight those in the sales department in green. Just like that, the top sales managers are easy to spot. Now let’s pivot to see how departments are staffed: full time, contract, remote. This gives me a straightforward comparison of staffing across all departments. Now let’s go and visualize this. Show me a chart of full time versus contract employees. Sometimes I may have a question that goes beyond what interactive reports can do with a single filter, pivot, or group by. For example, which departments have the highest average tenure? Within those, how many managers are rated outstanding? That’s more complex. It requires multiple layers of aggregation.

In that case, Apex brings up the analysis assistant. It shows the thinking behind each answer so I can understand exactly what’s happening. The AI still isn’t generating SQL. Instead, it interprets my intent and runs safe, deterministic operations in the background. When possible, it refreshes the report with new chips. This extends the power of interactive reports to answer open ended questions with clarity and trust. From here I can keep going, asking follow ups or diving deeper. This is the next chapter for interactive reports in Apex: natural, conversational, trusted, and the best part, millions of existing reports can be AI enabled just by upgrading.

Juan Loaiza, Database Team, Oracle: All right. It’s very impressive. It’s amazing what AI can do now. This is Trusted Answers. You know that these are correct. It shows you what it’s doing. It’s limited. We’ve limited the LLM so it can’t do crazy stuff. That’s Trusted Answers. We’re also applying this gendev methodology of architecting AI data and app dev together to generate full applications. We’re doing that in Apex Application Generator, which is a new generation of Apex. Now, what’s Apex? Apex is Oracle’s very popular low code IDE that allows you to visually create apps. Apex creates mission critical apps that scale to any level and it’s very popular. We have over 3 million Apex apps in production today and thousands of new ones are generated every day. It’s also important that Apex is completely free both to develop and run, so there’s no charge for it.

We’ve rearchitected Apex to be an AI native application generator. What we’ve done is we’re rearchitecting Apex to use a new solution centric language. Remember I talked about that, like SQL is a solution centric language called Open Application Specification Language for creating apps. The idea here is this allows AI to focus on specifying what an app should do rather than how it should implement it. That makes it much more reliable and much more succinct. Like SQL, this open app spec language is orders of magnitude more succinct and understandable than traditional app dev language. The way this works is you just describe in natural language the app functionality, the pages, the data that you want, the features that you want, and AI. Apex uses the LLM to convert that description, the natural language description, into a succinct and understandable open app specification for the app.

Then Apex simply compiles that specification into a runnable app. This generated app will use trusted data APIs, the thing I talked about a little while ago, to ensure that its access to data is correct, evolvable, and secure. That’s how this works. It’s very different from anything else out in the market. This is a very simple example of this open app spec language. This just specifies the simple report. You can see it just says here’s the name of the report, here’s the type of the report, you give it the SQL statement and it automatically generates a screen for all the data returned by the SQL statement. That’s a trivial example.

Okay, so that was a quick overview of AI app Dev and you see what you’ve seen is the key idea of this GenDiv methodology is to enable ultra fast innovation by architecting enterprise level trust into the stack, into the data and application stack, so it really can’t be violated by the LLM. This is an approach that’s very unique for us because we’re really focusing on enterprise capable apps. Okay, so that’s the second section. Now moving on to the next section, which is OpenAI analytics in our new AI lakehouse. Now, Oracle’s always been about open standards and we’re continuing to be about open standards. In fact, we’re bringing open standards and open platforms to the age of AI. Oracle, our AI for data, supports all the leading models and frameworks.

Every LLM, every framework, customers can either call them via APIs or can deploy them as private instances for added security. We’re open to all the models, we’re open to running these things everywhere. In all the public clouds, you know our multi cloud strategy, we can run Oracle database in all the leading clouds. We can run Oracle database on premises, we can run this model called Cloud at Customer where the database runs on premise, but it’s in a cloud model. Because we’ve architected all this technology into the database, you get to choose where you deploy it. Now this advanced AI capability I’ve been talking about, it’s not limited to data in the Oracle database. You can use Oracle AI in any of your data stores using Oracle AI Database’s advanced federated query capability to create an AI proxy database.

An AI proxy database sits in front of your other databases and it performs AI actions. A simple example here is you say select AI what were the top 10 products by sales per region that goes to the AI proxy database. AI proxy database understands these other data sources, on prem sources, cloud sources, gathers the data it needs, gives it to the LLM, the LLM produces answers. This is great for all kinds of environments where the data is not all stored in one database. One thing that it’s really good for is running against older versions. You can put this proxy database, for example, against older databases, older Oracle databases, or older anybody’s databases. That’s the AI proxy database. In addition, for many years we’ve supported sophisticated analytics on data in object store.

We’ve already released things like native data, delta sharing, parallel SQL, materialized views, all sorts of technologies against data in object stores now. This is something we’re announcing at this AI world. We’re embracing Apache Iceberg to provide a truly open lakehouse. For those of you that don’t know what Apache Iceberg is, it’s a standard that defines the common table format in an open catalog for tables stored in object store files. You store your data in an object store file in this format. The whole goal of this thing is that once it’s stored in this common standardized format, then any product can access the data, both read and write the data. Iceberg tables can be both read and written by a multitude of databases and by things like Spark engines and directly by Python programs. It’s a completely open standard where all vendors are interoperable.

Now these Iceberg tables have a lot of benefits. They do things like data versioning, partitioning, basic transactions, basic schema evolution. These kind of shared tables have a bunch of drawbacks also. You can’t have indexes, it’s batch-oriented transactions, can’t span tables, rudimentary security, slow storage. We’re bringing to market a new version of our database, a new product called Oracle Autonomous AI Lakehouse. The idea of this is to provide the best of both worlds and resolve a lot of these trade-offs. What we’re delivering is the vendor independence of an Apache Iceberg so we can read and write this vendor-independent data with all the power of the Oracle AI Database. In fact, as of today, the 23AI data warehouses that you already have are automatically converted into AI lakehouses. Now they’re enabled to access all this data in a lakehouse in Apache format.

With this autonomous lakehouse, we’re providing all of Oracle’s best-of-breed analytics and SQL on top of all your data lake data. What do you get? You get the AI vector search I talked about earlier for fast index semantic search on all the data lake. You get the richest analytics SQL of Oracle. That handles both advanced relational but also graph JSON on the same Iceberg data. You get Exadata performance because we will automatically cache frequently accessed data in the data lake in Exadata and then access it with Exadata performance while keeping it completely consistent. You get scale out data access with a new technology that’s a pay per view serverless data lake accelerator, and you get the secure data access that I’ve talked about lately that ensures privacy, provides comprehensive security, governance, and sovereignty.

Now, along with that, you also get a federated catalog of catalogs that comes with it. What that does is it provides unified discovery and access to data across Iceberg and dozens of other data stores. It’s a catalog of catalogs. It finds all your data and catalogs it. What it enables is plug and play access to Iceberg data from Oracle SQL. You just name the Iceberg tables and their catalog in the from clause of a SQL statement, and it finds it and accesses it. You can see the example there where you just put the schema table catalog. This works with any Iceberg catalog, whether it’s our catalog or Databricks catalog or Snowflake’s catalog. No matter who catalogs the data, we can find it, we make it really easy to access, and we can both read and write that data.

Not only can you find that, you can also load data into Iceberg. We’ve adapted our industry leading ETL and industry leading GoldenGate replication products so that we can easily move data from your existing systems into Iceberg format that’s completely open from both operational and analytics systems in real time. That’s a big deal. This whole Iceberg open data lake, in this section, what I talked about and I’ve shown is the power and simplicity of Oracle’s open, architecturally converged, everything works together approach to AI. You’ve seen that this enables you to deliver trusted AI insights, innovations, productivity, and apps for all your data. Data in Oracle, data in other stores, data in the object store, data in data lakes, data in third party databases. One of the unique things is we’re running the same AI across everything.

Whether it’s operational data or data lake data, you get the same AI capabilities so you can standardize and simplify your AI estate. We’re continuing what we announced last year, which is all these AI capabilities that I’ve talked about are included in Oracle Database for free at no additional charge. You have these already. If you have Oracle licenses, no additional charge. You already own everything I’ve talked about. That was AI Lakehouse. Now we’re going to transition to our last topic, which is the Oracle AI Data Platform. To describe that, I’m going to invite Oracle EVP T.K. Anand to the stage to talk about this Oracle AI Data Platform and how it extends the Oracle AI Database that I just talked about by providing an integrated platform to bring all your enterprise data together and use AI to solve real world business problems. Please welcome T.K. Anand.

Please welcome to the stage Oracle’s Executive Vice President of AI Data Platform, T.K.

Anand.

Welcome T.K.

Unidentified speaker: Hi everyone. I hope you’re all enjoying AI World. I’m T.K. Anand. I’m excited to be here and talk to you about the AI Data Platform. You just heard Juan Loaiza talk about the Oracle AI Database and all of the amazing AI innovation that’s going on there. The AI Data Platform builds on top of the database and provides an end-to-end platform for data analytics and AI. All right, let’s get into it. AI is driving the next industrial revolution and we know it’s going to disrupt every industry. It’s just a matter of time, months or years, not decades. The organizations that are going to survive and thrive in the AI era are those that can augment and reinvent every aspect of their business with AI agents.

We’ve all seen what foundation models can do with the massive corpus of public domain data that they’ve been trained on, but they know very little about your organization and your business. The key to achieving AI transformation is to get these models to understand your enterprise data, your business applications, and your workflows. The Oracle AI Data Platform is a comprehensive platform that brings your enterprise data together with industry-leading foundation models to help you build agentic applications and experiences for your business users. Firstly, it provides a data foundation that brings all your enterprise data together under one roof. This includes all your databases and applications, your structured and unstructured data, your historical and real-time data. All of this data is made AI ready through a unified catalog, through semantic enrichment, vectorization, et cetera.

We have an AI platform on top of that data that enables developers to build AI applications and agents. The platform includes foundation models, AI frameworks, and developer tools. We also have agentic experiences for business users to consume these AI solutions. You can think of the AI Data Platform as these two systems that work in concert with each other, getting your data ready for AI and then leveraging AI to transform your business. Next slide. All right, what is the AI Data Platform? Let’s look into the key capabilities. At the core of the AI Data Platform is an open lakehouse. It’s an open lakehouse built on open standards, and there’s a unified catalog that brings all your data and AI assets together to enable integrated security and governance. On top of the data, we offer the best of Oracle and open source data engines.

You already heard Juan talk about the Oracle AI Database, and that’s core to this platform. In addition, we also offer Apache Spark and Flink, which are popular open source engines for working with these data lakes. We have industry leading AI models like OpenAI. We have AI frameworks like LangChain. All of these are built into the platform. On top of this data and AI foundation, we have an integrated developer workbench that supports a variety of data analytics and AI use cases. We have an agentic user experience for business users to work with AI agents using interfaces like chat, data visualizations, business workflows, et cetera. The AI Data Platform is a brand new PaaS service in OCI, but it integrates multiple underlying services together into a cohesive experience. It relies on OCI infrastructure services such as CPU and GPU compute for data processing, model training, and inferencing.

It uses object storage to manage all of your structured and unstructured data in the lakehouse. It leverages Autonomous Database for all the amazing capabilities that Juan talked about, but most notably high performance query processing and data retrieval for AI applications, Vector Store for AI agents that Juan talked about, et cetera. It leverages the OCI Generative AI service for all of the foundation models like OpenAI. It leverages Oracle Analytics for its semantic modeling and data visualization capabilities. Finally, the AI Data Platform comes with a set of built-in services for things like the open source Spark and Flink engines, the Unified Catalog, the developer workbench, the business user experience, et cetera.

One of our goals in creating the AI Data Platform was to ensure that we eliminate the developer effort required to wire all these products and services together, and instead developers can just focus on building the solution. All right, now let’s talk about how you get data ready for AI using the Oracle AI Data Platform. The AI Data Platform comes with an enterprise-grade data lakehouse that brings all your data together under one roof. You can bring data from all your databases and applications into the lakehouse. Like I said, it’s structured and unstructured data, historical and real-time data. All of this is made accessible through a unified catalog. You can ingest data into the lakehouse using a variety of different techniques. You can use batch ETL or ELT pipelines. You can use streaming data pipelines using GoldenGate or Kafka.

You can also just leave your data in the source system and access it live through the catalog through zero copy data integration. You can implement a medallion architecture on top of the lakehouse, which is a pretty common pattern. For example, all your source data from the source systems ends up populating the bronze layer. The silver layer is populated by transforming, cleansing, and enriching your data. The gold layer typically represents the most curated version of the data that you use for your analytics, your AI applications, and so forth. The AI Data Platform supports open formats, namely Iceberg and Delta Lake, for managing your data in object storage for the bronze and silver layers, and the gold layer. The gold layer is typically managed natively in the Autonomous Database for high-performance retrieval within analytics and AI applications.

The AI Data Platform makes it really simple to manage this medallion architecture thanks to the unified catalog that provides integrated security, governance, and lineage across all layers. Now, on top of this data layer, on top of this lakehouse foundation, the AI Data Platform offers a developer workbench where developers can implement all their solutions in one environment. This includes projects like data integration, data engineering, data science, agent development, et cetera. The workbench has an AI-assisted notebook interface that supports multiple languages including SQL, Python, Scala, and Java. We also have a visual drag-and-drop interface for low-code developers. The workbench is integrated with Git to enable source control, versioning, and team development. You can use the workbench to create jobs that run on all of the supported data engines.

For example, you might create a job that runs on the autonomous database or on a Spark cluster, or you can create jobs that run across multiple of these data engines. Now let’s take a look at the Oracle AI Data Platform’s developer workbench. Here’s a quick demo.

Oracle AI Data Platform unifies AI models, enterprise data, and an intelligent developer experience to turn data into value. Meet Jordan, a data engineer at a leading telecom company tasked with predicting customer churn using customer profiles and customer reviews in her AI Data Platform workspace. Jordan opens a notebook with direct access to her organization’s data catalog, which manages all assets. Using Medallion Architecture from Bronze layer’s Object Storage volume, she drags and drops the customer reviews file to begin exploring the data. With AI Code Assist, she joins customer reviews with detailed customer profiles, bringing together structured and unstructured data into a single data set. Next, with just a drag and drop, Jordan instantly receives sample code that demonstrates how to call an OCI GenAI model. This ready-to-use code gives her a clear starting point.

With a few quick edits, she customizes it to analyze the sentiment of each customer review. She then writes additional code to save the results into a new table. In the Silver layer, AI Code Assist can also create code comments and generate full documentation explaining the code’s purpose, logic, and use cases for developers. With profiles and sentiments processed in the Silver layer, Jordan applies a machine learning model from the catalog to predict churn risk. Again, she uses AI Code Assist to run the model and save the output to the Gold layer powered by Oracle’s Autonomous AI Lakehouse. To validate the results, Jordan simply switches the notebook’s cell’s language to Select AI, pointing directly at the Gold layer. She then drags and drops the table and asks Select AI how many customers are likely to churn.

Autonomous AI Lakehouse generates the query, runs it, and delivers the answer. In just a few steps, Jordan has transformed raw, unstructured data into predictive insights by combining GenAI, custom ML models, and enterprise data. Finally, Jordan schedules the notebook to run automatically when new reviews arrive, with notifications on completion, providing end-to-end orchestration with ease. Oracle AI Data Platform empowers teams to turn data into insights faster, smarter, and at enterprise scale, driving innovation across the business.

Now let’s talk about how you leverage all of that enterprise data in the lakehouse to build AI applications for your business. The Oracle AI Data Platform comes with a comprehensive set of AI models and frameworks that are ready to use out of the box. We have industry-leading foundation models such as OpenAI, Groq, Llama, Cohere, and soon Gemini. We have popular AI frameworks like PyTorch, TensorFlow, LangChain, and LangGraph. They’re also integrated in the platform, and these models and frameworks run on highly optimized CPU and GPU compute shapes in OCI. On top of these AI models and frameworks, the AI Data Platform has an Agent Studio that’s integrated into the developer workbench. The Agent Studio supports a variety of use cases such as getting insights from data, semantic search over unstructured documents, orchestrating workflows, monitoring business processes, raising alerts, etc.

Let’s see a quick demo of the Agent Studio in the AI Data Platform.

Oracle AI Data Platform reimagines how enterprises build and deploy AI applications. It brings together advanced AI models, a modern developer experience, and enterprise-grade data security and governance. Meet Alex, a developer at a soft drinks company. His task? Build an AI agent to monitor new government regulations such as restrictions on packaging or labeling. Inside the Oracle AI Data Platform Workbench, Alex creates an agent flow. He begins by setting the trigger, in this case a daily schedule that executes the agent automatically. Next, Alex adds an agent and starts assembling the tools it needs. First, he connects a RAG tool to the agent and assigns a regulation knowledge base that was built with Oracle AI Database 26AI. This allows the agent to scan newly published regulations, compare them to existing rules, and instantly spot any new proposals.

With Oracle AI Database 26AI vectorizing the content, the agent has precise, reliable access to regulatory knowledge. Then Alex adds an NL2SQL tool. This allows the agent to query the CX and supply chain systems to determine which products could be impacted by the new rules, whether those involve packaging, labeling, ingredients, or other characteristics. To capture the details for auditing and analysis, Alex adds a SQL tool that writes a new entry into the company’s regulations database. Finally, he connects Slack and email outputs so the relevant product and supply chain managers are notified as soon as a regulation is detected. With just a few clicks and minimal coding, the agent is ready for testing from the editor playground. Alex can simulate triggers and inputs, or review outputs and monitor the agent’s behavior, accuracy, and performance step by step.

Once validated, the agent is deployed to a production environment where it runs automatically on schedule. Just a few days later, the agent proves its value when California proposes new limits on plastic bottles. The agent identifies the regulation, logs the details in the database, and sends a Slack alert. The business owner immediately reviews which products are affected and begins planning next steps. Agents enable organizations to act faster, stay compliant, and uncover opportunities. With Oracle AI Data Platform, enterprises gain a foundation for building and deploying trustworthy AI applications that keep them agile, competitive, and resilient.

The Oracle AI Data Platform has a unified catalog for not just all of your data assets, but also all of your data science and AI models and your AI agents and tools. All of the AI agents and tools that you build in the developer workbench are automatically registered in the catalog, and they support open interrupt standards like MCP and A2A. Over time, we expect most organizations to have numerous AI agents that run in different platforms and applications. For example, you can have agents running in your SaaS applications like Oracle Fusion Applications or Salesforce. You can have agents running within a productivity suite like Office 365. You might have custom agents that you build in a cloud platform like Azure or AWS, but the Oracle AI Data Platform will let you register these agents into its unified catalog, regardless of where they run.

Why is this interesting? The Oracle AI Data Platform comes with an agentic experience for business users called the Agent Hub. In the same way that BI tools and data warehouses enable business users to have a single pane of glass over their organization’s data, the Agent Hub enables business users to have a single pane of glass over the organization’s agents. For example, the Agent Hub will let business users navigate and search the catalog of agents and have a conversational interface with agents that enables things like getting access to information, gaining insights from data, initiating tasks, chaining tasks into workflows, creating a team of agents to solve a complex problem, etc. Our vision for the Agent Hub is to become a cross-platform and cross-application experience for business users to leverage AI agents to automate their work and improve productivity. Let’s take a look at the Agent Hub.

Oracle Agent Hub provides business users with a unified experience to interact with all their agent powered applications and tools, delivering a central entry point for workflows, tasks, and activities. Right from the start, users see their top tasks, key notifications, and quick access to frequently used agents, along with curated links to enterprise applications. When a user makes a request, the system automatically identifies the right agents from the catalog and orchestrates them to complete the task. For example, asking to send a compliance training reminder triggers HR agents to check completion status. Once it confirms who hasn’t completed training, it calls an email tool to send the reminder. The system even suggests scheduling a daily reminder until the task is complete and sets it up automatically. Once approved, the value comes alive even more with complex questions a business user might ask.

Will EMEA be able to fulfill sales of our top five products next quarter, while keeping Overstock under 20%? Agent Hub identifies the sales and supply chain agents, pulls together their insights, and delivers a consolidated answer. The response can include narratives, visual data, recommended actions, and links to applications. Because the reasoning behind the answer is transparent, users can review the details, verify the outcome, and build confidence in the system. From there, Agent Hub goes beyond analytics and into action. The user might ask, how can we address the identified shortfall? The system then coordinates across agents to propose a supply chain strategy. It generates an actionable plan that can be approved in one step or provides links into the supply chain application so leaders can review purchase orders and other details before moving forward.

By combining intelligence with execution, Agent Hub helps organizations not only understand their business, but also act on it immediately.

All right, let’s recap. We’ve gone through all of the layers of the AI data platform, starting from the Data and AI foundation to the developer Workbench, the Agent Studio, and finally the Agent Hub experience for business users. I’m happy to announce that the AI data platform is now generally available, and many of the product features and capabilities that I’ve shown you today are already available in the product, while some, like the Agent Hub, are still in development and should be released over the next 12 months leading up to general availability. Leading up to GA, we worked with several customers who used the product as design partners and gave us valuable feedback. Here are two customers for which we’d like to share a brief video. This is University College Dublin and Clopay.

My name is Colin McMahon. I’m the Executive Director of the UCD Clinical Research Centre. Our mission at the UCD Clinical Research Centre is to translate research into impact for better patient care. The challenge we faced previously was unlocking significant amounts of unstructured and semi-structured data and being able to combine that with open data sets to in this instance deliver a use case around better respiratory care, specifically patients suffering with chronic diseases. We’ve been working with Fortis for several years now. They brought expertise on the Oracle AI Data Platform. They managed the governance and the compliance aspects and were able to prototype a demonstrator within a number of weeks. In the pilot we were able to use fully synthetic data and open data sets to develop a decision support tool that ultimately will be able to scale in a fully governed and compliant way.

The Oracle AI Data Platform has allowed us for the first time to take stock of what’s possible and has allowed our clinicians to understand the art of the possible. The benefit of the Oracle AI Data Platform is that it provides a governed, fully integrated, and scalable solution. We’re excited about the potential to take these unstructured data sets and convert those to structured data sets to allow us to perform longitudinal analysis and the impact that that might have on patients, but also on clinicians in terms of how their time can be managed by better understanding how patients can be managed in the community using this data.

Clopay is the largest garage door manufacturer in the U.S. We are based out of Ohio. We have multiple manufacturing locations all within the U.S. We are basically a mass manufacturer of custom products, millions of SKUs. Every door is made uniquely for our customers. Prior to implementing Oracle AI Data Platform, we had to get to this information manually, pulling that information from OBI into spreadsheets and manipulating spreadsheets to really look at different SKUs, geographies, pricing to really understand what the issue was. Using Oracle AI Data Platform, we are able to predict our dealer churn much more accurately. It’s a more data-based decision. This helps us uncover exactly what is going on, and it’s a great tool. It really drives our bottom line and also the predictability. It helps us understand where they’re going.

It’s not only looking back, it also gives us a view into the future. Using Oracle AI Data Platform was natural to us. Clopay is an Oracle shop, and for us this is a natural extension to continue to use Oracle tools into AI to help us improve our business performance.

All right. The rapid growth we’ve seen in the Oracle Cloud over the past few years has only been possible due to the support and commitment of our partner community. We’ve been working with a number of our global partners to get them ready for the AI Data Platform launch. I’m happy to note that some of these global partners have committed to over $1.5 billion towards comprehensive training and development of industry use cases and solutions. Here’s a brief video showcasing the work that these partners are doing towards AI Data Platform.

Oracle AI Data Platform represents an exciting leap forward for organizations looking to reimagine what’s possible with data and AI. I believe it’s ideal for enterprises seeking to modernize their data and AI capabilities at Infosys. Oracle’s AI Data Platform is a top strategic priority for Infosys investments, for talent development, and for growth. Infosys invested over $140 million in R&D during 2025 as stated in our latest annual report, and the plans to make significant investment in Oracle’s AI Data Platform capabilities over the next few years. Today I’m excited to share that we are building several industry use cases that leverage generative or agent AI. These use cases built on Oracle’s AI Data Platform will be part of Infosys Topaz AI First offering. We remain dedicated to empowering our customers to achieve transformative outcomes with AI.

For over 30 years we’ve been committed to driving client outcomes, leveraging the power of data, and now more recently with AI. I am super excited that Cognizant is one of the strategic launch partners of Oracle’s AI Data Platform. We at Cognizant announced a $1 billion.

Unidentified speaker: Investment in AI last year.

Our commitment in shaping the AI and agentic journey for our enterprise clients is steadfast. We see this AI data platform as a strategic component of this journey.

Our clients are on.

Our goal is to train over 1,000 associates within the next 24 months. We’re working with Oracle and joint offerings and use cases in industries as broad as manufacturing, retail, technology, travel, hospitality, and utilities. We also aim to create over 50 industry-specific agentic AI use cases that leverage both the Oracle AI Data Platform as well as our own agent foundry.

Juan Loaiza, Database Team, Oracle: We are thrilled to see Oracle announce its new AI Data Platform, a major milestone in enterprise AI innovation. This advancement will allow Accenture’s client science to unlock the full value of Oracle’s capabilities from day one. Since our $3 billion commitment in 2023, Accenture has been heavily investing in AI technology and upskilling our people so that we’re ahead of the curve for innovation. We’ve been excited to walk this path hand in hand with Oracle, quite frankly, as our partner. Together we are embedding Oracle capabilities into Accenture’s AI Refinery, a modern unified framework built on Oracle Cloud Infrastructure and powered by Nvidia.

Unidentified speaker: For years at KPMG, we’ve been focused on helping our clients make better decisions and automate business processes through the use of data, AI, and analytics. We’re very excited about the release of Oracle’s new AI Data Platform, allowing us to further use data and AI to help our clients make better business decisions and automate business processes. We’re also going to utilize that data in the form of industry analytics, where we’ve worked with our industry experts in KPMG and created a way to very quickly access that information. As we continue to focus on adding valuable solutions to our clients, we anticipate investments of $200 million over the next three to five years, as well as training an additional 1,600 employees globally.

We are empowering organizations to accelerate success using data and AI, especially in decision making process against metrics that matter the most. We are extremely proud to be a partner of Oracle at the launch of this new Oracle AI Data Platform. As part of our billion dollar commitment to AI initiatives, Oracle’s AI Data Platform will serve as one of the key cornerstones in our AI strategy. At PwC, we have thousands of consultants across the globe equipped to drive this strategy and deliver with Oracle’s latest AI technologies. Oracle’s AI Data Platform will be a core enabler of our Agent Card Performance solution. Agent Power Performance solution is PwC’s flagship IP that combines AI agents, automation, and analytics. Leaders often wonder why do my AI pilots stall and value take so long?

The answer is usually siloed data, lagging integration, and time wasted managing different tools, which drives up costs and compliance risk. Oracle’s new AI Data Platform changes this by uniting data governance, analytics, and AI in one solution. Now with LTI Mindtree’s Transistor, you can quickly migrate KPIs, models, and pipelines to AIDP and gain value from day one. Today I’m happy to announce that we are investing over $200 million and training thousand plus experts over the next two years to support this journey and unlock value for our customers.

Unidentified speaker: All right, before I wrap up, I want to make an important point about the AI Data Platform and how it relates to our SaaS applications. For all of our major application suites like Fusion, NetSuite, Health, Life Sciences, et cetera, we will offer a tailored version of the AI Data Platform that comes with pre-built integration with the SaaS applications. It includes data pipelines, lakehouses, business semantics, analytics, and of course AI agents. This is just a natural evolution. It’s a natural evolution of our existing products like Fusion Data Intelligence and Health Data Intelligence. The idea is that you can get started with the AI Data Platform with immediate value for your business users. You still have the full power of the platform in OCI to extend and customize the solution. All right, that’s about it.

I just want to wrap up by saying the AI Data Platform is a comprehensive and integrated platform for all your data analytics and AI use cases. It’s built on a lakehouse architecture supporting open data formats. The Unified Catalog enables end-to-end security, governance, and lineage. We bring together the best of the Oracle AI Database and industry-leading open source engines like Spark to give you high performance and scalable processing on your data. We have industry-leading AI models like OpenAI and Gemini already integrated into the platform. Oracle Applications customers can get started with the pre-built AI Data Platform solutions that I just talked about. We have a number of sessions at AI World that get into the technical capabilities of the AI Data Platform. Please check those out. I just want to reiterate what Juan Loaiza said at the start of this keynote.

The AI revolution is here and your organization needs you to step up and be an AI leader. The Oracle AI Database and the AI Data Platform can help you in this transformation journey. Please scan these QR codes to learn more. Thank you.

This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers
© 2007-2025 - Fusion Media Limited. All Rights Reserved.