Gold prices hold gains amid Fed rate cut hopes, tariff jitters
On Tuesday, 10 June 2025, NVIDIA Corporation (NASDAQ:NVDA) participated in the Nasdaq Investor Conference in Partnership with Jefferies. The conference highlighted NVIDIA’s strategic focus on AI growth, the resolution of supply chain issues, and challenges in the Chinese market. The discussion was led by CFO Colette Kress and Jefferies analyst Blayne Curtis.
Key Takeaways
- NVIDIA is capitalizing on sovereign AI initiatives, projecting potential growth in the trillions over several years.
- The Blackwell architecture is a key focus, with supply chain constraints easing.
- Gaming revenue reached record levels despite supply limitations.
- U.S. government restrictions impact NVIDIA’s ability to address the Chinese market.
- Gross margins are expected to improve, reaching the mid-70s by year-end.
Financial Results
- Gross Margin Improvement: NVIDIA anticipates gross margins to reach the mid-70s by the end of the year, driven by increased volumes and cost efficiencies in the Blackwell architecture.
- Gaming Sector: Gaming hit record levels, though supply constraints remain due to the focus on Blackwell production.
- Networking Business: Networking revenue is robust, with a 64% increase and Spectrum Access valued at $2 billion.
- China Market: The Chinese market is estimated at $50 billion, but U.S. restrictions limit NVIDIA’s ability to fully engage.
Operational Updates
- Sovereign AI Initiatives: NVIDIA is collaborating with countries worldwide to develop AI infrastructure tailored to their needs.
- Blackwell Architecture: The company is shipping full data center scales of the Blackwell architecture, enhancing performance and efficiency.
- Supply Chain Improvements: NVIDIA is addressing supply chain issues, shipping 1,000 racks per customer weekly.
- Networking and NVLink: NVLink and networking solutions, including InfiniBand and Ethernet, are crucial for maintaining NVIDIA’s competitive edge.
Future Outlook
- Sovereign AI Growth: NVIDIA expects significant growth in sovereign AI, with potential for tens of billions of dollars in the near future and trillions over time.
- Blackwell Focus: The company prioritizes Blackwell production to meet industry demand.
- Expansion Beyond Data Centers: NVIDIA is exploring opportunities in automotive, robotics, and enterprise AI applications.
- Continuous Innovation: The company remains committed to innovating and expanding its platform to address emerging AI needs.
Q&A Highlights
- Sovereign AI Demand: Countries are developing their own AI capabilities, with NVIDIA supporting these efforts.
- Blackwell Demand: Strong growth is anticipated despite challenges in China.
- Networking Strategy: The networking business remains strong, with a high attachment rate to NVIDIA GPUs.
- China Strategy: NVIDIA is in discussions with the U.S. government to navigate restrictions impacting the Chinese market.
For a deeper understanding of NVIDIA’s strategies and insights, refer to the full transcript.
Full transcript - Nasdaq Investor Conference in Partnership with Jefferies:
Janet Harbison, Lead International Equities, Jefferies: For our twenty twenty five Nasdaq Investor Conference. My name is Janet Harbison, and I lead international equities at Jefferies here in London. We are proud to be partnering with Nasdaq for the consecutive year on this event, and I can confidently say that this is the best lineup ever. I want to take a moment to thank Jack, Daniel McCart and Andrea Joff from Nasdaq for their incredible partnership and dedication. On the Jefferies side, a huge thank you to Abigail Chartam, Adita Balsam and Tanya Cosler for their tireless work behind the scenes to ensure a seamless experience for all of you.
And of course, thank you to the corporates for making the trip to London. What an exceptional lineup we have. The Nasdaq has long been a bellwether for innovation driven growth. Over the past five years, it has consistently outperformed broader market indices, reflecting the strength and resilience of technology and biotech sectors. This performance underscores Nasdaq’s role not just as a stock exchange, but as a global platform for companies shaping the future from AI and semis to cloud computing and digital health.
A few quick words on Jefferies as we will be better known to some of you than others. We’re one of the fastest growing investment banks globally. With a sixty year old firm, a $60,000,000,000 balance sheet and almost 7,000 professionals across more than 40 offices in Europe, The Middle East, Asia and, of course, The Americas. We focus exclusively on global markets, investment banking and asset management. U.
S. Equities are a key focus of the delighted that we’re welcoming our Jefferies team from Stockholm, Frankfurt, Paris as well as multiple U. S. Offices here today. Within equities, clients are often surprised to learn that Jefferies now has the broadest global equity research coverage on the street, covering over 3,500 stocks.
And most recently, we’ve added Latin America, MENA and Canadian research. What truly differentiates Jefferies is the global nature of our business and the depth of collaboration between regions and teams. Please do speak to me or my colleagues if we can help you or your business further or if you would like to be introduced to other parts of the business. This spirit of global collaboration and insight is at the heart of our upcoming lunchtime panel semiconductors, where we look forward to hosting our global semis analyst, Blayne Curtis, our U. S.
Semis analyst, Gennard Menon, Head of European semis and Edison Lee, Head of Asian semis. We hope you’ll join us for what promises to be a fascinating discussion. Before I close, one small ask. In 2024, we had record results in Institutional Investor, ranking five. Jefferies was the most improved firm for the year running, and we have almost 80 analysts ranked in The U.
S. And Europe. We care, and we would be incredibly grateful for your five star votes in The U. S. Survey that is currently running, especially for the tech team attending this conference and helping make it happen.
Many of you in this room have been instrumental in our journey. Thank you for your trust and support. It is now my pleasure to hand over to Colette Cress, CFO of NVIDIA and Blayne Curtis, Jefferies Head Semiconductor Research. NVIDIA has been a trailblazer in the tech industry, revolutionizing fields such as AI, gaming and data centers. Under Colette’s financial leadership, NVIDIA has achieved remarkable growth and innovation, making it one of the best performing and most exciting tech stocks globally.
Welcome to the Jefferies stage, Colette and Blaine.
Blayne Curtis, Head Semiconductor Research, Jefferies: Alright. Thank you for all for joining. I’m Blayne Curtis. Obviously, you know Colette Kress and very happy to be kicking off the conference with NVIDIA. Obviously, been an incredible story over the last couple of years with AI particularly.
I think we want to start on the kind of demand side because I think one of the new interesting kind of drivers is sovereign AI. I think Jensen has kind of talked about it as the next growth driver. In fact, I think at GTC, talked about maybe Sovereign would be the biggest spenders, he said non CSPs. So maybe just kicking off there. Obviously, you’ve been talking about it for several quarters.
There are some Middle East announcements. I think Jensen promised some European. You have GTC Paris coming up here. So thank you for joining and maybe start there.
Colette Kress, CFO, NVIDIA: Yes. Thanks so much for having us here. I’m pleased to be here. It’s been a while since I’ve been here at the conference and been able to speak to so many of the investors. Really appreciate that you all came out for today.
I have a little bit of an opening kind of little statement that I have to say. Before we begin, as a reminder, the content of this meeting may contain forward looking statements, and investors are advised to read our reports filed with the SEC for information related to risks and uncertainties facing our business. Well, I want to talk about some of the things that occurred over the last couple of days. Jensen was in here in The U. K, working here with the Prime Minister.
And the Prime Minister and Jensen together really work to develop opportunities within The U. K. And focusing on that sovereign piece of it. And we will be looking to build out infrastructure here in The U. K, supporting many of the industries that are here, many of the startups that are here, focusing on what they can do for AI.
We know this is an important time to help them, help them in terms of building that AI and infrastructure just to start that fuel that’s going to be necessary for their AI solutions. Now thinking about that and in here in The U. K, a lot of discussion about referring to it as the Goldilocks place. And the Goldilocks place was really a common way that we try and think about the importance of the great talent that is here, the great AI talent, the great startups that are here in The U. K, and we couldn’t be more proud to be there.
So we will also be here in GTC Paris. That is correct. So shortly after today, we just head on over to Paris, where we will also be talking about Sovereign in a bit different part of the world in terms of the EU. So Sovereign is a very big piece and a focus of where we are concentrating. Keep in mind, the world of AI has moved probably the fastest of any other technology across the globe that we’ve seen in history.
From the onset of what we saw in terms of ChatGPT, the instantaneous understanding worldwide on how important AI would be for our future And all countries, all enterprises, all people, all consumers are all thinking about how AI would work there. We’re happy to be just a proud partner with so much of that work in terms of our platform and what we’ve put together. But sovereign is a big piece. We have been in The Middle East, as you indicated, and we ended up speaking with not only the Saudi Arabia, but also The UAE. I think it led to what you heard is the tens of gigawatts that would be available through many of those nations.
It was an important time because it was U. S. Government together with what we were seeing in the Middle East leadership as well. And I think that will be a great start for such an important part and what they can do to influence both from the capital, the data center and our help from a platform to do so. How large is sovereign?
How large is sovereign is always the question in front of us, but it is going to be a very, very large piece. Look at it in this perspective. Every country will need their own ability to have their AI within their country. Using just one standard foundational models and some of them that are available in The United States, you’re going to see many of these foundational models begin in a lot of the countries that are here. That’s the ability for you to have your own language, your own culture, your own data that you will likely want to keep inside of that country.
That’s why the sovereign piece is such an important piece for us. It will be just as your GDP would likely be, growing as your GDP does and being a very big part of that. So we see in just right now probably tens of billions of dollars that will be surfaced. But again, when you look at the size of this, you can be approaching over several, several years could be close to $1,000,000,000,000 So these are key areas about why we’re here, why we’re here in this part of the world and why we’re focusing on a lot of different parts because sovereign is going to be a big piece.
Blayne Curtis, Head Semiconductor Research, Jefferies: I want to follow-up, you kind of partially answered it, but the question I get a lot is, who’s the ultimate customer? You have a sovereign funded data center in The Middle East per se, Is it the customer going to be Microsoft and it’s just a regional data center? Or I think you answered there will be specific national efforts, models, data, such. Maybe you can elaborate on that. And then in terms of timing, I get this a lot as well.
We’ve seen some announcements. I’m assuming these are massive data centers, gigawatts probably need to build buildings and then fill them. So maybe you can walk us through a little bit of the timing behind some of these statements.
Colette Kress, CFO, NVIDIA: Yes, we get a lot of discussion and there’s a lot of folks interested in being a part of Sovereign. All will be partners within what will be built in terms of Sovereign. When you think about what is necessary in each country probably being a little bit different, What participation does the government issue in many of the countries? Remember, they’re also very important in terms of the telecom business or what we need for the Internet. You can imagine what they are using for AI will also be part backed by parts of the AI.
What we’ll see though is not necessarily a standard model, every country will probably do that different. But the governments are very focused in terms of what they need to do to support the country as a whole and have been a very big part of a lot of the fundraising that will be necessary. You then go into who is that builder. The builder can be absolutely CSPs that you’re seeing, but you also see a new brand surfacing that will also be very important and we refer to often as the neo clouds. The regional clouds that will be stood up that may not be standard with the larger clouds that you see, but really customized and be focused on more of a private cloud, providing specific data or a specific model for one or two different types of customers.
These may also be what you’ll see in terms of enterprises in these nations. Enterprises building AI factories through these neo clouds to be put together. So many can contribute to that. Much of the European Union folks had seen supercomputing as being an important industry. This can be a focus of moving towards AI included in their accelerated computing focus that they also did on supercomputing.
So a lot of opportunities for all to join in that perspective. Now how soon, what will we see what we heard in terms of in The Middle East, for example, some of the important foundational things that leads to these types the in in we’re growth the group that are focusing on where will the power that is necessary for these data center complexity also be put together. So those are some of the things that we’re already seeing, each going hand in hand in terms of what we’ll build in AI.
Blayne Curtis, Head Semiconductor Research, Jefferies: So I want to kind of finish up on the demand side. I mean, actually, to begin this year, there was a lot of questions about whether the sustainability of the level of spend, which you’re to get when you see that kind of growth. I thought it was interesting. Google talked about serving $480,000,000,000,000 tokens in a month, which is up 50x. So we’ve heard comments from the CSPs, they don’t have enough GPUs, they can’t serve the inference that they need to.
I’m kind of just curious from your perspective, maybe wrap in the demand for Blackwell and just overall demand
Colette Kress, CFO, NVIDIA: what is going to be necessary going forward. Yes, foundational models continue to be trained, but new and advanced models are very, very predominant at this time. What you see is reasoning models taking a significant amount of need of compute. The three scaling laws are still a big part of it from the foundational part and moving into reasoning models, you see a significant amount of more compute that is necessary slide. So next to next models and the future of agentic types of models.
Agentic models are essentially doing work for you, not just actually reasoning and giving you answers. It would be great to see so much of the work that we do today, so much of the manual work that could be done with some of those agentic models. Blackwell has been engineered specific for a lot of those reasoning models and particularly for inferencing. Right out of the gate, when we shipped our GB200, NVL72, several of our customers stood it up just to look in terms of the size of inferencing improvement. The inferencing improvement as we have now focused on accelerating just about every part of that Blackwell infrastructure has been key.
That software platform, also very important in terms of influencing the inferencing performance. And as you’ve seen, what they can do in terms of token generation is an X factor greater than anything that they’ve seen before. We’re seeing folks actually use our Blackwell directly for inferencing, not just for the training upfront. Both of these are important factors that are driving them. So many of our customers absolutely see more and more needs for more compute as we continue to scale.
So it is not just focused in terms of one industry or one part of the world, each and every industry that is in terms of growing. So yes, as we recognized in our guidance that we provided for the quarter, we do see strong growth. We see strong growth in terms of Blackwell even with the backdrop of some of the challenges that we’ve had in terms of what we’d able to ship to China.
Blayne Curtis, Head Semiconductor Research, Jefferies: I want to ask you about the China market. Jensen talked about it being a $50,000,000,000 market. He’s been quite vocal that he’s kind of against the restrictions that you guys have seen. Obviously, you had the diffusion rules that went away, but that was another area that I think you spoke out against as well, trying to address this demand, be the one who does it versus something homegrown. So maybe you can just talk about you made the comment that post the H20 ban, Jensen said that a cut down version would maybe not be competitive and really you shouldn’t think about you guys addressing China.
There’s still rumors about that you could cut down a chip and still address it. So what you can talk about maybe just why China is important and then what is your plan to address or not address that market?
Colette Kress, CFO, NVIDIA: Yes. So during the quarter of our first quarter in the middle of the quarter, we received notice from the U. S. Government that we would not be able to ship our H20. Now keep in mind, our H20 going to China was the only product of significance from the data center that we were doing through a lot of work in terms of what we developed for them and a lot of back and with The U.
S. Government with continuous approval for them to do so and what we brought to market. And unfortunately, they chose to not allow it to go. Now that means where we stand is in a situation that we really don’t have anything for that market. We’ve discussed that it wouldn’t be appropriate for us to just start a new chip at this point, because essentially the H20 from where it compared to our Blackwall architecture was significantly lower in terms of what we were being able to enable in China.
That was about a 25x change from an H20 to what you would receive in terms of a black well. So we knew this takes a discussion with the U. S. Government, if anything new that we want to do. And we know that our work in China is not about us as alone, because remember, there is domestic competition in China when you are not being able to ship your best to do so.
So at this time, we are going to continue to work to see what would be possible, what could we do, given that we’ve gone through this now and have had to stop in the middle. That’s not something that we want to do going forward. It’s a big market, though. China is a very, very big market. We can think about it just today or this year, probably could be about a $50,000,000,000 market.
That’s a great opportunity for us to continue to innovate, continue to build the platform from The U. S. To the rest of the world and we think that’s an important market for us to go and do. So again, still in discussions with the U. S.
Government and we’ll see.
Blayne Curtis, Head Semiconductor Research, Jefferies: I want to ask you on the supply side was another kind of concern entering the year. You look during Hopper, it was availability of co ops and supply issues more on the chip level. This time around with the GB200, it’s more of a system issue and it’s not that you ran into one huge problem, it’s probably a lot of little problems in just standing these racks up. So I think the interesting comment that was made on earnings was that you actually shipped 1,000 racks per customer per week, which is obviously a huge number, but I think the point was that you’re starting to catch up. So maybe you can elaborate on just the supply equation, people you’ve had a decent Blackwell number in terms of revenue, but people look at these downstream data points and the amount of racks that the ODMs can produce and it had been quite a low number to start the year.
How is that improving?
Colette Kress, CFO, NVIDIA: Yes. So our Blackwell architecture was a phenomenal decision on an architecture change. What we did is we pretty much shipped to our customers a full data center scale versus what they had seen in, for example, the hopper architecture, which was a standard classic configuration of what we would be selling, which would be a motherboard with about eight GPUs in it. So moving to what we did with Blackwell was the importance of understanding each and every part of that data center needed to be accelerated to focus in terms of continuous performance improvement and the best efficiency even from a power perspective. So that configuration as sophisticated as it was with probably about 1,200,000 different components in it landed with many of our system integrators, our OEMs and ODMs, working to get it pretty much what they would do to build out a data center and get that into market.
So nothing unique about that. It was just the change in terms of what they had received earlier to getting a full data center. All is moving now quite seamlessly. And yes, we are getting them, back up to levels so that they can move what they had received and getting them stacked into the data centers, all racked up and many of them have already started their work in terms of starting workloads on those systems. We’ve also indicated important for our Blackwell is our next architecture moving to the 300 series or Blackwell 300.
It will be the same pretty much the same architecture, same electronics, same mechanicals. Change in terms of the chip or change in terms of the memory is probably the only change to that. The customers are well brief now on how to build out those GB systems and we’re excited to see that now in the next architecture as well.
Blayne Curtis, Head Semiconductor Research, Jefferies: I was actually going to ask about that in terms of not only GB300, but also the generation Rubin per year roadmap, it’s the same rack effectively. So kind of also want to just ask you, yes, in terms of the concern people had is that maybe there was chips at the ODM and somehow that wouldn’t clear through. So I mean, I guess in terms
Colette Kress, CFO, NVIDIA: of All moving quite well. All moving well. The speed of what they’re moving, what’s going to be important is the next thing is getting them on a cadence where they have it all showing up and moving quite quickly. Just as we do our supply chain, you’re now seeing this to be an important part in terms of standing up REXSON.
Blayne Curtis, Head Semiconductor Research, Jefferies: And would you relate the GB the H200 transition was pretty quick, kind of just over one quarter almost all of it switched over. You think about the transition of the GB 300, which you’re sampling now, should it be a similar kind of cadence given that it’s so much overlap with the platform?
Colette Kress, CFO, NVIDIA: You’re going to see both. You’re still going to see the 200 Series and the 300 Series ship. Keep in mind, we are still shipping, for example, Hopper 200. So there is still the continuation. Many of them fill out their data centers fill out for certain workloads.
So you’ll probably see both of them continue over several quarters.
Blayne Curtis, Head Semiconductor Research, Jefferies: I want to ask you on the competitive side, particularly ASICs. I thought it was interesting. Computex, the NVLink Fusion allows some permutations with other people’s silicon, whether it’s a CPU or accelerator. So maybe kind of talk about that strategy, what you’re seeing from ASICs as competitors and why Fusion, I guess is the question I get a lot.
Colette Kress, CFO, NVIDIA: Yes. So let’s start with NVLink. NVLink, as you know, is our very important connectivity that has been part of us for five generations of what we’ve put into market, very important in terms of GPU to GPU connections as well as CPU to GPU connectivity. For example, in our GB200 NVL, you have NVL link or 72, and what we are doing is NVLINK plus eight switching, so a very, very important part of the configuration working with the significant amount of traffic, particularly on an inferencing side. And we had taken the best of breed of what we had seen in our InfiniBand and enabled that now also with our switching Ethernet as well.
So going back to NVLink and its importance of it, this is an opportunity for folks to still maintain on our platform and get those capabilities. If they want a different CPU, yes, we have our Grace CPU, but another CPU gives them an option if they want an x86 or otherwise to still stay connected to our full platform, but having a license to that and working in terms of our networking. It could be the same in terms of ASICs as well. So our opportunity here is to continue to expand the opportunity of our platform both with NVLink as well as networking.
Blayne Curtis, Head Semiconductor Research, Jefferies: Perfect, Leen. And I want to ask you on networking. I mean, I think when you look at your roadmap, it’s not just a GPU roadmap. You have half dozen chips there that are all critical in making that system, which I think is the challenge. And we’re going have a couple of AI days coming up with some of your competitors.
I mean, they’re going to have to answer that equation, how they match the NVL72. So networking was $5,000,000,000 up 64%. Maybe you talk about the strength you’re seeing. And then also Spectrum Access $2,000,000,000 So I think we get this question a lot, FiniVAN versus Ethernet, your Ethernet versus others Ethernet, what kind of traction are you seeing on the networking side?
Colette Kress, CFO, NVIDIA: Yes, really good. Our networking is doing phenomenal. Just as we discussed the importance of accelerating pretty much every part of that data center is going to be essential for many of these AI workloads. Our networking business continues to expand. You have teams really focusing on how well to integrate with so much of our work that we do in terms of AI.
So we had best of breed in terms of InfiniBand, best of breed InfiniBand. But keep in mind, many of your enterprises are on Ethernet and we created Ethernet for AI inclusive of Spectrum X. Spectrum X has been very important for many of our hyperscalers and many of them we talked about in terms of on our earnings. What they are seeing is a great solution, Ethernet that would give them key for many of their enterprise tenants that they have there, but keeping those key points of the traffic that needs to be monitored so heavily, plus many other capabilities with that. Yes, it’s reached a very strong level and we are still also shipping a great amount of InfiniBand as well.
So together with our NVLink, our Ethernet platform as well as InfiniBand, we have a really, really good attach rate in terms of what we’re seeing. The time that they are usually choosing NVIDIA’s networking to attach with what we have in terms of our GPUs, that can be over 70% of what we’re seeing. So it’s moving quite well.
Blayne Curtis, Head Semiconductor Research, Jefferies: You do a great job talking on the strategy. I want to ask a CFO question. So just finishing on the ramp of the GB200, I think the gross margin has been a big focus. And you did guide gross margins up sequentially, talked about mid-70s by the end of the year or in the future, maybe, I don’t want to put words in your mouth. Can you talk about what needs to happen to get those gross margins to mid-70s?
Colette Kress, CFO, NVIDIA: Yes. So making a pretty big change moving to Blackwell in terms of that configuration and all the different types of components. We were in catch up mode for a good part of the of getting Blackwell to market and we have now gotten to a fairly solid ramp. And that’s going to be able to assist us in terms of improving those gross margins as we get more volume and more that we can work on in terms of the yield plus the cost together pieces of that. Let’s not just forget all of those different components that are there to put together.
So we made progress absolutely in Q1. We are guiding in terms of Q2 continued progress. And yes, we do see that path towards the mid-70s before the end of the year.
Blayne Curtis, Head Semiconductor Research, Jefferies: I wanted to you’re running out of time and I think I do want to ask, obviously, the data center and AI is the biggest part of the story. I wanted to ask on gaming. You saw a great deal of strength there. AMD saw strength as well. I think people the question is, are we seeing some sort of gaming cycle?
Is this AI that’s driving the demand? I’m just kind of curious from your perspective, what’s driving the half strength in gaming?
Colette Kress, CFO, NVIDIA: Yes. Thanks for the question on gaming. Gaming actually hit record levels, okay, record levels in this last quarter. But keep in mind, it’s record levels and we are supply constrained. And so we’ve been working feverishly on getting our Blackwell architecture to market and the volume that we need to serve those customers.
And I think we’re getting stronger and stronger each quarter in terms of that size. What are they excited about? They are excited about gaming. It is still such an important industry, but there are other useful cases that you can see in terms of AI, AI with a PC. This in the future is going to be an important part, whether that be for your creatives, your independence, those that are really working, but now you have a great AI PC just as much as you have a great gaming PC.
So more to the growth and more to the Blackwell for gaming to come.
Blayne Curtis, Head Semiconductor Research, Jefferies: In terms of expanding the story out, I kind of want to ask you an open ended question in terms of where you see the biggest opportunities for kind of AI over the next kind of decade. You hear stories on obviously, you’ve been in autos for a while. It’s funny, we’re getting like a renaissance of autonomous driving if they don’t burn all of them in LA. But I think you hear talk of humanoid robots who might have 10 per person, biggest market ever. Obviously, it seems futuristic but may not be as far away as you think.
Obviously, the data center, lots of applications on R and D side. So maybe you can elaborate at where you see it all? Where are you most excited over a longer horizon?
Colette Kress, CFO, NVIDIA: Yes. There’s a lot of amazing work doing that will really influence so much of the AI work. And starting what we see right out of the gate is there are so many different software applications at an enterprise level, infusing AI, infusing AI work within those is absolutely what you see so many of those companies working on. So those are going to be some of the things that you see. The focus in terms of other enterprises, not a single enterprise on the planet doesn’t have a call center.
And wouldn’t they just love the ability using AI to make that as most efficient as well as a great experience for their customers and pulling that together. More and more agentic work will begin at the enterprises. Agentic work that says, I can get that work done in hours that I’m not at work and that I can walk into the office to where that it can go through the reasoning and go through a phase that says, what kind of work should be done? I can actually see that in something such as a finance organization that says, how do I decide in terms of what we need to do in terms of booking accruals and those types of things. So a lot of work can happen, AgenTic, in terms of AI.
But you brought in a good area of the kind of the next focus. Automotive, AV cars, EV cars, such an important industry. And yes, working ten years to really see a lot of the robo taxis on the car or Level two, Level three end market. But it’s also another introduction to another big industry, which is the physical AI and or the robotics. Change out some of the things that you see in terms of automotive and you can see that exact same thing coming through in terms of in the robotics.
Robotics in terms of the human eyes and the multiple brains, the brains that will be back in the data center, the brains that will actually be inside of the robots that are providing that landscape for them to actually do work as well. Manufacturing and industrial AI are very top of mind and very important in this part of the world in terms of the European side as well. Those are some of the big things that we’ll see in the future.
Blayne Curtis, Head Semiconductor Research, Jefferies: All right. Well, perfect. Already out of time, but thank you for joining. Thank you for everybody coming to as well. Thank you.
Thank you.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.