Arista Networks at Wells Fargo's TMT Summit: AI Drives Growth

Published 18/11/2025, 21:22
Arista Networks at Wells Fargo's TMT Summit: AI Drives Growth

On Tuesday, 18 November 2025, Arista Networks (NYSE:ANET) presented at Wells Fargo's 9th Annual TMT Summit, offering a strategic overview that highlighted both opportunities and challenges. The company underscored its strong market position and ambitious growth targets, driven by AI and campus networking initiatives, while addressing concerns over Q4 guidance due to industry-wide component constraints.

Key Takeaways

  • Arista targets 20% growth for the next fiscal year, with significant AI revenue contributions.
  • Deferred revenue surged by 87% in Q3, fueled by AI data center projects.
  • The company aims for $2.75 billion in AI-related revenue by 2026.
  • Component constraints are industry-wide, not specific to Arista.
  • Arista is expanding its customer base in the AI market, targeting 25-40 new clients.

Financial Results

  • Arista's Q4 guidance sparked investor debate, projecting only 2% growth.
  • The company is confident in achieving 20% growth in the next fiscal year.
  • AI-related revenue is expected to reach $1.5 billion in 2025 and $2.75 billion in 2026.
  • Campus networking revenue is targeted at $1.25 billion.
  • Gross margin fluctuations are influenced by customer mix and E&O inventory management, with steady inventory turns between 1.1 and 1.3.

Operational Updates

  • Three out of four initial AI deployments are nearing production, with the fourth transitioning to Ethernet next year.
  • Arista is engaged with 25-40 potential AI customers, including enterprises and sovereign states.
  • New campus networking technologies have been released, featuring identity and VeloCloud integration.
  • Supply chain strategies mitigate component constraints, focusing on multi-source and proactive negotiation.

Future Outlook

  • AI and campus networking are projected as key growth drivers.
  • Arista plans to increase its campus networking market share beyond the current 5%.
  • Strategic initiatives include a land-and-expand approach for new customer acquisition.
  • The company is preparing for growth in scale-up networks by 2027.

Q&A Highlights

  • Deferred revenue is increasingly product-driven, with recognition timelines extending to 18-24 months.
  • Blue box solutions offer Arista's hardware with customer-specific software, seen as a dual-source strategy.
  • Arista emphasizes the distinct value of its blue box solutions over white box alternatives in the competitive landscape.

In conclusion, Arista Networks remains optimistic about its growth trajectory, leveraging strategic initiatives in AI and campus networking. For further details, refer to the full transcript.

Full transcript - Wells Fargo's 9th Annual TMT Summit:

Unidentified speaker, Host: Why don't we go ahead and get started, try and keep us on schedule here. Extremely excited to host a 35-minute discussion with the Arista team. We've got Chantelle Breithaupt, and we've got Martin Hull, obviously CFO, Vice President, General Manager of Cloud and AI Platforms for Arista. You know, if there's time at the end, I might ask, you know, anybody that has a question, please raise their hand. I'm gonna jump right in.

Chantelle Breithaupt, Executive, Arista: Sure.

Chantelle, thank you for joining us. Martin, always good to see you. You know, I'm gonna just start here because it came up, you know, a lot post your recent earnings. Company put up great results, as expected. The debate seems to be the 2% guide, right, on the Q4. Maybe we could start by just talking a little bit, and Martin, jump into what you're seeing from a supply chain component perspective, how that's maybe affected some of the shaping and the timing of, you know, product availability. Just walk us through the puts and takes around that right now.

Yeah. Sure. Hi. Good morning. Good morning to those in the room and those on the webcast. I think that I would kind of almost decouple those two things, Aaron, and I think it's a great question. Thanks for bringing it up. As regard to Q4, FY2025, and FY2026 in general, we don't see any constraint issues on revenue. You know, anything that we see in the industry, we've either addressed through our purchase commitments, which we raised to ensure we have the supply we need this year and next year and perhaps beyond. Also, to take a look at the fact that we are confident in the 20% guide for next year as early as September at our analyst day. I think that's the earliest you've seen Arista come out with such a bold guide. We're very enthusiastic about that.

I think the constraint in the industry is across the industry. It's not Arista specific. We just wanna make sure, Jayshree, in the earnings remarks, hey, just heads up, there's some things here. It could be memory. It could be fab capacity. The revenue guide, you know, is Arista has we have a pragmatic style. I would encourage you to think about the growth both you see in the P&L and what you see in deferred. You saw deferred revenue grow 87% in Q3. I encourage everyone to look at the both combined. You know, we'll see where we end up in Q4 and how we finish the year and how we guide in February for 2026. We're very excited.

For those who have known Arista and Jayshree for a while, hopefully you heard her enthusiasm in the Q3 earnings call.

Yeah. I would say, you know, that I think it was the first question on the call. She said, "I've never been, you know, felt this good about the growth that we see in front of us, going forward." You know, talk a little bit about the deferred. You know, how do we think, I think, you know, 18 months gets thrown around. That's more of an average. You expanded your product deferred, which, you know, maybe you talk on a total basis. But, you know, product deferred went up another $625 million this last quarter. I think prior quarter was $687 million. Extremely robust growth in product deferred. You know, that pragmatic approach to guidance, how do we kind of pack, you know, product deferred vis-à-vis revenue, you know, generation? How is that factored into the calculus when you think about guides?

Yeah. Sure. I think I appreciate the opportunity to provide further education because Arista's business model is changing in the sense of how deferred plays. You know, if I started January 2024, and I think if you were to take that time in earlier, deferred was a cloud build-out services, your regular kind of maybe 6 to 12 months. As we progress through 2024 and 2025, now we have some of the largest, most complicated data center build-out for AI centers. Those take a lot to come together. They're based on acceptance criteria. We have moved from 6-12 months to 18-24 months for some of these larger deployments. In the modeling, I would encourage you to think about that time shift difference between that. You'd lean more towards 18-24.

Given the amount of growth in deferred, it's related to these AI build-outs and the new products. That's how I'd ask you to consider that from a modeling and timing perspective.

And just to be clear, you know, Martin, you've, you know, power availability, shell space, you know, component dynamics. Have you seen any kind of indicators or, you know, dynamics that have changed deployment plans for some of these larger projects?

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: I think when we talk about commissioning in a new data center from literally breaking ground, then you're talking about a, you know, more than 12-month horizon on the construction cycle. That can get delayed for permitting or power or who knows where in the architecture and supply of a physical building. That's not on our timeline. Right? The timeline where we get involved is the customer's talking to us about what they want to deploy in that data center and when they want to deploy it. When we get to, I don't know what the phase is, the building's dry, right? It's got a roof on it. It's got power. At that point, we're having in-depth conversations about what architecture, what density, what product choices are gonna go in there. That can vary by plus or minus a quarter.

We're not talking about plus or minus a year at that phase because the building's, you know, up, powered, and they're getting ready for deployment cycles. We can have conversations with the customers about, "We're thinking about this, but what about this?" That variability comes into product choice, that variability maybe comes into exactly when and where. Let's say plus or minus a quarter is kind of that ratio on there.

Yep. And then, you know, everybody's got, you know, different ways to try and kind of backwards in the math and how we think about the networking opportunity. I think one of the things that's very clear is the networking opportunity is, it is only getting more relevant as these clusters get larger. We go from scale out, scale up, scale across, and, you know, complexity will continue. When we think about, you know, pick your number, gigawatt of deployment, or we think about Lisa Su's comment last week of a trillion dollars of AI, you know, Silicon TAM. How do you think about how has it evolved your thoughts around the networking piece of that opportunity?

It's very difficult to pick down from the power or even the spend on the GPU or the accelerator side of it. It's very difficult to kind of go, "Okay. Take that number, divide by 4, and divide by another 1,000." We kind of work from the bottom up, right? If the price of the GPU doubles or halves, you know, that's what we're gonna do. I don't think that changes the value of the network.

Yep.

That divider is the problem. I tend to work from a bottoms-up approach. How many ports? How many blades? How many switches? How many interconnects in a given network architecture? 9,000 nodes? 16,000? 32,000? Okay. We can do that. If I try and work backwards to how much that CapEx is on the compute side of it, I do not know that we got the information on the pricing side of all of that. We certainly do not have information on the cost of the building.

Yep.

Mm-hmm. We tend not to try and divide that down. You guys can all do that. I'm happy for you to try and give me the answers.

Chantelle Breithaupt, Executive, Arista: Mm-hmm.

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: We tend to work from the bottom up in terms of how many ports. 400 gig, 800 gig, going to 1.6. Okay. Then I can do some math. We have a product choice as well. That portfolio of products from our X series to our R series can introduce another level of variability.

Mm-hmm.

You mentioned scale across. Scale across is kind of a data center interconnect. That incremental revenue is smaller than the total spend on the backend network itself. It is another factor. You cannot know from how much they're spending on the building how much they're gonna need for a data center interconnect. Those two things are independent, really.

Yep.

Chantelle Breithaupt, Executive, Arista: Yeah. The only thing I would add to it, in addition to what Martin's comments were, was that if you, when we were talking maybe two years ago, we would get asked the question of a data center build-out, what percentage are you over? I think at that time we were using basically high single digit, low double digit, kind of the 9-11. Our best estimates, 'cause we're still finishing some of the larger AI centers, is maybe that's moved from, you know, 5-7 versus the 9-11, but still TBD. Just to give a framework versus what we used to use to try to solidify that.

And just to be clear with the audience, that is obviously the definitions of what is AI is different depending on what company you talk to, right? That is just switching. It's not anything else. There's no transceivers. There's no optics. There's no, right? Because that definitely differs from, you know, if it's NVIDIA or some of your other peers that report these AI networking numbers. Correct?

Yeah. F, I would say that the difference between the 5 to 7 to 8 would be if optics were in or out, likely.

Okay.

It could be, but everything else we would exclude.

Okay.

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: Yeah. Yeah. When we talk about AI numbers, we're talking about the switches, the routers.

Yeah. Exactly.

The physical devices. We're not counting optics in there.

Okay.

We're definitely not, not counting NICs 'cause we don't have any.

Yep. Exactly. The companies, you know, over these last several quarters consistently talked about, you know, the four customer deployments, the 100,000 GPUs. Can you just remind us of where we're at on, you know, of those deployments and where we're going, you know, maybe, or how we should think about where we're going over the next 12 to 24 months?

Chantelle Breithaupt, Executive, Arista: Yeah.

Pick your horizon.

We're very excited. You know, obviously we have an expansion beyond these four. We talked about these four 'cause they were indicative as we started to get into this AI cycle, what it meant to us as a company. I think that the four are going well. Three of the four are coming into production this year, maybe into January. Basically, you know, we call it December 32nd kind of thing. So very close. The fourth one, which is a transition from InfiniBand to Ethernet, that one's expected and on track for next year as the earliest part from the sense of revenue recognition from us. Sorry. We have lots of other AI center opportunities. We talked about basically 25-40 other customers between tier two enterprise, specialty providers kind of customers and NeoCloud, sovereign states.

We're very excited about those conversations as well.

Yep. And that ultimately underpins, right, the $2.75 billion guide for AI front end, back end, you know, for 2026 versus the $1.5 billion, you know, for 2025. How has your, in these customers, how has it evolved in terms of your, you know, breadth of deployment? And any kind of anecdotes that you can share of like, "Hey, you've won these customers, but it's expanded your upsell, if you will, opportunity in these?

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: There's two axes on that one. That is within the, let's say, top four.

Chantelle Breithaupt, Executive, Arista: Yeah.

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: Within the top four. Could be five. Could be six. Within the top customers, we're having more deployments. They started in the second phase, third phase, and we're keeping rolling. Now, these are customers who, many of them are household names. They're not gonna slow down doing AI deployments. That continues to grow in multiple locations, both in the U.S. and internationally. Then there's the set of next customers. These are large customers in their own right. They're not necessarily in that top tier. For each one of those, there will be a first phase and then typically a second phase and occasionally a third. Then maybe they'll stop because they've achieved their business goals. We are seeing, you know, we've talked about it on the earnings. We're talking about, you know, 15-20 customers in that vanguard.

They're not all NeoClouds. They're not all sovereign wealth funds. There are some AI as a service providers in there. We're seeing that broader base of customers who are putting their first deployments in calendar year 2025 that will go through a pilot to production. Calendar year 2026 is back to second generation or an expansion of that. It is getting broader. The ones that we're in first with are getting deeper.

Yep. So the 1.5-2.75, would you say that that's majority heavily driven by just your top four hyperscalers, or is that?

Chantelle Breithaupt, Executive, Arista: No. It's across the customer set. You know, it's across the four and the 15-20. And there's possibly pull-ins from, like you were mentioning. There could be campus if there's more.

Sure.

Inference and agentic happening in their own companies. I just wanna take a moment to friendly remind that's revenue recognized, not orders, right? We talk about revenue. We are talking about $1.5-$2.7 billion revenue recognized.

Yeah.

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: Yeah.

Great point. You touched on it, and I think, you know, scale out is where the predominant majority of your business is in the AI. The scale across is like what might have been called DCI not too terribly long ago, has now become scale across. Scale up is still opportunity set, you know, kind of TBD more 2027 than it is 2026. Can you help us maybe how do we think about Arista in the context of the scale up opportunity?

Chantelle Breithaupt, Executive, Arista: I'll start in the sense of.

Yeah.

Positioning, and then Martin can speak, you know, how it works from a from a product and technical perspective. At Analyst Day in September, we talked about Arista Networks having a $105 billion TAM growing from the $70 billion we had said just a year before. Very excited by the TAM growth. In that, there is no scale up TAM. This is a net new TAM to that number. Just to position, you know, we're still exploring what that TAM could be, waiting for the Ethernet conversation for this scale up conversation. Maybe Martin, you could talk about some of the things we have to go through to get to that.

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: Yeah. I'm trying to think where to start. You mentioned the scale out network. The scale out network is a multi-tier, typically two-tier infrastructure to allow these thousands to tens of thousands of GPUs to all talk collectively across one single infrastructure. For that, we've got our portfolio of X series and R series products today that address that well. We put them in the two-tier networks, and they're eager to go. Those products are purpose-built for AI, but those are also the same products that customers can use on their front end networks for building the client connected network. We think about this next generation of scale up. It's a vertically integrated, tightly integrated network architecture. Effectively, it's a new set of products that are built with the customers to physically integrate into their physical infrastructure.

It's gonna take two or three key differentiators there: next generation silicon, next generation products, and then those customers coming to us and working with us on the requirements, the definition, the delivery, and then ultimately the release of those products. That's why it's different to the scale out. We're perfectly positioned for that in that we have access to the technology. We have the best engineers in the industry, I would claim, right? We've proven what we can do with these customers. They are coming to us and asking us to co-partner with them on developing these next generation solutions for scale up.

And I think you've brought up a good point with me in the past. Like, it's important to understand, like, excluding the leading vendor of GPUs, everything else is attached to Ethernet, right? Scale up across the entire fabric architecture, right? So Helios from AMD might be a very well-positioned, you know, rack scale solution for a scale up opportunity or, or pick your other piece of.

Yeah.

Silicon out there.

Yeah. At this point in time, if you're doing a scale up network, you're probably using an NVL generation technology. We've seen at the OSC conference last month.

Yep.

The introduction of the ESUN technology. That is an evolution of the scale up Ethernet. ESUN is Ethernet Scale-Up Network rather than scale up Ethernet. Move the letters around. Get another terminology. ESUN is a multi-vendor with customer participation in building a specification that everyone can work to. You do get this choice of products, choice of vendors, so that when you are moving into the late 2026, 2027 generation, you have vendor diversity, supply chain diversity, and de-risking of these technologies. That is where we are heavily engaged. We were one of the leading partners of that ESUN initiative. Tightly engaged with the technology and the customers to make sure when we bring it to market, it is not single vendor technology.

Yep. Yep. That kinda maybe ties or segues a little bit to the blue box narrative. Blue box, white box. I mean, that's, you know, white box has always been a persistent, you know, topic of discussion competitively. You know, maybe help us, you know, one, what have you seen, you know, from a competitive dynamic vis-à-vis white boxes? And what is, you know, what is Arista's blue box strategy? And I'm guessing it does tie back to maybe scale up over time. But, you know, you've been participating. Just walk us through kinda white box, blue box, you know, dynamics for the company.

Chantelle Breithaupt, Executive, Arista: Yeah. We'll tag team this.

Yep.

There's probably a lot of conversation here. Again, we hold the position that nothing has changed from a white box dynamic. From the perspective there, our place is white box can absolutely grow. The whole market's growing, and so there's room for everyone to grow. They grow in some of the customers we're not even in. You know, you take some of them are with Amazon and Alphabet, you know, and they've been in that white box even before Arista started. There's going to be growth there, and we recognize that, you know, in our hyperscaler conversations where white box is usually where you have the ability to have the engineer's team to support, you know, Sonic or FBOS on it. Those are at the larger customers.

White box, I think, apples and oranges compared to blue box and Arista branded. One example I'll give you where blue box is great and where we use it today before I pass it over to Martin is, you know, a lot of our largest customers, they need to and they should have dual source vendor strategies, right? You never wanna place any one large company on one vendor. Where blue box can be very useful is a dual source strategy where it's Arista hardware underneath, and some of the Arista hardware is running EOS, and some of it's running, you know, Sonic or FBOS. That allows the dual source strategy, but it has the Arista hardware underneath. That gives them the flexibility to say they have dual source, which is great.

That's one example of a use case that's existing and a great application. You know, blue box generally is in our guide. Sometimes we get a question, you know, what's the margin impact from blue box? For us, it's in our guide. It's in our actuals. Nothing's really changing there from a material perspective. Then Martin, maybe you wanna talk about some of the other blue box.

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: Yeah. So an Arista blue box isn't really new. It's something we're talking about now, whereas it's something we've been delivering with some of these large customers for multiple generation technology. They came to us and asked us to partner more closely on developing a product, a box for them. They wanted their own technology inside, but they also wanted the benefit of all the Arista design, manufacturing, supply chain, diagnostics, the 1-800 number you call for support. When we take all that packaging together from the fundamentals of an engineer designing something, mechanical engineering, thermal, packaging, everything that goes into making the Arista product today is what they want us to tap into. They didn't want to get an off-the-shelf white box. They wanted an Arista product, but with their own code running on top of it for the software stack.

That's something we've been doing for a while. There are many examples out there. We've partnered with a couple of these larger customers, and now we're talking about it more broadly as a way to help the investor community and customers generally understand what it is that differentiates Arista from an off-the-shelf product. Design, manufacturing, supply chain continuity, multi-sourcing, the diagnostic software that we run even on the design level before we run it on the manufacturing line, for the test, the release, the firmware that we're running on some of the FPGAs that are inside these systems is all Arista. It's that Arista value. That blue is the Arista blue, blue capability. It's very difficult to say where does the line start and stop. We've put some documents out there to kind of show where these layers are.

You're not just taking an off-the-shelf Broadcom silicon, putting it on a standard reference design and shipping it. There's a lot of value. We've seen examples where an Arista product compared to a standard one, we have lower power, we have better thermals, we have higher reliability. These are tangible benefits to the customer of taking an Arista product even if they don't run Arista's operating system, EOS. Now, of course, we'd prefer that they do that. The other benefit is if they take this blue box and they run their own network operating system, they can at any point refresh it to run EOS. It gives them some investment protection in terms of that technology. They're deploying it in one role. They wanna redeploy it in a different role, they can start to run EOS on it.

I think that's something that, you know, these are all the value propositions of blue box as against a standard white box.

Yeah. That was very thorough. That was great. Sticking on the competitive dynamics, you know, AI, you've got the largest GPU vendor wanting to participate, obviously, deeply in the networking stack, a full stack, you know, solution, walled garden, you know, approach. You've got, you know, obviously, your biggest competitor on the networking side that's talked about AI. How would you characterize the competitive landscape in these AI fabrics, be it scale out, scale across?

I'll start.

Chantelle Breithaupt, Executive, Arista: Yeah, you go.

Martin Hull, CFO, Vice President, General Manager of Cloud and AI Platforms, Arista: On the front end network, the front end network is de facto Ethernet. Nobody would think about putting an InfiniBand technology into a front end network. It's the part of the network that over the last 10 years or so, we've incrementally taken market share away from who was the number one vendor in the data center. We've done that to the point we've now surpassed them in market share for the front end network. This back end network is very much a net new opportunity for all of us. It's a TAM that's expanding rapidly. Within that, you're gonna get some market share dynamics. We saw two years ago the question about InfiniBand versus Ethernet. I think that question's largely gone away.

Not to say that InfiniBand will ever go to zero, but largely speaking, people have decided that Ethernet is the right technology for the back end network. We are now having a new debate about scale up. We will leave that to one side for now. On that back end network, you have now got a level playing field for Ethernet technologies. Just described how we were successful at the front end of the network. There is no reason to think we cannot be successful at a back end network. Given a level playing field, you take best of breed, you take the software quality, the features, the ability for our engineering team to partner with our customers and make sure we are building the right products at the right time, and then that track record.

You kind of take all that together, and that's a, it's a broad answer rather than saying, "Well, this feature's better than that feature, and we have a product, and they've got a Radix, and we've got a Radix." You know, those are all the detailed answers. You know, I like our chances on a level playing field, and our networking technology, our networking products are best in class.

Yep. That's perfect. Before I go to model stuff, I'm gonna ask this other follow-up to Martin. There's been some debate out there about, you know, one of your M&Ms, your largest, you know, cloud titans. You know, it's gone from a disaggregated scheduled fabric architecture and talked about a non-scheduled fabric architecture. Obviously, they've done stuff with, you know, their mini pack solutions and stuff like that. How does Arista play in a non-scheduled? Maybe walk us through 'cause I think you guys have, you know, obviously, switched portfolio that addresses non-scheduled fabrics. You know, what do you think about that?

As I said before, we have a portfolio.

Yep.

We've got the X series with the R series. Both of them are optimized for these large-scale AI deployments. At any point in time, any customer can choose from that portfolio. We don't force them down any path. We've seen in previous generations, these large customers, you can give the names out, have deployed a mixture of the X series and the R series at multiple tiers. For the latest generation, they chose to go with the DSF, which is taking the R series products and basically spreading it out. So disaggregated distributed scheduled fabric. The next generation, for whatever their technical reasons are, I mean, we talked about it at OCP, they're gonna deploy NSF as possibly a timing as well as achieving some scale capabilities. The distributed scheduled fabric, I think they've published results that show it's performed extremely well.

Mm-hmm.

By offering a choice, we can win or we can win. There is no losing in here.

Yep. That's perfect. Now, sticking to maybe some of the model stuff, analyst day, I think it was September 11th. Very thorough. I think one of the things that I took away was that you set a guidance, which you mentioned earlier, was quite strong relative to the way you've set in the past, 20% growth. If I do the numbers right, the $2.75 billion of AI, the $1.25 billion of campus, it really implies that the rest, the non-AI, non-campus business doesn't grow. Why?

Chantelle Breithaupt, Executive, Arista: Yeah. I think that we are very intentional with some of our goals to make sure that communities like yourselves understand what we have line of sight to and what our north stars are. We set a north star for the AI growth, the $2.75 billion in revenue, and the campus $1.25 billion, basically 60%+ growth on both of those targets. We laid those out for the company 'cause we want to be very clear where we are going from a strategic perspective. We also, from a style, never assume 100% of everything is going to work in our guidance. It does not mean we do not want or anticipate the rest to grow. We will see as we get into next year with two quarters of visibility, other things that can add to that number, but we will never assume 100%, give ourselves some optionality.

If we hit both those targets and things continue to grow, we'll see what, you know, how next year progresses.

Yep. That's perfect.

Mm-hmm.

And the reference to two quarters of visibility, that's an enterprise.

Yeah. That's everything basically underneath the hyperscalers.

I gotcha.

Too. Yeah.

Okay. The other question that's come up on the mar, you know, on the model has been, you know, this quarter, I might have gotten the math a little bit wrong, but, but like the gross margin on the product line takes a little bit of a step down. You know, so immediately when people see that, they go, "What's going on? Why?

Mm-hmm.

I think the short answer's mixed, but maybe, maybe I'll let you address that.

It's the majority of the time mix is what drives customer mix is what drives our gross margin conversation. You know, obviously, the hyperscalers have a different volume purchasing power than the enterprise, and everything in between is a, is a mix of those. On a high mix cloud AI tight and quarter, you're gonna see, you know, a little bit lower gross margin and a higher enterprise mix, a little bit higher generally.

Yep.

The other thing, as we tried to articulate at analyst day, the other thing that moves through there is our E&O activities. You know, the Arista model, we have long lead times in our supply chain, so we lean in almost a year in advance, and we have educated, you know, procurements, but it's not 100% forecast driven with two quarters of visibility. There will be times when maybe we don't get the mix right, and we do everything we can to mitigate the E&O, but that's the other factor that can come in. This year you saw not a lot of E&O, and so our gross margin has a bit been elevated at that 64-65% range some of the quarters.

Those are the dynamics, very open and transparent about it, but very excited even to, you know, kinda report those growth rates, the margin rates, and the operating margin rates this year and next year.

Yep. The campus opportunity, you, you've brought in, you know, the company hired Todd Nightingale to, to really, it sounds like, drive the campus, right? That stuff, I think it's $700 million-$800 million this year, growing to $1.25 billion. Can you, I think you're only 5% market share, of the campus market. What's changing there? What gives Arista the opportunity to win? What, you know, how do we think about, you know, maybe beyond $1.25 billion? 'Cause it is a large market, you know, $18 billion-$20 billion in a campus market.

No, absolutely. We're very excited. I think the one thing that's, I've seen and I'm very proud of Arista is once we set an intention, we very much try to execute and overachieve on that. You've seen it with the hyperscalers and then data centers specifically now going into campus. What has allowed us to kinda come out with this declarative state? We've not finalized, but very well finalized the campus portfolio. The VeloCloud acquisition was a great part of that SD-WAN kind of conversation. Very happy with the portfolio.

We think it's ready now to get more than 5% of that market share, as well as the fact that, you know, we have someone like Todd coming in who can spend, you know, more dedicated time giving his background as to how are we going to approach this market through land and expand and new logo acquisition. A lot of time between Todd and I and Martin and the team to talk about what are we gonna do with that. Where do we see this kind of validation? We are winning campus-first deals now that are material, especially for a campus market. We are very excited. Those could be without the data center, so we can win a new logo acquisition in campus and then land and expand over to the data center if we're not already in that position. Super excited about that.

5%, we do see that $20 billion kind of market, you know, TAM perspective. So very excited, and Todd's very much focused, as well as he owns the supply chain of Arista, so he can work on making sure we have the right products and lead times to make sure we win those refreshes as they come due.

Yeah. First part of it is portfolio. Throughout this year, we've incrementally released new technologies, new solutions with identity. Pulling in the VeloCloud has given us more of a complete solution. The second factor is time, right? A lot of these large campus opportunities come around once every five years, once every seven years. It's not for want of trying, but if the customer's not in the buying phase, then there's no opportunity. For the very largest customers, as they come up for a technology refresh, it gives us an opportunity to engage in an RFI and RFP or get into a lab or a qualification exercise and then hopefully be successful. If the customer's not in that phase, you sit on the sidelines and wait.

I think we're identifying that within the campus and the enterprise more broadly, the next two years, there's a lot of refreshes coming up for renewal. As Chantelle said, we are very encouraged by the customers who are winning first as a campus opportunity, and that gives us the chance to go and talk about other areas of their infrastructure. And remind me again, I just forget, how many enterprise customers does, I think it's like a rounded number, 10,000?

We said 10,000 customers for Arista, 10,000 plus.

Enterprise customers.

We said customers, but you can.

Yeah. It's a small list of hyperscalers. I guess the metric that would be interesting would be is like how many of the, to your point, if you have an enterprise data center footprint, it clearly gives you an opportunity to go into the campus opportunity.

That's right. That's what we're talking about. Land and expand our new logo. This is what Todd and I were working on this year, next year planning, etc. How do we get to that 1.2? Which methodologies?

Okay. And VeloCloud is in that $1.25?

Yes, it will.

Yep. Okay. In the few minutes I've got left, I think I have to ask you about component constraints. I know we touched on it a little bit earlier, but how are you guys mitigating? Is there, you know, is it DDR4 that you're seeing some constraints on? Is it, and I guess, you know, you've got a lot of purchase commitments. I think your inventory plus purchase commitments were like $7 billion. So not concerned about supply. How are you mitigating the price inflation risk? Do you see that at all, you know, in your gross margin?

We have not seen a material amount, and I give kudos to, you know, Todd, Mike Kappus, and his team for being very proactive, multi-source strategy. We have been mitigating where there is price inflation either through our own activities or through the actual negotiation with our vendors. Not a lot of materiality there, Aaron, to your question. I think from the perspective of the supply chain constraints, I think that was part of what you were asking.

Yep.

We don't see any restraints, constraints for 2025 and 2026. We're trying to get ahead of anything that could potentially become a topic. Martin, if there's anything memory fab that you wanted to cover?

No, I think there's a recognition in the last few weeks since we put our earnings out that there is a worldwide tightness on some components, right? We're just one company within that. We can't be immune to it, but I don't know that we're impacted. Visibility and then taking the right corrective actions, and that is putting in place volumes, multi-sourcing, and then timeframes. Like, we need this volume over this time. Are you able to support us? Commitments is how we do that.

Yeah. I would say just going back to your starting point, Aaron, the purchase commitment increase that you saw, which was sizable, is more related to demand than it is to buying into component shortage situations, just to be clear.

I think the number's $4.8 billion, I think, was the purchase commitment number, if I'm right.

A little bit higher than that.

A little bit higher. I'm sorry.

Yeah.

It's $7 billion in total, right?

Yeah.

I guess that brings the question I was trying to get to is like, how do you think about managing if it's demand?

Yep.

Right? I'm gonna ask the backwards question of.

Sure.

How do you manage, how do you think about inventory turns?

Mm-hmm.

Right? And thinking about from that perspective.

Yeah. Generally, at least since I've been in the role January 2024, inventory turns have gone between 1.1-1.3. We've been fairly steady. I'm actually agreeing with Todd to see if we can increase our turns. That's the goal. However, we are working through quite a frothy period. Let's say it stays in that range. The purchase commitments are meant to, you know, flush out in the timeframe that does not really impact that inventory turn calculation over time.

Okay. In the minute we've got left, I'm just gonna, I'm gonna put an open-ended question out there. When, when you're, when you're speaking, Chantelle, with, with investors or, or Martin, what do you feel like are the, the two or three things that are, are just not fully appreciated in the Arista story?

I think there's, obviously we work with very smart people such as yourselves in the room. I think it's just a shift in the sense of looking at deferred and P&L growth, knowing that that deferred is revenue over a timeframe. I think that's important, especially given that this quarter's the first quarter that product was the majority of deferred. It's the first time I've said that in the preparatory marks. Usually, it's been services, so now product. I think one is the kind of the revenue outlook, you know, just looking at the guide, P&L and deferred. I think the second one is just recognizing the way that Martin very well articulated what blue box is and that white box and Arista branded and blue box are a little bit apples and oranges.

I think the third one is there are some things we can't control, such as announcements by things in the industry where the whole industry's impacted. We will keep producing great product. We will keep our head down and work close with our customers. We'll just continue to execute and show you versus all of these kind of interwoven things with investments and commitments between customers and vendors. That's not the Arista style.

Yeah. Perfect. With that, I think we're right on time. Thank you so much for joining us.

Thank you for your time. Thank you.

This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers
© 2007-2025 - Fusion Media Limited. All Rights Reserved.