Arteris at Rosenblatt Summit: Navigating AI and Chip Complexity

Published 11/06/2025, 18:24
Arteris at Rosenblatt Summit: Navigating AI and Chip Complexity

On Wednesday, 11 June 2025, Arteris (NASDAQ:AIP) presented its strategic initiatives at Rosenblatt’s 5th Annual Technology Summit - The Age of AI 2025. The company highlighted its leadership in network-on-chip (NOC) technology amid growing chip complexity and AI adoption. Despite challenges from trade tensions with China, Arteris is diversifying its revenue streams and focusing on growth in key regions.

Key Takeaways

  • Arteris is leveraging its NOC technology to address rising chip complexity and AI integration.
  • The company is experiencing both opportunities and challenges from the shift to chiplet architectures.
  • Arteris aims to grow its average deal size to $1 million by 2026.
  • Trade tensions with China have impacted revenue, but growth in the US, Japan, and Korea is strong.
  • Arteris expects profitability by mid-2026 by managing operational expenses and focusing on innovation.

Financial Results

  • Arteris operates on a ratable revenue recognition model, with $90 million in remaining performance obligations.
  • The company’s revenue growth is approximately 20% annually, despite a shift in revenue mix.
  • Chinese revenue has decreased from 50% in 2019 to 15-17% due to trade restrictions.

Operational Updates

  • Arteris is a leader in NOC technology, essential for complex integrated circuits (ICs).
  • The company is incorporating AI into its system IP design for analytics and verification.
  • Arteris is cautious about AI limitations, emphasizing the need for data verification.

Future Outlook

  • Arteris is targeting a $1 million average deal size by 2026, driven by chip complexity and system IP adoption.
  • The company expects to reach profitability by mid-2026 through strategic expense management.
  • Arteris is expanding in the US, Japan, and Korea, offsetting the decline in Chinese revenue.

Q&A Highlights

  • CFO Nick Hawkins stated, "Complexity is our friend," highlighting the reliance on complex interconnects.
  • CEO Charlie Janik noted, "Chiplets are a major opportunity," emphasizing their role in addressing chip complexity.
  • Hawkins added, "It’s a combination of headwinds and tailwinds," reflecting the mixed market conditions.

Readers are encouraged to refer to the full transcript for more detailed insights.

Full transcript - Rosenblatt’s 5th Annual Technology Summit - The Age of AI 2025:

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Good morning, everyone, and welcome to day two of Rosenblatt Securities’ fifth annual agent AI Technology Summit. My name is Kevin Garrigan, and I’m one of the semiconductor analysts here at Rosenblatt. We’re pleased to have with us Artis’ CFO, Nick Hawkins and Artis’ CEO, Charlie Janik, who will be joining shortly for this fireside chat. We currently have a buy rating on Arteris with a $14 price target. We’re bullish on AIP because of rising SOC design complexity, the shift to chiplets and growing AI adoption driving demand for outsourced networking on chip IP, which Arteris is a leader in.

Throughout the fireside, we will ask for any question from the audience. So to ask a question, you can click on the quote bubble in the graphic on the top right hand corner of your screen. I’ll then read the questions to Charlie and Nick. So with that, thank you, Nick, for joining the conference. Great to see you again.

Nick Hawkins, CFO, Arteris: You too, Kevin. You too.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: So for for anyone listening that may not know the Arteris story, I thought we’d just start out with, you know, just a brief overview of Arteris.

Nick Hawkins, CFO, Arteris: Sure. So so, we we are essentially the creators, the inventors in a way of the what’s called the network on chip, the NOK. The NOK is an essential element of the of a system on chip or any complex IC, and it’s essentially the the, the communication fabric, of of a chip, which is, these days made up of multiple components, sometimes hundreds. And so, having an efficient a power efficient, fast, and and low heat generation, NOC is is, super critical, especially in the in in the current climate, especially with with AI demands being hugely power consumptive, for example. So and and the use in in any mobile devices, for example, including automobiles, which where power consumption is a is a key criteria.

You you can’t make a a chip without a complex chip without interconnect, especially as we move in chiplets. So so complexity is our friend. We also have another element of the business, which is what we call our SOC integration automation software, or SIA for short. That was really brought to us through two acquisitions, one of a company called Magellan in France, in Paris, in fact, in 2020, and supplemented by a acquisition in The US of a company called Semiphore in 2022. And so we have a a leading position in that SOC integration automation software, which, again, once you’re once you’re designed in, it’s very, very hard to to to live without, that solution.

Some people try to internally, but it’s it’s it’s not as effective. So we find that a very successful business element or business unit as well. So, yeah, we’re all about connectivity and communication within the chat, Kevin.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yeah. I I appreciate that overview. So I I figured we’d start with a a few industry topics. You know, there’s there’s a lot going on with with SOC design complexity. Again, you know, you mentioned shiplit.

So, you know, on the SOC design complexity side, compared to four or five years ago, what what is it about the design process that has kinda become more complex?

Nick Hawkins, CFO, Arteris: So so, if you look at the function, really are made up complex chips are made up of of multiple functional blocks. So the the creation of the NOC, the necessity of the creation of the NOC environment, which which, say, was an Arteris creation back in a decade and a half ago. Before under 10 functional blocks in a chip, Broadleaf is simple enough that you don’t really need a complex interconnect. You can hard hardwire the the functional box, together, and that’s how things used to be built. The way that we created it, it’s it’s essentially, like a, think of a Cisco network in an office and miniaturize that, and essentially that is what we created, and it’s a miniature network, which is why it’s called a NOC on a chip.

So twenty years ago, fifteen years ago, most most chips were were on the streets, although we’re very small, very low complexity sub 10. That seems to be the magic number of functional blocks on a chip. We now eighteen months ago, we we actually crossed the point where we found a chip actually in Japan that had more than 500 functional blocks. And so that’s that’s getting super complex on a single chip. It’s now got so complex, Kevin, that the the size of the die to contain all of those functional blocks becomes almost unworkably large.

And and so this is where the the sort of a move to multidie solutions chiplets and so on emanates from is is essentially breaking those huge chips with hundreds of functional blocks into into smaller chips. But, of course, all of all of those smaller chips need to communicate, and they will come together in a single packet, single package. But they still need to come to communicate, and the fact that communication, once you get into kind of three d chips or multidise between the layers and in intra layer becomes obviously even more taxing. And and this is in fact what’s driving, and I know this is a theme that you cover very well, the the the drive the move or the general trend to increasingly outsource knock design to the commercial market, which is where Arteris plays, Ad Arm plays, and and a few other smaller competitors play. It’s it’s not a very well populated space.

We’re certainly the largest player in there after after ARM, and ARM is is sort of more focused, I think, these days on on its own chips. As you as you know, it’s been well documented. But they’re still there. It’s still still still, of course.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: You know, it’s it’s incredible how, you know, complex these days chips are are getting. I mean, everyone’s trying to add, you know, everything under the sun onto a onto a single piece of silicon, and it’s it’s definitely getting a a lot harder, but that that helps you guys out. So,

Nick Hawkins, CFO, Arteris: you know It it does. And as as I said, complexity is our friend because the more complex chips get, the more companies have to rely on on complex interconnect. And, you know, most companies, take in any of the big semis, for example, they have a lot of them have their own teams. Most of them have their own teams. They also a lot of them use the commercial market and increasingly are moving in that that direction.

So we we see that as a as a significant vector, sort of a a tailwind vector for the company, because it’s not only more efficient and effective to use a commercial solution, which is silicon proven and tested on hundreds of chips, thousands of chips, or hundreds of chip designs and and billions of actual chips. When compared to even the largest company might have 30 or 40 designs, semiconductor company I’m talking about, but in in in a year, and we have hundreds. So it’s a and and, you know, over the lifetime of the company, we’ve had many, many, many hundreds. So it’s a it’s a very exciting time for us, and this is why I say complexity is our friend.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yeah. No. Absolutely. And I have a couple questions on on kind of the shift to the from insourcing to outsourcing. But I just before that, you know, you talked about about chiplets and, you know, there there may just be a lot of people out there that don’t understand kind of, you know, what chiplets are, the benefits of chiplets, you know, why is is there this transition?

So can you just spend, you know, a minute or two, you know, providing a little bit more in-depth on, you know, what chiplets are and why, you know, you see the the transition to to more chiplet based architecture?

Nick Hawkins, CFO, Arteris: I will give you a a sort of a novice view, but I think it’s something that we also ought to, ask Charlie when he, met he’s able to join from from his, his Uber. But but, basically, this is what I was mentioning earlier on. Chiplets are a solution for complexity. And and and as chips get more and more complex, it’s just an unsized. It it becomes increasingly difficult to make a single chip perform all the functionality that’s required from from the multiple functions.

As you say, the the the demands on chips is is ever increasing. And and while the chip in market is actually still nascent, it it’s still in in early stages of of development. It’s clearly a developing theme for for the world, and and you’ll see that that market growing. It’s not it’s not a huge market right now, but it’s a very important market because it’s very future oriented.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yep. Yep. Yep. Okay. That makes sense.

But Charlie can give us some more science

Nick Hawkins, CFO, Arteris: and and technology around that answer, which is my mine is a a sort of a high level, 40,000 foot view.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yeah. So, as we’re talking, you know, again, you talked about the the shift over time from from internal to commercial. You know, how is that transitioning happening today? Is it, you know, faster than than you’ve expected, you know, previously? Is it, you know, kinda on par?

And do you do you see it, you know, accelerating as we as we kinda move forward?

Nick Hawkins, CFO, Arteris: So so, yeah, let me give you some color on that. It is, it is not an overnight process. I it is the way I’ll preface it. So if you look back to, when I got into the semis world, which was which was a couple of decades ago, the even then, predominantly, EDA was an an in in house solution generally to thirty years ago, it was almost exclusively in house. And Charlie, as you probably know, was was you probably remember, was employee number two at Cadence.

And so he saw it when when Cadence was a, sort of a $3,040,000,000 dollar business, and it was absolutely in its infancy, and 95% of the market was internal, and only 5% was was sort of party commercial EDA solutions. If you roll forward to today, Cadence and Opsys, Siemens have the lion’s share of the market. There’s very few people who do EVA internally now. It’s just too complex. But that took several decades.

So we are, not at the beginning end of that curve. We prob we think that maybe 25% of the market right now is is commercial, though the 75% is still internal. But it’s shifting, if you go back to, I guess, pre Arteris, the, that was sort of the the the the cadence of fifteen, twenty years ago. That’s a trade the EDA market where it was it was in low single digits percent. Almost everything was done internally then.

So now roll forward today, at 25%. It never gets to a 100%, to be clear. But over the next decade, you’ll see that shifting. And and, you know, one of the big unknowns to us is how fast that shifts. I think maybe it gets to 75% commercial, 25% internal, maybe it goes all the way to ninety ten.

That that we don’t know. One of the limiting factors for the internal market, well, there are two key limiting factors. One is is just simply the availability of hardware EEs with the right background. And and there are very few people coming out of universities and colleges now around the world who are specializing in that space. They’re they’re more software oriented or solutions oriented engineers.

So there’s there’s a there’s a lack of people, and some of the people who are in that market are retiring. So so and there’s new there’s little new blood coming in. So that that is a fundamental issue that’s facing the whole semiconductor world is is this this limited pool of people who can do the work. So that is a drive more towards commercial market. The other the really driving vector towards commercial market is economics simple economics.

So if you we I was just talking to somebody two days ago about one of the I can’t name name them, but one of the larger global semis who you will know very well. But I so I can’t name them. They in the they have an interconnect. Even though they’re a customer of ours, they have an interconnect team internally, and it’s about 60 people. So if you look at the the all in cost of of a of a a typical engineer, they’re they’re really not cheap, especially if you if they’re in The US.

That’s somewhere between 20 and $30,000,000 of annual OpEx to support that team. Now imagine how many licenses, the of Arteris that you can buy for, for example, for that amount of money. It it it’s it’s kind of almost the size of our, it’s it’s sort of almost like half the size of our revenue. It’s getting onto that that level. Just gives me an idea as one company.

We generally think there’s somewhere between a sort of a four to eight times payback. So every dollar that you invest in our is gives you $4 or $8 back on your own OpEx saving. And and as you know, you know, all of the the semis players right now are super focused on on cost control. So anybody who has any growth limitations or or just is having suffering price erosion or margin erosion or anything like that, OpEx is a really major focus area for all of these companies, plus they can’t get the people. So that both both those things drive to an increasing push to the commercial market.

What’s really interesting is that if you look at our product that we launched actually at the back end of last year, and it’s now going, it’s being monetized and now into into real negotiations with real customers for for actual use cases, I. E. People taking licenses. That is a very interesting dynamic because FlexGen essentially allows people and customers with less technical knowledge and less skills to do the same work that a highly skilled engineer would have done previously in terms of designing that knock into their complex chip. So FlexGen actually is a is a is a huge additive.

It’s it’s solving an industry problem, which is, two industry problems. One is lack of people, and two is is the cost element. So I I see Charlie has has joined now. He’s he’s on mute, so I guess he’s he’s he’s probably waiting for a question so he can come off mute because I know he’s in a he’s in a in a loop because his flight was late. So maybe, Kevin, now would be a good time to unless you got any follow on questions on that whole shift dynamic, maybe now would be a good time to ask Charlie about the the question you have on on giving some more background and color on chiplets.

What they are Absolutely.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yeah. Hey. Hey, Charlie. How’s it going?

Charlie Janik, CEO, Arteris: My my humblest and apology is due to a travel disaster.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yeah. No. That’s quite alright. We, we we all have them. So I had asked Nick earlier, there’s a big one of the big industry movements is more towards chiplets.

So can you just kind of give an overview of chiplets? Why is this transition happening? You know? And and when do you see kind of the the market really adopting, you know, 404050% of maybe the market adopting kind of chiplets?

Charlie Janik, CEO, Arteris: So, you know, there there’s different types of chiplets. Right? So the homogeneous chiplets where you have basically single dies becoming too large because of neural processing blocks, you know, those those are in production. Right? What’s also in production is these homogeneous sorry.

Heterogeneous chiplets that are made by a single company. So this is like the Intel Meteor Lake, the AMD chiplet chips. They’re out there. So those are What is what people are trying to figure out is how do you get into production things that where the chiplet is made by different companies on different processes. Right?

And what’s driving that is, you know, the need for compute. Right? The Moore’s law has slowed down substantially, and so people are trying to figure out how to how to make things that are able to process things in bigger radical sizes that you can make on a single die. So there’s a yield issue there. There’s a performance issue.

There’s the fact that number of different functions on a chip don’t really belong on a leading edge process. Right? They you know, it makes sense to have the CPU chiplet on the latest, you know, three nanometer process, but the analog digital is perfectly happy on 28 nanometer, for example. Right? And so, also, it allows you to mix and match chiplets, right, rather than having to redo an entire die.

So there’s a bunch of economic reasons for why this is gonna happen, but it’s still today a relatively minority portion of the market. But the projects are starting, and, obviously, it makes the system IP very much more valuable and very much more sophisticated and complex because now you don’t have to worry about on die communication, about the communication between dies. And and so we think that chiplets are a major opportunity for our tourists.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: You you would not have understood the sense. And are are you seeing kind of any specific end market or are just all end markets, you know, eventually gonna adopt chiplet architectures?

Charlie Janik, CEO, Arteris: So you’re seeing chiplets in data centers. Yep. Right? You’re you’re starting to see the HPC in servers, infrastructure, even midrange servers. You’re starting to see it.

You’re starting to see it in automotive where some of the the dies are just too big given how much machine learning sections that they need to have. You know, there are some projects starting in in storage, and and there’s some, you know, AI AI inference and AI training kinds of designs. So leading at stuff.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yeah. Okay. Yeah. That makes sense. So, you know, I know we’re still in the in the very early innings of, you know, AI and and chiplets and SOC design complexity.

But, you know, looking out five, ten years, what do you think will be the the next kind of major disruption after AI and chiplets? Or or any anything that is kinda look out for?

Charlie Janik, CEO, Arteris: I mean, we’re we’re we’re we’re just scratching the surface of of of AI, but I I think the, you know, the the the issue is gonna be the just explosion of autonomous systems based on AI. Right? You’re gonna have some relatively mundane things, mundane machines making their own decisions. You’re gonna have some relatively sophisticated machines making their decision. You know, what we’re starting to see is people starting to for example, as you have chiplets, it’s possible that you have some chiplet or one chiplet out of this, you know, six or seven chiplet set.

It’s basically silicon photonics. So you can get inter die, very high performance, high speed inter die communication. Right? And then the other thing that’s kinda surprising me is the evolution of space applications. Right?

Now there’s there’s just a lot of lot of stuff going on in for satellites, satellite communication, exploration vehicles, you know, those kinds of things. So, you know, that that’s gonna be the next frontier is actually gonna be space in in some sense. And then the other thing is, I mean, down the road, I believe that we’re gonna see combination of biology and electronics where the electronics are gonna do control for you know, with biological actuators. Right? So so, basically, combining combining biology and electronics, which, you know, it’s it’s, like, a little bit farther out, but I don’t think there’s any shortage of of of of major applications coming down the pike.

Nick Hawkins, CFO, Arteris: Yeah. Yeah. One of the the just to add a little bit of color to that, Kevin, one of the things that the themes that we’ve we’ve had, we were talking about AI and machine learning even before we IPO ed, as you probably remember, when it wasn’t particularly fashionable. It wasn’t a thing. And we mistakenly referred to it as one of our verticals.

It’s not a vertical tool. It’s actually a a horizontal. It goes across every vertical wherein you have AI in in big in vehicles, you have AI in in space, you have AI in in, sort of the the communications sector in in, in industrial. And so it it’s it’s across every chip, area. There are some which will adopt it faster, automotive, for example.

Now if you look at the intensity, I know this is something that you you focus a lot on in terms of how much of your business is actually related to AI and machine learning. And and we actually measure that in terms of the number, the the proportion of our customers’ design starts, which are AI and machine learning related. So when we started talking about this back, you know, back in in pre IPO days, it was a very small, single digits number. But it was there. We were we were talking.

Now it’s half half of our customer design starts, and it’s growing every time you you listen to one of our earnings. You hear a a a an uptick in that number. And I think that’s just representing what the industry is doing now and and what the world is demanding in terms of use of AI and machine learning.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yeah. And I I think that’s actually a good segue into, you know, a couple of the other questions, more Arteris focused questions that I that I kinda had. So, you know, are are you guys specifically incorporating AI and machine learning to, you know, help with system IP design at all? And, you know, I know a lot of again, you know, we said a lot of other companies are incorporating ads in their business. What are what are kind of net the negatives to to incorporating AI?

Charlie Janik, CEO, Arteris: Well, I mean, we’re obviously I mean, we’re not ready for any announcements, but I I had a demo of an AI feature today, this morning, right, before I get on a plane. So we’re we’re obviously looking at that. I think you you basically start off with AI based analytics to help customers understand what’s actually happening in design. Your your next looking at AI features for for verification where you’re trying to have AI adaptive tests and and things like this. And, ultimately, you know, these things some of these things are generated by algorithms, and and, you know, right now, they’re generated by heuristic algorithms.

Eventually, some of these things will be generated by AI algorithms. Now the downside is that you need a ton of data. And the AI technology, it’s not even AI technology at this moment, really. If you really look at the, you know, the definition, it it is very primitive. And so and it’s not repeatable or deterministic.

Right? So no one really can, at least today, determine how an AI algorithm arrived in an answer. And even though people are working on that and they will solve that problem for sure over the next next number of years, but the but the answer is that you gotta watch AI because it can go give you answers that are completely nonsensical. Right? So and when you’re dealing with a $300,000,000 chip project, that’s a very bad thing.

Right? So you gotta be very, very careful about how you use AI, and you have to be very judicious to make sure that that the answers are correct.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yep. Yeah. That makes sense. And, you know, you said something something interesting, you know, a $300,000,000 design project. I mean, how much what are what are kind of the costs to, you know, AI chip designs these days?

Charlie Janik, CEO, Arteris: Yeah. I mean, 300,000,000 would be a very big platform. Right? So that would be something like a like a smartphone platform or or a large ADAS, you know, level three plus platform. Right?

But, I mean, you know, with mask sets costing, I don’t know, 5,000,000, 10,000,000, something like this, you really, really gotta you know, these these projects are are relatively expensive to to undertake. And the thing that we’re trying to do is lower the complexity, lower the risk, and lower the cost of of of doing doing some of these things. And and some of these AI features will be helpful. Right? We’re we’re fighting constantly because the chips are getting way, way more complex.

We’re introducing features and capabilities and automation that that keeps that complexity and that cost at bay.

Nick Hawkins, CFO, Arteris: I’m sure, Kevin, that you you you you asked also about, you know, what what are the the risks and and the downsides potentially and the headwinds to to use of AI in companies. The and and it’s a great question, and and I think you and and, and the audience will be will be aware that the adoption of AI, by the semi’s world and by the whole world, in fact, is is focused right now in terms of the, the the sort of a mundane, low or high high frequency, but low complexity tasks initially, rather than the the the most complex piece of of software coding, for example. So and we even see that internally, for example, in the we are already using AI tools in, even in g and a, for example, for efficiency purposes. But when you look, for example, a great example is the creation of a 10 q. So there’s some elements of the 10 q that are just basically rolling forward and adopting the language of the last 10 q and and shifting this quarter’s numbers into last last quarter’s numbers and and those kind of things.

That can be done very efficiently on AI. You could then say, okay. Go into my NetSuite and pull out all the numbers and populate the, the the the the current unannounced numbers, the actual numbers for the quarter we’re reporting. But, of course, the risk there is unless you have an entirely a fortress approach internal only, which is very, tough to do, not using any external, feeds, then your your data the risk is that your information can get pre populated to the world and available to the world. Beef and this actually happened to one of the big semis, and I won’t mention their name here, but it was well publicized, who who actually, put their chip design out and asked AI hack to help me, debug this software for the firm for all the this this design for my chip.

Unfortunately, that became then public knowledge. And so they they completely lost the edge because their design was now completely public. So those are the kind of dangers. So this is where you’ve gotta be very careful and judicious about what part of the company’s operations don’t really matter. Somebody’s seeing the last quarter’s numbers populated in the in the in the prior period column on a 10 q, that’s no risk because it’s already public.

Putting this year’s this quarter’s numbers out, that’s a risk. And so so that would have to be taken with, with a great deal of caution.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yep. And I’m I’m sure that’s I’m sure there’s gonna be a lot more, instances like that that that we’re gonna see kind of in the future, and it’s kinda what makes, again, cybersecurity also, you know, one of the the bigger themes as well.

Nick Hawkins, CFO, Arteris: And we take cybersecurity extremely seriously. So we’re probably more advanced than most companies of our size in terms of cybersecurity because it’s it’s a it’s a big old big hairy risk.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: No doubt. Just switching gears a little bit, you know, given the the ongoing trade tensions between The US and and China, can you just remind remind us how much of your top line is is driven from China customers and, you know, how how our business activities over there right now? Any and, again, you know, multi multi part question, but there was a a recent ban that I think is has been still still ongoing, but I think alleviated right now. But just the EDA sales to to China, does that impact our our tariffs in in any way?

Charlie Janik, CEO, Arteris: Not not directly. So we’re we’re we’re we’re basically you know, there has been any ban of IT. Right? So so that’s not direct. But the but the the the effect of it is on our customers.

Right? If you can’t get EDA tools, you you really can’t because because we’re part of a big ecosystem. Right? You don’t you you you’re having trouble getting your design stuff. Right?

So so this is this is basically the the issue is that there’s some indirect impact. You know, I keep thinking that we’re gonna work this out because The US economies and the China economies are are are are large, but, you know, we just have to roll with the punches. But there’s no direct impact on IP at the moment or at least as of this morning.

Nick Hawkins, CFO, Arteris: And I think, Charlie, the the the the, the other point is is, in terms of the the the teeth of that, most of our, most of our top line is derived from French origin products, and so they’re not actually subject to, US. So directly directly, the impact is very muted. In fact, in fact, zero. But as Charlie said, there is a there is a, a sort of collateral impact on on Chinese, chip designers, which they’re kind of working around. There are some nascent EDA sorry.

There are some native, Chinese EDA solutions, that but they do kind of 80% of the job of of the The US EDA guys. But there’s a switch involved, and it’s not not easy.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: And then how much of your your top line? I was coming that’s where I

Nick Hawkins, CFO, Arteris: was coming on to. My I had a brain thought, and I couldn’t get couldn’t get there. So yeah. So so it’s a it’s a great question. So, if you roll back to 02/2019, 50% of our business, our top line came from China.

Then the the the US BIS created the the entity list, and and that dropped to by about the beginning of twenty three, that was down to 30%. And right now, it’s it’s closer to sort of high teens percent of our of our of our top line in terms of deal creation. The so so bookings essentially, which we don’t disclose. When you look at the impact on revenue, since the drop in the second half of twenty three, sort of around midpoint of ’23, from around 30% to around fifteen, sixteen, 17% of total bookings at the front load, because we have ratable, revenue recognition. Our run rate of revenue concentration from China, has stayed at about 30%, and it’s now started to drop as some of those earlier China contracts have end dated and and they’re not faced.

There’s a couple of things going on in China that affected that drop. One was access to VC capital, which which really completely dried up around that time for China, and the was the, the the increased focus of the entity list, the re increased impact of the entity list. So and that’s that’s gradually coming through. What’s been good and how, you know, how we carry on growing at 20% ish, give or take in top line? And, like, just looking at revenue, for example.

And the answer is that there are other geos that are growing that are growing faster than the decline in China. So, that’s a really good fact, and and the probably the most the most the highest focus area in geo growth for us is is in The US, which is a great thing. And but we’re also seeing growth in China sorry, in Japan and in in Korea, for example. Excuse me. So right now, the last, the first quarter, numbers, we we were about 20 dropped to 25% of total revenue out of China.

And we by the time that we get fully lapped out of this decline in China that started in ’23, by the mid of twenty six, so maybe the beginning of the third quarter of twenty six, we should see that settling, at around high teens percent of total revenue from the current 25%. But all the time, other geos outpacing that because people still need chips. People still buy things. They’re still, consuming SoCs eventually, whether it’s coming from The US or or from, Europe or from China sorry, from China or from Japan or Korea, which are the main centers or Taiwan. The the the consumption is still there, the consumption demand.

And that’s why we’re seeing the overall growth rates still still happening.

Charlie Janik, CEO, Arteris: Yeah. It’s a it’s a combination of headwinds and tailwinds. And the reason you wanna be highly and and geographically application wise, customer size, you know, application wise is is that when inevitably things like the China headwinds happen, you can compensate with with tailwinds from things like AI chips and autonomous systems and and, you know, government investment in Europe and and those kinds of things in US. So, yeah, it’s Sorry,

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: John.

Charlie Janik, CEO, Arteris: No. It’s just a combination of tailwinds and headwinds.

Nick Hawkins, CFO, Arteris: A great example of that, Kevin, in terms of this diversification benefit that we have is is the just look at the automotive sector, for example. Right now, China Chinese EVs are, are are winning the game. They’re they’re doing extremely well. Tesla has had some some issues and some challenges in terms of its of its units. Many of The US and the European manufacturers have had headwinds.

Some of the Koreans, companies and the Japanese have been doing very well. We are designed into OEMs in all, places. So in China, in Korea, in Japan, in EMEA, and a lot in The US. So if one geo wins, against the others, we don’t mind because we’re already designed into four OEMs in China, for example, and multiple SOC, vendors into the, the Chinese, automotive market. So, we’re we’re very well positioned, and so we we have this luxury of being less focused in terms of having back to the right horse, because we’re whacking all the horses, essentially.

So if one’s winning, then we win. If another one wins, then we win.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: Yeah. The benefits of of being diverse as as you guys said. So we we have about five minutes left and, you know, we got an investor question and one topic that I did want to hit on is revenue model pricing trends. So I guess just to start on that, what is the average deal size that you guys are seeing? Where do you see it kinda going?

And, you know, are there increasing number of of design starts and deals that are at higher prices right now?

Nick Hawkins, CFO, Arteris: Charlie, do want to take that, or you’d like me to?

Charlie Janik, CEO, Arteris: No. No. I’ll I’ll I’ll I’ll take that one. The the ASPs continue to grow. They continue to grow because the designs are getting more complex.

There there’s also you know, they use more system IP, those kinds of things. So so the ASP is is is is growing nicely. I think we’ll be at a million dollar average, you know, by 2026, as we said. Now you, I think, on one of the earnings calls, have asked a very insider question. When we go into microcontrollers, isn’t they gonna dilute the ASP?

And the answer is yes. It is. But we are essentially trying to capture entire generations of microcontrollers rather than individual microcontroller projects. Right? So the the sort of the ASP calculation is kind of more for SOCs.

But, you know, the the the pricing’s going going up nicely because these chips just are very much more complex than than they used to be. And and generally, AI adds complexity. The chiplets add complexity. So so so we’ve and system IP is just becoming, you know, a very, very valuable category to the point where I think ultimately, it’s the most important IP category after the processor.

Nick Hawkins, CFO, Arteris: And if I can give you some CFO type responses to that as well, put some numbers around Charlie’s excellent commentary there. The if you look one of our most important, bedrock product, categories for product families is our non coherent interconnect, the flex family. FlexNOC is is is is our oldest product in the in the company or that that that family. And so if you look at FlexNOC four, which was really the mainstay when I when I joined the company back in 02/2019, That is gradually being replaced by FlexNote five as peep as customers go through the next iterations. And then eventually, many of those customers will adopt FlexGen into which is FlexNote five plus plus some automation features.

And now Flex Note five is roughly a 30% list price increase over Flex Note four because it does more. Flex Gen is another 30 on top of Flex Note five because it does more. So as the as the customer base shifts to these later, generations of of non coherent just as one example, then, if that lifts our our ASPs, our our our sort of project level, costs. You also have another another part of that question. I’m just caught just keeping an eye on the clock here.

In terms of the the the revenue model, and I think it’s a very important question. And I and I think when does when does that then generate, non GAAP profitability? The revenue model is a ratable, so essentially a deferred revenue, base. So, as we sign deals, the deal value goes on to the balance sheet and then amortizes into the revenue line over the the design term, which is typically on average around three years. So it’s we we have $90,000,000 approximately of what we call remaining performance obligations, which is essentially deferred revenues, revenue that we would have recognized already in the past, but is now going to be recognized in the in the future.

So the the path to profitability is really just a question of how quickly that, deferred revenue, and that RPO amortizes into top line because it’s a it’s a trailing measure of top line. RPO is the, which is essentially our backlog of revenue. That is the, the the leading indicator of of growth, and ACV plus royalties is the current state of of top line essentially, and revenue trails that by about a year. So the path to profitability is when do we have catch up? Because we’re only growing.

If we’re growing the top line at high teens to 20%, which is what we’ve stated, and we we brothel OpEx and spending down to half that level, so around 10%, so let’s say just for between friends, then you automatically go move towards profitability. It’s a math question more than anything else provided we don’t start losing market share at all or something like that, tragic like that, which we’re not. So, we see the the where the catch up point, where all this deferred revenue starts to get to a point where it matches OpEx, and cost of revenue, that is around the mid of twenty twenty six. And so that’s a math question as opposed to, a growth question, if that makes any sense.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: You know, it absolutely does. Yeah. No. I appreciate all that color. So, it looks like we’re just about out of time.

So, Charlie and Nick, thank you very much again for for joining us for our conference. We really appreciate it.

Charlie Janik, CEO, Arteris: Okay. You’re very welcome. Thanks. Kevin.

Kevin Garrigan, Semiconductor Analyst, Rosenblatt Securities: See you. Thanks. I’m for listening to you again.

This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers
© 2007-2025 - Fusion Media Limited. All Rights Reserved.