Tonix Pharmaceuticals stock halted ahead of FDA approval news
On Tuesday, 03 June 2025, Rambus Inc. (NASDAQ:RMBS) presented its strategic vision and financial performance at the Baird Global Consumer, Technology & Services Conference 2025. The company highlighted its robust market position and growth strategies, focusing on data centers and DDR5 technology. While Rambus emphasized positive growth prospects, it also acknowledged challenges in the competitive landscape.
Key Takeaways
- Rambus’s three-pronged market approach includes patent licensing, silicon IP, and memory interface chips.
- Data centers contributed over 75% of revenue last year, driven by DDR5 technology.
- The company expects significant growth in its silicon IP and memory interface chip businesses.
- AI is a major driver of increased demand for DDR memory, boosting Rambus’s market opportunities.
- Rambus positions itself as a strategic U.S.-based supplier amid global supply chain concerns.
Financial Results
- Patent Licensing: Generates a stable $200 million to $210 million annually.
- Silicon IP:
- Generated $120 million last year.
- Expected to grow by 10% to 15%.
- Memory Interface Chips:
- Generated $250 million last year.
- DDR5 market share in the early 40% range, targeting 40% to 50% long-term.
Operational Updates
- DDR5 Leadership:
- Established a strong position, with a cycle expected to last five to seven years.
- Companion Chips:
- Targeting a 20% market share with expected revenue contribution in the second half of 2025.
- MRDIMM:
- Sampling customers with positive feedback, anticipating revenue contribution in the second half of 2026.
- Client Opportunities:
- Two chips in the market: client clock driver and LP CAM solution.
- Tariff Impacts:
- No direct impacts; monitoring supply chain tightness.
Future Outlook
- Companion Chips:
- Revenue contribution expected to grow into 2026 and beyond.
- MRDIMM:
- Revenue contribution aligns with Intel and AMD platforms by late 2026.
- DDR6:
- In the definition stage, with Rambus aiming for a leadership role.
- Strategic Advantage:
- Last U.S.-based supplier, offering supply chain security to enterprise customers.
Q&A Highlights
- AI Impact:
- AI servers require significantly more DDR memory, accelerating the shift from DDR4 to DDR5.
- Custom ASIC:
- Benefits from rapid product cycles, providing critical IP blocks.
- CXL Platform:
- Adoption delayed, but Rambus continues to explore opportunities through its silicon IP business.
Rambus remains optimistic about its growth trajectory and strategic positioning. For further details, readers are encouraged to refer to the full transcript.
Full transcript - Baird Global Consumer, Technology & Services Conference 2025:
Tristan Gerra, Senior Semiconductor Analyst, Baird: Okay. Good morning. Well, let’s get started. I’m Tristan Gerra, Senior Semiconductor Analyst at Baird. I would like to introduce Wambus, a leading memory IP supplier.
We’re pleased to have with us today Desmond Lynch, Senior Vice President, Chief Financial Officer and Matt Jones, Vice President of Strategic Marketing. And with that, let’s get started. Gentlemen, good morning. I think you have a few slides for us before we start the fire chat.
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: We do. Thank you, Tristan, and good morning, everyone. It’s a pleasure for us to be here today at the conference. We have a few slides today. We’ll walk you through an overview of Rambus and really how we go to market with our solutions.
Before we begin, I’d just like to mention our safe harbor statement about forward looking statements, and I would encourage everyone please to read the documents on file with the SEC as they contain a lot more information on the company than we will discuss today. So jumping in, Rambus has been a pioneer within the semiconductor industry for the last thirty five years. The company was founded based upon foundational memory interface technology, which can be found in all of today’s modern compute systems. The foundation and the bedrock of the company really has been its patent licensing program, which has provided millions of dollars of free cash flow each year, which has enabled us to invest both organically and inorganically into product programs, which our licensees directly benefit from. From an end market perspective, we’ve really positioned the company around the data center with over 75% of our revenue last year coming from the data center end market.
This next slide is a great representation of how we go to market with our solutions as a company. There are three ways that we go to market, which is our patent licensing program, our silicon IP business and lastly, our memory interface chip business. So starting off with the patent licensing program, as I mentioned earlier, this really has been the bedrock and foundation of the company. We have a robust patent portfolio today of about 2,700 patents. And really the business here is really operating at 200,000,000 to $210,000,000 level, really stable cash generation of this business.
And that’s really a function of the long term agreements that we have in place with our patent licensees from here. The next area that we go is through our Silicon IP business. In Silicon IP, we sell building blocks of IP to our customers who integrate our solutions into larger ASIC and SoC solutions. The portfolio today is really built around security IP as well as interface controller IP. And the business is operating at scale today about $120,000,000 last year with the expectation of growing the business 10% to 15%.
And we’re really pleased with the diversity that this business offers to us as we have customers ranging from start up companies all the way through large well known leading semiconductor companies. And the last area that we go to market is through our memory interface chip solutions. This is where we sell chips to the memory vendors who integrate our solutions onto the DIMM module. And we’ve really seen some exceptional growth in this business. We’ve been able to grow the business to about $250,000,000 last year, which is up significantly given our leadership position within DDR5.
And really through our organic investments, we’ve been able to double our market opportunities and we’re very excited about the growth opportunities that the business offers. So three ways to market, a diverse business portfolio and we’re really excited about the opportunities ahead of us Tristan.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Thank you. My first question, and we’ve heard some of that on the this past earnings season, is macro trends have been fairly dynamic recently with tariff concern. We had Intel talk about some advanced purchases in some areas including PC past couple of quarters. At a high level, how do you see those trends? Do you see yourself impacted or not by tariff directly or indirectly?
And how are you seeing order patterns given those changing dynamics?
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: Yes, it’s a great question. One, we’ve been very pleased with our financial results in Q1 and the guidance we provided for Q2. Based upon what we see just now, we’ve not seen any evidence of pull ins from our sort of customers. As it relates to Rambus, I would say, based upon the current rules as of today, there is no direct impacts of the tariffs for us as a company. We are a fabulous semiconductor company who has manufacturing partners front end manufacturing partners in Taiwan and back end manufacturing partners both in Taiwan and Korea.
We ship our chips to the memory vendors who assemble the modules within APAC from there. What we continue to watch out for is any indirect impacts of the tariffs and we’ve really bucketed that into two different areas. Firstly, it would be any supply chain tightness as companies move out of China into other sort of locations. And secondly, really demand destruction depending upon the level of sort of tariffs that we have in place from here. If we look at one important factor within discussions with our customers and monitoring really is the level of inventory that has been held.
And I would describe that as sort of reasonable sort of levels. And I would attribute that to two factors: one, the overhang of DDR4 inventory of the past couple of years And really secondly, the fact that there’s three sub generations of DDR5 being within the market. But one area we don’t have great visibility into is the module inventory that our customers are holding. So we’re in that continuous discussion from there. But overall, we’re very pleased with how we’ve executed on the chip side and we continue to monitor the impact of the tariffs on our business from here.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Great. We get a lot of questions and interest on product revenue. Obviously, you’ve had a very significant growth the past several years. And I think at least some of that was driven by the transition to DDR5 where you have doubled the market share versus DDR4. That transitions has been complete pretty much for the past few quarters.
How should we look at the outlook for product to revenue for the rest of this year and medium term, whether the catalyst, what’s the level of confidence that you can maintain the growth that you’ve shown that has been fairly impressive and maybe ranking those catalysts?
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: Yes. I’d say that’s a great question. We’ve been very pleased with our execution on the DDR5 cycle, where we’ve been able to establish a leadership position. As you mentioned, we’ve been able to almost double our sort of market share. Last year, our share on the DDR5 was in the early 40% sort of range, in line with our long term sort of target of 40% to 50% market share, but up significantly compared to DDR4 where we were around mid sort of 20% share.
It is important to note that DDR5 still has some legs left within the cycle. It was only sort of last year about this time last year that the industry crossed over from a bit perspective to DDR5. And if you really look at these DDR cycles, you are looking at a five to seven year cycle. We’re only about two point five years into that cycle just now. In terms of the other sort of catalysts of sort of product revenue, I would really highlight a couple of areas, which has really been our organic sort of growth here in investment over the past couple of years.
The first area that I would highlight is the companion chips. On the companion chips, these are three chips under DDR4 that were on the motherboard and moved to the module under DDR5. This is the SPD hub, temperature sensor and power management device. And in aggregate, this offers a $600,000,000 market opportunity for Rambus. We are we do expect to see the revenue contribution in the second half of twenty twenty five with the launch of the next processor platforms and continuing to grow into 2026 and beyond and really getting towards a targeted market share on the companion chips of around 20% from there.
The next area that I would highlight is the MRDIMM opportunity. In October, Rambus announced the industry’s first MRDIMM solution from here. And really this is a solution that multiplex data against two ranks of DRAM to really increase significant capacity and bandwidth sort of growth from there. We have been sampling our customers and just recently we’re receiving sort of positive feedback. But in terms of timing of the MRDIMM opportunity, you are probably looking at the second half of twenty twenty six to see that first revenue contribution in line with the next several platforms from Intel and AMD.
And the last area that I would maybe highlight is really the client opportunity. What we’re seeing is some of the data center technology water falling down into the client space. And really this is a small sort of market for us just now, but it’s an important proof point for us going forward. And we do have two chips into the market today, which is our client clock driver as well as our LP CAM solution. So very excited about the future growth opportunities there.
But overall, very pleased with how we’ve executed the DDR5 cycle, very pleased with our organic investments that have been able to highlight a couple of the areas here today. We’re excited about the growth opportunities on the chip side.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Great. In data center, on the product side, I know you don’t participate directly with some of the Grace platform opportunities, but you have other content opportunities, which net net provide an acceleration relative to the demand that you’re seeing in traditional data center. Maybe you could help us understand how AI is also driving your business from a content standpoint.
Matt Jones, Vice President of Strategic Marketing, Rambus: Yes, it’s a great question, Tristan. So certainly our Silicon IP business, the building blocks that we sell, they’re very targeted for AI accelerator solutions. So HBM interfaces, GDDR as we see inferencing begin to rise in some of the bespoke accelerating solutions leveraging that memory technology, and then the interconnect of these accelerators to standard complexes. We certainly get the question a lot of, hey, we see, you know, HBM back a couple of years and continuing on, is that a displacement vector in terms of the data center for traditional DDR, you know, with GB 200 and the Grace chip sitting next to Blackwell and to a smaller extent in the hopper generation as well. You know, the spotlight comes on LP CAM.
But what you’re seeing is with the diversity of computing as we see these heterogeneous models is not the turning from one technology to the other, but the augmentation of different types of memory for different workloads and different purposes. So when we saw HBM rise and become much talked about, it was a small amount of that memory sits next to each of the GPUs. It’s very wide, provides a lot of bandwidth, but it has its limitations as well. It complements very well the system memory, as we would like to call it, that goes along with the standard compute element. And we’ve seen that AI has been an accelerant for us.
It’s a tailwind in our product business because in every AI server there is some traditional content, the mix varies depending on whether it’s a kind of a 4U type of box to a rack GPU to traditional compute, but it’s a much more dense amount of memory. So versus a traditional server you tend to see two to 4x the amount of DDR in given AI server. The other important thing is that it was a catalyst for the conversion from DDR4 based systems to DDR5 based systems. The performance needed in the traditional compute complex to feed those GPU complexes really drove traditional compute and the memory associated with it and helped us realize the market share gains that we’ve picked up from DDR4 to DDR5, and the additional chip content that Des talked about in the companion chips. So we see it as a positive for us.
We’re going to continue to see the evolution of memory subsystems, purpose built subsystems and from our foundational technology, as Des talked about, from patents through silicon IP and our chips, know, we’ll work to solve those problems in the interface between compute and memory.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Yes. I think you touched on something very important that the density of memory is much higher in AI configuration. Is that the case just for standalone AI GPUs like B200? Or is that the case as well for GB200, which is still a very small percentage, but presumably wiping a little bit faster in the second half?
Matt Jones, Vice President of Strategic Marketing, Rambus: It’s across the board. You know, what you see is whether it’s a standalone GPU, whether it be from Nvidia or AMD or others, or you know what Nvidia calls the super chip, the Grace Blackwell. Those are really focused on that training piece of the problem. Keeping the model nearby, or as much of the model nearby, and having that data prepared and staged is very important. The key thing is with the power consumption and the cost of things like GPUs and accelerators, keeping them as efficient as possible is critical.
And so we talked about HBM, the amount of capacity of memory to keep portions of the model staged close to those GPUs is very small. So that’s why you see the larger system memory in AI applications. But it is across the board, even the bespoke kind of LP DDR that goes with Grace in a Grace Blackwell, it doesn’t solve that capacity problem and we see continued drive to more capacity density in AI servers. Great.
Tristan Gerra, Senior Semiconductor Analyst, Baird: So it sounds like as AI continues to expand as a percentage of total data center, you should see probably some acceleration in terms of DIMM count that will help your product your business on the product side.
Matt Jones, Vice President of Strategic Marketing, Rambus: That’s right. The amount of the increase in the capacity should drive additional DIMMs per box, if you will. Also
Tristan Gerra, Senior Semiconductor Analyst, Baird: of high interest is obviously custom ASIC and you know in terms of the ramp of AI and we’ve seen companies including com, successful announcing new engagements. And I think you’ve talked in the past about how now we have accelerated product cycles. NVIDIA tries to be on a one year product cycle. There’s a lot of activity with custom ASIC. How do you benefit from that on the custom ASIC side?
Matt Jones, Vice President of Strategic Marketing, Rambus: Yes. The custom silicon side has been a really exciting trend in the market for the companies you mentioned and for our silicon IP business. So you know the customers you mentioned similar to them, you know the acceleration as you talked about Tristan of one year product cycles and that’s you know really the cadence that the custom silicon business is aspiring to as they clean through the pipeline of initial developments. We sell the building blocks that help them speed that time to market. So those custom, those standard interfaces like HBM4, we’ve talked about driving our business to scale.
As inference accelerators continue to rise, a more diversity of memory LPDDR and GDDR controllers start to become more of the key interface there. But then the connectivity PCI Express, CXL for chip to chip connectivity. And then importantly, as we’re seeing these accelerated compute models, it’s a heterogeneous compute environment. Data is moving from parts of the system to other parts of system from chip to chip and securing that data becomes critical. So our security IP for hardware roots of trust for secure booting of systems and then IPsec and MACsec for protecting that data as it’s in motion within a box or across the data center are also key elements that help give rise to the custom silicon business and benefit from it.
Tristan Gerra, Senior Semiconductor Analyst, Baird: So it sounds like the companies providing custom ASIC design would be significant customers and there is more demand for more IP blocks over time in addition to the unit ramp.
Matt Jones, Vice President of Strategic Marketing, Rambus: They certainly have made it more exciting without commenting specifically on any of them Tristan, yes.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Great. Just going back on the product revenue side of things, Desmond, you talked about the companionship ramping. You didn’t say Granite Rapids, but basically you’re inferring a second half ramp. You’ve said also it will take a few years for companionships to kind of get to that targeted market share that you’ve talked about. How should we look at companionship end of this year once that ramp has happened?
Are you on that particular platform at your market share target medium term, but then you’re waiting for other CPUs to get to that share? Or how should we look at where this ramp gets you by end of this year on the companion ship side?
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: Yes, it’s a good question. I would say in terms of our sort of DDR5 rollout, it’s been very strategic. One was to make sure that we win on the RCD and get to the market share gains that we’ve talked about. But in terms of the companion chips, it was just April of last year that we completed our chipset solution with the launch of our power management solutions. And what we’ve been doing in the past year here is really going through the qualification phase with the customers.
And we’ve been very pleased with how that has taken place. What we’ve talked about is a ramp in the second half of the year and continuing to grow into 2026 and beyond and getting towards that targeted sort of market share of the sort of 20% that I talked about. To get to that 20% market share, you are looking at the subsequent sort of processor platforms from Intel and sort of AMD as well as MRDIMM will also have this sort of companion chip opportunities. But we’re very excited now to have the complete chipset out to the market and we’ll see this revenue contribution in the back half of this year continuing to grow from here onwards.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Just from a longer term standpoint, obviously, your content is associated with x86. If we see other architectures ramping, obviously, you have content, around, GPU platforms, not directly, but around GPU platforms. Are you kind of agnostic in terms of CPU architectures? Or or how could that make your business evolve over time?
Matt Jones, Vice President of Strategic Marketing, Rambus: Yes, it’s a good question, Tristan. We get that certainly a lot with the increasing traction and certainly press around ARM based CPUs. And we talked a bit about custom silicon. That’s not just accelerators, that’s also processors. You know what we’ve seen is that yes, AMD or Intel, you know historically agnostic in ARM on the processing side of things certainly continues that trend.
We’ve seen, you know, the commercial introductions of ARM based solutions not only adopt but drive DDR5 utilization as well as their memory of choice. We continue to look forward as we see other emerging compute models, as I talked about for memory subsystems as we go. But yes, agnostic with the CPU choice.
Tristan Gerra, Senior Semiconductor Analyst, Baird: We also get some questions about the CXL platform that’s been pushed out. What’s the landscape there and what are the opportunities there for Rambus?
Matt Jones, Vice President of Strategic Marketing, Rambus: Yes, so there’s two real opportunities that we see there historically. One, our silicon IP business, we sell the controller core that manages that CXL connectivity. We sell it to a broad range of customers and applications using that CXL interconnect standard. So we have good insight to where things are going. I think as an industry we’ve seen that we all see CXL being pushed to the right from where we saw it maybe two years ago.
AI kind of happened and so a lot of the data center activity needed to absorb and capitalize on AI, and so CXL attached memory as a way to augment memory capacity and bandwidth and systems might have taken a bit of a backseat in terms of priority. But also as an emerging standard we’ve seen a bit of a fragmentation there specifically. We’d expected the first application to be CXL based chips that would control large blocks of memory to add memory capacity to servers. So you wouldn’t need to just continue to stack servers if you had memory intensive applications and be inefficient in your CPU. But as we saw the hyperscalers who are going to be the leading edge of this adoption as the industry saw it, adopt that we saw very different usage models come about.
Cold page storage from one of the hyperscalers has a very different set of needs than maybe search algorithms or you know some of the other things that would be used there. So we’ve seen
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: a
Matt Jones, Vice President of Strategic Marketing, Rambus: fragmentation Tristan and really a need for that market to coalesce into a standard usage. And then a lot of things happen along the way in terms of AI and other effects that have pushed it out to the right a little bit.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Okay. Thanks for the feedback. Just going back to something more near term and topical, given the macro trends right now and you have some competitors that are foreign based. Are you benefiting at all from your U. S.-based position?
And how do you look at those trends near term? Or what customer feedback are you getting in that regard?
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: Yes. We are in a small ecosystem, three customers, three suppliers, primary suppliers into the market. Our competition in this area is Montage, who’s the Chinese supplier, as well as Renasas, the Japanese conglomerate into this market from here. What I would say is that none of the companies have been impacted by the current sort of trade restrictions. But the fact that Rambus is the last U.
S.-based supplier into this market, we do believe in the long term is a strategic advantage. We’ve certainly heard comments from the enterprise customers as well as the CSPs about the secureability of sort of supply chain. So I think overall, we do think in the long term it is a strategic advantage for the company being the last U. S.-based supplier.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Great. Then on the licensing front, so you have a stream of licensing revenue, but you also have a percentage that is growing and you mentioned HBM. Could we maybe dive in a little bit more on that? What I don’t know if you really quantify this, the percentage of your licensing IP that is actually not fixed or kind of flat over time, but is growing with new contract? And what are the key drivers, HBM and anything else you may want to mention?
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: Yes. I would say I’ll go back to the three ways that we go to market sort of Tristan. On the patent licensing business, I talked about 200,000,000 to $210,000,000 That’s been relatively stable at that sort of level given the long term agreements in place. In terms of the sort of growth area on the IP side, I’d maybe point you towards the Silicon IP business. Last year this was about $120,000,000 growing 10% to 15% would be our sort of expectation on this sort of model.
And in this area, this is where you have some of the key building blocks of IP. On the security side, as we move towards heterogeneous compute environments, the importance of securing data at rest and data at motion is sort of a good trend for us. And then on the interface controller side, have a lot of sort of key IP here such as HBM, CXL, as Matt sort of talked about, as well as GDDR and PCIe. So that’s the sort of growth area on the silicon IP side as it relates to the business.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Great. Any questions from the audience? Any opportunity we haven’t talked about or anything you wanted to emphasize whether it’s in terms of growth opportunity going forward or things that you think we should be aware of for the rest of this year?
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: Yes. Again, I think as we’ve talked about today, we’ve been very pleased with the progression of transitioning the business from a patent licensing business to be a semiconductor product solution company. What we’ve touched upon today has been some of the product sort of growth areas. These have all been organic areas that we’ve investing in and we’ve been able to double our market opportunity here in the past couple of years. And I think we have a steady sort of roadmap here, which will take us through the end of sort of 2030 on the DDR5 sort of cycle.
If you look beyond 02/1930, I think you’re starting to look into the timeframe of DDR6 coming into the market. DDR6 is still in the definition stage within JDAC from there. But again, if you look beyond that, we’re looking at five to seven year cycle. So again, we play a leadership role in that sort of committee definition within JDEK and we have a roadmap that will expand into sort of mid-2030s. As we’ve talked about today, some of the client opportunities remain very interesting for us sort of going forward.
We’re at the very high end of this sort of client sort of space. But we’re very excited about the sort of growth opportunities ahead of us on the chip side. And we have a very sort of strong roadmap, which we’ll continue to execute on from here.
Tristan Gerra, Senior Semiconductor Analyst, Baird: Great. Gentlemen, thank you very much for presenting with us. Thank you. And we’ll have a breakout session at the Rockefeller Foyer at the mezzanine level. Thank you.
Desmond Lynch, Senior Vice President, Chief Financial Officer, Rambus: Thank you.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.