Meta at Morgan Stanley Conference: AI Investment and Strategic Vision

Published 06/03/2025, 20:12
© Reuters.

On Wednesday, March 5, 2025, Meta Platforms Inc. (NASDAQ: META) presented at the Morgan Stanley Technology, Media & Telecom Conference, offering insights into its ambitious AI initiatives and infrastructure investments. Chris Cox, Meta’s Chief Product Officer, discussed the company’s substantial $60 billion to $65 billion investment in AI infrastructure, while also highlighting the challenges of navigating AI’s hype cycle and the necessity for patience in achieving long-term goals.

Key Takeaways

  • Meta plans to invest $60 billion to $65 billion in AI infrastructure this year.
  • Llama, Meta’s open-source AI model, has over 800 million downloads.
  • Meta AI boasts 700 million monthly users across its platforms.
  • AI-driven recommendations have significantly increased user engagement.
  • Meta is developing new technologies, including smart glasses and an EMG wristband.

AI Research and Infrastructure

  • Meta’s FAIR team, celebrating its 10th anniversary, continues to contribute to open-source AI, including the development of PyTorch.
  • A two-gigawatt data center, equivalent in size to Manhattan, is being designed for AI training purposes.
  • Llama, Meta’s foundation AI model, has achieved over 800 million downloads, demonstrating its widespread adoption.

Llama and Open Source

  • Meta’s commitment to open-source development attracts top talent and supports safety, geopolitical, and auditability goals.
  • Over 4 million advertisers are leveraging GenAI tools for image creation and text generation, enhancing content adaptability.

Meta AI and Use Cases

  • Meta AI aids users in understanding cultural context and news on platforms like Facebook and Instagram.
  • The AI model offers personalized search experiences and engages users in ongoing conversations.
  • A notable example includes Cox’s son using Meta AI for Dungeons & Dragons-related queries.

Monetization and Business Messaging

  • Meta prioritizes product functionality over immediate monetization, although there is potential in business messaging and advertising.
  • Hundreds of millions of businesses utilize WhatsApp for customer communication, with a beta program allowing Llama to assist in customer interactions.
  • Business messaging is particularly popular in Brazil, Mexico, and India.

AI-Driven Recommendations

  • AI-driven content delivery accounted for 30% of Facebook posts and 50% of Instagram content last year.
  • These recommendations have fueled the growth of Reels and increased user engagement, with a focus on content likely to be shared.

Wearables and Future Technology

  • Meta is expanding the availability of Ray-Ban Meta smart glasses, which enjoy strong user retention.
  • The company is also developing an EMG wristband for gesture-based input in AR/VR environments.
  • Custom silicon (MTIA) is used for inference in recommendations, reducing operational costs.

Challenges and Opportunities

  • Meta recognizes the organic evolution of AI use cases within communities as an underappreciated opportunity.
  • Cox emphasizes the importance of patience amidst the AI hype cycle, especially in areas requiring nuanced human understanding.

For further details, readers are encouraged to refer to the full transcript of the conference call.

Full transcript - Morgan Stanley Technology, Media & Telecom Conference:

Unidentified speaker, Interviewer, Morgan Stanley: All right. Good morning, everyone. Welcome to Day three of the Morgan Stanley twenty twenty five TMT Conference. We’re thrilled to have Chris Cox with us from Meta. It’s good to see you.

Welcome to your first keynote presentation here.

Chris Cox, Chief Product Officer, Meta: Yes. It’s a keynote. This is a keynote. I did not prepare a keynote. We’ll call it a fireside keynote.

I’ll take it down, but we’re really happy to

Unidentified speaker, Interviewer, Morgan Stanley: take you so much. Yes, thanks

Chris Cox, Chief Product Officer, Meta: for having me. Nice to hear.

Unidentified speaker, Interviewer, Morgan Stanley: I have to do some disclosures first. Please note that all important disclosures, including personal holdings disclosures and Morgan Stanley disclosures appear on the Morgan Stanley public website at www.morganstanley.com/researchdisclosures. They are also available at the registration desk. Some of the statements made today by Meta may be considered forward looking. These statements involve a number of risks and uncertainties that could cause actual results to differ materially.

Any forward looking statements made today by the company are based on assumptions as of today, and Meta undertakes no obligation to update them. Please refer to Meta’s Form 10 K filed with the SEC for a discussion of the risk factors that may impact actual results. So just for by way of background, Chris, you’re the Chief Product Officer at Meta. You lead Facebook, Instagram, WhatsApp, Messenger, Threads, GenAI work at Meta AI Research, a lot of the privacy work that you’re working on at Meta. I believe you were one of the top twenty first twenty people, 20 engineers working at Meta.

That’s right, number 13. Number 13, lucky number 13. You’re in charge of a lot of teams and you have a great perspective of everything that’s been changing at the company. So thank you so much for coming in. Yes, thanks for having me.

So maybe let’s start with sort of what is changing and sort of in the AI research and what you’re focused on. It’s just so interesting over the last, call it, twenty seven to twenty eight months since we had GPT launch. There’s been a lot of new innovation, a lot of changes and almost pivots of what happens next, what are we focused on. So I would be curious to sort of ask you, when you look sort of internally at Meta, what has changed in how you are approaching AI research, the teams that are working on GPU enabled capabilities? What has changed over the last two point years?

Chris Cox, Chief Product Officer, Meta: Sure. So I mean, just going back to, I guess, GPT, we’ve had for a while now a fundamental AI research team called FAIR. This team just celebrated recently its 10 anniversary. This is a team that we set up a long time ago to think about doing next generation research on where ad was going and to do it openly. So this team has contributed a lot to the open source community going back to things like PyTorch.

If you’re an AI developer, there’s a really good chance for the past decade you’ve been using work that was released by our fair team. Then as you know for ads and for reels, just for ranking and recommendations, there’s just been a ton of AI work we’ve been doing for a long time to run the core business and to make the product good. Past three years, we’ve built up a now among the leading industry teams on GenAI. So we took a lot of the folks with language backgrounds who are working on LLMs, a lot of the experts inside of the company as well as from outside of the company to see the GenAI team, which is now the thing I’m spending almost all of my time on building out and getting both the infrastructure really dialed in. We’ve announced we’re spending $65,000,000,000 on 60,000,000,000 to $65,000,000,000 this year.

A lot of that is on AI infrastructure. Mark shared this picture of a data center, a two gigawatt data center we’re designing now, which is about the size of Manhattan. So I’m still wrapping my head around that. It’s quite the map, it really is. So we’re sort of building the plumbing we need to be able to train at the Frontier.

We’re also building the Llama series. So Llama is our I’m sure you’ve heard of Llama, but Llama is our foundation model. LAMA three is the most recent. We released that last April at the frontier with open weights. So we’ve had this sort of dedication to wanting to continue to be open in the way we do research and publish weights.

This then becomes beloved by developers if we do it well. We’ve had over 800,000,000 downloads of the llama series now, which means a million a day. This is a weight of a large language model. It’s not a consumer piece of software being downloaded. And if you use Zoom, if you use Shopify, if you use Spotify, you’re using instances of Llama, tons of startups and developers are using Llama.

In the scientific community, folks like the Mayo Clinic are using Llama. There’s just all this huge range and diversity of use cases, which accrues back to our understanding of how to build a frontier model that’s usable across a huge range of domains. And most importantly for us, by us, which gets to Meta AI. So Meta AI is now the most used AI assistant on a monthly basis. We’ve got 700,000,000 monthly users.

It’s embedded into all of our apps, not in threads, but Facebook, Instagram, WhatsApp and Messenger. WhatsApp being sort of the most commonly used entry point for Meta AI because it’s a messaging app and both voice and chat are the modalities that people expect there. And also if you’re in India or Mexico or Indonesia, there’s a really good chance that WhatsApp is the place you’re the most comfortable having a conversation, whether it’s with text or voice. So Meta AI, our goal has been to sort of build the most freely available intelligent personal assistant that understands where you’re at as a consumer, understands what you know about, what you care about, your reading level, sort of which sources matter to you, how to talk to you about whatever it is that you’re sort of thinking about or working through. And to be able to have that conversation over the span of years versus these like sort of single shot one player experiences you kind of have today.

So that’s where we’re going with Meta AI. There’s been a huge amount of work there. Just getting that to scale over the past year and there’s a lot of work happening this year on deepening the richness of the experience, the personality, the personalization, the voice interface, etcetera. And then the last piece I’ll say is just the Ray Ban metas. So we have now we believe is the best AI device on the market, which is the Ray Ban metas.

They allow you to talk to them. You can ask questions about the world in front of you. This, it turns out is incredibly powerful of an experience when people are having it. So I would say those are the main pillars of what we’re doing and then there’s a ton underneath the surface.

Unidentified speaker, Interviewer, Morgan Stanley: That’s good color, right. There’s a lot I want to sort of drill into there. Let me ask sort of one bigger picture, one about sort of philosophy on infrastructure build and so how you arrive at $65,000,000,000 is the right number. You need a data center the size of Manhattan. There’s all this discussion about the way to win in pre training is to have the most parameters, the most compute, the most data and then you have post training and now it’s like, I don’t know, maybe not, maybe you can do less upfront and more on the back end with post training and tuning.

What is Meta’s sort of view on that right now?

Chris Cox, Chief Product Officer, Meta: Well, we’re in a unique situation as Meta, which is that we have a lot of teams that need GPUs. So just putting training of LLMs aside, we have all of this ranking and recommendations for ads, all of this ranking and recommendations for reels, both in Facebook and Instagram. There’s plenty of teams waiting for GPUs. Right. We have all the fundamental AI research that I talked about that’s not just pre training of LLMs.

So we are just investing in having really world class data centers with the best GPUs, the best networking, co located with the data that they need to make sure that we are able to be positioned, however, the cards end up being played on how important is pre training versus post training. It turns out these both of those, the science on both of those is evolving pretty quickly on data mix, video is starting to get used, synthetic data is getting used a lot more. The architecture is changing, so mixture of experts is sort of the novel architecture that all the frontier models are using. How do you do that effectively while that’s still sort of getting figured out? And then in post training, you’re obviously starting to see a lot more not just done with humans in the loop, but with other novel approaches.

So for us, it makes sense to invest in part because we want to be able to adjust.

Unidentified speaker, Interviewer, Morgan Stanley: Right.

Chris Cox, Chief Product Officer, Meta: But also because we know that we have plenty of uses for these things regardless of how much the pre training requires.

Unidentified speaker, Interviewer, Morgan Stanley: Got it. Yep. So a lot of uses for GPUs at Meta. Maybe to let me dig into a couple of these then. So let’s talk about sort of some of the biggest llama opportunities off of your core platforms.

When you sort of step back and you say, we want to have the leading open source models with Llama available for all developers. What are the benefits to Meta from that other than the core platform? How do you think about new use cases, ways in which you can sort of drive more digitization of the globe with Lama?

Chris Cox, Chief Product Officer, Meta: Well, so first, I mean, having the heritage of a lot of the company was built on open source. So the company I joined, like you were building everything on open source software. It was Linux, it was Apache, it was MySQL, it was PHP. There’s a certain heritage and soul that comes with open source that’s really powerful for a lot of the best researchers and engineers. They want to work on it.

And so part of what we get that’s not showing up in the code or the product is just you attract a lot of talent who believes that doing this in the open for safety reasons, for geopolitical reasons, for auditability reasons is just the right way to do it. So that’s kind of why we’ve stood behind open source or open weights here from the beginning is it brings you the type of people that frankly are the type of people who built the company and the type of people that we want to work with. If you look at how it’s being used, I mean, you get lots of people doing distillation in interesting ways. You get lots of contributions back to the science, specific to LAMA, but also to the problem in general. When you’re doing stuff out in the open, you start to see in each of these niche areas, similar to what you saw with Linux, you start to see developments around the edges that can accrue back to us all the way down to the level of like silicon optimization.

So those are some of the things you see. I’d say the last piece is businesses using. Gen AI has been fascinating. I mean, we’ve got, I think 4,000,000 advertisers are using these tools for the simple act of like creating an image or writing a piece of text. And if you think about the problem of like I don’t have a marketing team.

Unidentified speaker, Interviewer, Morgan Stanley: There’s

Chris Cox, Chief Product Officer, Meta: hundreds of millions of businesses. I don’t have a marketing team. I don’t know how to translate this message into all of the languages of the people who might want my product or service. This image could be different depending on the audience. Like these are fascinating important problems, right, for the big idea that every business could reach its customers wherever they are in the world.

That’s a huge idea. We until now haven’t really had a tool. We could help your piece of content reach the right people, but we weren’t going to help you write the right content or create the right image and then have it register in the right way in rural India, in Lagos, Nigeria, in Paris. You know what I’m saying? Yes.

So that to me is a really powerful big idea. Yep. That’s not about a chatbot that applies not just for advertisers, but also for content creators like anybody creating a video for TikTok or for reels or for YouTube or for whatever. The idea that it could appear in the right language just magically for free, including with things like lip syncing, not just in audio, but in video. That to me is a really big idea.

Yep. Especially for us because we’re in the business of helping billions of people get personalized content experiences. But it’s sort of adjacent to the core LLM sort of thing that we’ve seen so far, which is the chatbot interface that I think is really important.

Unidentified speaker, Interviewer, Morgan Stanley: And that’s like you go back to the beginning of digital advertising, it was just it was more personalized. It was better targeting. And now if we actually have ads that are more personalized potentially even one to ten, one to one conversion rates should go up. The relevancy of an ad that for the same set of golf clubs, we may see a very different ad because they know that we have different interests and lo and behold conversion goes up. So that you add a lot of value for your users and for your advertisers from that.

When you think about sort of the advances in Llama over the generations, I know this year we’re sort of we’re hoping expecting Llama four. If you want to announce the date, now is the time to do it. LlamaCon is ahead. What some of from a change and sort of advancement perspective, what are you most excited about with with LAMA4 with what the new capabilities could be unlocked?

Chris Cox, Chief Product Officer, Meta: Yes. We’re working on LAMA4. No date forthcoming. We’ve finished pre training the smaller model. The main thing first just from an intelligence perspective, we’re trying to pack basically the intelligence of the large LAMA three Series down into really small models, which can then be used with low latency for low cost on devices on a single host.

So basically getting a lot of the intelligence down into the smaller form factor, that’s one of the most important things we can do. The second is just the basis of what’s expected in a new model today, which is reasoning. AgenTic use cases just meaning tool use, ability to use a browser, ability to use other tools. And then the third piece is being an omni model. So basically interacting with image and voice natively.

So rather than translating voice into text, sending text to the LMM, getting text out, turning that back into speech, having speech be native. This is a big deal, right? I believe it’s a huge deal for the interface, the product. The idea that you can talk to the Internet and just ask it anything. I think we are still wrapping our heads around how powerful that is.

Yes.

Unidentified speaker, Interviewer, Morgan Stanley: And I think the one of the consumer facing products we’ve seen already has been MedAI that takes advantage of certainly, Llama three, 700,000,000 users now. Maybe so let’s talk about Med AI a little bit. Can you give us some examples of how those current users are using it? What are they asking are they searching for? What is sort of some of the recurring use cases you’re seeing in Meta AI so far?

Chris Cox, Chief Product Officer, Meta: Sure. Well, so in the context of Facebook and Instagram, a lot of it is like I want to learn more about this thing I’m looking at. Could be the Oscars. Yep. It could be a news story.

It could be a meme that you don’t totally understand and are afraid to ask your friend. AI is really good for like just something you can ask anything about anything. And because in Facebook and Instagram, especially you’re discovering a lot of what’s happening today. A lot of it is cultural context. A lot of it is related to news broadly defined.

A lot of the queries are just learning more about what you’re seeing. We have started to help in search results in Facebook and Instagram use Meta AI to summarize results. If you’re trying to wrap your head around what happened on the red carpet, like you can just quickly see that on top of all the images and videos that it’s describing. So that’s happening a lot. I would describe that behavior as kind of search two point zero, taking something you would maybe have gone to a search engine before, but now wrapping it with a ton of rich context and also coming to you with the context of what you’ve already seen and looked at.

So that’s happening. It’s powerful. It’s big. It’s something everybody understands because of the product. I’d say the second one is a little more like an ongoing conversation with an assistant.

And I think you see this across LLMs is the idea that you’re constantly thinking about one or two things. You know, I’m a marathoner, me personally, so I’m like constantly like nerding out about running. Mhmm. I play piano, so I’m like constantly trying to figure out what I wanna learn next. And those are questions that I can get like a pretty basic response if I just ask the Internet.

But if I have an ongoing conversation about it, it turns out that an assistant can learn in a rich way how to talk to me about my journey as a runner, right, or whatever it is. My son is a 10 year old. He’s really into Dungeons and Dragons. Yeah. So as we are as 10 year olds, right?

Or 40 year olds. Or four year olds. I never played, but apparently people still play. Yeah. So one of my favorite things to do with him when I’m driving the school is I’ll I’ll open that AI and I’ll turn on voice mode, and I’ll just like, alright, you can ask it anything.

So you can start with, like, what happened in the news today? Oh, well, this thing happened in in Congress. Oh, well, how does that normally go in Congress? And it’ll just like start answering and it’s like, wow, he’s he’s in the back like, well, yeah, it makes sense that you can do this. Right.

And I’m just like, this is the same. You can just have an interactive conversation about anything, including the news today. Yep. But when when his eyes really lit up is it’s like, alright, talk to it about DND. And it’s like, alright, well, I’m a level four Druid.

Right. Alright. 99% of humans you tell that they’re going to be like, I have no idea. Of course, NetAI is like, oh, you’re a level four Druid, like which subclass did you choose? Right.

And it’s like, well, here’s the subclass. So then it’s like, alright, you can take a like a deep topic that is very niche and get into like a very specific long conversation. That to me is a really powerful idea for the product. And to get back to your question, those are the kind of capabilities we’re going to unlock as we bring voice online line to be very native and also as you get the sort of personalization factor happening for everybody.

Unidentified speaker, Interviewer, Morgan Stanley: How do you think about it? I know the philosophically, one of the objectives is build new use cases and drive utility and then over the long term drive monetization. But monetization is never a front burner. Search does open the door for potential monetization, I would argue. So walk us through how you think about the potential of monetizing Search.

You’re a big runner. You’re going through your Instagram feed, you see reels of people running marathons. Couldn’t you have MedAI ask you, want to buy a new pair of shoes, want to plan a marathon trip? There are monetizable questions

Chris Cox, Chief Product Officer, Meta: I could ask you. Where are

Unidentified speaker, Interviewer, Morgan Stanley: you in that philosophically? So yes,

Chris Cox, Chief Product Officer, Meta: you could ask all of that. The hard problem would be how to have it be annoying, right? And we would start with and but that’s the kind of thing we’re looking at. It’s like what is the right way to have an ongoing conversation with a person about what they care about at the right time. We would start with what’s the right way to do that before we get to how to monetize it.

Yep. It’s not a priority right now to like have people having conversations with NetAI that we can monetize. Yep. The priority is make the product work for everyone. But as I said before, we’re having a ton of success already with advertisers using tools to create ads.

We have a beta for businesses using basically Lama as a delegate so that they can chat with a business can delegate an AI for the person to chat with them in WhatsApp or Messenger. So we have hundreds of millions of businesses using WhatsApp to chat with customers. They don’t want to deal with every single customer service request. They’re answering the same question hundreds of times to be able to automate that is a huge deal obviously. So that would be the next most obvious place we would look to, to say that’s a big business.

Got

Unidentified speaker, Interviewer, Morgan Stanley: it. Okay. The other thing is to sort of step back to your earlier question your other point of just the amount of opportunities for GPUs at the company. Last year, you talked about how 30 of posts people see on Facebook are delivered through the AI recommendation engines, 50% of the content people are seeing on Instagram is AI recommended. That’s leading to increases in time spent, increases in click through rates, we think.

Where are you in that journey now of just sort of taking your leading corpus of data and running it through GPU enabled machine learning to drive more engagement and monetization opportunities?

Chris Cox, Chief Product Officer, Meta: I mean, we’ve been doing that for a while. The whole reals journey was how do we get into position as a contender for the best in the world alongside by dance at being able to solve this needle in the haystack problem of here’s a 100,000,000 videos for each person to show them exactly the right one, right. Hard problem. When you do it right, it is magic. When you don’t do it well, people bounce and it’s like this thing is lame, right.

Turns out, one of the nice properties about it is it’s not just engaging, but it creates sharing and social behavior just automatically when you get it right. And so we’ve been looking for a while, cool, let’s make sure Reels is growing. Let’s make sure we’re getting in line to be best in the world at the tech, the tech and product. Even more, even a sharper curve in the right direction is how much are people sharing this content. So are we doing a better job of showing people content that they end up talking about with their friends?

We know this to be true. If you see a funny SNL clip, if you see a poignant story that touches you about something you care about, whatever it is, you talk about it, period. We talk about even the news or I’m using the Oscars example because it’s recent. We talk about it through the lens of like the moments that resonated with us that stuck with us. And so those are getting shared around and that’s part of what’s making Facebook and Instagram work in a modern world, which is people want to discover video and they want to talk about it.

Turns out that’s what those apps do. Threads is a fascinating opportunity, like how do you do this well for text? Right, we all see the proof of concept for short form video. It’s hours a day between us and TikTok and YouTube. For text, it’s a fascinating opportunity.

China, you had this app called ChoCho, which was the predecessor to TikTok before the Musically acquisition was a text based and link based aggregator of content using sort of modern techniques, so GPU, sparse models, etcetera. And it was a really popular product in China. We’ve never really seen that play out in the West. We have X, we have threads, you have Blue Sky, you have other options for sort of consuming text based real time information. We’ve never really seen the full power of an amazing recommender system plugged into that, which I think is a really big opportunity for that whole space.

Unidentified speaker, Interviewer, Morgan Stanley: Got it. That’s helpful. And then we’ll have to I’ll watch the threads growth to see if we continue to scale that business up from a user perspective. You mentioned business messaging earlier. You’ve disclosed about business messaging on WhatsApp as the main driver in other revenue, which is now about a $2,000,000,000 run rate.

Walk us through sort of the vision for how you think about multimodal models unlocking new business messaging and business agent capabilities and where are you making the most progress there from an industry or geographic perspective?

Chris Cox, Chief Product Officer, Meta: Well, if you drive around in like Brazil, if you just drive around, you’ll see that WhatsApp is like the way. It’s just the way businesses are asking you to communicate with them. Mexico is like this, India is like this. These are countries where the Internet the order in which the Internet sort of people came on to the Internet versus WhatsApp was just inverted or at least happened simultaneously. So you have really different dynamics.

Like I said, hundreds of millions of businesses use the WhatsApp SMB app, which is an app. So this is a massive app that we operate, which is just for small businesses to use WhatsApp to communicate with their constituents. You go to like the Delhi Train Station and you walk up to the guy telling Chai, he’s using WhatsApp to tell people when he’s open, to tell people when he has a special, etcetera, going all the way up to banks. Banks in Brazil that use WhatsApp as a way to help people do basic bank services instead of like downloading an app and that each bank building an app for doing all the things. So it’s a huge thing.

This is sometimes lost on Americans, which is why I give these examples. If you’re asking about the business agent idea and how multimodal can help, A few things come to mind. One is like sending images of like your receipt. I mean, how much time do we spend typing in things from an image we’re looking at into a computer? A lot.

The idea of sending those images and just having it understand all of them automatically is a huge deal from a friction perspective. Audio, the idea of talking to any business in any language and having it understand and talk back to you in a way that’s coherent with its own values and services is an insanely powerful idea. Those aren’t coming next week, but I think like the product opportunity is pretty obvious for us.

Unidentified speaker, Interviewer, Morgan Stanley: Yes. And how let me ask you one about wearables then. You mentioned the Ray Bans earlier. We also have Orion somewhere on the burner to come. So how do you think about the next innovation on the Ray Bans set of glasses and the set of devices that you’re most excited about to sort of get us closer to that agent where I can interact with businesses or people?

Chris Cox, Chief Product Officer, Meta: Sure. I mean, some of the basics are just like more types of models and styles. If you’re an athlete, you’re probably not running around in a pair of wayfarers. Or if you are, that’s cool. And similarly, if you’re like a diva, like maybe you’re wearing something some higher end glasses.

So there’s this whole question for us of like how do you enable this on a bunch of different styles. Because it’s eyewear, style matters immense like an immense amount. So that’s the thing. And then there’s specific software for each of these types of use cases. So like the kind of things you want to do if you’re a runner a little bit different.

Yes. What else can I say about that? I mean, some a lot of what we’re discovering is that the product is like super retentive, which in product land is like four months after the day you bought the thing, did you use it that day? It’s kind of like the ultimate measure of does the product matter to you is did you use it on day, you know, four months after day one? And then we see with Ray Bans really, really, really healthy retention, not just for the glasses themselves, but also for the various use cases making calls.

People, it turns out like making calls without earbuds in. Right. Sounds obvious, but it’s like, oh, this is actually really refreshing.

Unidentified speaker, Interviewer, Morgan Stanley: Yeah.

Chris Cox, Chief Product Officer, Meta: Listening to music similarly, not having the occlusion in the ear and then also talking to the glasses about whatever is in front of you It takes a little bit to get used to, but as soon as you understand that you can just ask about what you’re looking at and get answers. Travel is becomes in like cool when you’re in a foreign country and you’re like, what does that say? What does that say? What does that person just say? Like these are powerful use cases.

So these are the things we’re pushing forward. Yes. I mean, we’re still in the zone of just getting to like making sure this is available everywhere and everybody who wants it can get it, right? That’s a high class problem versus creating desire for the product. You’re starting to see a lot of love for the product.

Unidentified speaker, Interviewer, Morgan Stanley: Talk a little bit about the wristband technology. I know it’s with Orion now, but I think this is one of the parts that is underappreciated of what you’re working on with the way in which you’re tapping into some underutilized nerves so that you can send messages without actually texting. Just walk us through that. Yes.

Chris Cox, Chief Product Officer, Meta: Well, they’re utilized nerves. I mean, they’re your nerves. It’s an EMG wristband. So you’re basically observing the electrical activity underneath the skin in order to understand gesture. And the reason you want to do that is when you have in the future, when this isn’t the way you’re doing computing, you’re going to want and you’re not going to want to like wave your hands around when you’re walking around to use your computer.

I think, I hope. And so it’s like, well, what’s the ideal input for a computer that you’re wearing in front of your eyes? Well, the first and the most obvious one is voice. You just want to be able to talk to it. But a lot of the time, you don’t want to talk to your computer.

Like if you’re using one right now, it would be really awkward. Everybody here. And so you get to like, I want to really, a really subtle, but reliable gesture of like using my fingertips. And you want to do it in a way that is as subtle as possible, but 100% repeatable. So you’re not mistaking one gesture for another because that would be as frustrating as like a mouse going to the right half the time.

Right. So that’s what we’ve built. So the EMG wristband allows you to just use really, really subtle gestures with your fingertips and control a cursor on the screen.

Unidentified speaker, Interviewer, Morgan Stanley: Got it. Makes you see more from that in the next few years. Yes. I want to also ask you about custom silicon and sort of what you’re working on with the MTIA. You’ve talked in prior conference calls, you know, the most recent conference call about ways in which you’re taking information on the core platform and moving it over to custom silicon.

So what have you learned from that so far? Where are you on that journey of sort of running more of your core platform through MTI? And ultimately, do you do your own training through it as well? Not yet. Long term vision?

Chris Cox, Chief Product Officer, Meta: Not yet, but we’re working on it. And so far, what we’ve announced and built and developed at scale is inference for recommendations. This is Artemis is the name. Yes. And this allows us to take the workload when we are delivering an ad or the workload when we’re delivering a video, a short form video and put that on the custom silicon that we made and designed ourselves.

TSMC still makes it, but we’re doing the design. That brings the cost down, which is good. We’re working on how would we do training for recommender systems and then eventually how do we think about training and inference for Gen AI. Those are the four sort of categories of silicon that we think about. And we’re so far it’s kind of a walk, crawl, run situation, but the first generation has been a big success.

Unidentified speaker, Interviewer, Morgan Stanley: Okay. I think that’s alphabet, it sort of took them eight to ten years on the TPU front. Is that kind

Chris Cox, Chief Product Officer, Meta: of the right way to

Unidentified speaker, Interviewer, Morgan Stanley: think about this? Is this sort of a decade long process or should it be faster now that we’ve kind of had more generations of innovation?

Chris Cox, Chief Product Officer, Meta: Well, I mean, silicon design is something you have to go get good at. And depending on how much of the apple you want to bite off, it takes years to get good at it. Yep. We know a lot about inference for ranking and recommendations because we just do I mean, if you were to do the measurement, we’re just doing probably a huge amount of that computation on earth each day. Right.

And so that gives us a unique advantage on developing the ones that are going to be good at that. I can’t speak for the rest of the industry kind of generally on that.

Unidentified speaker, Interviewer, Morgan Stanley: I want to end with one big picture one because it has been a wild two plus years of the Gen AI cycle, the beginning of the Gen AI cycle. You are so involved in so much research and so much leading edge work of what could come next. But just from as you sit and you look at all the teams that are working on GenAI, what in your mind is the most underappreciated opportunity that’s not written about, talked about that you’re excited about when you go home and talk to your wife and kids about it? And then what is the challenge you think that people are missing and they don’t talk about as well?

Chris Cox, Chief Product Officer, Meta: I mean, I told this story about my son. Yes. To me, it’s still it’s kind of like when the Internet was happening at the beginning, it took people a while to wrap their heads around how big of a deal it was. It took all of us a while to wrap our heads around that. And I think every once in a while you get a product category where you see the inklings and it’s like this is completely insane.

And yet, no one knows how to use it. Insane and yet no one knows how to use it. And in those instances, you learn a lot by just seeing the ways that evolve organically inside of various communities. It’s It’s a really fascinating way to do product development. Social media was like this.

We just kind of like Facebook was successful enough. We’re able to watch it well, what does Facebook look like, among older people? What does it look like in a high school? What does it look like in Columbia? And we didn’t know the answer to that, but it turns out that answer starts to discover itself if you’re paying attention.

We had this really funny with this huge spike in India over Valentine’s Day with Meta AI usage, like what is going on? Why is India using Meta AI ravenously on Valentine’s Day? Well, it turns out this people in India had discovered that you can say imagine me, which is the feature we launched last year, right, and we’ve promoted it and then explained it and people are like, cool. It allows you to create an image with yourself in it. Pretty simple.

We’re kind of the only one that does it. But people started doing Imagine Me in like a Valentine’s Day card and then like sending it to their friend. It went absolutely nuts. And for at one moment in time, everybody discovers this tool. And then after Valentine’s Day passes, it persists.

It’s like a whole market discovered a use case at the same time and now it moves on. I think a lot of the development is going to happen in that way, which is you can’t just tell everybody this is a chatbot that understands everything like here you go. You’re a paralegal, what should I do with it? Like talk to other paralegals, right. I think that’s going to be a little bit of how the development happens here.

I find that to be somewhat misunderstood and also incredibly important to not just building intelligent things, but actually building things that people understand how

Unidentified speaker, Interviewer, Morgan Stanley: to use and want to use every day. And challenge?

Chris Cox, Chief Product Officer, Meta: I think we’re going to need to be patient. I think there’s a lot of like it’s a hype cycle. We’ve all seen it before. Like this thing is the greatest thing ever. It’s coming next year.

I remember I was in the I was part of the Stanford AI Lab when we built the self driving car that won the DARPA Grand Challenge in 02/2005. This is like a car that can drive itself across Death Valley. It’s like a 100 miles automated car. Sebastian Thrun was leading the robotics team. Yeah.

This is 02/2005 and everybody is like self driving cars are here. They’re coming, it’s going to be three to five years. And everybody is like it’s three to five years and it’s like everything is three to five years away. And I think if we just like people need to be patient with technology and like have some amount of dispassion that like some things are going to happen really fast and some things that seem easier and take a long time because edge cases are crazy hard, especially when you’re dealing with something like a car driving itself. But even when we talk about intelligence, there are so many pieces of what we’re saying that are very subtle, that are very deeply human and are not just about sort of like doing really well in math.

And I think that surface area around intelligence is going to be the stuff that takes a lot longer. And that’s where I think the patience is going to be important.

Unidentified speaker, Interviewer, Morgan Stanley: All right. Well, Chris, it was a great conversation. Thank you so much. We look forward to seeing what you do in the next twenty five, ten, twenty

Chris Cox, Chief Product Officer, Meta: twenty. Thanks a lot.

This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers
© 2007-2025 - Fusion Media Limited. All Rights Reserved.