Micron to exit server chips business in China after 2023 ban- Reuters
Investing.com -- Meta Platforms announced it is expanding access to its Llama AI models to more key US allies for defense and national security applications.
The company is extending access to France, Germany, Italy, Japan, and South Korea, as well as NATO and European Union institutions. This follows previous access granted to US government agencies and Five Eyes security partners - Australia, Canada, New Zealand, and the UK.
Meta highlighted that Llama is particularly suitable for sensitive security applications because, as an open source platform, it can be securely downloaded and deployed without transferring sensitive data through third parties. Government agencies can fine-tune the models using their own national security data and deploy them in secure environments.
The company stated that Llama has already been used to develop advanced AI tools for the US military and national security agencies. One example is Meta’s work with the Army’s Combined Arms Support Command on a project using AI and augmented reality to speed up equipment repairs.
To deliver Llama-based solutions to these allies, Meta is partnering with numerous companies including Accenture, Amazon Web Services, AMD, Anduril, Microsoft, Lockheed Martin, and Palantir.
Meta is also supporting US national security through its partnership with Anduril to develop wearable products for US soldiers, described as "the largest effort of its kind" to equip troops with enhanced perception capabilities.
The company emphasized that the widespread adoption of open source models like Llama is essential for maintaining US and allied AI leadership, aligning with the US government’s AI Action Plan for America.
Meta stated it is taking a gradual approach to extending Llama access for defense purposes and will consider adding more countries in the future in consultation with the US government.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.