Meta and Arm: AI on Smartphones

Making AI more accessible and efficient with enhanced SLM models.
Meta and Arm: AI on Smartphones
Published on

Meta is further expanding its AI vision with a new colossal partnership with chipmaker Arm.
The two companies are partnering up to develop AI models specifically tailored for smartphones and other devices.

This collaboration was announced during Meta Connect 2024 by Meta’s CEO. Mark Zuckerberg said that the company’s focus would be on making AI more efficient by working on smaller language models (SLMs). The new AI models would enable fast, on-device processing and edge computing, aimed at reducing the delay in AI inference. This would make interaction with AI assistants much more seamless.

Focus on Smaller Language Models

The new AI models Meta is building will be for small devices like smartphones and tablets, where the more traditional large language models cannot operate effectively. Meta's Llama 3.2 1B and 3B would be the ideal candidates for this. Arm will be crucial in the partnership as it makes processor-optimized AI that can become versatile in many workflows.

Improving Experience for Users

Ragavan Srinivasan, who is the product management VP for generative AI at Meta, spoke at length about user experience enhancement. The company’s vision is for more intuitive and responsive AI models. This might mean that AI assistants will be able to call or take pictures without much prompting. 

Plans for the collaboration

Meta and Arm hope to develop AI tools that will not just raise the bar of user experience but also help developers create new, creative applications for their intelligent devices. This is yet another feat that Meta would achieve towards completing its mission of AI innovation. Meta has also released the first Orion AR glasses prototype at the Connect 2024.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net