Meta AI, Meta’s AI-powered assistant across Facebook, Instagram, Messenger, and the web, can now speak in more languages and create stylized selfies. And, starting today, Meta AI users can route questions to Meta’s newest flagship AI model, Llama 3.1 405B, which the company says can handle more complex queries than the previous model underpinning Meta AI.
The question is whether the enhancements will be enough to improve the overall Meta AI experience, which many reviewers, including TechCrunch’s Devin Coldewey, found incredibly underwhelming at launch. The early iterations of Meta AI struggled with facts, numbers, and web search, often failing to complete basic tasks like looking up recipes and airfares.
Llama 3.1 405B could make a difference, potentially. Meta claims that the new model is particularly adept at math and coding questions, making it well-suited for help with math homework problems, explaining scientific concepts, code debugging and so on.
However, there’s a catch. Meta AI users have to manually switch to Llama 3.1 405B in order to use it, and they’re limited to a certain number of queries each week before Meta AI automatically switches over to a less-capable model (Llama 3.1 70B).
Meta’s labeling the Llama 3.1 405B integration as a “preview” for the time being.
Generative selfies
There’s another new generative AI model besides Llama 3.1 405B in Meta AI, and it powers the selfie feature.
The model, called Imagine Yourself, creates images based on a photo of a person and a prompt like “Imagine me surfing” or “Imagine me on a beach vacation.” Available in beta, Imagine Yourself can be invoked in Meta AI by typing “Imagine me” followed by anything that isn’t NSFW.
Meta didn’t say which data was used to train Imagine Yourself, but the company’s terms of use make it clear that public posts and images on its platforms are fair game. That policy — and the convoluted opt-out process — hasn’t sat well with all users.
Joining Imagine Yourself in Meta AI are new editing tools that let users add or remove, change or edit objects with prompts like “Change the cat to a corgi.” Starting next month, Meta AI will get an “Edit with AI” button to pull up additional fine-tuning options. And in the coming days, Meta AI users will start seeing new shortcuts for sharing Meta AI-generated images to feeds, stories and comments across Meta apps, Meta says.
New languages and Quest support
Meta AI is also replacing the Meta Quest’s VR headset’s Voice Commands feature, with the rollout scheduled for next month in the U.S. and Canada in “experimental mode.” Users will be able to use Meta AI with passthrough enabled to ask questions about things in their physical surroundings, Meta says, for example “Look and tell me what kind of top would complete this outfit” while holding up a pair of shorts.
As of today, Meta AI is available in 22 countries, Meta says — new additions include Argentina, Chile, Colombia, Ecuador, Mexico, Peru and Cameroon. The assistant now supports French, German, Hindi, Hindi-Romanized Script, Italian, Portuguese and Spanish, and Meta promises that more languages are on the way.