The AI chatbot race was started by ChatGPT. Meta is focused on winning. Facebook, Instagram, and WhatsApp are all being integrated with Meta’s AI assistant. Additionally, Llama 3, the company’s next major AI model, has arrived.
In light of this, the Meta AI assistant—which was unveiled in September of last year—is currently being included in the search fields on Facebook, Instagram, WhatsApp, and Messenger. Additionally, it will begin to show up right in the Facebook mainstream. It is still available for chat in the messaging inboxes of Meta’s applications. Additionally, it is now available via a stand-alone website at Meta.ai for the first time.
The underlying model needs to be at least as good as ChatGPT for Meta’s assistant to stand a chance of competing with it. For this reason, Meta is also introducing Llama 3, which is the company’s next significant release of its open-source core model. According to Meta, Llama 3 performs better overall at skills like coding and beats rival models in its class on important benchmarks. A significantly larger, multimodal version of Llama 3 will be available in the upcoming months, in addition to the two smaller models that are now available to external developers and the Meta AI helper.
CEO Mark Zuckerberg says that Meta AI wants to be “the most intelligent AI assistant that people can freely use across the world.” “We essentially feel like we are there with Llama 3.”
The only chatbot that combines real-time search results from Google and Bing is the Meta AI assistant. Meta determines which search engine to utilize in response to a command. High-resolution images now automatically generate as you type, and its image generation has also been improved to produce animations, which are effectively GIFs. According to Ahmad Al-Dahle, head of generative AI at Meta, the objective of the Perplexity-inspired panel of prompt recommendations that appears when you first start a chat window is to “demystify what a general-purpose chatbot can do.”
Up until now, Meta AI was exclusively accessible in the United States. However, it is currently being made available in English in Australia, Canada, Ghana, Jamaica, Malawi, New Zealand, Nigeria, Pakistan, Singapore, South Africa, Uganda, Zambia, and Zimbabwe, with additional nations and languages to follow. Although Zuckerberg’s vision of a genuinely global AI assistant is still a long way off, Meta AI is now closer to someday reaching the company’s more than 3 billion daily users thanks to this expanded rollout.
This can be compared to Stories and Reels, two iconic social media formats of the era that were added to Meta’s applications in a way that increased their ubiquity after being invented by upstarts, TikTok, and Snapchat, respectively.
Some could refer to this as blatant piracy. However, Zuckerberg views Meta’s size and capacity for swift trend adaptation as its competitive advantage. And he’s playing the same game with Meta AI, saturating the market and making large bets on basic models.
“When considering the primary AI assistants that people use today, I don’t think that many people really think about Meta AI,” he acknowledges. However, we believe that now is the right time to start genuinely exposing it to a large number of people, and I anticipate it to be a fairly significant product.
“Take on everything in the world.”
Today, Meta is making two Llama 3 models available for free use to independent developers. Both the 8-billion and the 70-billion parameter models will be available on all of the main cloud service providers. (At a high level, a model’s complexity and ability to learn from its training data are determined by its parameters.)
An excellent illustration of how fast these AI models are scaling is Llama 3. According to Zuckerberg, the upcoming enormous edition of Llama 3 would feature approximately 400 billion parameters, compared to the 70 billion in the largest version of Llama 2, which was launched last year. Llama 2 was trained using 2 trillion tokens, which are essentially the words or basic meaning units that make up a model. In contrast, Llama 3’s large version uses almost 15 trillion tokens. (OpenAI has not yet made the quantity of parameters or tokens in GPT-4 publicly known.)
Reducing false refusals—the number of times a model claims it is unable to respond to an innocuous prompt—was a major goal for Llama 3. Zuckerberg gives the example of telling it to create a “killer margarita.” Another is the advice I offered him in an interview from the previous year, back when Meta AI wasn’t yet mature enough to advise me on how to end a relationship.
Since Llama 3 is still being trained, Meta has not yet decided whether to release the 400 billion parameter version of the system publicly. Zuckerberg minimizes the chance that it isn’t open source because of security concerns.
Nothing at the level that we or others in the field are working on in the upcoming year, in his opinion, is truly in the ballpark of those kinds of risks. “Therefore, I think we can make it open source.”
Zuckerberg says we should anticipate additional iterative changes to the smaller models, such as longer context windows and increased multimodality, before the release of the most advanced version of Llama 3. He is vague on the specifics of how that multimodality would operate, but it doesn’t seem like creating video in the vein of OpenAI’s Sora is currently in the cards. Meta hopes to enhance its assistant’s personalization in the future, which may include the ability to create images that reflect your own image.
The details of the data needed to train Llama 3, Meta becomes agitated. Compared to Llama 2, the entire training dataset has seven times more data and four times more code. Despite Zuckerberg’s previous assertion that the Meta corpus is greater than all of Common Crawl, no Meta user data was used. A combination of artificial intelligence-generated data and “public” internet data is used by Llama 3. Sure, AI is being utilized to develop AI.
Even while Meta is currently reclaiming its top spot on the open-source scoreboard with Llama 3, the speed at which AI models are changing is so rapid that it’s impossible to predict what the future will hold. GPT-5, which OpenAI is supposedly working on, has the potential to outpace the rest of the market once more. Zuckerberg responds that Meta is already considering Llama 4 and 5 when I question him about this. It’s a marathon, not a sprint, in his opinion.
Our current objective is not to outperform the open source models, he states. “To be the best AI in the world and to compete with everything out there.”