LLaMa 2 Insights

LLaMa 2 Insights

Meta, in partnership with Microsoft, is releasing the open source code for LLaMA 2 – its large language model, trained with 40% more data than the previous version.

The announcement was made during the Microsoft Inspire event, where both companies spoke of a “growing” partnership. The open-sourced LLaMa 2 is already available via the Azure platform, as well as Amazon Web Services, Hugging Face, and other providers.

Qualcomm, in turn, announced that it is collaborating with Meta on the integration of LLaMa into laptops, phones, and headsets from 2024 for AI-based applications that operate without the use of cloud services.

According to Meta, LLaMa 2 was trained with 40% more data compared to LLaMa 1. The company’s VP of AI, Ahmad Al-Dale, states that two datasets were used – publicly available data from the internet and a dataset tailored based on tester feedback. Meta reportedly did not use user metadata in LLaMa 2 and excluded data from sites containing a lot of personal information.

The large language model also reportedly “outperforms” other LLMs, such as Falcon and MPT, in “argumentation, coding, qualification, and tests.”

Meta says it received over 100,000 requests from researchers to use the first model, but open-sourced LLaMa 2 is likely to see a much wider reach. The model is available ini several Llama 2 model sizes (7B, 13B, 70B).

“We believe that an open approach is right for the development of modern artificial intelligence models, especially in the generative space, where the technology is rapidly evolving. An entire generation of developers and researchers can test them under load, quickly finding and solving problems,”

reads a statement from Meta.

Meta first released its LLaMa model in February – as an open-source package that AI community members could access. However, a week after the company began accepting requests, a torrent for downloading the language model appeared on the 4chan site, which later spread to other social networks.

Read related articles: