Meta AI, formerly Facebook AI Research open-sources the OPT-66B, the first un-restricted open-source model till date

Meta AI, formerly known as Facebook AI Research or FAIR before Facebook changed its name to Meta Platforms to focus on Metaverse, has recently open-sourced its OPT-66B machine learning model. This comes after Meta AI made OPT-175B, a 175 billion parameter model available to the wider AI research community last month.

Open Pre-Trained Transformers(or OPT) is a family of NLP models trained on billions of tokens of text obtained from the internet. The OPT family of transformers contains various models ranging from 125 million in OPT-125M to 175 billion in OPT-175B. In addition to the model being open-sourced, Meta AI also released the logbook used for training all of their baselines from the 125 million parameter model OPT-125M through the 66 billion parameter model OPT-66B.

There has been some concern in Artificial Intelligence and Deep Learning circles regarding the misuse or malicious use of Deep Learning models which has prevented many notable models from being open-sourced such as GPT-2 and GPT-3 by Open AI which was founded on the idea of collaborating with other institutions and researchers by making its patents and research open to the public.

With the open-sourcing of its models, Meta AI has set the bar high for other organizations like OpenAI working on Artificial Intelligence and Deep Learning but are reluctant to open-source their models, training data and methodology, and access to their models for research and exploration.

While information regarding all but OPT-175B models can be obtained on the GitHub repository. However, the largest of the OPT models consisting of 175 billion parameters is not yet open-sourced, but access to the model can be requested by filling out the Google form over here.

Other Articles you might be Interested In

Alphabet spins out SandBox AQ, an AI and Quantum Computing Company

Trends in Artificial Intelligence in 2021

Private Investment in Artificial Intelligence soars to 93.5 Billion Dollars in 2021, more than double from Previous Year

ConvNext: The ConvNet for the Roaring 20s