OpenAI announced on Tuesday the release of two open-source AI models, a surprising move that returns the company to its early roots. The two new models, GPT-oss-120b and GPT-oss-20b, are designed to be freely available via the Hugging Face platform, and are claimed to achieve state-of-the-art performance in open benchmark evaluations.
This announcement represents a significant shift for OpenAI, which has moved away from open source in recent years to focus on closed, commercial models. The GPT-oss release aims to provide a powerful open-source alternative for developers and researchers, while also promoting transparency and collaboration in the AI field.
The GPT-oss-120b model is characterized by its high capacity, as it can be run on a single Nvidia GPU. The smaller GPT-oss-20b model can be run on a consumer laptop with 16 GB of RAM. This variety in requirements allows a wide range of users to access and use these models.
According to OpenAI, GPT-oss models can be used to send complex instructions to cloud AI models for execution. In cases where GPT-oss is unable to handle certain tasks, such as image processing, developers can connect it to OpenAI's more powerful closed-source models.
This announcement comes at a time when OpenAI is facing increasing competition from Chinese AI labs such as DeepSeek, Alibaba's Tongyi (Qwen) and Moonshot AI, which have released several world-leading open source models. In addition, OpenAI is facing pressure from the US government to increase open source sharing to promote AI technologies that reflect American values.
By releasing GPT-oss, OpenAI is seeking to gain the support of both the developer community and policymakers. OpenAI CEO Sam Altman stated that OpenAI's mission is to ensure that General Artificial Intelligence (AGI) benefits all of humanity, and that they are excited to see the development of open source AI technologies based on an American and democratic foundation.
GPT-oss models have achieved promising results in performance tests, outperforming other open source models such as DeepSeek R1 in programming competitions. However, they still lag behind OpenAI's closed-source models such as o3 and o4-mini in some tasks.
One important aspect to consider is that GPT-oss models have a higher "hallucination" rate than the latest OpenAI models. This suggests that open source models may be more prone to producing inaccurate or incorrect information.
GPT-oss models were trained using a process similar to that used in OpenAI's closed-source models. The models are based on a "Mixture-of-Experts" (MoE) architecture, which increases operational efficiency by activating only a portion of the parameters at any given time. GPT-oss-120b and GPT-oss-20b are released under the Apache 2.0 license, a lenient open source license that allows companies to use the models in commercial applications without the need for permission or fees.
In conclusion, OpenAI's release of GPT-oss represents an important step towards increasing transparency and collaboration in the field of AI. While there are some limitations, these open-source models provide developers and researchers with powerful tools to explore and develop new AI technologies. Understanding the nuances between closed and open source development can allow researchers to better leverage AI models. It is important to assess the risk and rewards of each decision.
Risk Warning: this article represents only the author’s views and is for reference only. It does not constitute investment advice or financial guidance, nor does it represent the stance of the Markets.com platform.When considering shares, indices, forex (foreign exchange) and commodities for trading and price predictions, remember that trading CFDs involves a significant degree of risk and could result in capital loss.Past performance is not indicative of any future results. This information is provided for informative purposes only and should not be construed to be investment advice. Trading cryptocurrency CFDs and spread bets is restricted for all UK retail clients.