US Non-Profit Challenges China's AI Dominance With New Open Models
- tech360.tv
- 17 minutes ago
- 2 min read
A United States non-profit organisation has released "fully open" artificial intelligence models, aiming to challenge Chinese dominance in the open-source AI sector. The Allen Institute for AI, known as Ai2, disclosed its models with full training data, and pipelines available for public inspection.

This approach exceeds the common practice of Chinese open-source AI developers. These developers typically make only model weights accessible for viewing, and modification.
Ai2 stated that increased transparency could enhance user trust, especially as AI systems are increasingly used for critical services. Its flagship 32-billion-parameter Olmo 3-Think model reportedly narrowed the performance gap with comparable Chinese models.
This includes models like Alibaba Cloud’s Qwen3-32B, despite training on approximately six times fewer tokens. Alibaba Group Holding operates Alibaba Cloud.
Ai2 models, which range from 7 billion to 32 billion parameters, are small enough to run on consumer-grade hardware locally. Chinese open models have seen broad global adoption due to their lower cost compared to proprietary US systems.
Many US start-ups currently use Chinese models. Chinese models have become a default choice in the US AI industry, with Stanford University computer science classes heavily relying on Qwen models.
Chinese models have drawn criticism for potential security risks, and for producing censored outputs. The Trump administration previously urged the US to develop open models based on American values.
Ai2 described Chinese offerings as "open-weights" instead of "open-source." This distinction arises because Chinese companies typically release only model weights, not the full training data, and source code.
In contrast, Ai2 reported that its new Olmo 3 model was pretrained on a 9.3-trillion-token corpus. This corpus specifically excluded paywalled content.
The organisation also published a "model flow" detailing the entire creation process of the system. In China, Hangzhou-based start-up DeepSeek has also advocated for increased transparency.

DeepSeek published a peer-reviewed article in the British journal Nature outlining the training process for its R1 model.
Ai2 research scientist Nathan Lambert stated on X that the Olmo 3 32B base model might be the most impactful artefact in the new suite. He noted Alibaba has not open-sourced its comparable Qwen3 32B base model.
Lambert suggested competitive reasons for Alibaba's decision. Developers commonly customise base models for specific uses, enabling local data storage, and processing instead of cloud reliance.
Lambert wrote that enterprises consider geographic origins when seeking trusted AI deployments with sensitive data. Ai2 was founded in 2014 by Microsoft co-founder Paul Allen.
The Seattle-based organisation performs foundational AI research in open models, and robotics. Lambert also launched the ATOM Project, which stands for "American Truly Open Models," to promote US-developed open models.
This initiative responds to Chinese dominance in the sector. San Francisco start-up Deep Cogito released Cogito v2.1, described as the best open-weight large language model by a US company. This model was built upon a DeepSeek base model.
Ai2 released "fully open" Olmo AI models, including training data, and pipelines.
This initiative aims to challenge China's dominance in the open-source AI sector.
Ai2 models offer greater transparency, and can run on consumer-grade hardware.
Source: SCMP