While DeepSeek-R1 operates with 671 billion parameters, QwQ-32B achieves comparable performance with a much smaller footprint ...
Alibaba’s QWQ-32B is a 32-billion-parameter AI designed for mathematical reasoning and coding. Unlike massive models, it ...
Alibaba is positioned to dominate China's AI market with its groundbreaking, highly efficient QwQ-32B model, surpassing ...
Alibaba (BABA) is stepping up its efforts in the AI race. The company has launched an upgraded version of its AI assistant ...
Alibaba developed QwQ-32B through two training sessions. The first session focused on teaching the model math and coding ...
Alibaba released and open-sourced its new reasoning model, QwQ-32B, featuring 32 billion parameters. Despite being ...
This remarkable outcome underscores the effectiveness of RL when applied to robust foundation models pre-trained on extensive ...
B, an AI model rivaling OpenAI and DeepSeek with 98% lower compute costs. A game-changer in AI efficiency, boosting Alibaba’s ...
After the launch, Alibaba's shares rose over 8% in Hong Kong, which also helped boost the Chinese tech stocks' index by about ...
These reasoning models were designed to offer an open-source alternative for the likes of OpenAI's o1 series. The QwQ-32B is a 32 billion parameter model developed by scaling reinforcement learning ...
Alibaba launched new reasoning model comparable to DeepSeek's R1, pledged increased support for AI in China, and committed ...