Alibaba Unveils Ambitious AI Strategy at 2025 Yunqi Conference

Alibaba's CEO outlines a bold vision for AI, introducing new models and strategies aimed at establishing a leading position in the AI cloud market.

Introduction

At the end of September, a light rain fell in Hangzhou, but the AI fervor at Yunqi Town made it feel like summer had not yet faded.

On September 24, the 2025 Yunqi Conference was held as scheduled. Alibaba Group CEO and Chairman of Alibaba Cloud Intelligence, Wu Yongming, delivered a speech titled “The Path to Super Artificial Intelligence.”

This was Wu’s first appearance at the Yunqi Conference after more than a year at the helm of Alibaba Cloud. He stated that “the greatest imagination of generative AI is not to create one or two new super apps on a mobile screen, but to take over the digital world and change the physical world.”

A Year of Progress

If this statement was more of a vision a year ago, it has now transformed into a more concrete roadmap and aggressive actions.

At this year’s Yunqi Conference, Alibaba Cloud unveiled a plethora of new products. Among them was the flagship model Qwen3-Max, which is currently the most powerful model in the Alibaba Tongyi model family, outperforming GPT-5 and Claude Opus 4, ranking among the top three globally on LMArena.

In addition to the flagship model, Alibaba also launched six new models, including the next-generation foundational model architecture Qwen3-Next, the programming model Qwen3-Coder, the visual understanding model Qwen3-VL, the multimodal model Qwen3-Omni, the visual foundational model Wan2.5-preview, and the speech model Tongyi Bailin.

Image 1

More noteworthy were Wu Yongming’s two bold new assertions.

The Future of Operating Systems

He made a definitive statement: large models are the next generation of operating systems. Large models will engulf software, allowing anyone to create an infinite number of applications using natural language. In the future, almost all software interacting with the computational world may be generated by agents from large models, rather than traditional commercial software.

As a result, Alibaba Cloud has been undergoing a reconstruction of all operating systems—from underlying computing power to infrastructure and cloud services—to align with the changes brought by large models.

The Rise of Super AI Cloud

The second assertion builds on this logic: the Super AI Cloud is the next generation of computers. Drawing parallels with the stages of computer development, natural language is the programming language of the AI era, agents are the new software, context is the new memory, and LLMs will serve as the middleware for user, software, and AI computational resource interactions, becoming the OS of the AI era.

Alibaba Cloud’s goal is to establish a “Super AI Cloud” to provide a global intelligent computing network.

In February, Alibaba announced a three-year plan for AI infrastructure construction worth 380 billion. Wu Yongming added a new plan today—by 2032, compared to 2022, the energy consumption scale of Alibaba Cloud’s global data centers will increase tenfold to welcome the arrival of the ASI era.

Alibaba Cloud also proposed a new development strategy and goal for AI: not the commonly discussed AGI (Artificial General Intelligence), but a further step towards ASI (Artificial Super Intelligence).

Wu Yongming explained the three stages to reach super artificial intelligence:

  1. Intelligent Emergence: AI learns from humans, acquiring generalized intelligence through the collection of global knowledge, gradually developing reasoning abilities.
  2. Autonomous Action: AI masters tool usage and programming capabilities to assist humans, which is the current stage of the industry.
  3. Self-Iteration: AI connects with the physical world’s complete raw data for autonomous learning, ultimately able to “surpass humans.”

In 2025, the global large model field is progressing amidst challenges. After OpenAI launched GPT-5, its performance fell short of market expectations, leading to criticisms of stagnation and setbacks in model innovation. Meanwhile, Meta and OpenAI are making more aggressive capital investments—no one wants to miss out on this wave of technological revolution.

Alibaba’s Commitment

Now, Alibaba Cloud is proving through action that it not only intends to invest but to invest aggressively.

The market has responded positively to Alibaba Cloud’s new strategy. Today, Alibaba’s Hong Kong stocks surged, rising over 9% during trading, reaching a new high since October 2021.

New Model Launches

Before the Yunqi Conference, Lin Junyang, head of the Qwen model team at Alibaba, teased on Twitter that they would launch more than six new products, none of which would be “small items.”

When the models were officially released, the number exceeded expectations, marking a sincere launch. Alibaba Cloud CTO Zhou Jingren flipped through the PPT at the conference rapidly, rushing through his points but still exceeding the time limit.

Alibaba Cloud launched a total of seven new models, each with significant improvements in scale and performance:

  • Qwen3-Max: Flagship model with a pre-training data volume of 36 trillion tokens and over a trillion parameters, significantly enhancing coding and agent tool invocation capabilities.
  • Qwen-Next: Next-generation model architecture and series. The total parameters of the model are 80 billion, with only 3 billion activated, comparable to the flagship model Qwen3 with 235 billion parameters. The training cost has decreased by over 90% compared to the dense model Qwen3-32B.
  • Qwen 3-VL (Visual Understanding): Capable of accurately interpreting images and charts, with a breakthrough in “visual programming” ability, converting visual design drafts directly into front-end code and operating mobile devices and computers, advancing from mere “seeing” to understanding and execution.
  • Qwen3-Coder (Code Model): Significantly improves generation speed, code quality, and security, making it easier to complete complex tasks from code completion and bug fixing to generating complete projects with one click.
  • Qwen3-Omni: A native multimodal model that can “hear, speak, see, and write”; it interacts naturally like chatting with a person, understanding audio and video while maintaining capabilities in text and images, suitable for use in AI applications for vehicles, glasses, and mobile phones.
  • Tongyi Wanxiang Wan2.5-preview: A new visual foundational model with capabilities for generating video from text, images from text, and image editing, capable of generating matching human voices, sound effects, and music BGM.
  • Tongyi Bailin: A new family of speech models, including speech recognition and synthesis sub-models. For example, Fun-CosyVoice can provide hundreds of preset voice styles for applications in customer service, sales, live e-commerce, consumer electronics, audiobooks, and children’s entertainment.

Image 2

Alibaba Cloud does not rely solely on static datasets to demonstrate model capabilities. In blind tests on authoritative rankings like LMArena, Alibaba’s flagship model Qwen3-Max has already ranked third on the Chatbot Arena leaderboard.

Following the global AI industry explosion driven by DeepSeek, a domestic open-source model competition has ignited, contrasting sharply with last year’s closed-door approaches.

Both domestically and internationally, this year has seen a round of open-source model battles, with nearly all companies still investing in models increasing their open-source efforts. Alibaba stands out as the most aggressive among domestic giants in pursuing an open-source route.

This stems from Alibaba being one of the first companies in China to open-source models and build a model ecosystem. These investments have now yielded tangible returns, motivating Alibaba to make even more aggressive investments.

DeepSeek and Qwen are among the few models that have gained global recognition. After the open-source surge initiated by DeepSeek, Qwen has once again attracted attention in the global AI community, entering a new phase of growth.

As of now, Alibaba Tongyi has open-sourced over 300 models, covering various sizes of “full-size” and LLMs, programming, image, speech, and video across all modalities.

Globally, Tongyi’s large models are the leading open-source models, with downloads exceeding 600 million and over 170,000 derivative models.

Image 3

Agent Development Framework

In addition to models, Alibaba Cloud has also released a new agent development framework, ModelStudio-ADK—agents can autonomously plan and invoke models, leading to increased computational consumption. Alibaba Cloud disclosed a figure indicating that with the continuous enhancement of model capabilities and the explosion of agent applications, the daily invocation volume of models on Alibaba Cloud’s Bailian platform has grown 15-fold over the past year.

Investments in model open-sourcing not only accelerate model iteration but have also translated into revenue in the cloud. Alibaba has begun to establish a commercial closed loop for the AI era—its latest quarterly report shows that Alibaba Cloud’s quarterly revenue surged 26% year-on-year, with AI-related revenue achieving triple-digit growth for eight consecutive quarters.

According to a report from the international market research firm Omdia, the AI cloud market in China is expected to reach 22.3 billion yuan in the first half of 2025, with Alibaba Cloud holding a 35.8% market share, ranking first, surpassing the combined share of the second to fourth places.

Competing in the LLM Era

In 2024, with OpenAI’s Sora release and GPT-5 development stagnating, discussions about technical routes briefly led to a dip in sentiment in the global large model field.

However, this sentiment has largely dissipated. Just days before the Yunqi Conference, NVIDIA announced a $100 billion investment in OpenAI. Wu Yongming predicted at the conference that global AI investments will exceed $4 trillion in the next five years.

Alibaba Cloud CTO Zhou Jingren admitted in a media interview after the conference that there are now very few major disagreements regarding technical routes in the industry. Almost all companies globally are aggressively investing in AI competition and rapidly releasing models. The question now is how each vendor approaches this.

“Current model competition is essentially a competition between systems,” Zhou Jingren said. “The innovation of model development does not involve holding back major breakthroughs; it is complementary to the underlying infrastructure and cloud.”

Understanding ‘System’

How to understand “system”? This likely points more towards a strategic choice in AI.

After DeepSeek changed the global AI narrative, all major companies are increasing their investments in AI, from underlying computing power to cloud computing and open-source efforts.

The divergence in AI routes among major companies has formed interesting contrasts—take Tencent’s recent ecological conference as an example, where Tencent focused more on scenarios and B-end and C-end implementations, first applying AI to its own business before turning to external applications; ByteDance, on the other hand, resembles iOS, adopting a regimented approach from models to applications, but with a tendency to first close-source and roll out better versions, with a slower pace in open-sourcing.

2023 is a pivotal year for Alibaba Cloud. After Wu Yongming took over as CEO, he proposed an “AI-driven, public cloud-first” strategy.

Since then, Alibaba Cloud has completed several key tasks: returning to the public cloud, cutting low-profit projects, and investing heavily in AI, not only externally investing in AI startups but also significantly investing in self-developed models, open-source efforts, and infrastructure reconstruction.

Alibaba Cloud’s current trajectory is closer to Google’s. From the underlying computing infrastructure to cloud computing and then to the upper-level models, both Alibaba and Google adopt a full-stack self-research strategy, ensuring that each layer is internationally competitive.

The ASI proposed by Alibaba today is not a new term. In March of this year, Google DeepMind disclosed its “AGI Six-Level Roadmap,” which corresponds closely to Alibaba’s ASI trilogy: the third stage of ASI, “surpassing humans,” is quite similar to DeepMind’s defined AGI Level 6.

Image 4

Aggressively investing in AI also stems from the inseparable relationship between AI and cloud computing. Today, Alibaba Cloud even announced a new positioning as a “full-stack AI service provider.” “Tokens are the electricity of the future AI world,” Wu Yongming stated.

Undoubtedly, we are still in the early stages of the AI era. Currently, the volume of model calls accounts for a very small portion of enterprise cloud consumption, but the trend is crucial.

In a post-conference interview, Xu Dong, General Manager of Alibaba Cloud’s Tongyi Large Model Business, told the media that a year ago, the volume of large model calls mostly came from offline tasks like data labeling; but a year later, online task calls have increased by dozens of times, with enterprises across various industries embedding large models into their production processes—this proves that large models are rapidly bringing incremental growth to the cloud market.

For the past 16 years, providing the “water and electricity” of the digital world has long been Alibaba Cloud’s explanation of its market value—this is consistent with Alibaba Cloud’s current call of being the “Android of the LLM era,” which is fundamentally the same ecological niche.

Whether proposing a new roadmap or a new positioning, Alibaba needs to find its home court in the AI era and secure a leading position before the application market explodes; this goal has never been clearer.

Was this helpful?

Likes and saves are stored in your browser on this device only (local storage) and are not uploaded to our servers.

Comments

Discussion is powered by Giscus (GitHub Discussions). Add repo, repoID, category, and categoryID under [params.comments.giscus] in hugo.toml using the values from the Giscus setup tool.