Token becomes buzzword at ZGC Forum, seen as key to AI development and competition


token Photo: VCG

token Photo: VCG

Token, a key concept in the artificial intelligence (AI) field that has recently been standardized in Chinese as “ciyuan,” has become one of the most talked-about buzzwords among entrepreneurs and experts at the 2026 Zhongguancun Forum (ZGC Forum).

While some sci-tech startup entrepreneurs describe it as the “digital fuel” of the AI field, a growing consensus is emerging that tokens are a critical resource underpinning the rapid development of AI and future competition in the field.

The growing focus on tokens reflects a broader trend: AI is accelerating its integration into a wide range of industries in China, moving from professional circles into mass adoption. This transition is naturally expanding both research and commercial frontiers, Ma Qingyang from LLVISION told the Global Times on Thursday.

Showcased at the forum, LLVISION’s AI-powered translation glasses, capable of real-time translation in more than 140 languages, attracted attention from attendees. Behind such innovative application lies the intensive use of large language models.

In simple terms, tokens are the basic unit of computation in AI systems. Ma likened them to fuel in a gasoline-powered car, or data usage in internet access, essential resources that determine how AI systems operate. As AI applications become more widespread, the demand for tokens has grown rapidly.

By March this year, the average daily usage of tokens in China exceeded 140 trillion, an increase of over 1,000 times compared to 100 billion at the beginning of 2024, according to China Central Television.

Ding Zhixin, a representative from a digital technology company based in Guangzhou, told the Global Times that token usage has seen explosive growth, based on publicly available data. 

“This reflects how AI is entering everyday life and empowering various sectors,” he said. Looking ahead, Ding said that as AI evolves from relatively basic stages toward more advanced intelligence, reliance on such computational resources will only deepen. Emerging technologies such as brain-computer interfaces could further drive demand to levels far beyond today.

Industry data presented at the forum also underscores this trend. A staff member from ByteDance introduced the company’s Doubao large language model and Volcano Engine platform for businesses, saying that daily average token consumption had exceeded 100 trillion as of Thursday, marking a 60 percent surge from 63 trillion at the end of 2025. This sharp increase reflects the rapid scaling of AI applications.

Aslan Tarik, CTO of Beijing Tangta Technology Co, who mainly focuses on AI and AI robotics, said that tokens have effectively become a universal unit for pricing and value exchange in AI services. With the rise of AI agents, such as Openclaw, capable of running tasks continuously, token consumption is no longer tied to single user queries but to sustained operations. This shift is expected to make tokens even more central to competition in the AI sector.

From a research perspective, Dai Xin, a professor at Chongqing University, said that tokens represent valuable computational resources closely linked to computing power. Reducing computational costs while improving research efficiency, he said, will be a key challenge moving forward.



Source link

Leave a Reply