Google has removed over 31,000 Chinese YouTube channels for coordinated influence operations in the past 7 months

Google has released its latest TAG Bulletin report, which provides an overview of all the ccoordinated influence operations that his team detected and stopped in his applications in the first quarter of 2022.

And for the most part, it seems simple enough – 3 YouTube channels closed in connection with attempts to criticize Sudanese President Omar al-Bashir, an AdSense account linked to influence operations in Turkey, 42 YouTube channels and 2 Ads accounts closed as part of an operation investigation of coordinated influence related to Iraq.

But then there is this:

“We have shut down 4,361 YouTube channels as part of our ongoing investigation into coordinated China-related influence operations. These channels primarily uploaded spammy content in Chinese about music, entertainment and lifestyle. A very small subset uploaded content in Chinese and English about China and US foreign affairs, and these findings are consistent with our previous reports.

Sounds like a lot, doesn’t it? 4300 YouTube channels – not just individual videos – in a single month is a lot of content.

But as YouTube notes, this actually matches previous TAG reports.

Returning to its latest TAG updates, Google removed:

  • 5,460 YouTube channels in December, also linked to coordinated influence operations linked to China
  • 15,368 Chinese YouTube channels in November
  • 3,311 YouTube channels in October
  • 1,217 in September
  • 1,196 in August
  • 850 in July

All these elements are linked to the same investigation on coordinated China-related influence operations, and all have the same description as above, relating to “spam content” around entertainment, with some notes on US-China affairs. That’s over 31,000 YouTube channels deleted in the last seven months.

So what’s going on? What exactly are these Chinese influence operations looking for, and are they gaining popularity from this massive YouTube campaign?

It seems, based on Google’s description, that the primary goal of this effort is to first build an in-app audience with engaging, light-hearted content, before then using that reach to sprinkle sentiment pro-Chinese, in order to sow it among wider audiences.

This then allows the CCP, and/or related groups, to potentially influence public opinion through subtle means, gently nudging those viewers toward a more positive view of China’s activities.

This has generally been China’s MO with its information operations – on the Quora question and answer platform, for example, there are many examples of people asking questions about China, only to see extremely positive from users.

That seems to be the modus operandi here too, with ethnic Chinese groups seeking to build audiences on YouTube to then establish distribution and broadcast channels for pro-China propaganda.

But it’s certainly a big push, and it’s interesting to note how much China is ramping up its activity over time. This likely suggests he sees YouTube as a powerful influencer, further underscoring the importance of social platforms taking proactive and definitive action to stop these programs before they can gain traction.

We’ve asked Google for more information about the specifics of these banned strings, and we’ll update this post if and when we receive a response.

You can read Google’s latest TAG report here.

Raymond T. Helms