Study finds TikTok manipulates content to favorably promote Chinese government
An analysis of TikTok's algorithm found “compelling and strong circumstantial evidence” that the app's content is controlled by the Chinese government, but the study said the evidence is “not definitive proof of state orchestration.”
TikTok uses an algorithm to promote content that puts the Chinese government in a favorable light in order to sway users' views, according to a new study from the Network Contagion Research Institute (NCRI).
The study builds off a previous report from December that found the social media site likely promotes pro-China content, as the app faces bipartisan criticism over national security concerns. The app is already facing a potential ban in the United States if the Chinese parent company ByteDance doesn't divest its shares of the platform on time.
An analysis of TikTok's algorithm found “compelling and strong circumstantial evidence” that the app's content is manipulated by the Chinese government, but the study said the evidence is “not definitive proof of state orchestration.”
FBI Director Christopher Wray previously said it would be "difficult to detect" manipulation by a foreign government, according to The Hill.
The content analysis, however, did find that pro-Chinese videos like content about traveling to the Asian country were boosted in the searches, including ones that were posted by state government-supported accounts and influencers. Stories that were not favorable to the Chinese government, such as videos about the mistreatment of the Uyghur people, were suppressed.
The study also found that the platform was successful at shifting opinion in favor of the Chinese government by doing a survey of roughly 1,200 Americans, which found that frequent TikTok users were about 50 percent more likely to have a positive opinion of the Chinese government than those who were not consistently on the app.
“NCRI assesses that the CCP is deploying algorithmic manipulation in combination with prolific information operations to impact user beliefs and behaviors on a massive scale and that these efforts prove highly successful on TikTok in particular,” the authors wrote. “These findings underscore the urgent need for transparent regulation of social media algorithms, or even the creation of a public trust funded by the platforms themselves to safeguard democratic values and free will."
Misty Severi is an evening news reporter for Just The News. You can follow her on X for more coverage.