Media watchdog: TikTok teems with misinformation | Inquirer News
Go-to platform for youth

Media watchdog: TikTok teems with misinformation

/ 05:36 AM September 16, 2022

TikTok

TikTok app logo is seen in this illustration taken, August 22, 2022. REUTERS

SAN FRANCISCO — TikTok is serving up misinformation to users searching for news about politics, climate change, Covid-19, the war in Ukraine and more, according to a report released Wednesday.

Toxicity and false claims are a “significant threat” at TikTok, which is becoming a go-to online venue for young people to search for information, according to a study by NewsGuard, a media watchdog.

Article continues after this advertisement

NewsGuard describes itself as a “journalism and technology tool” that rates the credibility of websites and online information.

FEATURED STORIES

“Even when TikTok’s search results yielded little to no misinformation, the results were often more polarizing than Google’s,” NewsGuard said of its findings.

NewsGuard in September analyzed the top 20 results from 27 TikTok searches on news topics, finding that 19.5 percent of the videos suggested contained false or misleading claims, the report stated.

Article continues after this advertisement

Researchers said that they compared TikTok and Google results from searches for information about school shootings, abortion, Covid-19, US elections, Russia’s war on Ukraine and other news.

Article continues after this advertisement

False or misleading claims in results included conspiracy theories promoted by QAnon and supposed home recipes for hydroxychloroquine, a prescription drug used to treat malaria and lupus, according to NewsGuard.

Article continues after this advertisement

Company response

TikTok says the methodology used in the analysis is flawed, and that it makes a priority of fighting misinformation.

“Our Community Guidelines make clear that we do not allow harmful misinformation, including medical misinformation, and we will remove it from the platform,” a TikTok spokesperson said in response to an AFP inquiry.

Article continues after this advertisement

“We partner with credible voices to elevate authoritative content on topics related to public health, and partner with independent fact-checkers who help us to assess the accuracy of content.”

While testifying on Wednesday at a Senate hearing on social media’s impact on national security, Twitter former senior vice president of engineering Alex Roetter said that the Chinese government is an investor in TikTok parent company Bytedance, and that it has incentives to maximize profit and user engagement.

Chinese, US kids

“The TikTok algorithm pushes educational science, engineering and math content on Chinese youth while pushing a feed containing twerking videos, misinformation, and other destructive content to US children,” Roetter told senators.

Social media companies stand to benefit from attention-grabbing online content despite harmful effects it may have on society, Roetter said in opening remarks.

“Our terms of service and community guidelines are built to help ensure our vision of a safe and authentic experience,” TikTok chief operating officer Vanessa Pappas said at the hearing.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

“Our policies have zero tolerance for disinformation, violent extremism and hateful behavior.”

RELATED STORY:

TikTok digital literacy hub: A guide to online safety

TAGS: TikTok

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.