当前位置: 当前位置:首页 >焦點 >【】正文

【】

作者:百科 来源:焦點 浏览: 【】 发布时间:2024-11-10 01:42:58 评论数:

Anyone who has ever fallen into a YouTube conspiracy theory wormhole just got thrown a lifeline.

YouTube announced Tuesday that it would begin rolling out "fact check panels" in the U.S. What does that mean? The company said in a blog post that if a user searches a topic on YouTube that has been debunked by fact checkers, a shaded box will appear at the top of the query results that identifies the claim and rates it as "false."

The feature rolled out in India and Brazil last year, but the post said that the spread of coronavirus misinformation helped show the urgent need for this feature to become more widespread.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!
Mashable Image

Just when the panels will show up is a bit tricky. YouTube explains that if your search is general, that won't trigger a fact check — though YouTube will surface what it calls "news from authoritative sources" in "Breaking News" and "Top News" shelves. But if you search a more specific topic that is both a subject of misinformation and has been flagged and debunked by fact checkers, the panel will appear. Here's how YouTube explains it:

There are a few factors that determine whether a fact check information panel will appear for any given search. Most important, there must be a relevant fact check article available from an eligible publisher. And in order to match a viewer’s needs with the information we provide, fact checks will only show when people search for a specific claim.

The fact checks come from the International Fact Checking Network, which includes organizations like PolitiFact and FactCheck.org. When fact checkers tag an article through an organization called The Claim Review Project, Google (and other social and search engines) will ingest the fact check and enable it to show up alongside the incorrect information.

Like other content platforms, YouTube has had to navigate how to deal with misinformation and conspiracy theories — deciding whether to ban such material, downrank it, fact check it, or just leave it alone. In January 2019, the company said it would stop surfacing "borderline content" in its recommendation algorithms.

The fact check tool will roll out over time, and eventually to other countries. For now, it's another tool in the arsenal against misinformation. Even with the panels, make sure you and your loved ones know how to spot coronavirus misinformation when you see it.

TopicsCOVID-19