Awani Review

Complete News World

Computer error highlights problematic content on Facebook

Computer error highlights problematic content on Facebook

Content identified as misleading or problematic has been mistakenly flagged in Facebook users’ feeds in recent months, due to a computer problem that took six months to fix, according to The Verge.

The article published Thursday “greatly exaggerates the scale of the error that ultimately did not have a significant or long-term impact on the problematic content,” Joe Osborne, a spokesperson for Meta, the parent company, responded. from Facebook.

According to The Verge, engineers from the group wrote an internal report citing a “major arrangement failure” of the content.

They noted in October that News Feed algorithms more widely disseminated certain content that was identified as suspicious by external media, and by members of “Third Party Validation,” a verification program developed by Facebook.

“Unable to find the cause of the problem, engineers saw the wave regress after a few weeks before resurfacing over and over until this arrangement issue was fixed on March 11,” the article details.

But according to Joe Osborne, the incident affected only “very few views”.

“The vast majority of News Feed content cannot be reduced,” he explained, adding that other mechanisms designed to avoid exposing users to so-called “harmful” content remained in place – such as “other cuts, fact-check warnings, and recalls.”

AFP participates in more than 80 countries and 24 languages ​​in “third-party fact-checking”. Through this programme, which began in December 2016, Facebook is paying more than 80 media outlets around the world, general or niche, to use their “fact-checks” on its platform, on WhatsApp, and on Instagram.

If information is diagnosed as false or misleading by one of these media, users are less likely to see it in their newsfeed.

See also  His Twitter followers decided: Musk should sell 10% of his Tesla stock

And if they watch it or try to share it, the platform suggests that they read the verification article. Those who have already shared the information receive a notification to be redirected to the article. There is no deletion of posts. Participating media have complete freedom in choosing and dealing with their topics.