TORONTO — Experts say the community notes program Meta will replace its current fact checking system with has several pitfalls that could allow misinformation to spread.
Richard Lachman says the model Facebook and Instagram will shift to tends to work slower than human fact checkers because it relies on platform users spotting potential misinformation and appending it with a note describing why it's wrong.
The associate professor at Toronto Metropolitan University's Radio and Television Arts School of Media says other users then have the ability to vote on whether they agree with the note.
If the platform deems people from multiple parts of the political spectrum all agree on the note, Lachman says it will then be served to more users.
Lachman says this process can be lengthy and by the time it really gets moving along, people have likely moved on to discussing other issues and won't revisit that post again.
Kaitlynn Mendes, the sa¹ú¼Ê´«Ã½ Research Chair in inequality and gender, finds reducing content moderators "very worrying" and fears the move will increase the amount of harmful, hateful, violent, racist, sexist, homophobic and transphobic content out there.
This report by The Canadian Press was first published Jan. 7, 2024.
Tara Deschamps, The Canadian Press