This chapter contributes to our understanding of organic and informal user correction practices emerging in WhatsApp groups in Namibia, South Africa, and Zimbabwe. This is important in a context where formal infrastructures of correcting and debunking dis/misinformation have been dominated by top-down initiatives. These formal infrastructures include platform-centric content moderation practices and professional fact-checking processes. Unlike social platforms such as Twitter and Facebook, which can perform content moderation and hence take down offending content, the end-to-end encrypted (E2EE) infrastructure of WhatsApp creates a very different scenario where the same approach is not possible. This is because only the users involved in the conversation have access to the content shared, shielding false and abusive content from being detected or removed. As Kuru et al.(2022) opine, the privacy of end-to-end encryption provides a highly closed communication space, posing a different set of challenges for misinformation detection and intervention than with more open social media, such as Facebook and Twitter. In this regard, false and misleading information on WhatsApp constitutes” a distinctive problem”(Kuru et al. 2022; Melo et al. 2020). As Reis et al.(2020, 2) observe,“the end-to-end en-crypted (E2EE) structure of WhatsApp creates a very different scenario” where content moderation and fact checking at scale is not possible. Fact-checking WhatsApp groups, which have been flagged as the major distributors of mis-and disinformation is equally difficult.
Dis/Misinformation, WhatsApp Groups, and Informal Fact-Checking Practices in Namibia

Written by
in
