As Artificial Intelligence (AI) technologies continue to advance, protecting human autonomy and promoting ethical decision-making are essential to fostering trust and accountability. Human agency (the capacity of individuals to make informed decisions) should be actively preserved and reinforced by AI systems. This paper examines strategies for designing AI systems that uphold fundamental rights, strengthen human agency, and embed effective human oversight mechanisms. It discusses key oversight models, including Human-in-Command (HIC), Human-in-the-Loop (HITL), and Human-on-the-Loop (HOTL), and proposes a risk-based framework to guide the implementation of these mechanisms. By linking the level of AI model risk to the appropriate form of human oversight, the paper underscores the critical role of human involvement in the responsible deployment of AI, balancing technological innovation with the protection of individual values and rights. In doing so, it aims to ensure that AI technologies are used responsibly, safeguarding individual autonomy while maximizing societal benefits.
Category: Type: Research
In this category:
1. Peer-Reviewed:
Research Papers
Chapters
Conference Proceedings
2. Pre-Prints. Pre-Prints are standard in some fields. They are not always peer reviewed.
-

Generative AI and the Future of News: Examining AI’s Agency, Power, and Authority
This special issue interrogates how artificial intelligence (AI), particularly generative AI (GenAI), is reshaping journalism at a moment of profound uncertainty for the profession. The rapid rise of GenAI technologies, particularly following the release of tools like ChatGPT, has intensified longstanding tensions between economic precarity, technological innovation, and journalistic values. Across diverse contexts in the Global North and South, articles examine how AI is simultaneously heralded as a source of efficiency, personalization, and newsroom survival, while also feared as a destabilizing force that threatens jobs, erodes professional norms, and concentrates power in the hands of technology corporations.
-

Dis/Misinformation, WhatsApp Groups, and Informal Fact-Checking Practices in Namibia
This chapter contributes to our understanding of organic and informal user correction practices emerging in WhatsApp groups in Namibia, South Africa, and Zimbabwe. This is important in a context where formal infrastructures of correcting and debunking dis/misinformation have been dominated by top-down initiatives. These formal infrastructures include platform-centric content moderation practices and professional fact-checking processes. Unlike social platforms such as Twitter and Facebook, which can perform content moderation and hence take down offending content, the end-to-end encrypted (E2EE) infrastructure of WhatsApp creates a very different scenario where the same approach is not possible. This is because only the users involved in the conversation have access to the content shared, shielding false and abusive content from being detected or removed. As Kuru et al.(2022) opine, the privacy of end-to-end encryption provides a highly closed communication space, posing a different set of challenges for misinformation detection and intervention than with more open social media, such as Facebook and Twitter. In this regard, false and misleading information on WhatsApp constitutes” a distinctive problem”(Kuru et al. 2022; Melo et al. 2020). As Reis et al.(2020, 2) observe,“the end-to-end en-crypted (E2EE) structure of WhatsApp creates a very different scenario” where content moderation and fact checking at scale is not possible. Fact-checking WhatsApp groups, which have been flagged as the major distributors of mis-and disinformation is equally difficult.
-

Shifting the Gaze? Photojournalism Practices in the Age of Artificial Intelligence
In this article, we explore the impact of artificial intelligence (AI) technologies on photojournalism in less-researched contexts in Botswana and Zimbabwe. We aim to understand how AI technologies, proliferating aspects of news production, are impacting one of journalism’s respected and enduring trades- photojournalism. We answer the question: In what ways are AI-driven technologies impacting photojournalism practices? Furthermore, we investigate how photojournalists perceive their roles and the ethical considerations that come to the fore as AI begin to technically influence photojournalism. We deploy an eclectic analytical framework consisting of the critical technology theory, disruptive innovation theory and Baudrillard’s concept of simulation to theorise how AI technologies affect photojournalism in Botswana and Zimbabwe. Data were collected using in-depth interviews with practising photojournalists and …
-

The philanthrocapitalism of google news initiative in Africa, Latin America, and the middle east–empirical reflections
In recent years, media organizations globally have increasingly benefited from financial support from digital platforms. In 2018, Google launched the Google News Initiative (GNI) Innovation Challenge aimed at bolstering journalism by encouraging innovation in media organizations. This study, conducted through 36 in-depth interviews with GNI beneficiaries in Africa, Latin America, and the Middle East, reveals that despite its narrative of enhancing technological innovation for the media’s future, this scheme inadvertently fosters dependence and extends the philanthrocapitalism concept to the media industry on a global scale. Employing a theory-building approach, our research underscores the emergence of a new form of ‘philanthrocapitalism’ that prompts critical questions about the dependency of media organizations on big tech and the motives of these tech giants in their evolving relationship with such institutions. We also demonstrate that the GNI Innovative Challenge, while ostensibly promoting sustainable business models through technological innovation, poses challenges for organizations striving to sustain and develop these projects. The proposed path to sustainability by the GNI is found to be indirect and difficult for organizations to navigate, hindering their adoption of new technologies. Additionally, the study highlights the creation of a dependency syndrome among news organizations, driven by the perception that embracing GNI initiatives is crucial for survival in the digital age. Ultimately, the research contributes valuable insights to the understanding of these issues, aiming to raise awareness among relevant stakeholders and conceptualize philanthrocapitalism through a new lens.








