The Battle for Truth
Hamelink, Facebook, and the Algorithmic War on Information
Introduction
In the early days of social media, platforms like Facebook were heralded as revolutionary tools for democratic participation, giving individuals direct access to a wealth of perspectives beyond the traditional gatekeepers of the press. Yet, the very structures that were supposed to democratise information have been systematically exploited. From the Cambridge Analytica data scandal to the Internet Research Agency’s (IRA) misinformation campaigns, Facebook has become a case study in how algorithmic systems can be manipulated to distort reality, shape elections, and erode public trust.
Cees Hamelink’s 1976 article, “An Alternative to News”, provides a valuable lens through which to examine these developments. Hamelink was deeply concerned with media control and the need for alternative, participatory news networks. In light of the exploitative strategies used by political actors on Facebook, his work remains strikingly relevant. The battle for truth is no longer just a philosophical concern—it is an ongoing struggle in the algorithmic age.
The Promise and Peril of Digital Media
When Facebook emerged as a dominant information platform, it was positioned as an alternative to traditional news media. Unlike television, newspapers, or radio—where gatekeepers controlled editorial decisions—social media allowed anyone to participate in the production and dissemination of information. The optimism was clear: a more democratic, decentralised information ecosystem could empower individuals and communities.
However, this vision failed to account for the incentive structures embedded within social media platforms. Facebook’s engagement-driven ranking system—optimised to keep users scrolling—prioritised emotionally charged, polarising content. Rather than serving as a marketplace of ideas, the platform became an amplifier of manipulation and disinformation.
Hamelink’s Warnings in “An Alternative to News”
In “An Alternative to News”, Hamelink argued that mainstream news outlets were controlled by economic and political elites, shaping narratives to serve institutional interests. He saw the need for grassroots, decentralised media networks that would empower individuals to challenge dominant ideologies. Hamelink also emphasised the importance of information literacy, warning that without critical engagement, audiences would remain passive consumers rather than active participants in news production.
This argument anticipated many of the structural weaknesses in digital media today. Facebook and other platforms promised to democratise news, but in practice, they reinforced many of the same problems Hamelink critiqued—albeit in new, algorithmically driven ways.
How Cambridge Analytica and the Internet Research Agency Exploited Facebook
The Cambridge Analytica scandal exposed how psychographic profiling could be weaponised for political gain. By harvesting the data of millions of Facebook users—without their explicit consent—Cambridge Analytica built detailed psychological profiles that were then used to deliver highly targeted political advertising. These ads were not neutral messages; they were designed to exploit personal fears, biases, and anxieties, shifting public opinion through psychological manipulation.
At the same time, the Russian-backed Internet Research Agency (IRA) was using Facebook to run large-scale misinformation campaigns. Fake accounts and pages—masquerading as legitimate political groups—were used to stoke division, suppress voter turnout, and shape political discourse. The IRA’s strategy relied on Facebook’s own ranking system: by producing emotionally provocative content, their posts were prioritised by Facebook’s engagement algorithm, reaching millions without significant scrutiny.
The Failure of Algorithmic News Feeds
The fundamental flaw in Facebook’s design is its reliance on engagement as a proxy for quality. Unlike traditional journalism, where editorial decisions are made by human professionals with ethical guidelines, Facebook’s News Feed is driven by algorithms that reward clicks, shares, and comments—regardless of accuracy or social harm.
This dynamic creates an incentive for disinformation:
- Content that triggers outrage or fear spreads faster than fact-checked journalism.
- Echo chambers emerge, reinforcing users’ biases rather than exposing them to diverse perspectives.
- The ability to run “dark ads”—political advertisements shown only to specific demographics—reduces transparency, allowing manipulation to go unchecked.
While Facebook has since introduced measures to combat misinformation, these responses have been reactive rather than proactive, often coming only after scandals or public outcry.
Reclaiming Digital Information Spaces
Hamelink’s work offers a critical perspective on the future of digital information spaces. If Facebook and similar platforms have failed to provide a meaningful alternative to institutionalised news, what comes next? A few key directions emerge:
- Platform Reform and Algorithmic Accountability
- Social media companies must be held accountable for how their ranking systems prioritise engagement over truth.
- Greater transparency is needed in how algorithms function and which actors are able to manipulate them.
- Strengthening Information Literacy
- Schools and universities must embed critical media literacy into curricula.
- Citizens need tools to identify misinformation, understand bias, and challenge manipulative content.
- Decentralised News Ecosystems
- Hamelink’s call for alternative media networks remains relevant—platforms that prioritise public interest over profit.
- Open-source, federated networks (such as Mastodon) may offer a path forward, reducing corporate control over information dissemination.
Conclusion
The battle between algorithmic control and participatory, ethical media is ongoing. While social media has brought new possibilities for engagement, it has also introduced new vulnerabilities to manipulation, disinformation, and corporate control. Hamelink’s vision of democratised, community-driven information systems remains an urgent challenge rather than a realised reality.
To move forward, we must rethink how digital platforms prioritise, distribute, and amplify information. Without meaningful reform, the structures that enabled Cambridge Analytica and the Internet Research Agency will continue to shape public discourse, undermining democracy in the process.
References
- Hamelink, C. (1976). An Alternative to News. Journal of Communication, 26(4), 120-125.
- Cadwalladr, C., & Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian.
- Marwick, A., & Lewis, R. (2017). Media Manipulation and Disinformation Online. Data & Society.
- Vaidhyanathan, S. (2018). Antisocial Media: How Facebook Disconnects Us and Undermines Democracy. Oxford University Press.
- Facebook Transparency Reports: https://transparency.facebook.com/