In the digital age, where immediacy takes precedence over reflection, it is more comfortable to blame algorithms than our own inability to discern.But let’s do a little self-criticism: is the problem really the artificial intelligenceor our lack of rigor in consuming and sharing details? The recent experience of the elections in the United States offers us an x-ray of a system that amplifies hate speech and fake newsnot thanks to AI, but to our complacency and the prevailing lack of control.
In the last North American elections, the recommendation algorithms of platforms like X were not manipulated mainly by content generated by AI, but by the “amplification effect.” An army of bots disguised as real users filled the platform wiht fake likes, retweets and shares, tricking algorithms into positioning certain speeches or messages in the feeds of real people. This not only encouraged misinformation, but naturalized it, giving it the appearance of consensus.
Thus an uncomfortable question arises: Why do we continue to trust these platforms as legitimate sources of information? A possible answer could be as it is indeed easier to give a like than checking a source. Though, the danger lies in the fact that this dynamic not only distorts the truth, but also polarizes, inflames passions and legitimizes hate speech.
Authoritarians don’t like this
The practise of professional and critical journalism is a fundamental pillar of democracy. That is why it bothers those who believe they are the owners of the truth.
One of the biggest problems in the fight against disinformation is restricted access to APIs (Application Programming Interfaces) of the main digital platforms. These interfaces could be key tools for researchers and developers to monitor and control the spread of fake news. Though, technology companies are increasingly limiting its use. The official excuse? Data protection. The real reason? Maintain absolute control of yoru ecosystems and narratives.
This lack of access is not an unavoidable rule. Open Gateway proposes programmable networks that allow mobile applications to access global functions, promoting openness and collaboration in areas such as security and operational efficiency.
This model contrasts with the opacity of customary platforms, which prefer to limit access to their apis to avoid accountability and perpetuate a system that monetizes from the misinformation and hate.
The official excuse? Data protection. The real reason? Maintain absolute control of your ecosystems and narratives”
Another case in this sense was the recent release of API commands by Anthropic for its Claude 3.5 model that demonstrates that technology has the potential to be part of the solution. though, provided that large platforms prioritize their financial profits over the common good, we will continue to be trapped in a cycle of negativity and misinformation. The inconvenient truth is that misinformation sells, and hate hooks.
Simultaneously occurring, states remain behind. Although the European Union has taken crucial steps with its AI law, designed to prevent bias and privacy violations, most countries lack a regulatory framework that requires transparency from platforms. In the United States, laissez-faire has left technology companies to design their own rules. The result? A digital ecosystem where polarization and radicalization are commonplace.
It is easier to give a like “to check a source”
Yuval Noah Harari put it clearly: the real challenge of our time is not just combating misinformation, but “investing in the truth.” For Harari, this means allocating both public and private resources to initiatives that strengthen quality journalism, media education and technological tools that promote transparency and access to verified information. Without this investment, the ground will remain fertile for those who seek to divide and confuse.
In our country, the panorama is no less worrying. According to DataReportal, in 2024 the 68.2% of argentines use social networks, which is equivalent to more than 31 million people. of them, 77.1% access at least one platform regularly. YouTube and Facebook lead in users,but X and other platforms are not far behind.
The problem lies not only in the level of penetration, but in the fact that a large part of the population is informed exclusively through social networks, where algorithms decide what is “relevant.” This creates an ecosystem where fake news spreads with alarming speed, polarizing society and eroding trust in institutions.
the fake news of today and the last military dictatorship
It is easy to blame algorithms, but the real duty lies with human actors: technology companies that prioritize profits over truth, states that do not regulate, and users who do not question. The solution, in a humble opinion, could begin to be thought of in the following three axes:
- Effective regulation: governments must demand transparency in algorithms and access to APIs.
- Media education: teaching the population to consume information critically is essential to counteract misinformation.
- Investment in the truth: as Harari proposes, it is necessary to allocate resources to strengthen journalism, education and technologies that promote transparency and fact-checking.
Until we address these root problems, we will continue to be complicit-victims of a system that prioritizes clicks over truth and polarization over dialog. The fight against disinformation is not just a technological battle, but an ethical imperative that involves us all and compromises our future and social dialogue.
What strategies can individuals use to identify and combat misinformation online?
Interview between Time.news Editor and Dr. Maria Thompson, Misinformation Expert
Time.news Editor: Welcome, Dr. Thompson! Thank you for joining us today. With the profound challenges of misinformation and fake news amplifying in today’s digital age, it’s essential to dive deeper into these issues. Your recent analysis of the implications during the last US elections has stirred quite a conversation. Would you mind summarizing what you found?
Dr.Maria Thompson: Thank you for having me! The key takeaway from my analysis is that the amplification of misinformation during the last elections wasn’t solely due to the algorithms themselves, but rather our collective complacency and lack of critical engagement with the content we consume. Even though artificial intelligence plays a role, it’s primarily the booming activity of bots and users misrepresenting themselves that fuels the misinformation machine.
Editor: Interesting outlook. You mentioned the influence of social media platforms like X and the “amplification effect.” Could you expand on how this mechanism works?
Dr. Thompson: Certainly! The “amplification effect” refers to how certain messages gain visibility not through genuine user engagement, but through a network of bots that generate fake likes, retweets, and shares. This creates an illusion of consensus and popularity around misinformation. As a result, actual users are more likely to encounter these distorted narratives in thier feeds, leading to increased polarization and acceptance of hate speech.
Editor: That’s unsettling. Given the current landscape, why do you think users still trust these platforms as legitimate sources of facts?
Dr. Thompson: It’s much easier to engage with content passively—like or share something—rather then take the time to verify it’s source. This convenience leads users into a false sense of security regarding the information they’re interacting with, despite the evident risks. Furthermore, the engagement algorithms are designed to prioritize emotionally charged content, which often exacerbates this problem.
Editor: You also touched on the role of journalism in combating misinformation. Can you explain why robust journalism is crucial for democracy in this context?
Dr. Thompson: Absolutely. Professional journalism serves as a gatekeeper to ensure that accurate and well-researched information reaches the public. it’s a foundational pillar of democracy. However, when misinformation flourishes unchecked, it undermines informed decision-making and allows authoritarian perspectives to gain ground. Journalists must be supported to tackle these challenges head-on and hold platforms accountable.
editor: Speaking of accountability, you raised an interesting point about restricted access to APIs from major digital platforms. How does this limitation exacerbate the issue of misinformation?
Dr. Thompson: The restricted access to APIs hinders researchers and developers from monitoring trends in misinformation effectively.While companies frequently enough cite data protection as the reason for limiting access, the underlying issue is their desire to maintain control over their ecosystems and narratives. Without openness and accountability, it becomes increasingly difficult to hold these platforms accountable for the roles they play in spreading misinformation.
editor: It seems like a tightrope walk between privacy and accountability. You mentioned an intriguing choice—the concept of “Open Gateway.” Can you elaborate on that?
Dr. Thompson: Open Gateway is a model that envisions programmable networks offering open access to functionalities for mobile applications. This would enhance collaboration and transparency in areas such as information security and operational efficiency. unlike conventional platforms that focus on keeping their APIs closed, this approach promotes openness, facilitating a more robust way to combat misinformation together.
Editor: That’s a hopeful perspective amidst these challenges. In closing, Dr. Thompson, what would you suggest as the most critical step individuals can take to combat misinformation in their daily media consumption?
Dr. thompson: The most vital step is to engage critically with what we consume. This means taking the time to verify sources, understanding the context of information, and being skeptical of narratives that seem too good—or too outrageous—to be true. Only by fostering a more astute approach to information sharing can we begin to challenge the tide of misinformation in our society.
Editor: Thank you for your insights, Dr. Thompson. It’s clear that we all have a role to play in addressing these pressing issues, and your expert perspective is invaluable for our readers.
Dr. Thompson: Thank you for having me! It’s crucial we keep having these discussions, as awareness is the first step toward creating change.