As a scientist and healthcare provider, I’ve long valued the ability to connect with colleagues, share research, and engage in public discourse. For years, X (formerly Twitter) felt like a place where that could happen. But over time, I’ve grown increasingly concerned about its ethical, algorithmic, and societal harms—especially for those of us who rely on accurate information and meaningful dialogue in our work.
After much reflection, I’ve decided to step away from X and move toward platforms more aligned with values like transparency, integrity, and collaboration. I’m now publishing and engaging on Bluesky and Mastodon—two decentralized platforms that offer a healthier alternative for evidence-based communities.
In this post, I’ll share the reasoning behind my decision, introduce key concepts like democratic algorithmic governance, and offer a glimpse of what a more ethical digital future might look like.

Ethical Concerns: Who Controls Our Information?
X’s acquisition by Elon Musk raised fundamental questions about the concentration of power in digital communication. When the infrastructure of public discourse is owned by a single individual—especially one with strong ideological leanings—democratic values are at risk.
As noted in the 2025 Global Risks Report, misinformation and disinformation are among the greatest threats to global stability. Platforms like X, which prioritise virality and profit over accuracy, accelerate this trend. For those of us working in health, science, and education, the consequences are deeply concerning: evidence gets drowned out, public trust erodes, and vulnerable communities are exposed to harm.
Bluesky: Transparency and User Control
Bluesky offers a compelling alternative. Built on the open-source AT Protocol, it allows users to choose or even create their own algorithms. This transparency gives back agency to users and weakens the monopolistic grip of opaque systems.
Bluesky isn’t perfect—it’s still in development, and the AT Protocol is currently stewarded by a company—but it offers a framework that is far more aligned with scientific values: open systems, reproducibility, and peer critique. It’s no surprise that many in our community are already making the switch, as noted in this article from Nature.
Mastodon: Full Decentralization in Action
If Bluesky is an open protocol with centralized oversight, Mastodon takes things further. It’s part of the Fediverse, a network of independently-run servers (“instances”) that communicate through open standards.
There’s no central company. Each server has its own moderation policy and culture. Crucially, Mastodon doesn’t push content using engagement-driven algorithms—your feed is chronological, not manipulated. It’s a quieter, more community-oriented space, especially valuable for those disillusioned with the noise and hostility of mainstream platforms.
Algorithmic Manipulation vs. Democratic Governance
One of the most dangerous shifts in social media has been the rise of opaque algorithmic curation. These algorithms decide what we see, reward outrage and division, and often bury reliable information in favour of sensationalism. Le Nguyen Hoang’s book, La Dictature des Algorithmes (available here) shaped my understanding of this issue.
But what if algorithms could be shaped democratically?
This is where the work of Audrey Tang, Taiwan’s digital minister, offers an inspiring model. Tang has shown how digital platforms can support consensus-building rather than conflict. In Taiwan, tools like Pol.is are used to map public opinion—not by amplifying the loudest voices, but by highlighting areas of broad agreement. In an interview, Tang explains how transparent, participatory systems can foster civil dialogue even on controversial issues. It’s a vision of what technology could be: a tool for democracy rather than division.
The Tournesol Project: A Real-World Example
Another powerful initiative that shaped my thinking is the Tournesol Project, co-founded by Le Nguyen Hoang, a researcher in AI safety and science communication. Tournesol is a collaborative platform where users rate the quality of YouTube videos based on shared public-interest values—such as trustworthiness, fairness, and informativeness.
These ratings are then used to recommend content, not to drive clicks or ads. Unlike the black-box algorithms of mainstream platforms, Tournesol’s logic is open, auditable, and guided by human judgment. You can explore it here: tournesol.app
Projects like Tournesol show that algorithmic governance doesn’t have to be dystopian or corporate-controlled. With the right design, it can be collective, transparent, and accountable.

Social Harms: Why It’s No Longer Safe to Stay
Beyond the technical issues, X has become an increasingly unsafe and toxic environment. Harassment, hate speech, and the marginalization of vulnerable groups are rampant—and moderation has only worsened.
As a healthcare professional, I find this deeply problematic. Our role often involves engaging the public on difficult topics. But when the platforms themselves become hostile or amplify misinformation, they jeopardize both the message and the messenger.
Social Harms: An Unsafe Environment for Vulnerable Groups
X has become increasingly hostile, particularly for women, minorities, and marginalized communities. Harassment, hate speech, and trolling are rampant, with inadequate moderation allowing these issues to persist.
For healthcare providers and researchers, this toxicity can have direct consequences. Public engagement is a vital part of our roles, but platforms like X often make it unsafe to share expertise, especially on contentious topics. Additionally, the proliferation of misinformation undermines public trust in science and medicine, a trend that is deeply concerning in the context of global health crises.
Why I’m Moving—and Where You Can Find Me
This isn’t just about leaving a broken platform. It’s about choosing better ones. Spaces where integrity, care, and critical thinking are still possible.
You can now find me on:
I’ll continue sharing reflections on physiotherapy, science, ethics, and technology—but now in spaces that better reflect my values.
A Call for Change
Leaving X wasn’t easy as I will loose all my followers and many won’t do the change. But it feels necessary.
We have a responsibility—as scientists, health professionals, educators, and citizens—to think critically about the systems we participate in. Where we post is not neutral. Platforms shape discourse, trust, and public understanding.
By supporting open protocols, democratic governance, and ethical tech like Mastodon, Bluesky, and Tournesol, we can help create a healthier digital future.
If this resonates with you, I encourage you to explore these alternatives. Follow the work of Audrey Tang, check out the Tournesol Project, and try out platforms designed to support—not sabotage—public dialogue.
Let’s lead by example, together.
References
- 2025 Global Risks Report – World Economic Forum
- How Technology Can Strengthen Democracy: An Interview with Audrey Tang – The Economist
- Why Scientists Are Moving to Bluesky – Nature
- La Dictature des Algorithmes by Le Nguyen Hoang – Science4All
- Digital Democracy Is Within Reach – Center for Humane Technology Podcast
Discover more from A physiotherapist's learning journey
Subscribe to get the latest posts sent to your email.

Leave a Reply