I worked in my first startup in 2012, and as I developed through my career between more startups and full-time jobs, I believe starting startups is like marriage. You gain experience if you marry young, but you get a stable marriage when you marry of age.
As I’m now reaching a defined point of my career, in the intersection of technology, media and human rights, and the incoming tsunami of AI, I believe startups blending AI and human rights often wrestle with the gap between bold visions and everyday challenges.
Founders convey the vision and engineers craft the tools to apply it to reality, but success comes from mastering the details. Details range from creating the legal frameworks and shaping the ethical standards to recruiting the right talent and securing funding. If any of these elements fall, the entire idea just grinds to a halt.
Between September 2025 and February 2026, I stepped in as the Interim Head of Operations and Strategy for Anmat, a media intelligence project that uses machine learning to track media bias in the Global South. My job was to take a promising technical concept and build it into an organisation ready for investment. I concentrated on creating stability and putting the right systems in place.
As this phase ends and I hand it over to the permanent team, I want to share what I learned about building institutional resilience from the start.
Operationalising AI Ethics
Saying an organisation is ethical is easy. Putting ethics into practice is much harder.
We used natural language processing to detect erasure in headlines, which is a serious responsibility, and I worked on the first version of the Anmat Data Style Guide, shifting the organisation’s voice from activist to empirical, and focusing on evidence instead of opinion.
Rather than calling an outlet biased, we built a framework to report, for example, that our model found a 2.2 times increase in passive language when describing Subject A compared to Subject B, to maintain clear communication between data scientists and storytellers, which is essential in digital rights work, and more so in any kind of remote work.
Building Systems While Operating
Beyond strategy, we had to set up the core systems and make sure ongoing costs were manageable. We prepared the organisation for its seed round and submitted applications to secure long-term funding.
The exit is a key part of any interim role. We completed a data pipeline handover so that the next Technical Lead receives a documented system rather than a black box.
What’s Next
My work with Anmat was focused on getting the project started quickly. By February, the project was operational and had a clear governance plan, and I am now leaving the initiative in the capable hands of the founder, Hager Hesham, the award-winning data journalist whom I really enjoyed working with. I want to say that I strongly believe in Anmat’s potential to make media analysis more accessible in the MENA region and the Global South.
For organisations working in AI governance, digital rights, or open knowledge, do not wait for stability before building systems. Build first to stabilise.