
I’m very pleased to have contributed to the volume “AI Economy” with a contribution dedicated to the gender AI gap.
The goal was to show why artificial intelligence is not neutral and why gender inequality is a central economic and political issue.
In the book, my contribution was the only one to explicitly, organically, and politically address the relationship between AI and gender, going beyond a simple mention of the problem.
In the days surrounding the volume’s release, several articles published on Tgcom24 and Money.it revived and relaunched the debate on the AI economy, focusing on the effects of artificial intelligence on businesses, productivity, and the transformation of capitalism. A useful and necessary debate. But also, once again, partial.
What continues to be missing, or remains in the background, in my opinion, is a clear acknowledgement: artificial intelligence is not neutral. It is a social technology, trained on historical data, designed within existing power structures, and governed by actors who do not represent society as a whole. For this reason, the AI gender gap is not an “ethical” or “accessory” issue, but a central economic and political question. And I thank Jacopo Paoletti, the author, for involving me in this work, which has the hallmarks of a collaborative effort, with numerous contributions from diverse fields.
My work for years has focused precisely on this issue: demonstrating how gender inequalities permeate the entire AI cycle, from training to careers, from governance to concrete applications. Women remain underrepresented in technical and decision-making roles, and this absence is not neutral. It means that the systems increasingly used to hire, evaluate, classify, and distribute opportunities are built on a partial vision of the world.
In the book, I wanted to clarify an often simplified point: it’s not enough to ask who programs AI, even though it’s crucial to increase the presence of women in STEM disciplines. The deeper question concerns which data, which imagery, and which bodies are considered “standard” by algorithms, and which remain marginal or invisible.
This has very concrete consequences. The data shows that AI’s impact on work will not be gender-neutral, and not only that. The jobs most exposed to transformation and automation, particularly clerical and administrative jobs, are currently largely held by women. Without political and cultural intervention, AI risks amplifying occupational segregation, rather than overcoming it.
There is also a paradox worth highlighting. AI and gender equality have both firmly entered the public discourse. But while artificial intelligence advances at an accelerated pace, gender equality is proceeding extremely slowly. Inserting AI into an already deeply unbalanced context, without challenging it, means automating existing inequalities.
In my contribution to AI Economy, I chose to take a clear position: AI can become a tool for transformation only if we address the gender AI gap as a structural issue, not as a peripheral concern. In this sense, transfeminisms offer a fundamental critical key, because they question power, reject binarism, and introduce an intersectional perspective capable of interpreting the complexity of the present.
Regulating artificial intelligence is necessary, but not sufficient. We need to rethink the development model that produces it, the economic priorities that guide it, and the criteria by which we define value, competence, and productivity. Without this shift, the AI economy risks becoming an economy of exclusion.
But, at the same time, it can become an opportunity to redraw the rules of the game.
My contribution stems from this conviction, developed through research, advocacy, and critical analysis: there is no innovation without social justice, and the AI gender gap is one of the crucial issues for the future of work and technological democracy.
