What do feminists think about technology

What does the future of feminist AI look like?

Data and the systems that work with it have become indispensable tools for challenging those in power and bringing about social change in communities. Yet marginalized communities are often excluded from the design processes that decide how the data should be used (and abused).
 
Data is needed to understand, plan and measure projects. Not only does movement rely on data, but also the technologies used to organize digital space, and the same is increasingly true of the algorithms that affect our lives. Data is generated, processed and evaluated under unequal power relations and therefore reproduces the oppressive and discriminatory norms of today's societies, whether we are dealing with colonialism, racism, class thinking, patriarchy or other structures.
 
In order to ensure real data justice, the ever-increasing amounts of data must be used for inclusive data practices, whereby potential digital violence in data production must be exposed, i.e. the potential for abuse, for epistemic violence and for algorithmic discrimination.
 
In her book Race after TechnologyRuha Benjamin talks about how the technology behind the veil of historical prejudice, bias and inequality is camouflaged as objective in order to rule black and dark-skinned bodies. As an example, she cites the Beauty AI initiative, the first beauty contest in which robots were used as a jury. Although “the most advanced machine learning technology” was used there, as expected, almost all of the 44 winners in the various categories were fair-skinned. From this, Ruha draws the following conclusion: "Beauty lies in the trained eye of the algorithm." This is a rather trivial example, but Ruha also describes worse applications of algorithms in areas such as police work, detention and social benefits.Countries in Africa are rapidly adopting new technologies, but at what cost? | Photo: Slim Emcee / Unsplash

African feminism, technology and the role of art

Feminist scholars believe that when studying technology and gender relations, care must be taken to ensure that technology and gender are not mutually exclusive, but are "co-produced". They do not exist in a vacuum: technology is shaped both by the environment in which it is used and by the social structures in which gender relations develop. Furthermore, technology and gender are constantly changing in this co-productive relationship, and this process does not end with the production of technology, but extends to its design and use and to the content conveyed by the technology. Therefore there is a dynamic interaction between technology and gender.

African feminists understand and accept their responsibility to strive for societies where justice and equality prevail. The feminists and women's rights movements in Africa differ significantly from the mainstream movements in the west and the northern hemisphere. An examination of feminist thinking on the African continent shows that there are a multitude of feminisms there that have a local context, but at the same time have many similarities. Linking intersectionality with other feminist theoretical frameworks shows that laws, traditions, violence, rituals, customs, education, language and work are used to make both digital and offline spaces patriarchal.
 
Yet art and love are viewed as radical transformative acts in African feminism. Like Minna Salami in her work 7 key issues in African feminist thoughtexplains: “African feminist thinking is based on the idea that love and justice on the one hand and revolution and change on the other complement each other. At the heart of this thinking is healing, reconciliation and the assurance that the language of African women, thanks to its global position, is the language that can transform our society into a community where sexual, ethnic, spiritual, psychological and social equality will be guaranteed . “Art is able to go beyond the technical jargon and put AI governance at the center of attention.
 
In 2019 my organization Pollicy received a Creative Media Award from Mozilla, the internet application provider. Mozilla seeks to sponsor art projects and advocacy initiatives that shed light on the role AI plays in spreading misinformation. We developed a game from the category “Choose your own adventure” that enables the user to slip into the role of one of three characters from East Africa. Players can make decisions for their characters and track and investigate false information and false reports. Mozilla has now doubled its efforts and recently launched the Creative Media Awards 2020, which recognize technologists, artists and media producers who succeed in making the interaction of AI with online media and truth visible and their impact from the point of view of Illuminate "black" experience.Pollicy has developed a web-based game called “Choose Your Own Fake News” | Copyright: @neemascribbles

Data fairness and transparency

 
Governments across Africa are rapidly adopting novel technologies. These platforms are often bought from foreign providers as part of unclear procurement agreements, with little or no transparency as to how the algorithms work and how they are to be used in the future. In their work Algorithmic Colonization of AfricaAbeba Birhane describes numerous examples that the rapid growth of AI interventions across Africa shows parallels to exploitation in the colonial era. Examples given by Birhane include the use of facial recognition software in Zimbabwe and Uganda, predatory microloaning in Kenya, and discriminatory biometric identification systems being set up across the continent.
 
There is growing consensus that the data must better serve the interests of women and marginalized groups and counteract hostility to the demands of feminism. This is particularly important for the response of governments and partners in development cooperation to the COVID-19 pandemic, as it will determine how medical, social and psychosocial support for women and other vulnerable groups is designed and provided. In feminist organizations and women's groups in Africa, it is imperative to promote a critical discourse about AI systems in order to stimulate active access instead of mere reaction and to create gender-inclusive framework conditions for overseeing the implementation of AI that guarantee welfare and security.Neema Iyer | © Neema Iyer
 
The feminist movements have a great opportunity to work at the intersections of gender, technology, research and art to shed light on the growing influence of AI on everyday life. AI regulation issues are often left to technologists, but we need feminist contributions to work together to build an inclusive and ethical digital future where all groups are involved to overcome the discriminatory and harmful practices traditionally built into software are. We need to be able to ask the right questions and think critically about what society can look like with different models for dealing with AI and data today, tomorrow and in the distant future.

More keywords from Neema Iyer about the future of creative AI can be found here.