A portrait of FATIMATA SAWADOGO #thisnamedpersondoesnotexist #DALLE

This Fatimata SAWADOGO does not exist

Posted by

We’ve used DALL-E text-to-image AI to generate portraits, based on names shared by a large number of people. This is our second blog post in our series with tag #thisnamedpersondoesnotexist, exploring how text-to-image AIs interpret personal names. We believe this project can illustrate the complexity of personal names interpretation, at the crossroads of ethnography, sociology, sociolinguistics, geography, history and, more recently machine learning.

Fatimata SWADOGO is a Bukinabé name shared by hundreds of people in Burkina Faso, mostly in the Centre-Nord, Nord regions of the country.

Here are two portraits generated by DALL-E based on the prompt “A portrait of FATIMATA SAWADOGO”.

A portrait of FATIMATA SAWADOGO #thisnamedpersondoesnotexist #DALLE
A portrait of FATIMATA SAWADOGO #thisnamedpersondoesnotexist #DALLE
A portrait of FATIMATA SAWADOGO #thisnamedpersondoesnotexist #DALLE
A portrait of FATIMATA SAWADOGO #thisnamedpersondoesnotexist #DALLE

This Adama SAWADOGO does not exist

We’ve also used the prompt “A portrait of Adama SWADOGO” to generate a male face, interestingly the AI chose a different style altogether.

A portrait of ADAMA SAWADOGO #thisnamedpersondoesnotexist #DALLE
A portrait of ADAMA SAWADOGO #thisnamedpersondoesnotexist #DALLE
A portrait of ADAMA SAWADOGO #thisnamedpersondoesnotexist #DALLE
A portrait of ADAMA SAWADOGO #thisnamedpersondoesnotexist #DALLE

Machine Learning Fairness and AI Ethics implication

This is quite impressive isn’t it? If a “blackbox” AI can interpret so well a name, it can also lead to disastrous effects when applying machine learning to decision making with potential discriminatory effects. At NamSor, we recognize the incredible level of complexity of biases in machine learning (gender bias, racial biais, biases based on the country of origin of an individual or other biases). We believe the effect of ML on real people’s life should be evaluated, to be fair and provide equal opportunity to all. Statistical fairness evaluation tools, such as Chicago University’s open source tool Aequitas, or the use of diversity indexes applied to bio-cultural diversity (Shannon index, Simplson index) could be used to define AI fairness quality labels.

This original article was also freely adapted in Spanish.

About NamSor

NamSor™ Applied Onomastics is a European vendor of sociolinguistics software (NamSor sorts names). NamSor mission is to help understand international flows of money, ideas and people. We proudly support Gender Gap Grader.

2 comments

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s