It’s a non-binary world

The fact that 55,1% of the world’s population have access to the internet (Internet World Stats, 2018)  and can freely express themselves has changed the experience of being human. Along with interconnectivity, we have been witnessing an increase in the blurriness of the boundaries between genders, nationalities and race, to the point of them becoming practically non-existing. Are we more diverse now than ever, or were we always, but now it’s easier to see? The more connected we are, the more we can see our own diversity and complexity.

Along with this realisation, however, there should be a big disclaimer. That is absolutely not to say that the experiences of groups of people identified as being in the minorities groups don’t count and should be dismissed: quite the opposite. More than ever, it is necessary to listen to acknowledge what oppression feels like, through the direct speech of those who suffer it.

We now have first hand access to stories of people that in the past we could only relate to through movies, songs and books. The difference is that, most of the time, those who are behind the such works are, ironically, representatives from the majority: white, cis, northern hemisphere born males. Up until today, even in works that intend to tell the story of, for example, a black woman and her experiences, are mainly written by white men, because such women still don’t have a strong enough voice and place in our society. 

That raises a big problem of inaccuracy of representation. Even I, a white woman, can’t possibly infer how a black woman experiences something, so how could a white man?

The problem with automation

Up until so far, algorithms have shown to be utterly ignorant when it comes to recognising the colourful spectrum of the experience of being human. As it’s quite clear in her article, Constanza-Shock shows how embarrassing and humiliating it can be for a transgender woman to go through airport security control. The A.I. can’t interpret the fact that the informations from the passenger’s gender information and the body scanner don’t quite match, therefore her existence is “flagged” by the machine as a threat. She then has to go through a very humiliating process of questioning and body search. That is just one example of it.

At each stage of this interaction, airport security technology, databases, algorithms, risk assessment, and practices are all designed based on the assumption that there are only two genders, and that gender presentation will conform with so-called ‘biological sex.’ Anyone whose body doesn’t fall within an acceptable range of ‘deviance’ from a normative binary body type is flagged as ‘risky’ and subject to a heightened and disproportionate burden of the harms.” (Constanza-Shock, 2018)

The questions that arise with from this subjects are many. First of all, why do these algorithms exist and who do they benefit? Is it for practicality and making life easier? If so, whose life is becoming easier with it? 

Another interesting concept that is presented in the article is Crenshaw’s concept of intersecionality, which basically explains how social injustice is often a correlational product of two or more factors that underlies someone’s outer identity. It is to say, for example, that black women suffer not only from sexism, but also from racism, which then merge together to tailor the experience of discrimination faced by a black women in a more intense way than if she was just a woman.

As designers and communicator or ideas, we are largely responsible, directly or indirectly, for shaping society’s views. So it’s more than about time that we pay attention to these concepts, especially if we are not affected by them. Not only to the concepts, but we must be listening to how different people experience the world, if we want to design smoother experiences for everyone.

At this point I ask myself whose lives are becoming easier with the design of algorithms that recognise and classify humans for any purpose. Designers? Companies? I can certainly say that there is a lot of people that are not happy with their experiences.
I wonder if is it even possible to create algorithms that can recognise human complexity? And do they really need to exist? Are we in such a rush that we cannot even pay attention to people around us? How can we move forward with the development of A.I. without running people over with automation? 

 

REFERENCES

Costanza-Chock, S, 2018. Design Justice, A.I., and Escape from the Matrix of Domination. Journal of Design and Science, [Online]. Available at: https://jods.mitpress.mit.edu/pub/costanza-chock [Accessed 30 October 2018].

Internet World Stats, Usage and Population Statistics. 2018. INTERNET USAGE STATISTICS The Internet Big Picture World Internet Users and 2018 Population Stats. [ONLINE] Available at: https://www.internetworldstats.com/stats.htm. [Accessed 29 October 2018].

 

Um pensamento em “It’s a non-binary world”

  1. Having read this I believed it was really enlightening.
    I appreciate you taking the time and energy to put this short article together.
    I once again find myself spending a significant amount of time both reading and commenting.
    But so what, it was still worth it!

    Gostar

Deixe uma Resposta

Preencha os seus detalhes abaixo ou clique num ícone para iniciar sessão:

Logótipo da WordPress.com

Está a comentar usando a sua conta WordPress.com Terminar Sessão /  Alterar )

Google photo

Está a comentar usando a sua conta Google Terminar Sessão /  Alterar )

Imagem do Twitter

Está a comentar usando a sua conta Twitter Terminar Sessão /  Alterar )

Facebook photo

Está a comentar usando a sua conta Facebook Terminar Sessão /  Alterar )

Connecting to %s