Mix-Up Inspires Saudi Designer to Create ‘Garden of Men’

Published June 26th, 2021 - 06:36 GMT
Sara khalid
Sara khalid (Instagram)
Highlights
We found the contrast the men being mistaken for flowers interesting. That the AI system inaccurately saw masculinity as a typically delicate, feminine object was particularly intriguing.

“Garden of Men” was inspired by a previous project of Alya’s in which she ran different photos through an AI model that generates automated captions.

One image was of her father and other male family members standing together wearing their shemagh (headscarves), but the AI recognized them as a group of women. We can’t really tell what went wrong, but the system probably confused the shemagh for actual hair.

That mix-up inspired my collaboration with Alya. We decided to test the AI’s ability (or lack of ability) to understand Saudi dress. The idea was to examine whether the AI model would keep misconstruing men wearing the shemagh. To do that, we inserted multiple photos into the model, and all the captions came out equally weird. There was one specific photo, however, which intrigued us the most. It was of some young men posing together in a stadium. The caption described them as “flower-filled vases.” Alya and I immediately chose this photo as the basis of our “Garden of Men” project.

We found the contrast the men being mistaken for flowers interesting. That the AI system inaccurately saw masculinity as a typically delicate, feminine object was particularly intriguing. We decided to take the contrast a step further and masked flowers into the men’s faces. We then inserted the edited image into a different AI generator called “Deep Dream,” which is supposed to add a dreamy touch to photos. In a way, we wanted to parody this mix-up. It’s like we insisted we were submitting to the view of the machine, but what we were really trying to do was to mock the system’s confusion.

“Garden of Men” was an attempt on our part to test the archival notion of AI technology, and I believe that the audience understood the main message behind it. We always speak of how biased this technology can be towards race or gender, but we wanted to reveal how culturally biased it can be too; how it can simply fail to recognize certain cultural objects, like the shemagh, abaya or niqab.

This article has been adapted from its original source.


Copyright: Arab News © 2021 All rights reserved.

You may also like