Technical Workshop
Deconstructing the Algorithmic Gaze: A Visual Analysis with Midjourney
Critically examine how AI image generators represent the Middle East and its people, uncovering embedded biases and orientalist tropes in training data.
This workshop turns you into a researcher of AI bias. Using Midjourney and other image generation tools, we will systematically prompt these systems to produce images of the Middle East — cities, people, historical events, everyday life — and then critically analyze the results. What visual vocabularies do these systems default to? Where do the tropes come from? What is absent?
We will work through a structured protocol: generating images with identical prompts across different tools, documenting patterns of representation, comparing AI-generated images to real photographs, and tracing the visual genealogy of specific tropes back to Orientalist painting and colonial photography. This is not an art workshop — it is an exercise in excavating the assumptions embedded in training data.
The workshop connects directly to Crawford’s “Excavating AI” and Kotliar’s “Data Orientalism,” grounding those theoretical frameworks in hands-on experimentation. You will leave with a portfolio of generated images annotated with your critical analysis — material that can inform your final research project and, more broadly, your ability to see the ideological work that AI image systems perform.