Corina Enache & Anna Aris

Failing to work with me. The implications of creating rigid algorithms and an argument for porous boundaries between them, the human and the practice that algorithms affect

From October 2018 to March 2019, our team of anthropologists completed an ethnographic research project at a large airline company on digitization and change. Our aim was to better understand the ways in which the mechanics, who conduct engineering and maintenance on the airplanes, learn to work with new digital technologies such as the Boeing 787, to what extent their algorithmic design reflects the needs of their practice, and how using such algorithms affect their craftsmanship.

Throughout our fieldwork, we observed that such algorithms carry within them certain assumptions about the work that do not reflect the social reality of the practice and the central role of the ‘human factor’ to both the development of technologies as well as the practice in the hangars itself. These assumptions lead to disconnects between software and hardware and the algorithms that inform them, and they tell us to reconsider the ‘human factor’ and craftsmanship as an essential driver in the functioning of the hangars. Crucial here is the translation (or: abstraction) from the ‘chaotic’ human world into binary digits of algorithm development. If that translation is not done correctly - for example, when the social needs of the practice in the hangars are misunderstood or undervalued by the developer that builds the algorithm - the practice itself is negatively affected in threefold: The rigidity of the technology algorithms, which mechanics are expected to use, negatively affects mechanics’  ability to to their work; secondly, the rigidity prevents mechanics from adding to the organization of the work; and lastly, it prevents the algorithm itself from ‘learning’ and being in sync with the mechanics and the practice.

If the human factor is not regarded as immanent to the (development of the) practice in the hangars, the problem of rigidity does not only reinforce the deceleration between mechanics, technologies and the needs of the practice; it makes the problem itself invisible. This vicious cycle can only be solved by a more nuanced understanding of technology and its algorithms as something that does not only affect but is also brought about by the human or cultural needs, and with that, a more porous boundary between algorithm development and humans that allows for more flexibility in human intervention, that in turn allows for a better learning process for both the algorithm and the human.

Our talk will also include the contribution of Alexei Sharpanskykh, Assistant Professor in Air Transport at Aerospace Engineering faculty of TU Delft.


Corina Enache

Corina Enache is a hybrid of two worlds, an ex-corporate in global innovation and an MsC in cultural anthropology. She has been working as an applied anthropologist in the technology space for the last 4 years having led assignments for companies such as KLM, Wintec, PwC and Harmoney. She is also the co-host of a global podcast project – The Human Show – where she interviews social scientists and industry people talking about their work in the technology space. Lastly, she is the co-organiser of The Amsterdam Interbuilding Applied Anthropology Meetup group, built in partnership with The University of Amsterdam.


Anna Aris

Anna Aris recently obtained her MSc degree in Applied Anthropology at the University of Amsterdam. She has been working on an ethnographic research project at KLM. Her research interests focus on material culture. Particularly, she is interested in cultural constructions that are created through human interaction with daily objects such as technological artefacts, and how people relate to their respective environments. Together with Corina, Anna is also co-organising The Amsterdam Interbuilding Applied Anthropology Meetup group.