From the panoptic eyes in the sky of satellites and drones to the intimacy of smart houses and phones—the pervasive nature of machine vision produces an inescapability to algorithmic subjugation. How does machine vision recognises a domestic item such as a desk lamp? This project aims to analyse the algorithm of commercial object recognition software (like the ones from Tesla and Facebook) by reverse-engineering its process. Object recognition software can be used as an agent in potential life and death situations and this project questions its reliability. 


When a machine sees an object, the recognition system firstly reduces the noise to see the essence of the object. Recognition takes place when the essence resembles an archetype in its trained database. However, who decides how the training data is selected, categorised and confirmed? How easily does a machine fail to distinguish its surroundings due to minor distortions and external factors? And how much agency is assigned to such systems in critical decision-making moments? 

The physical object demonstrates how an archetype of a Western desk lamp consists of various merged lamps, before noise reduction morphs the archetype into a shape unrecognisable to the human eye. The archetypal object derives from what Harun Farocki would call an "operational image" that is made for machine legibility. These operational images are brought to concrete reality through the 3D-printed desk lamp.


2017–2018, moving images and object (26 x 12 x 26 cm)


Featured on DomusWeb