Abstract
This talk discusses the concept of history inherent in so-called 'foundation models,' focusing on OpenAI’s CLIP model, a large-scale multimodal model that projects text and image data into a common embedding space. CLIP facilitates not only the automated labeling of images but also the automated production of images from labels (as one component of the DALL-E 2 composite model). Starting from Walter Benjamin’s concept of history, I argue that the spatialization of the past that occurs in models like CLIP invalidates the past’s potential to become history, to be productively reframed in a moment of crisis, to be both similar and dissimilar to the present at the same time