Isaiah
An AI curation tool that finds connections across thousands of artworks no human could hold in their head.
The idea
While studying in New York, I spent a lot of time in the city's museums. The MET, MoMA, the Guggenheim. I loved being in those rooms, but I rarely read the wall texts, and I had no real grasp of why the curators chose to put certain works together (especially looking at the modern art pieces). One afternoon at MoMA, I sat and watched, trying to read the room. Most visitors were doing the same thing I did. Walking through, glancing, moving on. A few stopped to read. Almost nobody seemed to be picking up on the curatorial thinking behind the show.
Curators put enormous effort into building exhibitions that tell a story, but that story mostly lives in their heads and on wall labels that people skip. I wanted to close that distance. What if AI could understand what a visitor is curious about and walk them through an exhibition built around that curiosity?
Isaiah is a tool that does that. You tell it what you're interested in, it asks follow-up questions to understand your intent, then it pulls together an exhibition from thousands of artworks, with explanations for why each piece belongs. A tool that curators can use to augment their expertise, and that lets audiences build their own exhibitions regardless of how much they know about art.
What I found along the way
I started by looking at previous experiments in AI curation. Studies such as the Duke Nasher Museum's AI-curated exhibition and IBM's Pinacoteca de São Paulo project showcased the possibilities and limitations of using AI in art curation. These experiments demonstrated AI's potential to identify patterns and streamline the curatorial process but also highlighted its inability to match the emotional and cultural depth of human curators.
The research that really got me going was Laura Herman's research, The Algorithmic Pedestal, where she used Instagram's algorithm to curate an exhibition. Her study compared AI-curated results with those of a human curator, revealing that the human curator created more cohesive and meaningful narratives. However, Herman's research relied on the same limited dataset for both approaches, which constrained the AI's potential. This led me to explore a different direction: leveraging AI's capacity to process and analyze vast datasets. While human curators are naturally limited by memory and the scope of their research, AI can uncover hidden thematic connections and recall overlooked works from expansive collections.
Isaiah builds on these insights, aiming to bridge the gap between human curation and AI's capabilities. Isaiah explores how AI can complement curatorial expertise by employing methodologies like embeddings and multimodal models, balancing data-driven efficiency with the emotional and cultural nuance necessary to craft compelling and inclusive exhibitions.

Laura Herman, The Algorithmic Pedestal: A Practice-Based Study of Algorithmic and Artistic Curation, Oxford Internet Institute
Prototyping
Early prototypes were rough. I tried the MoMA API, the Met Collection API, Google Arts & Culture. Rich metadata, but not enough context. Titles and tags alone don't tell you why two paintings belong in the same room.
I went through several versions. Simple tag matching first, which was too shallow. Then clustering and embeddings, which got closer. I eventually landed on a semantic search approach using Artpedia, a dataset with actual detailed descriptions that gave the model more to work with. I also added curatorial questions generated through the OpenAI API, so the tool could have a back-and-forth with the user to figure out what kind of exhibition they were after.
The final version uses the OpenAI API and Transformer.js embeddings to analyze text and visual data together. It selects artworks based on relevance, narrative arc, diversity, and how well pieces complement each other.

Prototype early versions workflow development



I went through many iterations to refine curation logic, dataset limitations, and thematic coherence
Talking to real curators
I interviewed curators from the MET, the Brooklyn Museum, and several independents. Their biggest feedback was about the dataset. Meaningful curation needs rich, contextual descriptions, not just titles and dates. That's what pushed me toward Artpedia and the Met's detailed collection records. One curator told me something that stuck: "The best curation makes you feel like the artworks were always meant to be in the same room." That became the bar I was trying to hit.

Final version workflow

User interface frame design for the final version exhibition

Exhibition spatial layout design
Exhibited in the IMA gallery
The exhibition
I rebuilt Isaiah as a show version and exhibited it in the IMA gallery. Projection mapping for the artworks around the UI frame. It went from running in a browser to something audiences could physically walk into and create their one and only exhibition.
Isaiah changed how I think about AI. Not as a replacement for human judgment, but as a collaborator that handles what humans can't, like searching across ten thousand works at once, so people can focus on what they're actually good at. Building Isaiah also gave me a real understanding of how RAG pipelines and LLMs work in practice, and that's shaped everything I've built since.
Credits
Software / Code / Design
Mickey Oh
Advisors
David Rios
Daniel Shiffman
J. H. Moon
Gottfried Haider




