Poster
in
Workshop: Machine Learning for Remote Sensing (ML4RS)
GIMI: A Geographical Generalizable Image-to-Image Search Engine with Location-explicit Contrastive Embedding
Hao Li · Jiapan Wang · Balthasar Teuscher · Peng Luo · Danfeng Hong · Gengchen Mai · Martin Werner
To query and localize objects of interest among massive and multi-modality big geospatial data (BGD) is fundamental in spatial data science and Earth system science (ESS). However, the effective and efficient searching among an extensive collection of geospatial data (e.g., global satellite imagery) for interesting patterns can be challenging, often requiring domain-specific prior knowledge (i.e., training labels) and intensive computational resources. Towards addressing this challenge, we introduce GIMI, a geographical generalizable image-to-image neural search engine that extends \textit{the cluster hypothesis} from information retrieval theory - closely associated documents tend to be relevant to the same requests - to geospatial data. We explicitly integrate geo-location information into the contrastive learning of image embeddings via a general distance-penalized triplet loss. On this basis, GIMI is designed to support a wide range of search queries, including embedding-based similar search and spatial-constrained nearest neighborhood search. As a case study, we select the task of post-disaster damage building search to demonstrate the general idea behind GIMI and evaluate its model performance in a critical real-world searching scenario. Experiments show that GIMI achieves promising searching performance, w.r.t accuracy and efficiency, in selected areas affected by the 2023 Kahramanmaraş Earthquake in Turkey