Who Goes There? Approaches to Mapping Facial Appearance Diversity

Zachary Bessinger, Chris Stauffer, and Nathan Jacobs

Project Image


Geotagged imagery, from satellite, aerial, and ground-level cameras, provides a rich record of how the appearance of scenes and objects differ across the globe. Modern web-based mapping software makes it easy to see how different places around the world look, both from satellite and ground-level views. Unfortunately, interfaces for exploring how the appearance of objects depend on geographic location are quite limited. In this work, we focus on a particularly common object, the human face, and propose learning generative models that relate facial appearance and geographic location. We train these models using a novel dataset of geotagged face imagery we constructed for this task. We present qualitative and quantitative results that demonstrate that these models capture meaningful trends in appearance. We also describe a framework for constructing a web-based visualization that captures the geospatial distribution of human facial appearance.

Interactive Map

Click on the map below to explore our web interface!


Paper (pdf) Poster (pdf) Dataset


  title = {Who Goes There? Approaches to Mapping Facial Appearance Diversity},
  author = {Bessinger, Zachary and Stauffer, Chris and Jacobs, Nathan},
  booktitle = {ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems (ACM SIGSPATIAL)},
  year = {2016},
  organization = {ACM}