Menu

Kelly Schroeder

SpaceNet 4: Off-Nadir Buildings

SpaceNet 4: Off-Nadir Buildings

The Problem

Can you help us automate mapping from off-nadir imagery? In this challenge, competitors were tasked with finding automated methods for extracting map-ready building footprints from high-resolution satellite imagery from high off-nadir imagery. In many disaster scenarios the first post-event imagery is from a more off-nadir image than is used in standard mapping use cases. The ability to use higher off-nadir imagery will allow for more flexibility in acquiring and using satellite imagery after a disaster. Moving towards more accurate fully automated extraction of building footprints will help bring innovation to computer vision methodologies applied to high-resolution satellite imagery, and ultimately help create better maps where they are needed most.

The main purpose of this challenge was to extract building footprints from increasingly off-nadir satellite images. The created polygons were compared to ground truth, and the quality of the solutions were measured using the SpaceNet metric.

Read more about the Challenge winners from our blog!

The Data – Over 120,000 Building footprints over 665 sq km of Atlanta, GA with 27 associated WV-2 images.

This dataset contains 27 8-Band WorldView-2 images taken over Atlanta, GA on December 22nd, 2009. They range in off-nadir angle from 7 degrees to 54 degrees.

For the competition, the 27 images are broken into 3 segments based on their off-nadir angle:

  • Nadir: 0-25 degrees
  • Off-nadir: 26 degrees – 40 degrees
  • Very Off-nadir 40-55 degrees

The entire set of images was then tiled into 450m x 450m tiles.

See the labeling guide and schema for details about the creation of the dataset

Catalog

The data is hosted on AWS as a Public Dataset. It is free to download, but an AWS account is required.
aws s3 ls s3://spacenet-dataset/spacenet/SN4_buildings/

Sample Data

2 Samples from each Off-Nadir Image – Off-Nadir Imagery Samples

To download processed 450mx450m tiles of AOI_6_Atlanta (728.8 MB) with associated building footprints:

aws s3 cp s3://spacenet-dataset/spacenet/SN4_buildings/tarballs/summaryData.tar.gz .

Training Data

SpaceNet Off-Nadir Training Base Directory:

aws s3 ls s3://spacenet-dataset/spacenet/SN4_buildings/tarballs/train/

SpaceNet Off-Nadir Building Footprint Extraction Training Data Labels (15 mb)

aws s3 cp s3://spacenet-dataset/spacenet/SN4_buildings/tarballs/train/geojson.tar.gz .

SpaceNet Off-Nadir Building Footprint Extraction Training Data Imagery (186 GB)

To download processed 450mx450m tiles of AOI 6 Atlanta.

Each of the 27 Collects forms a separate .tar.gz labeled “Atlanta_nadir{nadir-angle}_catid_{catid}.tar.gz”. Each .tar.gz is ~7 GB

aws s3 cp s3://spacenet-dataset/spacenet/SN4_buildings/tarballs/train/ . --exclude "*geojson.tar.gz" --recursive

Testing Data

AOI 6 Atlanta – Building Footprint Extraction Testing Data

To download processed 450mx450m tiles of AOI 6 Atlanta (5.8 GB):

aws s3 cp s3://spacenet-dataset/spacenet/SN4_buildings/tarballs/SN4_buildings_AOI_6_Atlanta_test_public.tar.gz .

The Metric

In the SpaceNet Off-Nadir Building Extraction Challenge, the metric for ranking entries is the SpaceNet Metric.
This metric is an F1-Score based on the intersection over union of two building footprints with a threshold of 0.5

F1-Score is calculated by taking the total True Positives, False Positives, and False Negatives for each nadir segment and then averaging the F1-Score for each segment.

F1-Score Total = mean(F1-Score-Nadir, F1-Score-Off-Nadir, F1-Score-Very-Off-Nadir)

Collection Details

Catalog ID Pan Resolution (m) Off Nadir Angle (deg) Target Azimuth (deg) Category
1 1030010003D22F00 0.48 7.8 118.4 Nadir
2 10300100023BC100 0.49 8.3 78.4 Nadir
3 103001000399300 0.49 10.5 148.6 Nadir
4 1030010003CAF100 0.48 10.6 57.6 Nadir
5 1030010002B7D800 0.49 13.9 162 Nadir
6 10300100039AB000 0.49 14.8 43 Nadir
7 1030010002649200 0.52 16.9 168.7 Nadir
8 1030010003C92000 0.52 19.3 35.1 Nadir
9 1030010003127500 0.54 21.3 174.7 Nadir
10 103001000352C200 0.54 23.5 30.7 Nadir
11 103001000307D800 0.57 25.4 178.4 Nadir
12 1030010003472200 0.58 27.4 27.7 Off-Nadir
13 1030010003315300 0.61 29.1 181 Off-Nadir
14 10300100036D5200 0.62 31 25.5 Off-Nadir
15 103001000392F600 0.65 32.5 182.8 Off-Nadir
16 1030010003697400 0.68 34 23.8 Off-Nadir
17 1030010003895500 0.74 37 22.6 Off-Nadir
18 1030010003832800 0.8 39.6 21.5 Off-Nadir
19 10300100035D1B00 0.87 42 20.7 Very Off-Nadir
20 1030010003CCD700 0.95 44.2 20 Very Off-Nadir
21 1030010003713C00 1.03 46.1 19.5 Very Off-Nadir
22 10300100033C5200 1.13 47.8 19 Very Off-Nadir
23 1030010003492700 1.23 49.3 18.5 Very Off-Nadir
24 10300100039E6200 1.36 50.9 18 Very Off-Nadir
25 1030010003BDDC00 1.48 52.2 17.7 Very Off-Nadir
26 1030010003CD4300 1.63 53.4 17.4 Very Off-Nadir
27 1030010003193D00 1.67 54 17.4 Very Off-Nadir

Citation Instructions

Weir, N., Lindenbaum, D., Bastidas, A., Etten, A.V., McPherson, S., Shermeyer, J., Vijay, V.K., & Tang, H. (2019). SpaceNet MVOI: A Multi-View Overhead Imagery Dataset. 2019 IEEE/CVF International Conference on Computer Vision (ICCV), 992-1001.

License

SpaceNet 3: Road Network Detection

SpaceNet 3: Road Network Detection

The Problem

The commercialization of the geospatial industry has led to an explosive amount of data being collected to characterize our changing planet. One area for innovation is the application of computer vision and deep learning to extract information from satellite imagery at scale. CosmiQ Works, Radiant Solutions and NVIDIA have partnered to release the SpaceNet data set to the public to enable developers and data scientists to work with this data.

Today, map features such as roads, building footprints, and points of interest are primarily created through manual techniques. We believe that advancing automated feature extraction techniques will serve important downstream uses of map data including humanitarian and disaster response, as observed by the need to map road networks during the response to recent flooding in Bangladesh and Hurricane Maria in Puerto Rico. Furthermore, we think that solving this challenge is an important stepping stone to unleashing the power of advanced computer vision algorithms applied to a variety of remote sensing data applications in both the public and private sector.

The Data – Over 8000 Km of roads across the four SpaceNet Areas of Interest.

See the labeling guide and schema for details about the creation of the dataset

AOI Area of Raster (Sq. Km) Road Centerlines (LineString)
AOI_2_Vegas 216 3685 km
AOI_3_Paris 1,030 425 km
AOI_4_Shanghai 1,000 3537 km
AOI_5_Khartoum 765 1030 km

Road Type Breakdown (km of Road)

Road Type Vegas Paris Shanghai Khartoum Total
Motorway 115 9 102 13 240
Primary 365 14 192 98 669
Secondary 417 58 501 66 1042
Tertiary 3 11 34 68 115
Residential 1646 232 939 485 3301
Unclassified 1138 95 1751 165 3149
Cart track 2 6 19 135 162
Total 3685 425 3537.9 1030 8677

Catalog

The data is hosted on AWS as a Public Dataset. It is free to download, but an AWS account is required.
aws s3 ls s3://spacenet-dataset/spacenet/SN3_roads/

Sample Data

10 Samples from each AOI – Road Network Extraction Sample

To download processed 400mx400m tiles of AOI 2 (728.8 MB) with associated road centerlines for training do the following:

aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_sample.tar.gz .

Training Data

AOI 2 – Vegas – Road Network Extraction Training

To download processed 400mx400m tiles of AOI 2 (25 GB) with associated building footprints for training do the following:


aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_train_AOI_2_Vegas.tar.gz .

aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_train_AOI_2_Vegas_geojson_roads_speed.tar.gz .

AOI 3 – Paris – Road Network Extraction Training

To download processed 400mx400m tiles of AOI 3 (5.6 GB) with associated road centerlines for training do the following:


aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_train_AOI_3_Paris.tar.gz .

aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_train_AOI_3_Paris_geojson_roads_speed.tar.gz .

AOI 4 – Shanghai – Road Network Extraction Training

To download processed 400mx400m tiles of AOI 4 (25 GB) with associated road centerlines for training do the following:


aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_train_AOI_4_Shanghai.tar.gz .

aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_train_AOI_4_Shanghai_geojson_roads_speed.tar.gz .

AOI 5 – Khartoum – Road Network Extraction Training

To download processed 400mx400m tiles of AOI 5 (25 GB) with associated road centerlines for training do the following:


aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_train_AOI_5_Khartoum.tar.gz .

aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_train_AOI_5_Khartoum_geojson_roads_speed.tar.gz .

Testing Data

AOI 2 – Vegas – Road Network Extraction Testing

To download processed 400mx400m tiles of AOI 2 (8.1 GB) for testing do:


aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_test_public_AOI_2_Vegas.tar.gz .

AOI 3 – Paris – Road Network Extraction Testing

To download processed 400mx400m tiles of AOI 3 (1.9 GB) for testing do:


aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_test_public_AOI_3_Paris.tar.gz .

AOI 4 – Shanghai – Road Network Extraction Testing

To download processed 400mx400m tiles of AOI 4 (8.1 GB) for testing do:


aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_test_public_AOI_4_Shanghai.tar.gz .

AOI 5 – Khartoum – Road Network Extraction Testing

To download processed 400mx400m tiles of AOI 5 (8.1 GB) for testing do:


aws s3 cp s3://spacenet-dataset/spacenet/SN3_roads/tarballs/SN3_roads_test_public_AOI_5_Khartoum.tar.gz .

Citation Instructions

If you are using data from SpaceNet in a paper, please use the following citation:

Van Etten, A., Lindenbaum, D., & Bacastow, T.M. (2018). SpaceNet: A Remote Sensing Dataset and Challenge Series. ArXiv, abs/1807.01232.

Metric

In the SpaceNet Roads Challenge, the metric for ranking entries is the APLS metric. This metric is based on graph theory and empahsizes the creation of a valid road network.

The current version of the metric is open sourced on github: Average Path Length Similarity (APLS) metric For more information read the SpaceNet Road Detection and Routing Challenge Series, Part 1, and Part 2, written by Adam Van Etten at The DownlinQ.

For more information read the full article written by Adam Van Etten at The DownlinQ.

License

SpaceNet 2: Building Detection v2

SpaceNet 2: Building Detection v2

The Problem

The commercialization of the geospatial industry has led to an explosive amount of data being collected to characterize our changing planet. One area for innovation is the application of computer vision and deep learning to extract information from satellite imagery at scale. CosmiQ Works, Radiant Solutions and NVIDIA have partnered to release the SpaceNet data set to the public to enable developers and data scientists to work with this data.

Today, map features such as roads, building footprints, and points of interest are primarily created through manual techniques. We believe that advancing automated feature extraction techniques will serve important downstream uses of map data including humanitarian and disaster response, as observed by the need to map road networks during the response to recent flooding in Bangladesh and Hurricane Maria in Puerto Rico. Furthermore, we think that solving this challenge is an important stepping stone to unleashing the power of advanced computer vision algorithms applied to a variety of remote sensing data applications in both the public and private sector.

The Data – Over 685,000 footprints across the Five SpaceNet Areas of Interest.

AOI Area of Raster (Sq. Km) Building Labels (Polygons)
AOI_2_Vegas 216 151,367
AOI_3_Paris 1,030 23,816
AOI_4_Shanghai 1,000 92,015
AOI_5_Khartoum 765 35,503

Catalog

The data is hosted on AWS as a Public Dataset. It is free to download, but an AWS account is required.
aws s3 ls s3://spacenet-dataset/spacenet/SN2_buildings/

The Metric

In SpaceNet Challenge, the metric for ranking entries is based on the Jaccard Index, also called the Intersection-over-Union (IoU). For more information read the full article on The DownlinQ.

Labeling Guidelines

For more information about the labeling guidelines, please view the SpaceNet Buildings Dataset Labeling Guide

Sample Data

10 Samples from each AOI – Road Network Extraction Samples

To download processed 400mx400m tiles of AOI 2 (728.8 MB) with associated road centerlines for training do the following:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/train/tarballs/ SN2_buildings_train_sample.tar.gz .

Training Data

AOI 2 – Vegas – Building Extraction Training

To download processed 200mx200m tiles of AOI 2 (23 GB) with associated building footprints for training do the following:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/tarballs/SN2_buildings_train_AOI_2_Vegas.tar.gz .

AOI 3 – Paris – Building Extraction Training

To download processed 200mx200m tiles of AOI 3 (5.3 GB) with associated building footprints do the following:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/tarballs/SN2_buildings_train_AOI_3_Paris.tar.gz .

AOI 4 – Shanghai – Building Extraction Training

To download processed 200mx200m tiles of AOI 4 (23.4 GB) with associated building footprints do the following:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/tarballs/SN2_buildings_train_AOI_4_Shanghai.tar.gz .

AOI 5 – Khartoum – Building Extraction Training

To download processed 200mx200m tiles of AOI 2 (4.7 GB) with associated building footprints do the following:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/tarballs/SN2_buildings_train_AOI_5_Khartoum.tar.gz .

Testing Data

AOI 2 – Vegas – Building Extraction Testing

To download processed 200mx200m tiles of AOI 2 (7.9 GB) for testing do:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/tarballs/AOI_2_Vegas_Test_public.tar.gz .

AOI 3 – Paris – Building Extraction Testing

To download processed 400mx400m tiles of AOI 3 (1.9 GB) for testing do:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/tarballs/AOI_3_Paris_Test_public.tar.gz .

AOI 4 – Shanghai – Building Extraction Testing

To download processed 200mx200m tiles of AOI 4 (7.7 GB) for testing do:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/tarballs/AOI_4_Shanghai_Test_public.tar.gz .

AOI 5 – Khartoum – Building Extraction Testing

To download processed 200mx200m tiles of AOI 2 (1.6 GB) for testing do:

aws s3 cp s3://spacenet-dataset/spacenet/SN2_buildings/tarballs/AOI_5_Khartoum_Test_public.tar.gz .

Citation Instructions

If you are using data from SpaceNet in a paper, please use the following citation:

Van Etten, A., Lindenbaum, D., & Bacastow, T.M. (2018). SpaceNet: A Remote Sensing Dataset and Challenge Series. ArXiv, abs/1807.01232.

License

SpaceNet 1: Building Detection v1

SpaceNet 1: Building Detection v1

The Problem

The commercialization of the geospatial industry has led to an explosive amount of data being collected to characterize our changing planet. One area for innovation is the application of computer vision and deep learning to extract information from satellite imagery at scale. CosmiQ Works, Radiant Solutions and NVIDIA have partnered to release the SpaceNet data set to the public to enable developers and data scientists to work with this data.

Today, map features such as roads, building footprints, and points of interest are primarily created through manual techniques. We believe that advancing automated feature extraction techniques will serve important downstream uses of map data including humanitarian and disaster response, as observed by the need to map road networks during the response to recent flooding in Bangladesh and Hurricane Maria in Puerto Rico. Furthermore, we think that solving this challenge is an important stepping stone to unleashing the power of advanced computer vision algorithms applied to a variety of remote sensing data applications in both the public and private sector.

The Data – Over 685,000 footprints across the Five SpaceNet Areas of Interest.

AOI Area of Raster (Sq. Km) Building Labels (Polygons)
AOI_1_Rio 2,544 382,534

Catalog

The data is hosted on AWS as a Public Dataset. It is free to download, but an AWS account is required.
aws s3 ls s3://spacenet-dataset/spacenet/SN1_buildings/

Training Data

AOI 1 – Rio – Building Extraction Training

To download processed 200mx200m tiles of AOI 1 (23 GB) with associated building footprints for training do the following:

aws s3 cp s3://spacenet-dataset/spacenet/SN1_buildings/tarballs/SN1_buildings_train_AOI_1_Rio_3band.tar.gz .

aws s3 cp s3://spacenet-dataset/spacenet/SN1_buildings/tarballs/SN1_buildings_train_AOI_1_Rio_8band.tar.gz .

aws s3 cp s3://spacenet-dataset/spacenet/SN1_buildings/tarballs/SN1_buildings_train_AOI_1_Rio_geojson_buildings.tar.gz .

Testing Data

AOI 1 – Rio – Building Extraction Testing

To download processed 200mx200m tiles of AOI 1 (7.9 GB) for testing do:

aws s3 cp s3://spacenet-dataset/spacenet/SN1_buildings/tarballs/SN1_buildings_test_AOI_1_Rio_3band.tar.gz .

aws s3 cp s3://spacenet-dataset/spacenet/SN1_buildings/tarballs/SN1_buildings_test_AOI_1_Rio_8band.tar.gz .

Citation Instructions

If you are using data from SpaceNet in a paper, please use the following citation:

Van Etten, A., Lindenbaum, D., & Bacastow, T.M. (2018). SpaceNet: A Remote Sensing Dataset and Challenge Series. ArXiv, abs/1807.01232.

License

The USSOCOM Urban 3D Competition

The USSOCOM Urban 3D Challenge

The USSOCOM Urban 3D Challenge

High-resolution satellite imagery is changing our understanding of the world around us, as well as the way we as humans interact with our planet. However, raw images do little more than pique our interest unless we can superimpose a layer that actually identifies real objects. Reliable labeling of building footprints based on satellite imagery is one of the first and most challenging steps in producing accurate 3D models. While automated algorithms continue to improve, significant manual effort is still required to ensure geospatial accuracy and acceptable quality. Improved automation is required to enable more rapid response to major world events such as humanitarian and disaster response. 3D height data can help improve automated building footprint detection performance, and capabilities for providing this data on a global scale are now emerging. In this challenge, contestants used 2D and 3D imagery generated from commercial satellite imagery along with state of the art machine learning techniques to provide high quality, automated building footprint detection performance over large areas.

This challenge published a large-scale dataset containing 2D orthrorectified RGB and 3D Digital Surface Models and Digital Terrain Models generated from commercial satellite imagery covering over 360 km of terrain and containing roughly 157,000 annotated building footprints. All imagery products are provided at 50 cm ground sample distance (GSD). This unique 2D/3D large scale dataset provides researchers an opportunity to utilize machine learning techniques to further improve state of the art performance.

SpaceNet is hosting the Urban 3D Challenge dataset in the spacenet repository to ensure easy access to the data.

Related Websites

For more information about the Urban 3D Challenge competition, please visit the Urban 3D Challenge contest website.

Additional information about obtaining the open source algorithms or visualization tools can be found on the Urban 3D Challenge GitHub website.

More information can also be found at the JHU Applied Physics Laboratory Public Geospatial Data and Software website.

Reference Requirement

Please reference the following when publishing results using this data:

  1. H. Goldberg, M. Brown, and S. Wang, A Benchmark for Building Footprint Classification Using Orthorectified RGB Imagery and Digital Surface Models from Commercial Satellites, 46th Annual IEEE Applied Imagery Pattern Recognition Workshop, Washington, D.C, 2017.

BibTex:

@inproceedings{Urban3D2017,
  title={A Benchmark for Building Footprint Classification Using Orthorectified RGB Imagery and Digital Surface Models from Commercial Satellites},
  author={Goldberg, Hirsh and Brown, Myron and Wang, Sean},
  booktitle={Proceedings of IEEE Applied Imagery Pattern Recognition Workshop 2017},
  year={2017}
}

Alternatively, this dataset may be referenced as, “USSOCOM Urban 3D Challenge Benchmark Dataset”.

Please also cite SpaceNet as follows:

SpaceNet on Amazon Web Services (AWS). Datasets. The SpaceNet Catalog.  Last modified January 4, 2018.
Accessed on [Insert Date]. https://spacenetchallenge.github.io/datasets/datasetHomePage.html.

Catalog

aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/00-Orig_Source_Data/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/01-Provisional_Train/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/02-Provisional_Test/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/03-Sequestered_Test/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/04-Unused_Data/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/AOI_polygons/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/Pretrained_Models/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/LICENSE.txt

Software Code with regard to this dataset:

Available open source solutions from contest winners:

For more information on how to install and run these solutions, see the Urban 3D Challenge Github README.

The pre-trained models associated with each of the winning solution are provided alongside the data as described in the ‘Catalog’ section of this README.

Additional open source software for formatting new dataset to be used by these algorithms can be found here.

Published Papers

  1. H. Goldberg, S. Wang, M. Brown, and G. Christie. Urban 3D Challenge: Building Footprint Detection Using Orthorectified Imagery and Digital Surface Models from Commercial Satellites. In Proceedings SPIE Defense and Commercial Sensing: Geospatial Informatics and Motion Imagery Analytics VIII, Orlando, Florida, USA, 2018.

Dependencies

The AWS Command Line Interface (CLI) must be installed with an active AWS account. Configure the AWS CLI using ‘aws configure’

Questions and Comments

For questions and comments about the dataset or the open source software, please contact pubgeo(at)jhuapl(dot)edu.

License

Urban 3D Challenge Dataset License

IARPA Functional Map of the World (fMoW)

The IARPA Functional Map of the World (fMoW) Challenge

The IARPA Functional Map of the World (fMoW) Challenge

Intelligence analysts, policy makers, and first responders around the world rely on geospatial land use data to inform crucial decisions about global defense and humanitarian activities. Historically, analysts have manually identified and classified geospatial information by comparing and analyzing satellite images, but that process is time consuming and insufficient to support disaster response. The fMoW Challenge sought to foster breakthroughs in the automated analysis of overhead imagery by harnessing the collective power of the global data science and machine learning communities; empowering stakeholders to bolster their capabilities through computer vision automation. The challenge published one of the largest publicly available satellite-image datasets to date, with more than one million points of interest from around the world. The dataset also contains other elements such as temporal views, multispectral imagery, and satellite-specific metadata that researchers can exploit to build novel algorithms capable of classifying facility, building, and land use.

SpaceNet is hosting the fMoW dataset in the spacenet repository to ensure easy access to the data.

Related Websites

For more information about the IARPA competition, please visit the fMoW Challenge website.

Additional information about obtaining the data, baseline algorithm, or visualization tools can be found on the fMoW GitHub website.

Reference Requirement

Please reference the following when publishing results using this data:

  1. G. Christie, N. Fendley, J. Wilson, and R. Mukherjee. Functional Map of the World. In CVPR, 2018.

BibTeX:

@inproceedings{fmow2018,
  title={Functional Map of the World},
  author={Christie, Gordon and Fendley, Neil and Wilson, James and Mukherjee, Ryan},
  booktitle={CVPR},
  year={2018}
}

Alternatively, this dataset may be referenced as, “IARPA’s Functional Map of the World Dataset.”

Please also cite SpaceNet as follows:

SpaceNet on Amazon Web Services (AWS). “Datasets.” The SpaceNet Catalog.  Last modified January 4, 2018.
Accessed on [Insert Date]. https://spacenetchallenge.github.io/datasets/datasetHomePage.html.

Catalog

aws s3 ls s3://spacenet-dataset/Hosted-Datasets/fmow/fmow-full
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/fmow/fmow-rgb

Software Code with regard to this dataset:

Available solutions from contest winners:

Published Papers

Published results:

  1. R. Minetto, M.P. Segundo, and S. Sarkar. Hydra: an Ensemble of Convolutional Neural Networks for Geospatial Land Classification. In arXiv preprint arXiv:1802.03518, 2018.
  2. G. Christie, N. Fendley, J. Wilson, and R. Mukherjee. Functional Map of the World. In CVPR, 2018. In arXiv preprint arXiv: 1711.07846 2017
  3. M. Pritt, G. Chern. Satellite Image Classification with Deep Learning, 46th Annual IEEE Applied Imagery Pattern Recognition Workshop, Washington, D.C.

Dependencies

The AWS Command Line Interface (CLI) must be installed with an active AWS account. Configure the AWS CLI using ‘aws configure’

License

fMoW License