Menu

Hosted Datasets

CORE3D

Creation of Operationally Realistic 3D Environment (CORE3D)

IARPA has publicly released DigitalGlobe satellite imagery for the Creation of Operationally Realistic 3D Environment (CORE3D) program to enable performer teams to crowdsource manual labeling efforts and to promote public research that aligns well with the CORE3D program’s objectives.

​SpaceNet is hosting the CORE3D public dataset in the SpaceNet repository to ensure easy access to the data.

Reference Requirement

Please reference the following when reporting results using any of this data:

  • M. Brown, H. Goldberg, K. Foster, A. Leichtman, S. Wang, S. Hagstrom, M. Bosch, and S. Almes, LargeScale Public Lidar and Satellite Image Data Set for Urban Semantic Labeling, in Proc. SPIE Laser Radar Technology and Applications XXII, 2018.
  • Commercial satellite imagery in the CORE3D public dataset was provided courtesy of DigitalGlobe.
  • Dataset was created for the IARPA CORE3D program: https://www.iarpa.gov/index.php/research-programs/core3d.
  • SpaceNet on Amazon Web Services (AWS). Datasets. The SpaceNet Catalog. Last modified October 15, 2018. Accessed on [Insert Date]. https://spacenetchallenge.github.io/datasets/datasetHomePage.html.

Catalog

```commandline
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/CORE3D-Public-Data/
​
```
  • ​One WorldView-2 PAN and MSI image for Jacksonville, FL; Tampa, FL; Richmond, VA; and Omaha, NE
  • Tiled WorldView-2 data sets including ground truth building labels for comparison with the USSOCOM Urban 3D Challenge
  • 26 WorldView-3 PAN and MSI images over Jacksonville, FL
  • 43 WorldView-3 PAN and MSI images over Omaha, NE
  • 35 WorldView-3 PAN and MSI images over UCSD, CA
  • 44 WorldView-2 PAN and MSI images over UCSD, CAß
  • See the referenced SPIE paper for information about where to find corresponding lidar and other complementary data sets for each location
  • For images over San Fernando, Argentina for the IARPA Multi-View Stereo 3D Mapping Challenge, see https://spacenet.ai/iarpa-multi-view-stereo-3d-mapping/

Dependencies

The AWS Command Line Interface (CLI) must be installed with an active AWS account. Configure the AWS CLI using ‘aws configure’

Questions and Comments

For questions and comments about the dataset or the open source software, please contact pubgeo(at)jhuapl(dot)edu.

The USSOCOM Urban 3D Competition

The USSOCOM Urban 3D Challenge

The USSOCOM Urban 3D Challenge

High-resolution satellite imagery is changing our understanding of the world around us, as well as the way we as humans interact with our planet. However, raw images do little more than pique our interest unless we can superimpose a layer that actually identifies real objects. Reliable labeling of building footprints based on satellite imagery is one of the first and most challenging steps in producing accurate 3D models. While automated algorithms continue to improve, significant manual effort is still required to ensure geospatial accuracy and acceptable quality. Improved automation is required to enable more rapid response to major world events such as humanitarian and disaster response. 3D height data can help improve automated building footprint detection performance, and capabilities for providing this data on a global scale are now emerging. In this challenge, contestants used 2D and 3D imagery generated from commercial satellite imagery along with state of the art machine learning techniques to provide high quality, automated building footprint detection performance over large areas.

This challenge published a large-scale dataset containing 2D orthrorectified RGB and 3D Digital Surface Models and Digital Terrain Models generated from commercial satellite imagery covering over 360 km of terrain and containing roughly 157,000 annotated building footprints. All imagery products are provided at 50 cm ground sample distance (GSD). This unique 2D/3D large scale dataset provides researchers an opportunity to utilize machine learning techniques to further improve state of the art performance.

SpaceNet is hosting the Urban 3D Challenge dataset in the spacenet repository to ensure easy access to the data.

Related Websites

For more information about the Urban 3D Challenge competition, please visit the Urban 3D Challenge contest website.

Additional information about obtaining the open source algorithms or visualization tools can be found on the Urban 3D Challenge GitHub website.

More information can also be found at the JHU Applied Physics Laboratory Public Geospatial Data and Software website.

Reference Requirement

Please reference the following when publishing results using this data:

  1. H. Goldberg, M. Brown, and S. Wang, A Benchmark for Building Footprint Classification Using Orthorectified RGB Imagery and Digital Surface Models from Commercial Satellites, 46th Annual IEEE Applied Imagery Pattern Recognition Workshop, Washington, D.C, 2017.

BibTex:

@inproceedings{Urban3D2017,
  title={A Benchmark for Building Footprint Classification Using Orthorectified RGB Imagery and Digital Surface Models from Commercial Satellites},
  author={Goldberg, Hirsh and Brown, Myron and Wang, Sean},
  booktitle={Proceedings of IEEE Applied Imagery Pattern Recognition Workshop 2017},
  year={2017}
}

Alternatively, this dataset may be referenced as, “USSOCOM Urban 3D Challenge Benchmark Dataset”.

Please also cite SpaceNet as follows:

SpaceNet on Amazon Web Services (AWS). Datasets. The SpaceNet Catalog.  Last modified January 4, 2018.
Accessed on [Insert Date]. https://spacenetchallenge.github.io/datasets/datasetHomePage.html.

Catalog

aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/00-Orig_Source_Data/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/01-Provisional_Train/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/02-Provisional_Test/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/03-Sequestered_Test/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/04-Unused_Data/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/AOI_polygons/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/Pretrained_Models/
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/Urban_3D_Challenge/LICENSE.txt

Software Code with regard to this dataset:

Available open source solutions from contest winners:

For more information on how to install and run these solutions, see the Urban 3D Challenge Github README.

The pre-trained models associated with each of the winning solution are provided alongside the data as described in the ‘Catalog’ section of this README.

Additional open source software for formatting new dataset to be used by these algorithms can be found here.

Published Papers

  1. H. Goldberg, S. Wang, M. Brown, and G. Christie. Urban 3D Challenge: Building Footprint Detection Using Orthorectified Imagery and Digital Surface Models from Commercial Satellites. In Proceedings SPIE Defense and Commercial Sensing: Geospatial Informatics and Motion Imagery Analytics VIII, Orlando, Florida, USA, 2018.

Dependencies

The AWS Command Line Interface (CLI) must be installed with an active AWS account. Configure the AWS CLI using ‘aws configure’

Questions and Comments

For questions and comments about the dataset or the open source software, please contact pubgeo(at)jhuapl(dot)edu.

License

Urban 3D Challenge Dataset License

IARPA Functional Map of the World (fMoW)

The IARPA Functional Map of the World (fMoW) Challenge

The IARPA Functional Map of the World (fMoW) Challenge

Intelligence analysts, policy makers, and first responders around the world rely on geospatial land use data to inform crucial decisions about global defense and humanitarian activities. Historically, analysts have manually identified and classified geospatial information by comparing and analyzing satellite images, but that process is time consuming and insufficient to support disaster response. The fMoW Challenge sought to foster breakthroughs in the automated analysis of overhead imagery by harnessing the collective power of the global data science and machine learning communities; empowering stakeholders to bolster their capabilities through computer vision automation. The challenge published one of the largest publicly available satellite-image datasets to date, with more than one million points of interest from around the world. The dataset also contains other elements such as temporal views, multispectral imagery, and satellite-specific metadata that researchers can exploit to build novel algorithms capable of classifying facility, building, and land use.

SpaceNet is hosting the fMoW dataset in the spacenet repository to ensure easy access to the data.

Related Websites

For more information about the IARPA competition, please visit the fMoW Challenge website.

Additional information about obtaining the data, baseline algorithm, or visualization tools can be found on the fMoW GitHub website.

Reference Requirement

Please reference the following when publishing results using this data:

  1. G. Christie, N. Fendley, J. Wilson, and R. Mukherjee. Functional Map of the World. In CVPR, 2018.

BibTeX:

@inproceedings{fmow2018,
  title={Functional Map of the World},
  author={Christie, Gordon and Fendley, Neil and Wilson, James and Mukherjee, Ryan},
  booktitle={CVPR},
  year={2018}
}

Alternatively, this dataset may be referenced as, “IARPA’s Functional Map of the World Dataset.”

Please also cite SpaceNet as follows:

SpaceNet on Amazon Web Services (AWS). “Datasets.” The SpaceNet Catalog.  Last modified January 4, 2018.
Accessed on [Insert Date]. https://spacenetchallenge.github.io/datasets/datasetHomePage.html.

Catalog

aws s3 ls s3://spacenet-dataset/Hosted-Datasets/fmow/fmow-full
aws s3 ls s3://spacenet-dataset/Hosted-Datasets/fmow/fmow-rgb

Software Code with regard to this dataset:

Available solutions from contest winners:

Published Papers

Published results:

  1. R. Minetto, M.P. Segundo, and S. Sarkar. Hydra: an Ensemble of Convolutional Neural Networks for Geospatial Land Classification. In arXiv preprint arXiv:1802.03518, 2018.
  2. G. Christie, N. Fendley, J. Wilson, and R. Mukherjee. Functional Map of the World. In CVPR, 2018. In arXiv preprint arXiv: 1711.07846 2017
  3. M. Pritt, G. Chern. Satellite Image Classification with Deep Learning, 46th Annual IEEE Applied Imagery Pattern Recognition Workshop, Washington, D.C.

Dependencies

The AWS Command Line Interface (CLI) must be installed with an active AWS account. Configure the AWS CLI using ‘aws configure’

License

fMoW License

IARPA Multi-View Stereo 3D Mapping

The IARPA Multi-View Stereo 3D

IARPA Multi-View Stereo 3D Mapping Challenge

The availability of public multiple view stereo (MVS) benchmark datasets has been instrumental in enabling research to advance the state of the art in the field and to apply and customize methods to real-world problems. In this work, we provide a public benchmark data set for multiple view stereo applied to 3D outdoor scene mapping using commercial satellite imagery.

This data set includes DigitalGlobe WorldView-3 panchromatic and multispectral images of a 100 square kilometer area near San Fernando, Argentina. We also provide 20cm airborne lidar ground truth data for a 20 square kilometer subset of this area and performance analysis software to assess accuracy and completeness metrics. Commercial satellite imagery is provided courtesy of DigitalGlobe, and ground truth lidar is provided courtesy of IARPA.

This data supported the IARPA Multi-View Stereo 3D Mapping Challenge and is now made publicly available with no restrictions to support continued research. JHU/APL does not plan to maintain an online benchmark leaderboard, but we welcome your feedback and would love to hear about what you’re doing with the data and include your published results on this page.

SpaceNet is hosting the Multi-View Stereo 3D Mapping dataset in the spacenet repository to ensure easy access to the data.

Competition Websites

For more information about the IARPA Competition, Please visit the Multi-View Stereo 3D Mapping Challenge Website

For more information about the MVS benchmark please visit the JHUAPL competition webpage

Catalog

aws s3 ls s3://spacenet-dataset/Hosted-Datasets/mvs_dataset 

The catalog contains the following packages:

  • Updated metric analysis software with examples from contest winners
  • Challenge data package with instructions, cropped TIFF images, ground truth, image cropping software, and metric scoring software (1.2 GB)
  • JHU/APL example MVS solution (451 MB)
  • NITF panchromatic, multispectral, and short-wave infrared DigitalGlobe WorldView-3 satellite images (72.1 GB)
  • LAZ lidar point clouds with SBET (2.2 GB)
  • Spectral image calibration software (84 MB)

Software Code with regard to this dataset:

Available solutions from contest winners:

Published Papers

Published results:

  1. G. Facciolo, C. de Franchis, E. Meinhardt-Llopis, “Automatic 3D Reconstruction from Multi-Date Satellite Images,” IEEE International Conference on Computer Vision and Pattern Recognition, EARTHVISION Workshop, 2017.
  2. R. Qin, “Automated 3D recovery from very high resolution multi-view images,” ASPRS 2017 Annual Conference, 2017.

Dependencies

The AWS Command Line Interface (CLI) must be installed with an active AWS account. Configure the AWS CLI using ‘aws configure’