JPL Mars Yard Database
Description
JPL Mars Yard Database is built to understand terrain types from various sensors, such as RGB and IR. It has 2 datasets; 1. Semantic Dataset 1: Understanding Terrain Types from RGB and IR, 2. Virtual Sensor Dataset 1: Deriving RGB-to-IR mapping models. Both datasets were collected at JPL Mars Yard on Nov. 17th 2017, with a RGB camera (FLIR Grasshopper 5M) and a thermal camera (FLIR AX65). We collected images every 1 hour, from 10 am till 5 pm (sunset), totally 8 times. Total number of images at each time is about 52 images, by changing the position of the cameras. The dataset includes challenging images such as shadows, reflection due to the Sun, and direct sunlight into the cameras.
Files
Name | Size | Actions |
---|---|---|
md5:7369cc489f04ab4de995becad50ff01f
|
5.4 kB | Preview Download |
md5:c399ceeb88cc9760e803ad4715efafa5
|
2.7 kB | Preview Download |
md5:04e64a574ab1194e18e2867debf38a3e
|
2.7 MB | Download |
md5:f0e2bddcf186a1a5afba58fbf430d7ef
|
130.0 MB | Download |
md5:0bab7ba6f59a2422f74a78c189890018
|
9.7 kB | Download |
md5:12fd778d164ec7627208ed105c29075b
|
978.6 MB | Download |
md5:3777d1c69f740e074886cd6ad68c7f18
|
2.7 kB | Preview Download |
Series information
JPL MARS YARD DATABASE, SEMANTIC DATASET 1 Semantic Dataset 1 contains RGB, IR, and annotation images. We manually annotated all images into 7 categories, unlabeled, sand, soil, rocks, bedrock, rocky terrain, and ballast. In the annotation images, pink, brown, green, light blue, purple, and red correspond to sand, soil, rocks, bedrock, rocky terrain, and ballast, respectively. The dataset was first introduced in the MIPR 2019 paper, TU-Net and TDeepLab: Deep Learning-based Terrain Classification Robust to Illumination Changes, Combining Visible and Thermal Imagery[1]. The lists of training, validation, and test dataset are provided in the section below. In the paper [1], we evaluated the proposed approach with 3 different settings: (Exp. 1) train, evaluate, and test with the dataset at 17:00, which has no influence of the Sun, (Exp. 2) train, evaluate, and test with all dataset from 10:00 to 17:00, (Exp. 3) train and evaluate with dataset from 14:10 to 17:00, and two tests: (i) test with dataset from 10:00 to 13:00 and (ii) dataset from 14:00 to 17:00. JPL MARS YARD DATABASE, VIRTUAL SENSOR DATASET 1 Virtual Sensor Dataset 1 contains RGB and IR images. The lists of training, validation, and test dataset are provided below. The dataset was first introduced in the PBVS 2019 paper, MU-Net: Deep Learning-based Thermal IR Image Estimation from RGB Image[2]. CITATION If you make use of the JPL Mars Yard Database, Semantic Dataset 1 in any form, please do cite the following paper: [1] Y. Iwashita, Kazuto Nakashima, Adrian Stoica, Ryo Kurazume, "TU-Net and TDeepLab: Deep Learning-based Terrain Classification Robust to Illumination Changes, Combining Visible and Thermal Imagery", IEEE International Conference on Multimedia Information Processing and Retrieval (MIPR 2019), San Jose, California, USA, 2019.3.28-30, 2019. If you make use of the JPL Mars Yard Database, Virtual Sensor Dataset 1 in any form, please do cite the following paper: [2] Y. Iwashita, Kazuto Nakashima, Sir Rafol, Adrian Stoica, Ryo Kurazume, "MU-Net: Deep Learning-based Thermal IR Image Estimation from RGB Image", IEEE Workshop on Perception Beyond the Visible Spectrum (PBVS), Long Beach, CA, USA, 2019.6.16, 2019. TIPS ON IMPLEMENTATION At each pixel in the annotation images, it has numeric number which corresponds terrain type. 0: __unlabeled__ 1: sand 2: soil 3: ballast 4: rock 5: bedrock 6: rocky terrain It is encoded and can be read by python as follows: encoded = np.array(Image.open(path_to_label)) label = np.bitwise_or(np.bitwise_or( encoded[:, :, 0].astype(np.uint32), encoded[:, :, 1].astype(np.uint32) << 8), encoded[:, :, 2].astype(np.uint32) << 16)
Additional details
- CALTECHDATA_ID
- 1332