Some years ago I read the Mars trilogy from Kim Stanley Robinson. I remember having a very good time reading the books. Fast-forward to some weeks ago, I read a post from Kin Lane, API guru, about a group of APIs published by NASA to open access to (some of, obviously) their data. Finally, last week I went to the movies to watch The Martian which I greatly enjoyed and gave me the final push to build something related to or inspired by Mars.
The NASA open data effort provides a well-documented set of APIs:
- Astronomy Picture of the Day, to retrieve stunning space images, with optional conceptual explanation by experts. Pretty cool, but they already mention building a bot as a use case, so nothing innovative in here for the moment.
- Asteroids - NeoWs (Near Earth Object Web Service), to explore how close asteroids can come from Earth. This one looks more promising, given there is a
is_potentially_hazardous_asteroidfield in the responses!
- Earth, to grab images taken by NASA satellites of the Earth surface. This API provides two endpoints:
- imagery: returns images from NASA satellites of the given lat/long position and as close as possible to the given date.
- assets: that lists images and dates available on the imagery endpoint for the given lat/long position.
- Mars Rovers Photos, to retrieve images taken from the different rovers that have been exploring in Mars.
- Patents, to access patents by NASA with the ability to search by keyword too.
- Sounds, to obtain sounds retrieved from space.
The NASA API requires an API key, and one can be obtained for free just registering a developer account. However, they provide a demo key
DEMO_KEY, that can be used to explore the API, and later on if your intended use of their API is not heavy. Responses of the API are plain JSON, which is very convenient.
For the moment I’ve focused on the photos from the Mars Rovers, as I wanted to develop something with more visual results. There are three rovers available: the Curiosity, the Opportunity, and the Spirit. Each of them has equipped a set of cameras to fulfill different exploration and research functions. Moreover, each rover has been taking pictures in Mars for a given period of time. Summing up, the three parameters I consider when making an API call to retrieve rover images are:
- the sol, representing the martian “day” since the rover landing. Each rover attaches to its JSON definition the maximum sol value, so I just can select a random value from 0 to the maximum sol.
- the rover name of the selected rover, so one in
- the camera of the rover to retrieve pictures from. It can be, depending on availability for each of the rovers, one in:
FHAZ: Front Hazard Avoidance Camera
RHAZ: Rear Hazard Avoidance Camera
MAST: Mast Camera
CHEMCAM: Chemistry and Camera Complex
MAHLI: Mars Hand Lens Imager
MARDI: Mars Descent Imager
NAVCAM: Navigation Camera
PANCAM: Panoramic Camera
MINITES: Miniature Thermal Emission Spectrometer (Mini-TES)
Having these photographic resources, the next step is to decide how to use them. One of the ideas I had in mind was to upload pictures as if they were memories of the rover. Similar to Polaroid pictures and possibly attaching comments or jokes of the rover in Mars. Another idea was, given the retrieved images are grayscale, to add more color and make the pictures appear more alien. So I decided to build two twitterbots instead of just one
The most relevant Python modules I’ve used are:
- requests: a more simpler, user-friendlier module to make HTTP requests, useful to consume external APIs.
- Pillow: a fork of the original Python Image Library (PIL), to process the images obtained from the NASA Martian Rovers Photos API.
To be able to store the NASA API key, I’ve modified (commit 1 and commit 2) my
credentials_pickler script to be able to store additional key-value pairs for a twitterbot account. Then a simple request to the API looks like this:
To obtain an original image, as explained above, I randomly select one of the available rovers, one of the available cameras for that rover (filtered by a subset of cameras that provide somewhat cool images) and a random sol from 0 to the maximum sol available for the selected rover:
For the Polaroid-like picture, Pillow provides basic image operations like
Image.crop for setting borders and cropping, respectively. So I used a series of operations to obtain the typical white border of a Polaroid picture:
For some reason (I have an open question on StackOverflow if someone knows how to help ) most of the times the big white border is painted black, so I had to implement a mechanism to detect whether the border is white or not and either keep working on the modified image or just keep the original image, respectively:
Finally, to complete the twitterbot, it tweets the rover that took the picture, on what date and a closing sentence, trying to make a fun remark or engaging with different hashtags or twitter users:
All in all, this twitterbot produces tweets like this:
Opportunity on 2004-03-23. Is this water? pic.twitter.com/yZ5jwRgdvp— Nostalgic Rover (@nostalgic_rover) November 13, 2015
You can check the full profile for the Nostalgic_Rover on Twitter.
For the colorized picture, Pillow provides a handy
ImageOps.colorize method which transforms a greyscale image applying a color gradient within the two colors provided as parameters. So trying to create more contrasting images, I chose the following approach, using the HSL color representation:
In this code,
color1 will be applied to the darkest parts of the image, and
color2 will be applied to the clearest parts of the image. With
hue = random.randint(0, 360) and
(hue + 180) % 360) I get a random hue for
color1 and the opposed hue for
color2. Then a random saturation between 25% and 75% (to avoid too neutral, which could get too similar, or too saturated colors, which could reduce the relative contrast between colors). Finally, for
color1 a random lightness in the “darkest” half of the spectrum and for
color2 a random lightness in the “clearest” half of the spectrum.
The second twitterbot generates tweets like this:
You can check the full profile for the Tripping_Rover on Twitter.
As usual, the source code of these twitterbots is on github.
P.S. During the writing of this post, I saw a couple of tweets with interesting material related to Mars, one fiction, one reality: