The Process
Background
Astrum is an artistic project from Inner Space in collaboration with Astrobin, one of the biggest online astrophotography communities.
We created 110 short audio-visuals with photos from the Messier Catalog, in a half-manual-half-generative way. An example:
We are also running an NFT sale to support the community and the photographers. For more details on the project, check out the home page.
We wrote this article to provide a technical description of our process. We hope you enjoy it!
The Idea
As you have seen in the main page of the project, each piece of the collection is an astrophoto
framed inside what should look like camera objective and rotating.
Meanwhile, an ambient sound is played on the background, composed by an atmospheric pad and a
"blip" sequence, generated from the star field.
In this section we want to briefly describe the process for the sound generation. More technical details about the whole pipeline can be found directly in the github repository.
The main idea was to have a sound component generated by the specific image itself. We decided then to see if we could "play" the star field, or put it in a different way: to use the pattern of stars and planets to create a sound.
After a few small prototypes, we realized that the most "spacey" effect would have come with simple blips, acting as deep space radio signals.
The Blips
In order to have a pattern to start off with, the photo of the astral object needs to be turned into playable "numbers". We used python and OpenCV to achieve this.
Let's see the flow with an example, M6:

Ideally, we would have a low frequency beeping sequence where there are fewer stars and a more
intense one where the field gets denser.
All this fitting within the 20 seconds length of the video. In order to do this, we needed to
identify the positions of the stars in the image and then distribute them in their respective
slot of an imaginary sequencer, playing the stars from left to right.
First step is to declutter the star field from the noise and "bring forward" the brightest parts of the image. This is done in two steps: first turning the image into grey scale

gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
ret, thresh_img = cv2.threshold(gray, 200, 255, cv2.THRESH_BINARY)
Result:

This version of the image can now be used with the OpenCV findContours
function to
identify the boundaries of the stars

From this is now possible to know the center position of each identified star and its area.
We have now turned an image into a list of values. The step from this to generating the sound sequence is relatively simple.
We have used a pre-synthesized blip as a starting sound and pitched it up or down to create lower pitch and higher pitch blips (we used the very nice pippi library for this). This is useful to both make the bigger stars blips "deeper" and also to create some variation between images.
Pads
Given that the pure bliping alone is not really immersive, we decided to add some atmospheric elements to the composition. For this, we resorted to the good old Ableton and some nice atmospheric pads that we selected (and edited) from our collection.This part is completely manual: one pad is played for each object. M1 and M110 have their own dedicated pads playing in C and B respectively (first and last notes of the scale in the syllabic notation).
The rest of the astral objects have been split into groups of 12 elements and each has been assigned to a different pad. 12 notes have then been played in an altered chromatic scale, thus covering the whole collection of 110 elements.
Check the final high quality result on Opensea.
Wrap up
More work has been done to make sure we could easily regenerate all videos and sounds easily in case we found better pictures or in case of copyright issues. We extensively used ffmpeg for the video composition and for the mixing/effects. We will probably write an article about the whole pipeline in a later post. Meanwhile, you can find sounds and more details inside the github repository.We hope you found this article interesting. Feel free to drop an email at
inner [dot] space [at] posteo [dot] de
for more clarifications and suggestions, or just follow/write us on twitter.
Live long and prosper,
Inner Space