Skip to content

dandelany/animate-earth

Repository files navigation

animate-earth

Overview

This repository is an experiment in applying optical-flow-based motion interpolation to processed RGB satellite imagery from Himawari-8’s AHI imager, with the aim of regularly producing high-quality smoothed videos from these images, and sharing them with the world. The final products can be found on this Youtube channel, updated nearly daily.

Acknowledgements

I didn’t invent any of this stuff, I’m just putting the pieces together. The images come from the JMA’s Himawari-8 satellite, processed to true color by Steve Miller and Dan Lindsey at NOAA/CIRA/RAMMB(/CSU/NESDIS/USA…​). Motion interpolation is accomplished with the open source Butterflow tool by dthpham, which uses OpenCV’s implementation of an algorithm devised by Gunnar Farnebäck.

Purpose

The primary purpose of this project is to share the awe-inspiring images of Earth created by the Himawari-8 AHI and the NOAA/CIRA/RAMMB team; thusly, these videos are optimized for aesthetic beauty rather than scientific utility. However, it also aims to encourage discussion and experimentation around the wider use of motion interpolation in scientific visualization and image processing. As such, feel free to open a ticket in the "Issues" section if you have questions or ideas about other ways in which this technique could be applied. The videos on Youtube are licensed freely and can be used for any purpose; to obtain high resolution original files you can contact me at [email protected]. Attribution is appreciated but not required.

Process

The process implemented by these scripts, described in detail in the Scripts section below, is essentially:

  1. scrape.js - Retrieves new high-resolution true color Himawari-8 images from the CSU/RAMMB webpage and saves them locally.

  2. main.js - The main image processing pipeline, which does the following:

    1. Splits the list of images into day-long sessions.

    2. Makes cropped copies of the full-sized images for different regions & sessions.

    3. Combines the cropped images into low-framerate video files.

    4. Parses image timestamps to identify gaps between images that are > the expected 10 minute interval.

    5. Generates and runs a command for Butterflow to produce a high-framerate, motion-interpolated video which corrects for imaging gaps by interpolating additional frames from the low-framerate video.

    6. Compresses and saves the video as a local .mp4 file.

  3. youtube.js - Automatically updates the project’s Youtube channel by creating playlists for each product and uploading newly created videos via the Youtube Data API.

Setup Guide

Butterflow

The project’s main dependency is Butterflow. Butterflow supports Mac OS X, Ubuntu 15.04 (Vivid Vervet), Debian 8.2 (Jessie), and Arch Linux.

Mac OS X

The current version of Butterflow seems to be most stable/reliable and easiest to set up on OS X, if you have a recent Mac computer that supports OpenCL. An older version can be installed with Homebrew + this one-liner - however, at the time of writing, the published version of butterflow on Homebrew contains a bug which results in uninterpolated gaps between video segments. For this reason, I recommend that you install it from a more recent version of the source, unless you plan to just use butterflow without these scripts, for simple interpolation of existing videos without any missing frames to account for.

To install from source on OS X, just follow these instructions. You will need to install Homebrew first if you don’t already have it. Note that the last few commands in that list must be re-entered any time you open a new terminal session or reboot, in order to use the source version instead of the global version:

cd butterflow
source bin/activate
echo "$(brew --prefix)/lib/python2.7/site-packages" > lib/python2.7/site-packages/butterflow.pth
python setup.py develop

Ubuntu Linux

I have also gotten Butterflow installed from source on Ubuntu 15.04. It may not work on versions older than this due to lack of support for FFmpeg. At the time of writing, I am getting degraded performance on Ubuntu + a NVidia GPU (20x slower vs. comparable Mac), which others may or may not experience. I’m working with the developer to debug this. The Ubuntu install instructions are here. If you’re using a NVidia graphics card, you may also have to install these extra dependencies and perform these steps.

Other Linuxes

dthpham (Butterflow developer) also provides instructions for installation on Debian and Arch Linux, which I have not tried. Other Linux distributions may also work similarly; the main dependencies to install are Python, OpenCV, OpenCL and FFmpeg.

Playing with Butterflow

Once you have installed Butterflow, make sure it’s working by running butterflow --version and butterflow -d. Now you can slow down an existing video and interpolate it to 60 FPS by running a command like:

butterflow -s full,spd=0.5 -r 60 -o output.mp4 input.mp4

If you don’t have an mp4 lying around, you can make one from an animated gif using ffmpeg (which is already installed as a Butterflow dependency):

ffmpeg -i input.gif output.mp4

More example Butterflow usage is shown on the wiki, and additional command parameters can be discovered by running butterflow -h.

Project Scripts

The scripts require a bit of cleanup on my part, and therefore probably should not be used yet unless you know what you are doing. I will clean them up shortly and put some more instructions here when I do.

If you are the adventurous type, you can clone this repo and install dependencies by running:

git clone https://github.com/dandelany/animate-earth.git
cd animate-earth
npm install
npm install -g babel

The useful scripts are located in the pipeline directory and can be run with babel-node

cd pipeline
babel-node <filename>.js

config.js

scrape.js

main.js

youtube.js

Results

Discussion & Improvements

OPTFLOW_USE_INITIAL_FLOW + machine learning model?

Appendix: useful commands

# interpolate video with butterflow
butterflow -s full,spd=0.5 -r 60 -o output.mp4 input.mp4

# make video from frames
ffmpeg -framerate 30 -pattern_type glob -i './*.jpg' -c:v libx264 -r 30 -pix_fmt yuv420p output.mp4

# extract frames from video
ffmpeg -i input.mp4 -r 30 -f image2 img/f%3d.png

# get info about video
ffprobe input.mp4

# small animated gif from video
ffmpeg -i 30-interp.mp4 -pix_fmt rgb24 -s 320x240 output.gif

# video from animated gif
ffmpeg -i input.gif output.mp4

# side by side video comparison of two videos
# A left side vs A left side
ffmpeg -i inputA.mp4 -i inputB.mp4 -filter_complex "[0:v]setpts=PTS-STARTPTS[bg]; [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w/2" output.mp4

# A right side vs B right side
ffmpeg -i inputA.mp4 -i inputB.mp4 -filter_complex "[0:v]setpts=PTS-STARTPTS[l]; [1:v]setpts=PTS-STARTPTS[r]; [l]crop=iw/2:ih:iw/2:0[l]; [r][l]overlay=0" output.mp4

# A left side vs B right side
ffmpeg -i inputA.mp4 -i inputB.mp4 -filter_complex "[0:v]setpts=PTS-STARTPTS[l]; [1:v]setpts=PTS-STARTPTS[r]; [l]crop=iw/2:ih:0:0[l]; [r][l]overlay=0" output.mp4

About

🌏 Himawari satellite imagery motion interpolation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published