0% found this document useful (0 votes)
16 views97 pages

FFmpeg Book Edit

Uploaded by

rashes.feral0b
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views97 pages

FFmpeg Book Edit

Uploaded by

rashes.feral0b
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 97

1 Introduction to FFmpeg

FFmpeg is an open-source software and available for Linux, Windows and OS-X. It's a very powerful command line tool and has no graphic user
interface.
Main project website: www.ffmpeg.org
Download site for the Windows builds: https://www.gyan.dev/ffmpeg/builds/
Alternative download site for Windows builds: https://github.com/BtbN/FFmpeg-Builds/releases

How to install the Windows build:


Download the file "ffmpeg-git-essentials.7z", open the ZIP file, open the "bin" folder and copy ffmpeg.exe, ffprobe.exe and ffplay.exe to a new folder, for
example to c:\ffmpeg\ That's all.
In rare cases, if you need some special libraries (for example lensfun), you might need to download the "ffmpeg-git-full" version instead. But you won't
need it for most examples in this book.
An alternative download site is here: https://github.com/BtbN/FFmpeg-Builds/releases

ffmpeg.exe is the very powerful software for manipulating videos.


ffprobe.exe is a program for viewing the properties of videos, pictures or audio files. It's useful for troubleshooting.
ffplay.exe is a video player. In most cases we don't need it if we use the VLC player instead.
It's also a good idea to save the file doc\ffmpeg-all.html somewhere on your own computer. This file contains (almost) the full documentation for
FFmpeg. The most important chapters are "Audio Filters" and "Video Filters".

Additional to the official documentation, there are also two wikis available:
• https://trac.ffmpeg.org/wiki This wiki contains many useful informations. All available pages are listed here:
https://trac.ffmpeg.org/wiki/TitleIndex
• https://en.wikibooks.org/wiki/FFMPEG_An_Intermediate_Guide This wiki is very incomplete and outdated.

12
How to search in the FFmpeg-user archives:
The archives are located at http://ffmpeg.org/pipermail/ffmpeg-user/
You can use a search engine and search for example for: setpts site:http://ffmpeg.org/pipermail/ffmpeg-user/

Other websites with FFmpeg examples:


https://hhsprings.bitbucket.io/docs/programming/examples/ffmpeg/index.html#

Most examples in this document were tested with Windows 7, and beginning in March 2020 I also used Windows 10.

What are the pros and cons of FFmpeg, compared to other programs for video manipulation?
• Very powerful capabilities.
• It's an active project, updates come almost daily.
• Conversion from almost all formats to almost all other formats.
• In most cases, there are no restrictions for video size (width * height), except for extremely large sizes.
• There is a mailing list where you can ask questions in english. Before you ask, make sure that the problem is reproducible in the latest FFmpeg
version. Always include the complete FFmpeg console output, because it contains many useful informations.
• FFmpeg is a command line program and has no graphical user interface. At first glimpse this sounds like a big drawback. But it's a nice idea to
have all commands in a batch file, because later you can easily make modifications at all arbitrary steps in the workflow. Just modify the batch file
and execute it again.
• You will need some time for learning FFmpeg.
• Unfortunately the documentation is the weak point of the project, and many times I wished that the documentation contained more informations
and especially more examples.1
• It's always a good idea to begin with a working example, and then modify it step by step. I hope that the examples in this book are a good starting
point for you.

1 Why is FFmpeg's official documentation so incomplete? I think documentation has the lowest possible priority for the developers, and most of those
users who could write better documentation (including me) are unable or inwilling to work with GIT, which is the only way to make any changes to the
documentation.

13
• Unfortunately it's very complicated to compile FFmpeg for Windows, making it almost impossible to make your own changes to the code. I
haven't yet figured out how that works, after years of using FFmpeg. Compiling for Linux is easier.
• FFmpeg and DaVinci Resolve complement each other. It's best if you know and use both of them.

14
1.1 What can be done with FFmpeg?

• Convert a video, picture or sound from one format to another.


• Make a (timelapse) video from many pictures.
• Make many pictures from a video.
• Cut segments from a video, for example remove the beginning or the end.
• Add or remove audio, or modify the audio volume.
• Change the video size (width x height).
• Enlarge parts of the video or cut away the borders, for example make a rectangular video square.
• Change the speed, timelapse or slow motion.
• Rotate, mirror or flip.
• Add texts to a video.
• Correct brightness, contrast, gamma, saturation, color temperature, also with look-up-tables.
• Masking, for example superimpose a circular mask over a video.
• Fade-in, fade-out and crossfade for video and audio.
• Morphing, for example curved texts for fulldome projection, or simulation of curved spacetime near block holes.
• Stabilizing of shaky videos
• Deflicker, for reducing brightness steps in timelapse.
• Change the video compression, to make the video smaller.
• and many, many more interesting things...

15
1.2 If FFmpeg has no graphical user interface, how do we use it?

There are three possible ways:


1. Open a console window All_Programs / Accessories / Command_Promt (german) Alle_Programme / Zubehör / Eingabeaufforderung
Another way to open the console window is to press WINDOW R and then enter "cmd".
2. In the Windows File Explorer, in the address bar, you can type cmd and press enter to get a command prompt in the directory you are currently
examining.
3. But the above ways aren't recommended, because in many cases the command lines are quite long and you don't want to type the same
command line over and over again. The recommended way is to write a batch file which contains the FFmpeg command line:
• A batch file has the extension *.bat and can be created and modified with any text editor. When you save a batch file with Notepad, make sure that
you choose "all files" and save it as *.bat and don't choose "text files", because then the extension would be *.bat.txt (Hint: Configure the explorer
so that all file extensions are visible!)
• You can edit a batch file by right clicking on it, and then choose "Edit".
• You can execute a batch file by double clicking on the icon or filename.
• Once you've created a batch file, you can place either it, or a short to it, on your Windows desktop. Then you can drag-and-drop one or more
(depending on how you've designed it) media files onto the icon for processing by the batch file.
• It's recommended to begin with a working example, and then modify it step by step. Make small steps and always make a test run. If it fails, go
back to the last working version.
• The % character has a special meaning inside a batch file. If you need a one % character in the FFmpeg command line, you must replace it in the
batch file by two %% characters.
• It's recommended to insert the command "pause" at the end of the batch file. This means the batch file waits for a keypress. Without this
command, the console window would close immediately when FFmpeg has finished, and you wouldn't see if there were any error messages.
• With the command "set" you can define variables in the batch file.
• With the command "rem" you can insert comments, so that you later understand how the batch file works. Comments can also begin with :: in the
same line as a command. Everything from :: to the end of the line is a comment.
• If the command line becomes too long, you can insert a ^ character at the end of the line and continue in the next line.
• How to copy and paste the content of the console window: Right click in the title of the Command_Prompt window, Edit -> Select_All, then Edit ->

16
Copy, then paste the content with ctrl-v somewhere else.
• (german) Wenn man den Inhalt des CMD-Fensters kopieren möchte, geht man so vor: Rechtsklick auf die Titelleiste des Fensters, Bearbeiten -->
Alles auswählen, dann Bearbeiten -> Kopieren, dann mit Control-V irgendwo anders einfügen.
• If you don't want to write to full path to FFmpeg in each batch file, then you should add the path to the PATH system variable. In this article is
described how to do this: https://www.computerhope.com/issues/ch000549.htm

The following was copied from the above link:


For Windows 7:

1. From the desktop, right-click the Computer icon and select Properties. If you don't have a Computer icon on your desktop, click Start, right-click
the Computer option in the Start menu, and select Properties.
2. Click the Advanced System Settings link in the left column.
3. In the System Properties window, click the Advanced tab, then click the Environment Variables button near the bottom of that tab.
4. In the Environment Variables window (pictured below), highlight the Path variable in the System variables section and click the Edit button. Add
or modify the path lines with the paths you want the computer to access. Each different directory is separated with a semicolon.

For Windows 10:

1. From the desktop, right-click the very bottom-left corner of the screen to get the "Power User Task Menu".
2. From the Power User Task Menu, click System.
3. In the Settings window, scroll down to the Related settings section and click the System info link.
4. In the System window, click the Advanced system settings link in the left navigation pane.
5. In the System Properties window, click the Advanced tab, then click the Environment Variables button near the bottom of that tab.
6. In the Environment Variables window (pictured below), highlight the Path variable in the System variables section and click the Edit button. Add
or modify the path lines with the paths you want the computer to access. Each different directory is separated with a semicolon [...].

17
1.3 The first example
This is a simple batch file:
rem A simple batch file for making a video from many pictures

c:\ffmpeg\ffmpeg -framerate 5 -start_number 3551 -i IMG_%%4d.jpg -i birds.mp3 ^


-shortest -codec:v mpeg4 -q:v 3 out.mp4

pause :: wait for a keypress

What's the meaning of the parts?


rem A simple ... This is a comment
c:\ffmpeg\ffmpeg This is the path to ffmpeg.exe
-framerate 5 This defines how fast the pictures are read in, in this case 5 pictures per second.
-start_number 3551 This is the number of the first picture, in this case 3551
-i IMG_%%4d.jpg This is the filename of the input pictures. The term %%4d stands for a 4-digit number. The filename of the first picture is
IMG_3551.jpg and the number will be increased by 1, until no more picture is found. For 3-digit numbers you would write %%3d
instead. The double %% characters are only required in batch files, because the % character must be escaped. If you type the
command line directly in the console window, then you must use a single % character instead.
-i birds.mp3 This is the second input file, in this case an audio file.
^ If the command line becomes too long in the batch file, you can break it with the ^ character and continue in the next line.
FFmpeg will get the whole line without the ^ character.
-shortest This option means that the length of the output video is determined by the shortest of the two input files.
-codec:v mpeg4 This option means that a MPEG4 video will be produced.
-q:v 2 This is an option for the quality of the output video. 1 is best quality, 3 is normal, 31 is strongest compression.
out.mp4 Filename of the output video
pause This command waits for a keypress, so that you have a chance to see any error messages before the console window closes.
:: wait for ... Everything right of :: is a comment until the end of the line.
Important: Options are always written before the file they refer to.
The options "-framerate 5" and "-start_number 3551" refer to the first input file "IMG_%%4d.jpg". Use "IMG_%%04d.jpg" for 0-padded numbers.
The second input file "birds.mp3" doesn't have any options in this case.
The options "-shortest -codec:v mpeg4 -q:v 3" refer to the output video "out.mp4".

18
1.4 Using variables
Using variables is much better programming style. This batch file has exactly the same function as the first example:

rem A simple batch file for making a video from many pictures

set "FF=c:\ffmpeg\ffmpeg" :: Path to ffmpeg.exe


set "FR=5" :: Framerate for reaing in the pictures (Frames per second)
set "SN=3551" :: Number of the first picture
set "IN=IMG_%%4d.jpg" :: Filename of the pictures
set "AUDIO=birds.mp3" :: Audio filename
set "QU=3" :: MP4 Quality, 1 ist best quality, 3 is normal, 31 is strongest compression
set "OUT=out.mp4" :: Output filemane

%FF% -framerate %FR% -start_number %SN% -i %IN% -i %AUDIO% -shortest -codec:v mpeg4 -q:v %QU% %OUT%

pause :: wait for a keypress

This is much clearer, because each variable is written in a new line and has its own comment.
It's recommended to use capital letters for the variables, so that you can easily distinguish them from command line options.
All variable names are allowed, but don't use special characters like ÄÖÜ.
You can copy a batch file and save it under a new name for a new project. Then you must only set the variables, so that they fit to the new project. There
is no need to modify (or even understand) the command line.
Why are the variable definitions written in " " quotation marks? This is only necessary if you want to add a comment in the same line. Without comments,
the quotation marks are unnecessary.

19
2 FFmpeg in detail

2.1 Convert from one video format to another video format

Some examples for format conversion:

rem Convert any input format to any output format


ffmpeg -i anyinput.xxx anyoutput.xxx

rem Convert MP4 to mov


ffmpeg -i in.mp4 -acodec copy -vcodec copy -f mov out.mov

rem Convert mov to MP4


ffmpeg -i in.mov -acodec copy -vcodec copy out.mp4

rem Convert mov to MP4 using h265 compression, default preset is medium, default crf is 28
ffmpeg -i in.mov -c:v libx265 -preset slow -crf 25 -acodec copy out.mp4

2.2 Change the container format

If want to change only the container format from mkv to mp4, it's not necessary to re-encode the video and audio streams. These commands are very
fast:
ffmpeg -i in.mkv -vcodec copy -acodec copy out.mp4
or
ffmpeg -i in.mkv -c:v copy -c:a copy out.mp4

20
2.3 Fit timelapse length to music length

How to give a timelapse video exactly the same length as the music?
We don't want to cut off the end of the music, and we don't want to hear silence at the end of the timelapse video.
The solution is to adjust the framerate, so that the length of the timelapse becomes equal to the music length.
Framerate = Number_of_images / Time_in_seconds
In this example we have 30 images and the music is 20 seconds long, so that the framerate must be 1.5.
rem A simple batch file for combining many images to a video

set "FR=1.5" :: Framerate for reading in the images (frames per second)
set "RATE=30" :: Output framerate
set "SN=3551" :: Number of the first image
set "IN=IMG_%%4d.jpg" :: Filename of the images
set "AUDIO=birds.mp3" :: Audio filename
set "QU=3" :: MP4 Quality, 1 is best Quality, 3 is normal, 31 is strongest compression
set "OUT=out.mp4" :: Output file

ffmpeg -framerate %FR% -start_number %SN% -i %IN% -i %AUDIO% -r %RATE% -shortest -codec:v mpeg4 -q:v %QU% %OUT%

pause :: Wait for a keypress

In this example we have two different framerates, which have different purpose:
• -framerate %FR% this is the framerate for reading in the images
• -r %RATE% this is the framerate of the output video.
These two framerates are totally independent from each other, and can be different. If the images are read in slower than the output framerate, FFmpeg
will automatically duplicate images. If the images are read in faster, then FFmpeg will automatically skip images.

21
2.4 Timelapse or slideshow from many images, with crossfading

rem Make a timelapse or slideshow from many images, with crossfading

set "RATE=30" :: Output framerate


set "SN=3551" :: Number of first image
set "IN=IMG_%%4d.jpg" :: Filename of the images
set "W=2000" :: Image width
set "H=1500" :: Image height
set "QU=3" :: MP4 Quality, 1 is best Quality, 3 is normal, 31 is strongest compression
set "OUT=out.mp4" :: Output file
:: A is the duration how long each image is shown (without crossfading), here 1.0 sec
:: B is the duration of the crossfade, here 0.5 sec
set "C=3" :: set C = (A+B)/B (you must calculate this integer manually)
set "D=2" :: set D = 1/B (you must calculate this floating point value manually)

ffmpeg -start_number %SN% -i %IN% ^


-vf zoompan=d=%C%:fps=%D%:s=%W%x%H%,framerate=fps=%RATE%:interp_start=0:interp_end=255:scene=100 ^
-codec:v mpeg4 -q:v %QU% %OUT%

pause :: Wait for a keypress

Inside the video filter (beginning with -vf) we have in this example two filters, which are applied one after the other. The first is "zoompan" and the
second is "framerate".
You must calculate the variables C and D manually, because there are no expressions allowed inside the "zoompan" filter.

Detailed explanations for this filter chain:


-vf zoompan=d=%C%:fps=%D%:s=%W%x%H%,framerate=%RATE%:interp_start=0:interp_end=255:scene=100

In this filter chain two video filters are applied consevitively, separated by a (,) comma.
1. "zoompan", with the parameters "d" , "fps" and "s"
2. "framerate", with the parameters "fps", "interp_start", "interp_end", and "scene"

22
https://www.ffmpeg.org/ffmpeg-all.html#zoompan
The zoompan filter is here not used for zooming in, but for duplicating the frames and passing them to the next filter with a certain framerate.
"d" specifies how often each frame is repeated.
"fps" is the output framerate of this filter.
"s" is the size of the output frames. It must be specified in most cases, because the default is 1280x720.

https://www.ffmpeg.org/ffmpeg-all.html#framerate
The framerate filter can calculate intermediate images between consecutive images. This is not a motion interpolation but a crossfade.
"fps" is the output framerate. It's not required to explicitely write this parameter; you could also write framerate=fps=%RATE%:...
The remaining three parameters "interp_start", "interp_end", and "scene" specify, when interpolation is active and when not. With those values that I
used (0, 255, 100), interpolation is always active.

These two filters together produce a video in which each image is shown for a certain duration, followed by a crossfade to the next image which also has
a certain duration. Both durations can be choosen freely, these are the values A and B in the comments. From these values you must manually calculate
the variables C and D, which are used in the command line. I haven't yet found a way to make this calculation automatically. It's possible to make
calculations in the batch file, but this works only with integer precision.
If you omit the zoompan filter and use only the framerate filter, the next crossfade would immediately follow when the previous has ended. With other
words: You always have a crossfade and there is no time where the image is shown without crossfade. That's why we use the trick with the zoompan
filter. Now it's still the case that one crossfade follows immediately on the prevoius one, but now we have crosssfades between identical images,
because the images were duplicated by the zoompan filter. A crossfade between identical images isn't visible, of course.

How to repeat the frames, so that the length fits to the music length:
Add -loop 1 and -shortest

23
2.5 Slideshow with different durations

ffmpeg -i img%4d.jpg -vf zoompan=d=25+'50*eq(in,3)'+'100*eq(in,5)' out.mp4

pause

In this example each frame is shown one second (25 frames), except the 4th image which is shown 3 seconds (25+50 frames) and the 6th image which is
shown 5 seconds (25+100 frames). Please note that the image numbering starts with 0, if not specififed differently with "-start_number".
Please note that it might also be useful to specify the size of the output frames with the "s" option, because the default size is 1280x720.

It's also possible to do the same thing with the concat demuxer. Make a text file with this content:
file '/path/to/dog.png'
duration 5
file '/path/to/cat.png'
duration 1
file '/path/to/rat.png'
duration 3
file '/path/to/tapeworm.png'
duration 2
file '/path/to/tapeworm.png'
Note: The last image has to be specified twice, the second one without any duration.
Then use this command line:
ffmpeg -f concat -i input.txt -vsync vfr -pix_fmt yuv420p output.mp4

pause

See also: https://trac.ffmpeg.org/wiki/Slideshow

24
2.6 Slideshow with scrolling images

Images scrolling from left to right:


set "IN=test%%3d.jpg" :: Input images
set "N=6" :: Number of images
set "SX=400" :: X Size
set "SY=300" :: Y Size
set "T=5" :: Time in seconds for scrolling from one image to the next image
set "FPS=30" :: Output framerate
set "OUT=out.mp4" :: Output filename

rem Make some test images

ffmpeg -f lavfi -i testsrc2=size=%SX%x%SY%:duration=%N%:rate=1 -start_number 0 -y test%%3d.jpg

rem Make a scrolling slideshow

ffmpeg -framerate 1/%T% -start_number 1 -i %IN% -framerate 1/%T% -start_number 0 -i %IN% -filter_complex [0]
[1]hstack,fps=%FPS%,crop=w=iw/2:x='iw/2*(1-mod(t,%T%)/%T%)' -y %OUT%

pause

Images scrolling from right to left:


ffmpeg -framerate 1/%T% -start_number 0 -i %IN% -framerate 1/%T% -start_number 1 -i %IN% -filter_complex [0]
[1]hstack,fps=%FPS%,crop=w=iw/2:x='iw/2*mod(t,%T%)/%T%' -y %OUT%

pause

25
Images scrolling from top to bottom:
ffmpeg -framerate 1/%T% -start_number 1 -i %IN% -framerate 1/%T% -start_number 0 -i %IN% -filter_complex [0]
[1]vstack,fps=%FPS%,crop=h=ih/2:y='ih/2*(1-mod(t,%T%)/%T%)' -y %OUT%

pause

Images scrolling from bottom to top:


ffmpeg -framerate 1/%T% -start_number 0 -i %IN% -framerate 1/%T% -start_number 1 -i %IN% -filter_complex [0]
[1]vstack,fps=%FPS%,crop=h=ih/2:y='ih/2*mod(t,%T%)/%T%' -y %OUT%

pause

This is similar, but now showing two images simultaneously side by side. The width of the output video is twice the width of the input images:
set "IN=test%%3d.jpg" :: Input images
set "N=6" :: Number of images
set "SX=400" :: X Size
set "SY=300" :: Y Size
set "T=5" :: Time in seconds for scrolling from one image to the next image
set /a "D=%T%*(%N%-2)" :: Total duration in seconds
set "FPS=30" :: Output framerate
set "OUT=out.mp4" :: Output filename

rem Make some test images

ffmpeg -f lavfi -i testsrc2=size=%SX%x%SY%:duration=%N%:rate=1 -start_number 0 -y test%%3d.jpg

rem Make a scrolling slideshow

ffmpeg -framerate 1/%T% -start_number 2 -i %IN% -framerate 1/%T% -start_number 1 -i %IN% -framerate 1/%T% -start_number
0 -i %IN% -filter_complex [0][1][2]hstack=inputs=3,fps=%FPS%,crop=w=2*iw/3:x='iw/3*(1-mod(t,%T%)/%T%)' -t %D% -y %OUT%

pause

Note: "set /a" is a Windows batch command and calculates a variable (in this case: the total duration of the output video). Only integer arithmetic is
possible, no floating point. This is necessary in this batch file, because the "-t" option doesn't accept expressions, and using the "trim" filter as a

26
workaround is also impossible, because it doen's accept expressions.

Same thing as before, but scrolling from right to left:


ffmpeg -framerate 1/%T% -start_number 0 -i %IN% -framerate 1/%T% -start_number 1 -i %IN% -framerate 1/%T% -start_number
2 -i %IN% -filter_complex [0][1][2]hstack=inputs=3,fps=%FPS%,crop=w=2*iw/3:x='iw/3*mod(t,%T%)/%T%' -t %D% -y %OUT%

pause

Same thing as before, but scrolling from top to bottom:


ffmpeg -framerate 1/%T% -start_number 2 -i %IN% -framerate 1/%T% -start_number 1 -i %IN% -framerate 1/%T% -start_number
0 -i %IN% -filter_complex [0][1][2]vstack=inputs=3,fps=%FPS%,crop=h=2*ih/3:y='ih/3*(1-mod(t,%T%)/%T%)' -t %D% -y %OUT%

pause

Same thing as before, but scrolling from bottom to top:


ffmpeg -framerate 1/%T% -start_number 0 -i %IN% -framerate 1/%T% -start_number 1 -i %IN% -framerate 1/%T% -start_number
2 -i %IN% -filter_complex [0][1][2]vstack=inputs=3,fps=%FPS%,crop=h=2*ih/3:y='ih/3*mod(t,%T%)/%T%' -t %D% -y %OUT%

pause

27
2.7 Extract many images from a video

rem Extract many images from a video

ffmpeg -i in.mp4 -vf fps=0.2 -y image%%4d.jpg

pause :: Wait for a keypress

This batch file reads the file in.mp4 and produces images with the filenames
image0000.jpg, image0001.jpg, image0002.jpg, and so on.
-vf fps=0.2 this specifies that images are extracted with a framerate of 0.2, which means one frame every 5 seconds.
Omit this option if you want to extract all images.
-y this option means that FFmpeg will overwrite any output files that already exist with the same filename, without asking. If you omit this option,
FFmpeg would ask before overwriting a file.

This example extracts each n_th frame from a video, beginning with the 0_th frame:
set "IN=video.mp4" :: Input video
set "STEP=10" :: Step width
set "OUT=image%%4d.jpg" :: Output images filename

ffmpeg -i %IN% -vf framestep=%STEP% -y %OUT%

pause

28
2.8 Extract the n_th frame from a video

rem Make a test video

ffmpeg -f lavfi -i testsrc2=size=vga -t 3 -y test.mov

rem Extract the N_th frame

set "N=10" :: Frame number to be extracted (assuming the first frame is number 1)

ffmpeg -i test.mov -vf select='eq(n,%N%-1)' -frames 1 -y out.png

pause

Note: The frame numbers begin with 0, that's why you must compare n with 9, if you want the 10_th frame.

2.9 Extract the first and last frame from a video

rem Extract the first frame


ffmpeg -i in.mp4 -frames 1 -y first_frame.jpg

rem Extract the last frame


ffmpeg -sseof -0.2 -i in.mp4 -update 1 -y last_frame.jpg

pause

29
2.10 Modify brightness, contrast, saturation, gamma and hue

rem Modify brightness, contrast, saturation, gamma and hue

set "INPUT=PanoView.mp4" :: Input video


set "OUTPUT=out.mp4" :: Output video
set "CONTRAST=1.0" :: Contrast in range -1000 to 1000, normal is 1.0
set "BRIGHT=0.0" :: Brightness in range -1.0 bis 1.0, normal is 0.0
set "SATUR=1.2" :: Saturation in range 0.0 bis 3.0, normal is 1.0
set "GAMMA=1.0" :: Gamma in range 0.1 to 10.0, normal is 1.0
set "HUE=20" :: Color correction (hue), negative shifts towards red and positive towards blue, normal is 0
:: Typical values are in the -30...+30 range
set "QU=3" :: MP4 Quality, 1 is best Quality, 3 is normal, 31 is strongest compression

ffmpeg -i %INPUT% -vf hue=h=%HUE%,eq=contrast=%CONTRAST%:brightness=%BRIGHT%:saturation=%SATUR%:gamma=%GAMMA% ^


-q:v %QU% -codec:v mpeg4 %OUTPUT%

pause

-vf is the command for "Video Filter". There are many different filters, see chapter "Video Filter" in the FFmpeg documentation.
In this case we use two filters, which are separated by a (,) comma.
• The first filter is "hue" and makes a rotation of the color circle.
• The second filter is "eq" and adjusts contrast, brightness, saturation and gamma.
From a mathematically point of view these functions work as follows:
• Contrast is a multiplication by a constant. Please note that what contrast does is scale the distance of a pixel's value from the median value i.e.
128 for a 8-bit input. So, if a pixel channel has a value of 100, then a contrast of 3 results in a value of 128 + 3*(100-128) = 44.
• Brightness is the addition of a constant.
• Saturation is difficult to describe mathematically. Setting saturation to 0 would produce a black and white video.
Warning: De-saturating a video with eq=saturation=0 produces a greenish tint in the output video. It's better to use hue=s=0.
• Gamma is a nonlinear distortion of the transfer function. When you increase the gamma value, details in dark areas become better visible.
It doesn't care in which order you write the parameters in the command line. They are always executed in the order contrast, brightness, gamma.

30
It's also possible to make a video monochrome by setting the saturation to 0, for example with the "eq" filter.

ffmpeg -i in.mp4 -lavfi eq=saturation=0 -y out.mp4

pause

This is a very simple method for converting a YUV color video to a monochrome video. The Y plane is extracted:
ffmpeg -i in.mp4 -lavfi extractplanes=y -y out.mp4

pause

2.24 Colorize a monochrome image

rem Make a monochrome image:

ffmpeg -f lavfi -i testsrc2,format=gray -frames 1 -y test.png

rem Colorize this image with orange color:

ffmpeg -i test.png -vf geq=r='1.0*p(X,Y)':g='0.5*p(X,Y)':b='0.0*p(X,Y)' -frames 1 -y out.png

pause

The same thing can also be done with the "colorchannelmixer" filter:
ffmpeg -i test.png -vf colorchannelmixer=rr=1.0:gr=0.5:br=0:gg=0:bb=0 -frames 1 -y out.png

pause

57
2.34 Colormap filter

The colormap filter can be used for correcting the colors, so that they fit to the colors of the ColorChecker:
rem Make the reference image:

ffmpeg -f lavfi -i colorchart=preset=reference -frames 1 -y reference.png

rem Now edit the reference image in image processing software. Change brightness,
rem contrast, gamma, hue and saturation.
rem Draw some additional lines in the image, but keep the centers of the patches unaffected.
rem Save the image as "modified.png"

rem Now try to reconstruct the original colors from the modified image:

ffmpeg -i modified.png -i modified.png -i reference.png -lavfi colormap=nb_patches=24 -frames 1 -y out.png

rem Show the difference between the two images. It should be a uniform gray surface,
rem if all colors have been reconstructed successfully. The additional lines must be visible.

ffmpeg -i out.png -i reference.png -lavfi blend=all_mode=grainextract -y diff.png

pause

Note: It's possible that "colormap" fails without error message, for example if two colors in the "source" image are identical, while the corresponding
colors in the target image are different. It's impossible to map one source color to two different target colors. In this case the output of the "colormap"
filter is a copy of the first input.

Note: The second (source) and third (target) input of the colormap filter must have the same size. The first input can have any size.

Note: The "colormap" filter picks only the central pixels from the patches. There is no averaging done in this filter. That means some preprocessing is
required for images of real-world colorcheckers, especially geometric transformation (for example with "perspective" filter) and averaging with a blurring
filter. See next example.

Note: The default value of the "patch_size" option is 64x64, which is also the default patch size in the colorchart source. So it's not necessary to specify

76
them.

Note: The option "nb_patches" is the number of patches that are used. By default it equals the number of available patches. If "nb_patches" is smaller
than the number of available patches, then the patches are used line-wise from left to right, beginning in the top left corner.

This is a real-world example for color mapping. Geometric transformation is applied to the image of the ColorChecker, and then the image is scaled to the
same size as the reference image. It is also blurred to improve color accuracy. Finally the colors of the input image are mapped to the correct output
colors.
rem Take a picture of the ColorChecker and measure the corner coordinates

set "X0=1656" :: Top left corner (dark skin)


set "Y0=691"
set "X1=4116" :: Top right corner (bluish green)
set "Y1=1226"
set "X2=1269" :: Bottom left corner (white)
set "Y2=2316"
set "X3=3796" :: Bottom right corner (black)
set "Y3=2870"

rem Apply geometric transformation, scale to 384x256 and use avgblur for averaging:

ffmpeg -i image.jpg -lavfi format=argb,perspective=x0=%X0%:y0=%Y0%:x1=%X1%:y1=%Y1%:x2=%X2%:y2=%Y2%:x3=%X3%:y3=


%Y3%,scale=384x256,avgblur=10 -y source_colors.png

rem Make the reference image (this is the perfect ColorChecker image):

ffmpeg -f lavfi -i colorchart=preset=reference -frames 1 -y reference.png

rem Now reconstruct the true colors of the input image:

ffmpeg -i image.jpg -i source_colors.png -i reference.png -lavfi colormap=nb_patches=24 -frames 1 -y out.png

rem Calculate the difference between output and input:

ffmpeg -i out.png -i image.jpg -lavfi blend=all_mode=grainextract -y diff.png

pause

77
This is the input image with wrong colors because of fluorescent lamp. This is the extracted "source_colors.png" image.

This is the corrected output image. This is the "reference.png" image which contains the target colors.

Note: It's possible to speed up filtering by using the "haldclutsrc" source and "haldclut" filter. First apply the "colormap" filter to a "haldclutsrc" image,
and then filter the image or video with the "haldclut" filter.

78
2.37 Amplify filter

The "amplify" filter amplifies differences between adjacent frames. Good for motion detection, but it's also sensitive to noise.

2.38 Sharpen or blur images

Images or videos can be blurred or sharpened with the "unsharp" filter:


ffmpeg -i 7z7a1256.jpg -vf unsharp=la=2 -y out.jpg

pause

These are the parameters of the "unsharp" filter:

Parameter Default value Description


lx 5 Luma matrix horizontal size, it must be an odd integer between 3 and 23.
ly 5 Luma matrix vertical size, it must be an odd integer between 3 and 23.
la 1 Luma effect strength, it must be a floating point number, reasonable values are between -1.5 and 1.5.
Negative values will blur the input video, while positive values will sharpen it, a value of zero will disable the effect.
cx 5 Chroma matrix horizontal size, it must be an odd integer between 3 and 23.
cy 5 Chroma matrix vertical size, it must be an odd integer between 3 and 23.
ca 0 Chroma effect strength, it must be a floating point number, reasonable values are between -1.5 and 1.5.
Negative values will blur the input video, while positive values will sharpen it, a value of zero will disable the effect.

82
For blurring you can also use these filters:
• "dblur" for directional blur (any directions are possible)
• "gblur" for gaussian blur (circular, elliptical, horizontal or vertical)
• "avgblur" for average blur (horizontal and vertical box size can be set independently, but not smaller than 3x3)
• "convolution" filter (size from 3x3 to 7x7)
• "fftfilt" filter
• "removegrain" filter with mode=20 (only 3x3 averaging)
• "sab" filter (Shape Adaptive Blur)
• "smartblur" filter (This filter doesn't impact the outlines)
• "yaepblur" filter ("yet another edge preserving blur filter")

Brightness distributiuon (resulting from a single white pixel in the center):

Pixels from center: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19


dblur=angle=0:radius=5 25 20 17 13 11 9 8 5 4 4 3 2 2 2 1 1 1 1 1 0
gblur=sigma=5:sigmaV=0 36 26 19 15 11 8 6 4 3 2 2 1 1 1 1 0 0 0 0 0
avgblur=sizeX=5:sizeY=1 6 6 6 6 6 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Command line for testing with a single white pixel:


ffmpeg -f lavfi -i color=black:s=50x50 -vf drawbox=w=1:h=1:x=25:y=25:color=white,avgblur=sizeX=5:sizeY=1 -frames 1 -y
out.png

pause

83
For variable blur you can use the "varblur" filter:
rem Create a circular mask:

ffmpeg -f lavfi -i nullsrc=size=vga -lavfi format=gray8,geq='25*lt(hypot(X-W/2,Y-H/2),200)' -frames 1 -y mask.png

rem Apply variable blurring:

ffmpeg -f lavfi -i testsrc2=size=vga -i mask.png -lavfi [0]format=gbrp[a];[a][1]varblur=max_r=25 -t 10 -y out.mp4

pause

84
2.40 Extract a time segment from a video

When you have a fisheye camera pointing upwards, it's unavoidable that you are visible in the video at the beginning and the end, because you must
start and stop the camera. That means we must cut off the beginning and the end.
rem Extract a time segment from a video

set "INPUT=PanoView.mp4" :: Input video


set "OUTPUT=out.mp4" :: Output video
set "START=2.0" :: Start time in seconds
set "LENGTH=3.0" :: Length of the segment in seconds

ffmpeg -ss %START% -t %LENGTH% -i %INPUT% -c copy %OUTPUT%

pause

The arguments for -ss and -t can also be specified in hours, minutes and seconds:
1:20 = 1 minute, 20 seconds
1:10:30 = 1 hour, 10 minutes, 30 seconds

Instead of the length it's also possible to specify the end time with the -to option.

If you want to save the output video with exactly the same quality as the input video (without re-encoding), then use the -c copy option. In this case it
makes no sense to specify the output video quality.
ffmpeg -ss 5 -i input.mov -t 10 -c copy output.mov

pause

The same thing can also be done with the "trim" filter.
For more informations about seeking, see also https://trac.ffmpeg.org/wiki/Seeking

92
Note: If -ss is written before the input file, the cut will be at the nearest keyframe but not at the accurate time. However if -ss is written after the input file,
the cut will be at the accurate time but there may be an empty part at the beginning of the file, until the next keyframe.

2.41 Remove a segment from a video

rem Make a 20s test video:


ffmpeg -f lavfi -i testsrc2 -t 20 -y test.mp4

rem Remove the segment from t>5 to t<10


ffmpeg -i test.mp4 -vf select='between(t,0,5)+between(t,10,20)',setpts=N/(FRAME_RATE*TB) -y out.mp4

rem This is (almost) the same, remove the segment from t=5 to t=10:
ffmpeg -i test.mp4 -vf select='not(between(t,5,10))',setpts=N/(FRAME_RATE*TB) -y out.mp4

pause

93
2.42 Trim filter

Drop everything except the second minute of input:


ffmpeg -i in.mp4 -vf trim=60:120 out.mp4

pause

Keep only the first second:


ffmpeg -i in.mp4 -vf trim=duration=1 out.mp4

pause

See also https://transang.me/practical-ffmpeg-commands-to-manipulate-a-video/

94
2.43 Tpad filter, add a few seconds black at the beginning or end

Method 1, using the "tpad" filter:


set "IN=my_video.mp4" :: Input video
set "DUR=3" :: Duration in seconds
set "OUT=out.mp4" :: Output video

ffmpeg -i %IN% -vf tpad=start_duration=%DUR% %OUT%

pause

The "tpad" filter inserts frames at the beginning or at the end of a video. These frames contain either a uniform color or a copy of the first or last frame.
The default color is black.

Method 2, using the concat filter:


set "IN=my_video.mp4" :: Input video
set "DUR=3" :: Duration in seconds
set "OUT=out.mp4" :: Output video

ffmpeg -i %IN% -an -filter_complex 'color=black:duration=%DUR%[black];[black][0:0]concat=n=2:v=1:a=0[v]' -map [v] %OUT%

pause

95
2.44 Extract the last 30 seconds of a video

When I make real-time videos of meteors, I let the Panasonic LUMIX GH5S camera record continuously. When I see a meteor, I speak to the soundtrack in
which part of the sky I've seen it, and after about 10 seconds I press the REC button to stop the recording, and immediately start a new recording. That
means after downloading the videos to the computer, meteors are always at the end of the videos. There is no need to watch the videos in full length
(that would be boring). This batch file extracts the last 30 seconds of the video which is drag-and-dropped over it, and for the output filename the string
"P1" is replaced by "CUT" (e.g. P1000336.MOV becomes CUT000336.MOV). It's lossless because the "-c copy" option is used.
set INPUT=%1
set OUTPUT=%INPUT:P1=CUT%

ffmpeg -sseof -30 -i %INPUT% -c copy %OUTPUT%

pause

This batch file (for Windows 7) does the same thing for all P1*.MOV files in the current folder:
for %%f in (P1*.MOV) do call :for_body %%f
goto :the_end

:for_body
set INPUT=%1
set OUTPUT=%INPUT:P1=CUT%
ffmpeg -sseof -30 -i %INPUT% -c copy -y %OUTPUT%
exit /b

:the_end

pause

96
2.45 Fade-in and fade-out

Fade-in and fade-out for a video of known length (only for video, not for audio). Here the times are expressed in frames:
ffmpeg -i input.mp4 -vf 'fade=in:0:30,fade=out:9650:30' output.mp4

pause

Fade-in and fade-out of a video of known length (both video and audio). Here the times are in seconds:
ffmpeg -i input.mp4 -vf 'fade=in:st=0:f=1,fade=out:st=32:d=1' -af 'afade=in:st=0:d=1,afade=out:st=32:d=1' output.mp4

pause

This is a workaround for fade in/out a video with unknown duration:


ffmpeg -i input.mp4 -sseof -1 -copyts -i input.mp4 -filter_complex
"[1]fade=out:0:30[t];[0][t]overlay,fade=in:0:30[v]; anullsrc,atrim=0:2[at];[0][at]acrossfade=d=1,afade=d=1[a]"
-map "[v]" -map "[a]" -c:v libx264 -crf 22 -preset veryfast -shortest output.mp4

pause

The trick is to feed the same input twice. From the second input only the last second is used. The timestamps are preserved. A fade-out is applied to the
short second input, and then both files are combined with overlay. For audio a 2 seconds dummy with silence is created, and then crossfaded with the
input audio. The -shortest option cuts the output to the same length as the input.

Another workaround for making fade-in and fade-out for audio of unknown length:
ffmpeg -i input.mp4 -filter_complex "afade=d=0.5, areverse, afade=d=0.5, areverse" output.mp4

pause

97
The same thing does also work for video, but keep in mind that you need a lot of memory for the reverse filter:
ffmpeg -i input.mp4 -filter_complex "fade=d=0.5, reverse, fade=d=0.5, reverse" output.mp4

pause

Another option is to use acrossfade with a silent track, but this works not for video because there is no crossfade filter for video:
ffmpeg -i input.mp4 -filter_complex "aevalsrc=0:d=0.6 [a_silence]; [0:a:0] [a_silence] acrossfade=d=0.6" output.mp4

pause

Afade curves are shown on this wiki page: https://trac.ffmpeg.org/wiki/AfadeCurves

2.46 Crossfading

The different types of xfade crossfadings are shown on this wiki page:
https://trac.ffmpeg.org/wiki/Xfade

Both inputs must be constant frame-rate and have the same resolution, pixel format, framerate and timebase.

98
2.47 Crop a video

Cropping means to cut off the borders, and in the next step you can also set the size (width * height) of the output video:

rem Crop and set the output size

set "INPUT=PanoView.mp4" :: Input video


set "OUTPUT=out.mp4" :: Output video
set "CROP=1224:1224:0:0" :: Specify the visible part: Width, height, left edge, top edge
set "SIZE=800x800" :: Width and height of the output video (can be smaller or greater than the input video)
:: Keep the width/height ratio constant, otherwise the video looks distorted,
:: for example a circle would become an ellipse.
set "QU=3" :: MP4 Quality, 1 is best Quality, 3 is normal, 31 is strongest compression

ffmpeg -i %INPUT% -vf crop=%CROP% -s %SIZE% -q:v %QU% -codec:v mpeg4 %OUTPUT%

pause

In the crop filter you can use the variables "iw" and "ih", which are the width and height of the input video.
If the 3rd and 4th parameter (coordinates of top left corner) isn't specified, the crop will be automatically centered.

crop=ih:ih makes a centered square crop, useful for fulldome videos


crop=iw/2:ih:0 returns the left half of the input video
crop=iw/2:ih:iw/2 returns the right half of the input video
crop=iw/4:ih/4 strong enlargement by a factor 4 in the center of the video

The "pad" filter does the opposite thing, it adds paddings with a uniform color to the video. See next chapter.

99
2.50 Changing the speed: slow motion and timelapse

rem Changing the speed (slow motion ot timelapse)

set "INPUT=PanoView.mp4" :: Input video


set "OUTPUT=out.mp4" :: Output video
set "RATE=30" :: Output framerate
set "SPEED=3.0" :: Speed factor, smaller than 1 = timelapse, 1 = real time, greater than 1 = slow motion
set "QU=3" :: MP4 Quality, 1 is best Quality, 3 is normal, 31 is strongest compression

ffmpeg -i %INPUT% -vf setpts=%SPEED%*PTS -r %RATE% -q:v %QU% -codec:v mpeg4 -an -y %OUTPUT%

pause

In this example the settings for "RATE" and "SPEED" are totally independent of each other. FFmpeg will automatically skip or duplicate frames, if
required.
Example: If both input and output frame rate are 30, and if SPEED = 3, then each frame will automatically duplicated 2 times, so that we see it 3 times in
the output video. If SPEED = 0.5, then each second frame is skipped.
In this example the slow motion or timelapse effect affects only video and not audio. It makes sense to disable the audio channel with the -an option.
The "setpts" filter is described in the "Multimedia Filters" section in the FFmpeg documentation.
The timebase (TB in setpts filter) is expressed in seconds [s].
The framerate (FR in setpts filter) is expressed in 1/seconds [s^-1]
In many cases the timebase is the reciprocal of the framerate, but this isn't always the case.

101
Some more examples:
setpts=0.5*PTS Double speed
setpts=2.0*PTS Half speed
setpts=PTS+x/(FR*TB) or tpad=x Delay by x frames (assuming the framerate is constant)
setpts=PTS+x/TB or tpad=x/framerate Delay by x seconds
setpts=PTS-STARTPTS Start counting PTS from zero
setpts=N/(FRAME_RATE*TB) Set new presentation timestamps, independant of input PTS
asetpts=N/(SR*TB)

See also these Wiki pages:


https://trac.ffmpeg.org/wiki/How%20to%20speed%20up%20/%20slow%20down%20a%20video
https://trac.ffmpeg.org/wiki/ChangingFrameRate

2.51 Slow motion or timelapse only for a segment of the video

See the comments for explanation.


set "IN=7Z7A2089.mov" :: Input Video
set "T1=5" :: Start time T1
set "T2=8.5" :: Time T2 when slow motion begins
set "T3=9.7" :: Time T3 when slow motion ends
set "T4=11" :: End time T4
set "SPEED=5" :: Speed factor, smaller than 1 = timelapse, greater than 1 = slow motion
set "FR=30" :: Output framerate
set "OUT=out.mp4" :: Output video

ffmpeg -i %IN% -filter_complex "[0:v]trim=%T1%:%T2%,setpts=PTS-STARTPTS[v1];[0:v]trim=%T2%:%T3%,setpts=%SPEED%*(PTS-


STARTPTS)[v2];[0:v]trim=%T3%:%T4%,setpts=PTS-STARTPTS[v3];[v1][v2][v3]concat=n=3:v=1" -an -r %FR% -q:v 2 -y out.mp4

pause

102
2.52 Time Remapping

This is an example for a gradual ramp into and out of slow motion:
ffmpeg -f lavfi -i testsrc2=size=vga:duration=10:rate=20 -lavfi "^
[0]trim=0.0:3.2,setpts=(PTS-STARTPTS)[1];^
[0]trim=3.2:3.6,setpts=(PTS-STARTPTS)/0.80[2];^
[0]trim=3.6:4.0,setpts=(PTS-STARTPTS)/0.60[3];^
[0]trim=4.0:6.0,setpts=(PTS-STARTPTS)/0.40[4];^
[0]trim=6.0:6.4,setpts=(PTS-STARTPTS)/0.60[5];^
[0]trim=6.4:6.8,setpts=(PTS-STARTPTS)/0.80[6];^
[0]trim=6.8:10.0,setpts=(PTS-STARTPTS)[7];^
[1][2][3][4][5][6][7]concat=n=7:v=1" -y out.mp4

pause

This is an example for a 10s input video where the framerate changes linearly from 20 to 10:
ffmpeg -f lavfi -i testsrc2=size=vga:duration=10:rate=20 -lavfi "
[0]trim=0:1,setpts=(PTS-STARTPTS)/0.975[1];
[0]trim=1:2,setpts=(PTS-STARTPTS)/0.925[2];
[0]trim=2:3,setpts=(PTS-STARTPTS)/0.875[3];
[0]trim=3:4,setpts=(PTS-STARTPTS)/0.825[4];
[0]trim=4:5,setpts=(PTS-STARTPTS)/0.775[5];
[0]trim=5:6,setpts=(PTS-STARTPTS)/0.725[6];
[0]trim=6:7,setpts=(PTS-STARTPTS)/0.675[7];
[0]trim=7:8,setpts=(PTS-STARTPTS)/0.625[8];
[0]trim=8:9,setpts=(PTS-STARTPTS)/0.575[9];
[0]trim=9:10,setpts=(PTS-STARTPTS)/0.525[10];[1][2][3][4][5][6][7][8][9][10]concat=n=10:v=1" -y out.mp4

pause
The length of the output video is 13.65s

103
Use the following example carefully, as I'm not 100% convinced that the approach is correct. This is based on an posting from Nicolas George in the
FFmpeg user mailing list, September 23, 2019. In the first equation it's unclear if t is the time in the input video or in the output video.
rem > So, to compute the timestamp of a frame with variable speed:
rem >
rem > * Express your frame rate as a complete formula: t → v
rem >
rem > * Integrate it: t → f.
rem >
rem > * Find the reciprocal: f → t.
rem
rem Let's assume we have a 10s video and the framerate changes linearly from 20 at the beginning to 10 at the end:
rem v = 20 - t v(0) = 20 v(10) = 10
rem
rem After integrating we get: f = 20 * t - 0.5 * t^2
rem
rem The inverse function is: t = 20 - sqrt(400 - 2 * f)

rem Creaste a test video with framerate=20 and length=10s:


ffmpeg -f lavfi -i testsrc2=size=vga:duration=10:rate=20 -y test.mp4

rem Apply the time remapping:


ffmpeg -i test.mp4 -lavfi setpts='(20-sqrt(400-2*N))/TB' -y out.mp4

pause

The resulting video gets slower towards the end (too slow, in fact), and the length is 18.95s and that seems to be wrong. With a constant framerate of 20
the length is 10s, with a constant framerate of 10 the length is 20s, and if the framerate changes from 20 to 10 the length should be about 15s. I don't fully
understand what's going on here.

Note: It's much easier to do time remapping in DaVinci Resolve.

Keywords for searching: "Time remapping", "Time ramp", "Slow motion ramp", "Speed ramp"

104
2.59 Combine multiple videos with concat demuxer

The concat demuxer combines several videos without re-encoding. It's very fast.
rem Final cut with concat demuxer

ffmpeg -f concat -i concat_list.txt -c copy -y MyVideo.mp4

pause

You simply write all existing scenes into a text file (here: concat_list.txt), which looks like this:
file text1.mp4 :: 10 Title: A year in the woods
file text2.mp4 :: 10 When and where
file Videos/scene20.mp4 :: 12 Live video in the wood
# This is a comment
file text22.mp4 :: 10 In 15 months...
file Videos/scene22.mp4 :: 52 Live video, camera
file text98.mp4 :: 10 the end

To the right of the double colons are optional comments (e.g. the length of the scenes and a short description). Comments can also begin with #.
This method, however, requires that all scenes have
• the same size (width and height)
• the same pixel format
• the same video codec
• the same framerate
• the same audio codec
• the same number of audio tracks (take care when you use a camera which writes only a mono soundtrack)
• the same audio sample rate

If one of these conditions isn't met, an error message is issued. You can then look at the properties of the files with FFprobe or Exiftool to find out where
the files differ.

115
How to create a concat_list file which contains all *.mp4 files from a folder:

if exist concat_list.txt del concat_list.txt

(for %%G in (*.mp4) do @echo file '%%G') >> concat_list.txt

pause

See also here: https://trac.ffmpeg.org/wiki/Concatenate

116
2.60 Combine multiple videos with concat filter

In this example the concat filter is used for input videos of the same size and no audio.
Each of the -ss and -t specifies the start time and length of the next input file. You can remove these options if you want to use the full videos.
The value n=3 passed to the concat filter should match the number of input files.
This filter does re-encode the videos, so the process is slow but you can also specify the encoding quality.
set "I1=my_video1.mp4" :: Input video 1
set "S1=0" :: Set start time 1
set "L1=4" :: Set length 1
set "I2=my_video2.mp4" :: Input video 2
set "S2=3" :: Set start time 2
set "L2=3" :: Set length 2
set "I3=my_video3.mp4" :: Input video 3
set "S3=6" :: Set start time 3
set "L3=2" :: Set length 3
set "OUT=out.mp4" :: Output video

ffmpeg -ss %S1% -t %L1% -i %I1% -ss %S2% -t %L2% -i %I2% -ss %S3% -t %L3% -i %I3% -lavfi "concat=n=3:v=1:a=0" -an %OUT%

pause

See also here: https://trac.ffmpeg.org/wiki/Concatenate

Note: Cutting the input videos to the required section can also be done with the "trim" filter.

The opposite of the "concat" filter is the "segment" filter, which splits a video into several streams.

2.61 The "fps" filter

This filter is described in detail on Jim DeLaHunt's website: http://blog.jdlh.com/en/2020/04/30/ffmpeg-fps-documented/

117
2.62 Split a video in multiple segments

A video can be split in multiple segments with the segment muxer. All segments will have the same length, except the last one.
set "IN=my_video.mov" :: Input video
set "L=10" :: Segment length in seconds
seu "OUT=out%%2d.mov" :: Output filename

ffmpeg -i %IN% -f segment -segment_time %L% -c copy %OUT%

pause
Note: The duration of the segments can also be expressed as minutes:seconds

This batch fill extracts a segment with known start and end frame numbers:
set "start=100" :: First frame number
set "end=200" :: Last frame number

set /a startms=%start%*1001/30 :: This calculation is for framerate 30000/1001 = 29.97


set /a endms=(%end%+1)*1001/30 :: Note that in the batch file only integer arithmetic is possible!
:: It's important to do first the multiplication and then the division

ffmpeg -i in.mp4 -ss %startms%ms -to %endms%ms -c copy -y out.mp4

pause

Note: The above command line with "-c copy" works only for intraframe codecs, meaning that all frames are I-frames. For interframe codecs you must
remove "-c copy", but then then the video will be re-encoded and the process is much slower.

I found the following hint here: http://ffmpeg.org/pipermail/ffmpeg-user/2023-July/056637.html


Try this "template" to generate .ts segment files:
ffmpeg -i <source> -y -codec copy -bsf:v h264_mp4toannexb -f segment -segment_time <seconds> -segment_time_delta 0.05 %03d.ts
MPEG-TS container is concatenable directly using "cat" from linux, after that you could use ffmpeg to change the container, .ts to .mov or .mp4 without
reencoding (-codec copy). May be this doesn't generate those extra keyframes, the option -segment_time_delta 0.05 has something to do with the
boundaries where ffmpeg will make a segment, i remember it was better this way, but i don't remember exactly why.

118
2.65 Stack videos side by side (or on top of each other)

set "IN1=left.mp4"
set "IN2=right.mp4"
set "OUT=out.mp4"
rem use "hstack" for horizontal stacking and "vstack" for vertical stacking

ffmpeg -i %IN1% -i %IN2% -filter_complex hstack -an -shortest -c:v mpeg4 -y %OUT%

pause

Note: If the videos have different width or height, use the "xstack" filter instead.

2.66 Horizontal and vertical flipping

This can be done with the "hflip" and "vflip" filters.

2.67 Stereo3d filter

The stereo3d filter can be used for splitting a square (fisheye) image in two halves (top and bottom) and stacking the halves side by side (left and right):
In this example a 4096x4096 fisheye image is splitted in two halves and the output size 8192x2048:
rem Test pattern from http://www.paulbourke.net/dome/testpattern/4096.png

ffmpeg -i 4096.png -vf stereo3d=abl:sbsl -y out1.png

pause
Note: If left/right images are swapped, use "sbsr" instead of "sbsl".

122
2.68 Stack four videos to a 2x2 mosaic

set "IN1=topleft.mp4"
set "IN2=topright.mp4"
set "IN3=bottomleft.mp4"
set "IN4=bottomright.mp4"
set "OUT=mosaic.mp4"

ffmpeg -i %IN1% -i %IN2% -i %IN3% -i %IN4% -filter_complex [0:v][1:v]hstack[t];[2:v][3:v]hstack[b];[t][b]vstack -an


-shortest -c:v mpeg4 -q:v 1 -y %OUT%

pause

Other method using xstack:


set "IN1=topleft.mp4"
set "IN2=topright.mp4"
set "IN3=bottomleft.mp4"
set "IN4=bottomright.mp4"
set "OUT=mosaic.mp4"

ffmpeg -i %IN1% -i %IN2% -i %IN3% -i %IN4% -filter_complex "xstack=inputs=4:layout=0_0|0_h0|w0_0|w0_h0" -shortest %OUT%

pause

Display 4 inputs into a vertical 1x4 grid, note that the input videos may have different widths (vstack can't handle this case).

ffmpeg -i %IN1% -i %IN2% -i %IN3% -i %IN4% -filter_complex "xstack=inputs=4:layout=0_0|0_h0|0_h0+h1|0_h0+h1+h2" %OUT%

pause

123
2.79 Vignetting

Vignetting at the edge of the image can be compensated with the "vignette" filter. "mode=backward" makes the corners brighter and "mode=forward"
makes them darker. The value must be set so that the corners are neither too bright nor too dark.
Example:
ffmpeg -i imput.png -vf vignette=a=0.5:mode=backward:x0=(w-1)/2:y0=(h-1)/2 -y out.png

pause

Note: The "a" value is clipped to the [0...pi/2] range.


Note: The default values for x0, y0 are w/2 and h/2. With these values, the vignetting effect isn't exactly centered in the frame. It's offset by 0.5 pixels.
That's why you should always use x0=(w-1)/2 and y0=(h-1)/2.

This is the formula for forward mode: output = input / cos^4(a * dist / dist_max)
This is the formula for backward mode: output = input * cos^4(a * dist / dist_max)
with a = "angle" option dist = distance from the pixel to x0,y0 dist_max = half of the image diagonal

angle Corner pixel in forward mode Corner pixel in backward mode


(input = 1) (input = 1)
PI/16 1.08 0.925
PI/8 1.37 0.723
PI/5 (default) 2.33 0.428
PI/4 4 0.25
PI/2 infinite 0

138
2.80 Subtracting a darkframe

Noise, hot pixels and amplifier glow in a low-light video can be reduced by subtracting a darkframe. Make a dark video with the same settings and at the
same temperature as your main video. The only difference is that you put the cap on the lens. Then you can average many (up to 1024) frames from the
dark video and save the darkframe lossless as 16-bit PNG:
set "DARKVID=Dark.mov" :: Dark video

ffmpeg -i %DARKVID% -vf "tmix=128,format=rgb48" -frames 1 -y dark.png

pause

Now you can subtract this darkframe from all frames of your video:
set "IN=meteor.mov" :: Input video
set "OUT=meteor-dark.mp4" :: Output video

ffmpeg -i %IN% -i dark.png -filter_complex "format=rgb48[a];[a][1]blend=all_mode=subtract" -y %OUT%

pause

139
2.151 Video stabilization

Videos can be stabilized in a one-pass process with "deshake" filter or (better) in a two-pass process with "vidstabdetect" and "vidstabtransform" filters.
set "IN=C1000650.MOV" :: Input video
set "OUT=C0650_stab.MOV" :: Output video

rem Stabilize the video

ffmpeg -i %IN% -vf vidstabdetect -y dummy.mov


del dummy.mov
ffmpeg -i %IN% -vf vidstabtransform -y %OUT%

pause
Note: The vidstabdetect filter does by default write the stabilization data to the file "transforms.trf". The vidstabtransform filter does by default read from
this file.
In the above example the output of the filter chain is encoded to the file "dummy.mov" which is deleted in the next line, because it's not required. This
can be simplified by encoding to the null device:
ffmpeg -i %IN% -vf vidstabdetect -an -f null -
Note: However if the option show=1 is used, then the output video contains the detected stabilization vectors and should not be deleted.

This is the same thing, but with 10-bit DNxHD (Digital Nonlinear Extensible High Definition) codec for importing in the free DaVinci Resolve version:
set "IN=C1000645.MOV" :: Input video
set "OUT=C0645_stab.MOV" :: Output video

rem Stabilize the video

ffmpeg -i %IN% -vf vidstabdetect -an -f null -


ffmpeg -i %IN% -vf vidstabtransform -map_metadata 0 -pix_fmt yuv422p10le -c:v dnxhd -profile:v 4 -c:a pcm_s24le
-color_range pc -movflags write_colr -y %OUT%

pause

338
Two notes from Andrei B.:
• The "vidstab" filter has the drawback that it gets confused by rolling shutter from CMOS sensors.
• A very good (and free) tool that can do much better is VirtualDub with Deshaker 3.0 filter. This filter has a rolling shutter factor input and can
greatly improve on reducing the wabbliness of a stabilized video. It's documentation includes instructions on how to measure your camera's
rolling shutter factor.

Some notes from Steffen Richter:


• "shakiness" option seems to have no effect.
• "tripod" may lead to strong z rotations, if no relevant movements are detected.
• "relative": Different from the documentation, the default seems to be "1", which also makes sense.
• "zoomspeed": Values between 0.01 and 0.1 are useful.

This is a test for comparing the "deshake" filter with the "vidstabdetect/vidstabtransform" filters:
rem deshake
ffmpeg -i test.mov -lavfi split[a][b];[a]deshake=rx=64:ry=64:edge=0[c];[b][c]hstack -y deshake.mov

rem vidstabdetect
ffmpeg -i test.mov -lavfi vidstabdetect=shakiness=10:show=1 -y dummy.mov

rem vidstabtransform with optzoom=0 (which means no zoom, so that borders are visible)
ffmpeg -i test.mov -lavfi split[a][b];[a]vidstabtransform=smoothing=50:crop=black:optzoom=0[c];[b][c]hstack -y
vidstab.mov

rem vidstabtransform with optzoom=1 (which means optimized zoom, so that no borders are visible)
ffmpeg -i test.mov -lavfi split[a][b];[a]vidstabtransform=smoothing=50:crop=black:optzoom=1[c];[b][c]hstack -y
vidstab_zoom.mov

pause

By comparing the results, I found that the two-stage solution with "vidstabdetect/vidstabtransform" gives much better results than the one-stage
"deshake" solution.
In "vidstabtransform" it's possible to set the "smoothing" parameter which defines the number of frames (2 * smoothing + 1) for low-pass filtering of the
camera movements. The default "smoothing" value is 10 (which means 21 frames), but i found it useful to use higher values between 20 and 50.
"vidstabtransform" does correct x and y translations and also rotations.
The option "optzoom=1" does automatically choose a suitable zoom factor, so that there are no no-data areas at the borders visible.

339
Note: I think there is a bug in vidstabtransform, the option crop=black doesn't work as described. The no-data borders are filled with the colors of the
edge of the image. Black borders appear only when the image shift becomes very large. But that doesn't matter, because with "optzoom=1" (which is the
default) the no-data borders are anyway cropped away.
See also: http://oioiiooixiii.blogspot.com/2016/09/ffmpeg-video-stabilisation-using.html

2.152 Remove linear drift

This is an example for removing a linear drift from a video. The first and the last image is extracted and the x,y coordinates of an object is measured in
these two images. The diffence between the coordinates is the motion vector. Then the "crop" filter is used for cropping a window out of the input video,
where the top left coordinates of the crop window are a linear function of time. The input size is 2048x2048, and the output size is reduced to 1856x1856
(which is the input size minus the larger of the two motion vectors).
rem Extract the first frame
ffmpeg -i M31-STATT-2020.mov -frames 1 -y first_frame.jpg

rem Extract the last frame


ffmpeg -sseof -0.2 -i M31-STATT-2020.mov -update 1 -y last_frame.jpg

rem Coordinates of object in first frame: 1026, 1091


rem Coordinates of object in last frame: 1131, 1282
rem Motion vector: +105, +191
rem Duration: 17.23s

ffmpeg -i M31-STATT-2020.mov -lavfi crop=x='105/17.23*t':y='191/17.23*t':w=1856:h=1856 -y out1.mp4

pause

340
2.157 Noise reduction

FFmpeg has several filters for video noise reduction (denoising):


Filter Description Notes and Examples
atadenoise Apply an Adaptive Temporal Averaging Denoiser to the video very fast, temporal only with no motion compensation; LGPL
input Example: atadenoise=0a=0.2:1a=0.2:2a=0.2:0b=0.3:1b=0.3:2b=0.3
bm3d Denoise frames using Block-Matching 3D algorithm very very slow, currently implemented as spatial only, algorithm considered as one of the
state of art denoisers; LGPL
chromanr Reduce chrominance noise This filter calculates the absolute difference of the Y components (contrary to the official
documentation! ) of the current pixel and a neighbour pixel from a rectangular
neighbourhood. Absolute differences are also calculated for the U and V components. A
neighbour pixel is used for averaging, if the sum of all three absolute differences is lower
than the threshold. Only the U and V components are averaged. The Y component
remains unchanged. With the "stepw" and "steph" options it's possible to use only a
subset of the neighbour pixels for averaging.
dctdnoiz Denoise frames using 2D DCT (frequency domain filtering) very very slow: spatial only, blurs too much; LGPL
fftdenoiz Denoise frames using 3D FFT (frequency domain filtering) slow, spatial and limited temporal, using Fast Fourier Transform, may have introduce
ringing with bad settings; LGPL
hqdn3d This is a high precision/quality 3d denoise filter. It aims to fast, both spatial and temporal, does basically lowpass by destroying high frequencies,
reduce image noise, producing smooth images and making blurs with extreme settings; GPL
still images really still. It should enhance compressibility. Example: hqdn3d=4:4:9:9
nlmeans Denoise frames using Non-Local means algorithm very slow, currently implemented as spatial only, algorithm considered as one of the
state of art denoisers; LGPL
owdenoise Apply Overcomplete Wavelet denoiser very very very slow, spatial only, wavelet; GPL
Example: owdenoise=ls=25
removegrain Spatial denoiser for progressive video fast, spatial only, limited usecase
vaguedenoiser Apply a wavelet based denoiser slow, spatial only, pretty good, wavelet; LGPL
tmix Noise reduction by averaging up to 1024 successive frames Not suitable for moving objects. Example: tmix=frames=20
tmedian Noise reduction by calculating the median out of up to 127 Example: tmedian=radius=20
successive frames
Special thanks to Paul B Mahol who posted most of these notes in the FFmpeg-user list on October 27, 2019

355
2.162 Automatic format conversions

Automatic format conversions can be disabled by the option "-noauto_conversion_filters".


Use "-v verbose" for checking where FFmpeg did auto-insert format conversions. Search for the green lines in the console listing.
You can also add "graphmonitor=f=format". This output is shown as text overlaid to the video.

If you are using the "-noauto_conversion_filters" option, you must manually insert the required conversions in the filter chain.
Example without "-noauto_conversion_filters":
ffmpeg -v verbose -f lavfi -i testsrc2=s=svga:d=5,format=yuv422p10le -vf
lut3d="VLog_to_V709.cube" -pix_fmt yuv422p10le -c:v h264 -y out.mov

pause

In this example you can see in the console listing that Ffmpeg did auto-insert two format conversions: Before the lut3d filter from yuv422p10le to rgb48le,
and after the lut3d filter from rgb48le to yuv422p10le.
The same example with "-noauto_conversion_filters":
ffmpeg -v verbose -f lavfi -i testsrc2=s=svga:d=5,format=yuv422p10le -vf
scale,format=rgb48le,lut3d="VLog_to_V709.cube",scale -noauto_conversion_filters -pix_fmt yuv422p10le -c:v h264 -y
out.mov

pause

As you can see, there are also two "scale" filters required. It's hard to understand why. In this case the second format conversion can be omitted because
it's redundant with the following "-pix_fmt" option.
The following explanation was written by Gyan Doshi in the ffmpeg user list on September 14, 2020:
"Each filter presents a list of input formats they can work with and a list of output formats they can directly generate. The framework inspects adjacent
filters and sets a compatible common format for the outputs and inputs when possible. If not, it sets one of the available output formats for the preceding
filter and one from input formats for the following filter and inserts a scale filter to convert between those. This process is format negotiation. The format
filter doesn't carry out the conversion itself - it inserts scale which in turn invokes libswscale. scale without any args defaults to the source W and H. But
for pixel formats, its output format is constrained by the following format filter. That triggers a format conversion by libswscale."

367
For more details about the scale filter see also: https://trac.ffmpeg.org/wiki/Scaling

Some important notes from the above website:


• When going from BGR (not RGB) to yuv420p the conversion is broken (off-by-one). Use -vf scale=flags=accurate_rnd to fix that.
• yuvjxxxp pixel formats are deprecated. Yet for x265 the workaround was implemented, but not for jpeg2000 and AV1. For those -vf
scale=out_range=pc should be used.
• Converion from YCbCr limited to RGB 16 bit is broken, use zscale instead of swscale
• Limited range RGB is not supported at all.
• Dither can be turned off using -vf scale=sws_dither=none
• One should always remember that YCbCr 4:4:4 8 bit is not enough to preserve RGB 8 bit, YCbCr 4:4:4 10 bit is required.
• The default for matrix in untagged input and output is always limited BT.601

368
2.166 Video Codecs

-c:v mpeg4 This is the older MP4 codec, which is poorly documented.
See also https://trac.ffmpeg.org/wiki/Encode/MPEG-4
-c:v libxvid This MP4 codec is using the external library "libxvid". Search for "libxvid" in the documentation.
See also http://www.ffmpeg.org/ffmpeg-codecs.html#libxvid
-c:v libx264 Newer H264 codec with better compression than mpeg4, but it's possible that the videos don't play on older computers.
-c:v h264 ??? It's also possible to create lossless videos, as described in the next link:
See also https://trac.ffmpeg.org/wiki/Encode/H.264
-c:v h264_nvenc This is a hardware accelerated version of the h264 codec. It accepts the undocumented -cq option (constrained quality)
-c:v libx264rgb The libx264rgb encoder is the same as libx264, except it accepts packed RGB pixel formats as input instead of YUV.
See also http://www.ffmpeg.org/ffmpeg-codecs.html#libx264_002c-libx264rgb
-c:v libx265 H265 (= HEVC) is newer than H264, it has better compression than H264 and about the same quality. It's possible that the videos
don't play on older computers.
See also https://trac.ffmpeg.org/wiki/Encode/H.265
See also http://www.ffmpeg.org/ffmpeg-codecs.html#libx265
-c:v hevc = H265
-c:v hevc_nvenc This is a hardware accelerated codec. It accepts the undocumented -cq option (constrained quality)
-c:v prores_ks Apple ProRes encoder, example:
ffmpeg -i input.MOV -vcodec prores_ks -pix_fmt yuva444p10le -profile:v 4444 -bits_per_mb 8000 -s
1920x1080 out.mov
See also http://www.ffmpeg.org/ffmpeg-codecs.html#ProRes
See also https://trac.ffmpeg.org/wiki/Encode/VFX
-c:v dnxhd This codec is suitable for converting 10-bit videos from GH5S camera into a format that's readable by the free DaVinci Resolve
software. There isn't much documentation available for this codec and its options. Example:
ffmpeg -i input.mov -map_metadata 0 -pix_fmt yuv422p10le -c:v dnxhd -profile:v 4 -c:a pcm_s24le
-color_range pc -movflags write_colr out.mov
-c:v rawvideo This means the output format is uncompressed raw video. It's good for lossless intermediate video files. Example:
ffmpeg -i in.mp4 -c:v rawvideo -f rawvideo -an -y out.raw (This file can't contain audio)
The drawback of this format is that you have to know the framerate, size and pixel format when you read such a file. This

375
problem can be avoided by using the *.nut file format. This format can also contain audio. Example:
ffmpeg -i in.mp4 -c:v rawvideo -y out.nut
-c:v ffv1 FFV1 is a lossless video codec which comes in two versions, 1 and 3.
-c:v ffv1 -level 3 See also https://trac.ffmpeg.org/wiki/Encode/FFV1

For the "mpeg4" and "libxvid" codecs you can select a video quality level with -q:v n , where n is a number from 1-31, with 1 being highest
quality/largest filesize and 31 being the lowest quality/smallest filesize. This is a variable bit rate mode.
The constant rate factor (CRF) can be set with the -crf parameter. Use this mode if you want to keep the best quality and don't care about the file size.
The CRF range 0–51 for 8-bit x264 and 0-63 for 10-bit. 0 is lossless (for 8-bit, but not for 10-bit), 23 is the default, and 51 is worst quality possible.
For lossless 10 bit -qp 0 is required.
This is in more detail explained here: https://www.pixeltools.com/rate_control_paper.html
Use a preset with the -preset parameter. Possible options are ultrafast, superfast, veryfast, faster, fast, medium (this is the default), slow, slower and
veryslow. A preset is a collection of options that will provide a certain encoding speed to compression ratio. A slower preset will provide better
compression. Use the slowest preset that you have patience for.

Some interesting details about video quality settings (found here: http://ffmpeg.org/pipermail/ffmpeg-user/2023-April/056330.html ):
qscale (-q:v) was used by old codecs up until mpeg4-asp (stuff like divx, xvid etc.) to set variable bitrate with a number between 1 and 31 where the
highest number was the lowest quality. It has been superseded by -crf in modern codecs like H264 and newer.
-crf tries to keep the same quality of the image by changing the bitrate as needed. In -crf also the lower the number, the better the quality. Most modern
codecs interpret -crf 1 as lossless coding. So -crf keeps a constant quality but in doing so can use more, or less bandwith depending on the source
material.
If you are constrained by bandwith you might better use -cq (constrained quality) or VBV if available for your codec if keeping qulity is desired up until a
certain limit.

376
The -tune parameter can be set to these options:
film use for high quality movie content; lowers deblocking
animation good for cartoons; uses higher deblocking and more reference frames
grain preserves the grain structure in old, grainy film material
stillimage good for slideshow-like content
fastdecode allows faster decoding by disabling certain filters
zerolatency good for fast encoding and low-latency streaming

List all possible internal presets and tunes:


ffmpeg -hide_banner -f lavfi -i nullsrc -c:v libx264 -preset help -f mp4 -

pause

For lossless codecs, see also FFV1 and VP9.

377
2.170 Video codecs with alpha channel

These are a few examples for exporting videos with alpha channel:
rem PNG images (lossless compression, output pixel format is rgba)
ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=0.2 -pix_fmt rgba -y test_png_8bit_%%5d.png

rem PNG images (lossless compression, output pixel format is rgba64be)


ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=0.2 -pix_fmt rgba64be -y test_png_16bit_%%5d.png

rem Apple ProRes (in all four cases the output pixel format is yuva444p12le)
ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=4 -pix_fmt rgba -c:v prores_ks -y test_prores1.mov
ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=4 -pix_fmt rgba64le -c:v prores_ks -y test_prores2.mov
ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=4 -pix_fmt yuva444p10le -c:v prores_ks -y test_prores3.mov
ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=4 -pix_fmt yuva444p12le -c:v prores_ks -y test_prores4.mov

rem Rawvideo (uncompressed, output pixel format is yuva444p10le)


ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=4 -pix_fmt yuva444p10le -c:v rawvideo -y test_rawvideo1.nut

rem Rawvideo (uncompressed, output pixel format is yuva444p12le)


ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=4 -pix_fmt yuva444p12le -c:v rawvideo -y test_rawvideo2.nut

rem Rawvideo (uncompressed, output pixel format is rgba)


ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=4 -pix_fmt rgba -c:v rawvideo -y test_rawvideo3.nut

rem Rawvideo (uncompressed, output pixel format is rgba64le)


ffmpeg -f lavfi -i testsrc2=s=1920x1080:d=4 -pix_fmt rgba64le -c:v rawvideo -y test_rawvideo4.nut

pause

Note: The *.nut format is unique for FFmpeg.

Note: alpha=0 means transparent, alpha=255 means opaque

See also https://www.digitalrebellion.com/blog/posts/list_of_video_formats_supporting_alpha_channels

379
2.173 Metadata

Global metadata can be saved in a text file as follows:


ffmpeg -i input.mp4 -f ffmetadata metadata.txt

pause

If you also need the metadata from the video and audio streams (which may contain more informations), use this command line:
ffmpeg -i input.mp4 -c copy -map_metadata 0 -map_metadata:s:v 0:s:v -map_metadata:s:a 0:s:a -f ffmetadata metadata.txt

pause

The metadata can be re-inserted into a video as follows:


ffmpeg -i input.mp4 -i metadata.txt -map_metadata 1 -codec copy output.mp4

pause

Write metadata "title" to mp4 video without re-encoding:


ffmpeg -i input.mp4 -metadata title="This is the Title" -acodec copy -codec copy -copyts output.mp4

pause

Unfortunately FFmpeg can't insert the metadata that is required for a spherical 360° video.
This website describes which metadata tags are actually written to the output files: https://wiki.multimedia.cx/index.php/FFmpeg_Metadata
There is also a list on this page: https://trac.ffmpeg.org/wiki/FilteringGuide
FFmpeg can't write EXIF metadata to *.jpg images (February 2021).

383
Note: By default, "-filter_complex" or "-lavfi" don't copy the metadata from the input to the output.
Also it's not possible to copy the metadata from the input to the output with "-map_metadata 0".
Unconfirmed: Add "-map_metadata 0" after "-map 0" ?

The metadata map must be explicitly specified. For example, this would be -map_metadata:s:v:0 0:s:v:0. For videos with multiple streams however, all
output streams (regardless of whether or not they are filtered) would have to be specified this way to preserve all stream metadata (since a single map
disables all mappings). (Source: https://trac.ffmpeg.org/ticket/9649)
How to copy the creation date of the input file to the output file: https://stackoverflow.com/questions/54456493/ffmpeg-keep-original-file-date

2.174 Video filters "copy" and "null"

These filters are only for testing, for example when you want to disable part of a filter chain.
The "null" filter does really nothing, the output is the same as the input.
The "copy" filter copies the old frame and deletes the old one. The output is the same as with the "null" filter.
For more details about "null" filter see also: https://trac.ffmpeg.org/wiki/Null

2.175 Re-numbering images

Cameras do normally create images with 4-digit numbers. When the counter (in Canon cameras) overflows, the number changes from 9999 to 0001. That
means the number 0000 is missing and the numbers aren's continuously increasing, as it's expected by FFmpeg. This problem can be solved with this
sequence of console commands:
ren IMG_1*.* IMG_2*.*
ren IMG_0*.* IMG_1*.*
ren IMG_9*.* IMG_0*.*

The first two commands add 1000 to those numbers that begin with 0 or 1. The last command subtracts 9000 from those numbers that begin with 9.

384
Note: A list of variables that can be printed with "-show_entries" can be found in the file "ffprobe.xsd" which is in the source tree under doc/ffprobe.xsd

Note: Tracking of objects works better in DaVinci resolve.

2.204 Detect black frames and replace them by previous frame

See https://video.stackexchange.com/q/23589/
See "blackframe" and "blackdetect" filters.

424
2.205 Image formats

FFmpeg supports these image formats. This list is incomplete, a longer (but also incomplete) list can be found at http://www.ffmpeg.org/general.html

Image compressed? lossless? 16-Bit? Notes


Format
BMP no yes no very big filesize
DNG yes yes yes "Adobe Digital Negative" format, recommended for saving RAW images from cameras. Use
Adobe DNG converter to make these files.
Warning: FFmpeg's DNG decoder doesn't work correctly with most images.
FITS yes yes yes Flexible Image Transport System, a popular image format in astronomy
GIF yes yes no this is obsolete, use PNG instead
JPG yes no no recommended for 8-bit images if small file size is required
JXL yes yes or no yes JPG XL Format, this is the successor of JPG, see https://en.wikipedia.org/wiki/JPEG_XL
PAM no ? ? "Portable Arbitrary Map"
PFM no yes no, 32-bit float "Portable Float Map" Pixel format is "grayf32"
PGM no yes yes "Portable Graymap", these files are required for the remap filter. FFmpeg can read binary
PGM (P5) files and ASCII PGM (P2) files, but for output only binary PGM (P5) is supported.
PGM files contain values in the range [0..65535]. Negative values aren't possible, but
FFmpeg gives no warning if a negative number is found in a P2 file.
PHM no yes, but limited yes, 16-bit float "Portable Half Float Map" (1 sign bit, 5 exponent bits, 10 fraction bits), see
precision https://en.wikipedia.org/wiki/Half-precision_floating-point_format
PGMYUV no yes ? This is a FFmpeg variant of the binary PGM format
PNG yes yes yes recommended for lossless saving of 8-bit or 16-bit images, can be forced by -c:v png
PPM no yes yes "Portable Pixmap", FFmpeg can read binary PPM (P6) or ASCII PPM (P3) files, but for output
only binary PPM (P6) is supported.
TGA yes yes no this is obsolete, use PNG instead
TIFF no yes yes

425
2.223 Capture video from the desktop or from a window

Capture the entire desktop without audio:


set "FR=10" :: Framerate

ffmpeg -f gdigrab -framerate %FR% -i desktop -y out.mp4

pause

Capture the entire desktop with audio:


Go to the windows device manager and choose "Audio Devices" (This can be found in "Sound settings" --> "Manage sound devices". Select the
"Recording" tab. Enable the "Stereomix" device. Click on "Properties". Under the "Listen" tab, ckeck the box "Listen to this device". Under the "Levels"
tab, disable the speaker.
Then run this command to find out the name of the "Stereomix" device:
ffmpeg -list_devices true -f dshow -i dummy

pause

Enter the name in the following command line, in this case "Stereomix (Realtek(R) Audio)". If the audio volume is too low, you can increase it with the
"volume" filter.
ffmpeg -f gdigrab -i desktop -f dshow -i audio="Stereomix (Realtek(R) Audio)" -af volume=3 -y grabbed_video.mp4

pause

Or alternatively with Nvidia driver:


ffmpeg -f gdigrab -i desktop -f dshow -i audio="Stereomix (Realtek(R) Audio)" -af volume=3 -c:v h264_nvenc -y
grabbed_video.mp4

pause

460
Note: This doesn't always work well. There might be missing frames in the captured video. Suggested workaround: Use "OBS Studio" for capturing the
desktop or a window, with or without audio.
Capture a region of the desktop:
set "SHOW=1" :: 0 = do not show the border
:: 1 = show the border of the region on the desktop
set "FR=10" :: Framerate
set "SIZE=500x300" :: Size of the region
set "X=20" :: Left edge of region
set "Y=50" :: Top edge of region

ffmpeg -f gdigrab -show_region %SHOW% -framerate %FR% -video_size %SIZE% -offset_x %X% -offset_y %Y% -i desktop -y
out.mp4

pause

Capture a window:
set "TITLE=*new 1 - Notepad++" :: Name of the window
set "FR=10" :: Framerate

ffmpeg -f gdigrab -framerate %FR% -i title="%TITLE%" -y out.mp4

pause
The title is the text in the title line of the window. It's not the name of the process in the task manager. A problem is that the title may change dynamically.
For example, the title of an editor changes as soon as you begin to enter something (a * is inserted at the beginning).
See also: https://trac.ffmpeg.org/wiki/Capture/Desktop

Capture the desktop with "ddagrab" filter, this works via Desktop Duplication API:
ffmpeg -f lavfi -i ddagrab -c:v h264_nvenc -cq 18 -v 40 -t 10 -y out.mp4

pause

Note: This example doesn't work on my notebook computer with Nvidia RTX 2060 graphics.
Error message: "OpenEncodeSessionEx failed: no encode device (1): (no details) ", see also https://trac.ffmpeg.org/ticket/10205

461
3 Audio processing with FFmpeg

3.1 Audio codecs

FFmpeg codec Remarks


AAC -c:a aac Lossy codec, see https://en.wikipedia.org/wiki/Advanced_Audio_Coding
and https://trac.ffmpeg.org/wiki/Encode/AAC
MP3 -codec:a libmp3lame MP3 Encoding guide, VBR Encoding, CBR Encoding, ABR Encoding: https://trac.ffmpeg.org/wiki/Encode/MP3
LPCM -c:a pcm_s16le Linear pulse-code modulation, this is a lossless codec, see https://en.wikipedia.org/wiki/Pulse-code_modulation

FFmpeg MP3 Encoding guide, VBR Encoding, CBR Encoding, ABR Encoding: https://trac.ffmpeg.org/wiki/Encode/MP3
Audio encoders FFmpeg can use, Container formats, Recommended minimum bitrates to use: https://trac.ffmpeg.org/wiki/Encode/HighQualityAudio

3.2 Concat two or more audio files

ffmpeg -i sound1.wav -i sound2.wav -lavfi [0][1]concat=n=2:v=0:a=1 -y sound.wav

pause

Note: It's important to set v=0 because the default value is 1. If you forget this setting, you would get the error message
"Stream specifier ' ' in filtergraph description [0][1]concat=n=2:a=1 matches no streams. "
because FFmpeg doesn't find the video streams.

529
3.3 Combine multiple audio files with crossfadings
rem Combine multiple audio files with crossfadings

set "FILE1=Sound_Day.wav" :: First audio filename


set "V1=1.0" :: Volume
set "S1=0" :: Start time
set "L1=14" :: Length
set "FILE2=Sound_Night.wav" :: Second audio filename
set "V2=0.2" :: Volume
set "S2=20" :: Start time
set "L2=55" :: Length
set "FILE3=Sound_Day.wav" :: Third audio filename
set "V3=1.0" :: Volume
set "S3=20" :: Start time
set "L3=30" :: Length
set "DUR=5" :: Crossfade duration
set "OUT=sound.mp3" :: Output audio filename

ffmpeg -ss %S1% -i %FILE1% -t %L1% -af volume=%V1% -y s1.wav


ffmpeg -ss %S2% -i %FILE2% -t %L2% -af volume=%V2% -y s2.wav
ffmpeg -ss %S3% -i %FILE3% -t %L3% -af volume=%V3% -y s3.wav
ffmpeg -i s1.wav -i s2.wav -filter_complex acrossfade=d=%DUR% -y s12.wav
ffmpeg -i s12.wav -i s3.wav -filter_complex acrossfade=d=%DUR% -y %OUT%

pause

In this example three audio files are concatenated with crossfadings. For each file the volume, start time and length can be specified.
At first three temporary files are created, then the first two are combined, and in the last step the third file is added.
There is no quality loss because *.wav is an uncompressed audio format.

3.4 Change audio volume

See https://trac.ffmpeg.org/wiki/AudioVolume

530
3.9 Replace a segment of the audio stream by silence

set "B=10" :: Time where silence begins


set "E=10" :: Time where silence ends

ffmpeg -i in.mp4 -c:v copy -af "volume=enable='between(t,%B%,%E%)':volume=0" out.mp4

pause

536
3.10 Replace a segment of the audio stream by another stream

In this example the first input stream is passed through to the output, except in the 4s to 6s interval where the second input stream is used:
ffmpeg -lavfi sine=500:d=10 -y audio1.wav
ffmpeg -lavfi sine=2000:d=10 -y audio2.wav

ffmpeg -i audio1.wav -i audio2.wav -lavfi [0]volume='1-between(t,4,6)':eval=frame[a];


[1]volume='between(t,4,6)':eval=frame[b];[a][b]amix -y out.wav

pause

The same thing can also be done with asendcmd and amix:
ffmpeg -lavfi sine=500:d=10 -y audio1.wav
ffmpeg -lavfi sine=2000:d=10 -y audio2.wav

ffmpeg -i audio1.wav -i audio2.wav -lavfi "asendcmd='4 amix weights '\\\'0 1\\\',asendcmd='6 amix weights '\\\'1
0\\\',amix=weights='1 0'" -y out.wav

pause

The drawback of this method is that escaping is required, which is really hard to understand.
See also http://www.ffmpeg.org/ffmpeg-all.html#Quoting-and-escaping
See also http://www.ffmpeg.org/ffmpeg-all.html#Notes-on-filtergraph-escaping
Note: It's possible to avoid filtergraph escaping if you use "-filter_complex_script".

The same thing can also be simplified with asendcmd and astreamselect:
ffmpeg -lavfi sine=500:d=10 -y audio1.wav
ffmpeg -lavfi sine=2000:d=10 -y audio2.wav

ffmpeg -i audio1.wav -i audio2.wav -lavfi asendcmd="4 astreamselect map 1",asendcmd="6 astreamselect map
0",astreamselect=map=0 -y out.wav

pause

537
3.11 Add an audio stream to a video, if no audio stream exists

In this example a silent stereo audio stream is added to a video, if (and only if) the video has no audio stream. Otherwise the audio stream remains
unchanged:
rem Make a test video without audio stream:
ffmpeg -f lavfi -i testsrc2=size=vga -t 6 -y none.mp4

rem Make a test video with mono audio stream:


ffmpeg -f lavfi -i testsrc2=size=vga -f lavfi -i sine=1000 -t 6 -y mono.mp4

rem Make a test video with stereo audio stream:


ffmpeg -f lavfi -i testsrc2=size=vga -f lavfi -i sine=1000 -t 6 -ac 2 -y stereo.mp4

ffmpeg -i none.mp4 -f lavfi -i anullsrc=cl=stereo -shortest -y test1.mp4


ffmpeg -i mono.mp4 -f lavfi -i anullsrc=cl=stereo -shortest -y test2.mp4
ffmpeg -i stereo.mp4 -f lavfi -i anullsrc=cl=stereo -shortest -y test3.mp4

pause
In this example test1.mp4 will have a silent stereo audio stream, test2.mp4 will have the original mono audio stream and test3.mp4 will have the original
stereo audio stream.

538
This example is similar, but the output audio stream is forced to be mono in all three cases:
ffmpeg -i none.mp4 -f lavfi -i anullsrc -shortest -ac 1 -y test1.mp4
ffmpeg -i mono.mp4 -f lavfi -i anullsrc -shortest -ac 1 -y test2.mp4
ffmpeg -i stereo.mp4 -f lavfi -i anullsrc -shortest -ac 1 -y test3.mp4

This example is similar, but the output audio stream is forced to be stereo in all three cases:
ffmpeg -i none.mp4 -f lavfi -i anullsrc -shortest -ac 2 -y test1.mp4
ffmpeg -i mono.mp4 -f lavfi -i anullsrc -shortest -ac 2 -y test2.mp4
ffmpeg -i stereo.mp4 -f lavfi -i anullsrc -shortest -ac 2 -y test3.mp4

How does it work?


If "map" is not specified, FFmpeg selects a single audio stream from among the inputs with the highest channel count. If there are two or more streams
with same number of channels, it selects the stream with the lowest index. anullsrc here has one channel, so it will be passed over except when the
source video has an audio stream.
See also: https://stackoverflow.com/questions/37862432/ffmpeg-output-silent-audio-track-if-source-has-no-audio-or-audio-is-shorter-th

3.12 Stereo --> mix into one mono channel

Both channels of the stereo stream will be downmixed into the stream:
ffmpeg -i stereo.wav -ac 1 mono.wav

pause

539
3.13 Check if both stereo channels are equal

In this example the difference between the left and right stereo channel is calculated and written to a mono file. If the result is silence, then both input
channels are equal. The input can be a video or an audio file.
ffmpeg -i input.mp4 -af "aeval=val(0)-val(1)" mono.wav

pause

3.14 Check if two mono inputs are equal

In this example the difference between the two mono audio channels is calculated and written to a mono file. If the result is silence, then both input
channels are equal.
ffmpeg -i input1.wav -i input2.wav -lavfi [0][1]amerge,aeval=val(0)-val(1) -y mono.wav

pause

3.15 Extract one mono channel from stereo

ffmpeg -i stereo.wav -filter_complex "[0:a]channelsplit=channel_layout=stereo:channels=FR[right]" -map "[right]"


front_right.wav

pause

If you only want the left channel use FL instead of FR.


See ffmpeg -layouts for a list of channel layouts.
If you are working with a video file, you can use `-map 0:0 -c:v copy` to preserve the video stream.

540
3.16 Stereo --> two mono channels

ffmpeg -i stereo.wav -filter_complex "[0:a]channelsplit=channel_layout=stereo[left][right]" -map "[left]" left.wav -map


"[right]" right.wav

pause

This command line does the same thing:


ffmpeg -i stereo.wav -map_channel 0.0.0 left.wav -map_channel 0.0.1 right.wav

pause

3.17 Use only one channel of a stereo signal

Use the left channel of the stereo input signal for both channels of the stereo output signal:
rem Make a 5 seconds test sound, left channel 500 Hz sine, right channel 2000 Hz sine:

ffmpeg -f lavfi -i sine=500 -f lavfi -i sine=2000 -lavfi [0][1]join -t 5 -y test.wav

rem Use the left channel of the stereo input signal for both channels of the stereo output signal:

ffmpeg -i test.wav -af "channelmap=0|0" -y out.wav

pause

Note: If you want to use the right channel of the input signal, use "channelmap=1|1" instead.

541
3.18 Mono --> stereo

Of course both stereo channels will be identical.


ffmpeg -i input.wav -ac 2 output.wav

pause

Other method for the same thing:


ffmpeg -i input.wav -af "channelmap=0|0" output.wav

pause

3.19 Two mono channels --> stereo

ffmpeg -i left.mp3 -i right.mp3 -filter_complex "[0:a][1:a]join=inputs=2:channel_layout=stereo[a]" -map "[a]" output.mp3

pause

542
3.20 Mix two stereo channels to one stereo channel

ffmpeg -i input1.wav -i input2.wav -filter_complex "[0:a][1:a]amerge=inputs=2,pan=stereo|c0<c0+c2|c1<c1+c3[a]" -map


"[a]" output.mp3

pause

Or use this command line, the output may be different:


ffmpeg -i input1.wav -i input2.wav -filter_complex "[0:a][1:a]amerge=inputs=2[a]" -map "[a]" -ac 2 output.mp3

pause

3.21 Create a file with multiple audio streams

In this example one video stream and two audio streams are mapped to the output file. The first is the unchanged input stream "-map 1:a" and the
second is the modified stream with higher volume "-map[a]".
ffmpeg -f lavfi -i testsrc2 -f lavfi -i "sine=1k:b=2,channelmap=0|0" -lavfi "[1]volume=3[a]" -map 0:v -map 1:a -map [a]
-t 20 -y out.mkv

pause

In FFplay you can toggle between the streams with "a" button.
In VLC player you can toggle between the streams with "b" button.

543
3.22 How to choose the correct audio volume level

Normally music is normalized to the maximum value (+-32676 for 16-bit). That means the loudest part uses the maximum possible values, just without
clipping. You can use the music for your video as-is, or you can make it quieter. If you make it louder, then it may be clipped.
Things are totally different when you make your own sound records, for example nature sounds.
As the first step, I recommend to calibrate the volume knob of your amplifier. To do this, show several videos from different sources (not your own
selfmade videos), and adjust the volume knob so that all videos sound just right, with other words: Adjust the volume knob so, as you would like to hear
these videos in the planetarium. To make sure that the frequency response is acceptable, use good 3-way boxes. Leave the volume knob in this position
and don't change it.
Now you can adjust the volume of your own video, so that it also sounds great in the planetarium. This ensures that you can play all videos (your own
and other videos) one after the other. You don't want to touch the volume knob during a presentation!

3.23 Remove low frequencies (wind noise) from an audio track

rem Audio hign pass filtering and volume adjustment

set "IN=sound.wav" :: Input soundtrack


set "AS=20" :: Start time
set "LEN=60" :: Length
set "HP=500" :: Cut-off frequency of the high pass filter
set "VOL=10" :: Volume factor
set "OUT=out.mp3" :: Output soundtrack

ffmpeg -ss %AS% -i %IN% -af highpass=f=%HP%,highpass=f=%HP%,highpass=f=%HP%,volume=%VOL% -t %LEN% -y %OUT%

pause

The high pass filter attenuates low frequencies by 12 dB per octave. At the specified cut-off frequency, the filter has 3dB attenuation. In this example, the
same filter is used three times in a row, resulting in 36dB per octave.

544
3.29 Passing the FFmpeg output to FFplay

This batch file passes the video and audio output of FFmpeg to FFplay
ffmpeg -i in.mp4 (insert some filters here) -f nut - | ffplay -

pause

Note: The "|" character is the piping operator from the batch file. It's not part of the FFmpeg command.
Note: The last "-" character after "ffplay" is an undocumented option of FFplay and means "stdin".

3.30 Record sound and pass the output to FFplay


This batch file records sound from the computer's microphone (or audio input) and passes the output to FFplay
ffmpeg -f dshow -sample_rate 44100 -sample_size 16 -channels 2 -i "audio=Microphone (SoundMAX Integrated" (insert some
filters here) -f wav - | ffplay -

pause

552
3.33 Extract the audio from a video

ffmpeg -i video.mp4 -vn audio.mp3

3.34 Split a video into audio-only and video-only

Audio and video are saved if individual files.


ffmpeg -i input.mp4 -vcodec mpeg2video output_video.m2v -acodec copy output_audio.mp3

3.35 Synchronize audio with video

If you have a video with out-of-sync audio, you can synchronize it as follows. In this example a 0.5 seconds delay is added to the audio stream:
ffmpeg -i input.mp4 -itsoffset 0.5 -i input.mp4 -map 0:0 -map 1:1 -acodec copy -cvodec copy output.mp4

For more infos about "-itsoffset", see also: https://trac.ffmpeg.org/wiki/UnderstandingItsoffset


See also the "compensationdelay" filter for delaying audio.
Note: Synchronizing audio with video is very easy with DaVinci Resolve.

556
3.52 Create an alternating left/right stereo sound

This batch file creates a sound file with this sequence:


Frequency F1 on left channel and silence on right channel for duration P/2, then silence on left channel and frequency F2 on right channel for duration
P/2, then repeat.
set "P=0.5" :: Duration of one cycle in seconds
set "F1=1000" :: Frequency for left channel
set "F2=2000" :: Frequency for right channel
set "T=10" :: Duration in seconds

ffmpeg -f lavfi -i sine=%F1% -f lavfi -i sine=%F2% -filter_complex "[0]volume='lt(mod(t,%P%),%P%/2)':eval=frame[a];


[1]volume='gte(mod(t,%P%),%P%/2)':eval=frame[b];[a][b]join=inputs=2:channel_layout=stereo" -t %T% -y out.wav

rem Alternatively you can also use this command line:

ffmpeg -f lavfi -i sine=%F1% -f lavfi -i sine=%F2% -filter_complex "[0]volume=0:enable='lt(mod(t,%P%),%P%/2)'[a];


[1]volume=0:enable='gte(mod(t,%P%),%P%/2)'[b];[a][b]join=inputs=2:channel_layout=stereo" -t %T% -y out.wav

pause

3.53 The "avsynctest" source

It's unclear why [out1] is required. If [out0] is used instead, or no output label at all, it gives an error message.

ffmpeg -f lavfi -i avsynctest=d=10[out1] out.mov

pause

590
4 FFprobe
How to examine a video file with FFprobe without having to write the name of the video into a batch file each time?
It's very simple, just create this batch file once and put it on your desktop:
ffprobe %1
pause

Now you can simply drag the video you want to examine with the mouse onto the icon of this batch file, and you will immediately see the result without
having pressed a single key. The parameter %1 causes the file name to be passed to FFprobe.

See also: https://trac.ffmpeg.org/wiki/FFprobeTips

By the way, it's also possible to let FFmpeg examine a file.


To see whether FFmpeg recognizes the file as something:
ffmpeg -i myfile.xxx
pause

To see whether FFmpeg can decode the file:


ffmpeg -i myfile.xxx -f null -
pause

This is an example for writing the "noise floor count" of an audio file to a CSV log file:
ffprobe -f lavfi -i amovie=in.mp3,astats=metadata=1 -show_entries tags=lavfi.astats.Overall.Noise_floor_count -of
csv=p=0 1> log.csv
pause

595
4.1 Count the number of frames

ffprobe -v error -count_frames -select_streams v:0 -show_entries stream=nb_read_frames -of


default=nokey=1:noprint_wrappers=1 input.mp4

pause

Get the total number of frames of a video: https://stackoverflow.com/questions/2017843/fetch-frame-count-with-ffmpeg

4.2 Find the keyframe timestamps

ffprobe -select_streams V:0 -show_frames -skip_frame nokey -show_entries frame=best_effort_timestamp_time input.mp4

pause

596
5 FFplay

Keyboard commands while playing:


Key Notes
q, ESC Quit
f, Toggle full screen (for videos larger than the screen, this works only if the size is
left mouse double-click specified smaller than the screen size with -x and -y options. Can be combined with -fs option
for starting in full-screen mode.)
p, SPACE Pause
s Step to the next frame. Pause if the stream is not already paused, step to the next video
frame, and pause.
m Toggle mute.
9, 0 Decrease and increase volume respectively.
/, * Decrease and increase volume respectively.
a Cycle audio channel in the current program.
v Cycle video channel.
t Cycle subtitle channel in the current program.
c Cycle program.
w Cycle video filters or show modes.
left/right Seek backward/forward 10 seconds.
down/up Seek backward/forward 1 minute.
page down/page up Seek to the previous/next chapter, or if there are no chapters seek backward/forward 10
minutes.
right mouse click Seek to percentage in file corresponding to fraction of width.

This is a batch file that you can put on your desktop, and then play a video simply by drag-and-drop:
ffplay %1 -autoexit

597
Note: Contrary to FFmpeg, FFplay doesn't need "-i" before the input.

List of the most important FFplay options:


- This is an undocumented option, it's a placeholder for stdin.
-video_size Set frame size (WxH or abbreviation) Note: This is different from FFmpeg, where the option is -s
The "-video_size" option is used to manually tell ffplay the size for videos that do not contain a header (such as raw video). It
is not used to resize videos. For resizing use either the -x and -y options or the "scale" filter.
-fs Start in fullscreen mode
-x Force displayed width
-y Force displayed height
-left Set the x position for the left of the window (default is a centered window).
-top Set the y position for the top of the window (default is a centered window).
-an Disable audio
-vn Disable video
-sn Disable subtitles
-ss pos Seek to pos
-t Duration of video (Note: -loop has higher priority than the -t option)
-loop Defines how often the video is repeated, 0 means infinity (Note: -loop has higher priority than the -t option)
-nodisp Disable graphical display
-noborder Borderless window
-alwaysontop Window always on top
-f fmt Force format
-loop number Loops movie playback <number> times. 0 means forever
-vf filtergraph Create the filtergraph specified by filtergraph and use it to filter the video stream Note: FFplay doesn't allow -filter_complex
-af filtergraph filtergraph is a description of the filtergraph to apply to the input audio
-autoexit Exit when video is done playing
-exitonkeydown Exit if any key is pressed

598
-exitonmousedown Exit if any mouse button is pressed
-report Always generate a log file
-ast Audio stream specifier
-vst Video stream specifier
-sst Subtitle stream specifier
Note: FFplay supports only one input. It doesn't support "filter_complex".

This is a batch file for playing audio files by drag-and-drop (without video output):
ffplay %1 -nodisp -autoexit

This batch file is for showing images (drag-and-drop) for infinite duration:
ffplay -loop 0 %1

This batch file is for showing images (drag-and-drop) for 5 seconds:


ffplay -autoexit -loop 125 %1
Note: The default framerate is 25.

The same thing can also be done with "loop" and "trim" filters. In this case the multiplication by 25 isn't required:
ffplay -vf loop=-1:1,trim=duration=5 -autoexit %1

Infinitely repeat the first 2 seconds of the video:


ffplay -loop 0 -t 2 %1
Note: -loop has the higher priorty, -t affects the play interval of the media. This is different from FFmpeg, where the -t option has the higher priority.

Repeat the first 2 seconds of the video 5 times:


ffplay -loop 5 -t 2 %1

Show the first frame of a video (without audio), or show an image. This can be used for comparing colors:
ffplay -vf select='eq(n,0)' -x 500 -an %1

The drawback of the previous command is that it decodes and evaluates all frames. This can be avoided with this command:
ffplay -vf trim=end_frame=1,tpad=stop=-1:stop_mode=clone -x 500 -an %1

599
Show the n_th frame of a video (without audio):
set "N=5" :: Number of frame, beginning with 0

ffplay -vf select='eq(n,%N%)' -an %1

Show UHD 4K spherical test pattern on extended desktop (UHD 4K beamer):


c:\ffmpeg\ffplay 2160.png -left 1920 -top 0

pause

5.1 Wrong colors in FFplay

Warning: FFplay has huge problems to show videos with the correct colors, especially if the pixel format is yuv420p, as can be demonstrated with this
example:
ffmpeg -f lavfi -i color=black:s=26x6 -lavfi
geq=r='clip(64*mod(X,5),0,255)':g='clip(64*Y,0,255)':b='clip(64*trunc(X/5),0,255)',crop=25:5:0:0,format=rgb24,datascope=
s=750x180:mode=color2:format=dec,scale='2*iw':'2*ih' -frames 1 -y test.png

ffmpeg -loop 1 -i test.png -vf zscale=r=full:p=709:t=709:m=709:rin=full:pin=709:tin=709:min=709,format=yuv420p -crf 0


-vcodec libx264 -t 20 -y out1.mp4

ffmpeg -loop 1 -i test.png -vf zscale=r=full:p=709:t=709:m=709:rin=full:pin=709:tin=709:min=709,format=yuv444p -crf 0


-vcodec libx264 -t 20 -y out2.mp4

pause

600
There was one problem to solve in Davinci Resolve before this worked:
The problem was that the timecode in the subtitle files did begin with 01:00:00:00 (which could mean either "1 hour" or historically "Reel 1").
This can be corrected in Preferences --> User --> Editing --> New Timeline Settings --> Start Timecode, where you can enter 00:00:00:00.
To change the starting timecode of an already existing timeline, right-click on the timeline in the media pool, select Timelines --> Starting Timecode.. and
set it to 00:00:00:00.
SMPTE timecode is presented in hour:minute:second:frame format. (Source: https://en.wikipedia.org/wiki/SMPTE_timecode )

If the timecode is 01:aa:bb:cc then the time in seconds can be calculated as (60 * aa) + bb + (cc / fps). This is only valid if fps is an integer.

6.84 Supported Codecs

Transcodings can be made under File --> Media_Management


See also: https://documents.blackmagicdesign.com/SupportNotes/DaVinci_Resolve_18_Supported_Codec_List.pdf

What H.264 and H.265 Hardware Decoding is Supported in DaVinci Resolve Studio?
https://www.pugetsystems.com/labs/articles/What-H-264-and-H-265-Hardware-Decoding-is-Supported-in-DaVinci-Resolve-Studio-2122/

6.85 Convert *.mkv videos for DaVinci Resolve

Davinci Resolve can't import *.mkv videos, but you can convert them lossless and very fast to *.mp4 with FFmpeg:
ffmpeg -i in.mkv -c:v copy -c:a copy -y out.mp4

pause

691
6.92 Miscellaneous unsorted things

View --> Safe Area This overlays a mask for showing the safe area that's always visible on a TV screen

Playback → Proxy Mode Off / Half Resolution / Quarter Resolution

*.svg Standard Vector Graphics

7 Shotcut

Shotcut is a free, open-source video editor.


https://www.shotcut.org/

It does have a 360° stabilizer.

The software OpenShot https://www.openshot.org/ seems to be similar, but Shotcut might be the better one of these two.

699
8 Exiftool

With this tool you can show all EXIF data that are contained in pictures or videos. https://exiftool.org/
Usage is very simple if you create this batch file once and put it on your desktop:
exiftool %1

pause

Now you can simply drag the picture or video you want to examine with the mouse onto the icon of this batch file, and you will immediately see the result
without having pressed a single key. The parameter %1 causes the file name to be passed to Exiftool.

Exiftool can be combined with the batch command "findstr", if you want to filter only a few lines from the large Exiftool output:
@echo off

exiftool %1 | findstr /C:"File Name" /C:"File Size" /C:"Duration" /C:"Image Width" /C:"Image Height" /C:"Video Frame
Rate" /C:"Exposure Time" /C:"F Number" /C:"Exposure Program" /C:"ISO" /C:"Photo Style" /B /C:"Noise Reduction"
/C:"Contrast " /C:"Saturation" /C:"Sharpness" /C:"Avg Bitrate" /C:"Track Create Date"

pause

"findstr" is in detail explained here: https://ss64.com/nt/findstr.html

The option -u means "extract unknown tags" and option -H means "Show tag ID numbers in hexadecimal":
exiftool -u -H %1

pause

Exiftool does also list "Internal Serial Number" and "Lens Serial Number", however in both cases the listed numbers don't agree with the serial numbers
printed on my GH5S and Leica DG 12-60mm f/2.8-4.0.

700
Example for extracting GPS metadata from a video:
exiftool -p gpx.fmt -ee input.mp4 > output.gpx

-p FMTFILE Print output in specified format


-ee Extract information from embedded files

Example for setting a metadata tag:


exiftool -ProjectionType="equirectangular" out.jpg

Note: The original file is renamed as "out.jpg_original" and the new file is saved as "out.jpg".

701
10 Gimp

How to inspect pixel values:


Use the "Color picker" tool and tick the box "Use info window"

11 Faststone Image Viewer

This is a free image viewer that can automatically the screen when the image file is overwritten by a new image.
https://www.faststone.org/FSViewerDetail.htm

12 Adobe DNG converter

This is a tool for converting RAW images from many different cameras to the DNG format, which has a lossless compression.
Note: You can only specify the folder and not the images. It's normal that you don't see the images. Make sure that you click on "Convert" and not on
"Extract".

703
13 Batch files (DOS, Windows 7, 10 and 11)

Some useful links for writing batch files:


https://en.wikibooks.org/wiki/Windows_Batch_Scripting (english)
https://de.wikibooks.org/wiki/Batch-Programmierung/_Druckversion (german)
https://ss64.com/nt/
https://ss64.com/nt/syntax.html

13.1 Wildcards in filenames

* any sequence of one or more characters


? a single character other than a period "."

When a command-line argument contains a filename, a special syntax can be used to get various information about this file:
Syntax Result Example for F:\Meteors_2019\CUT00380.MOV
%1 CUT00380.MOV
%~1 %1 with no enclosing quotation marks CUT00380.MOV
%~f1 Full path with a drive letter F:\Meteors_2019\CUT00380.MOV
%~d1 Drive letter F:
%~p1 Drive-less path with the trailing backslash \Meteors_2019\
For a file, the file name without path and extension
%~n1 CUT00380
For a folder, the folder name
%~x1 File name extension including the period .MOV
The same syntax applies to single-letter variables created by the FOR command.

704
Change the extension of a filename in a batch file:
set OLDFILENAME=%1
set NEWFILENAME=%OLDFILENAME:MOV=MP4%
pause
Please note that all instances of "MOV" will be replaced by "MP4". This fails if "MOV" is part of the path or filename, as in "MOVEMENT.MOV"

13.2 Create beeps in a batch file

@echo #
@timeout 1
@echo #
@timeout 1
@echo #

In this example the # stands for the non-printable character (ASCII code 7), which you can't enter with Notepad.
You can type any other character instead and later use a hex editor to replace it by 0x07.

Another way for creating the ASCII 7 is to type this command line at the command prompt:

echo @echo ^G>test33.bat

where ^G means typing CTRL G

This is an endless loop for beeping every 10 seconds, without any output on the screen (except a line feed):
:beep
@echo #
@timeout 10 > nul
@goto :beep

705
13.3 Loop over all files in a directory

This example does loop over all img*.jpg files in the current directory and creates img*.png images.
for %%f in (img*.jpg) do call :for_body %%f
goto :the_end

:for_body
ffmpeg -i %1 -y %~n1.png
exit /b

:the_end
pause

Note: "goto :the_end" can be replaced by "goto :eof" which is a predefined label at the end of the file. In this case it's unnecessary to write ":eof" at the
end.
See also: http://www.trytoprogram.com/batch-file-for-loop/

13.4 Create short text files or append text to a file

echo Hello !> test.txt


echo This is the 2nd line>> test.txt

Note: The first line (with ">") creates a new file or overwrites an existing file. The second line (with ">>") appends text to an already existing file.

706
13.5 Calculate variables in a batch file

It's possible to calculate variables, but only integer artihmetic is supported:


set "A=5"
set /a "B=2*%A%"

13.6 if conditions

if %MODE%==1 echo test1


pause
This doesn't work because the variable "MODE" is undefined. The left side of the comparison is empty. The batch file will immediately exit in the first line
without any error message.
To avoid this problem, you can add two dots to each side of the comparison (thanks to Dan Bridges for this idea):
if %MODE%.==1. echo test2
pause
In this case the left side isn't empty. You won't see the "test2" echo because the left and right sides aren't equal. The batch file won't exit in the first line
and will wait for a keypress in the second line.

13.7 Start a new process

This is an example for starting two new processes:


start ffplay -noborder -x 640 -y 480 -left 0 -top 200 udp://239.0.0.1:1234
start ffplay -noborder -x 640 -y 480 -left 640 -top 200 udp://239.0.0.1:1234

pause

707
13.8 Redirection and Piping in Batch Files

See https://learn.microsoft.com/en-us/previous-versions/windows/it-pro/windows-xp/bb490982(v=technet.10)

13.9 Command line length

In a batch file the length of a command line is limited to 8191 characters.


In C# ProcessStartInfo class the limit is 32766 characters.
If you need a longer command line, use filter_complex_script.

14 Batch files (Unix, Linux)

Unix batch files (shell scripts) have a different syntax than DOS/Windows batch files. For converting a batch file from DOS to Unix, see this website:
https://tldp.org/LDP/abs/html/dosbatch.html

708
15 Regular Expressions

The syntax is explained here for Python: https://docs.python.org/3/library/re.html


See also this Wikipedia article: https://en.wikipedia.org/wiki/Regular_expression
Regular expressions reference: https://www.regular-expressions.info/reference.html
Online tool for testing regular expressions: https://regex101.com/

Expression Meaning
(?i) switch to case insensitive (If this is not specified, regular expressions are case sensitive)
. matches any character
? means zero or one of the preceeding character
.? matches zero or one of any character
.* matches zero or more of any character
.+ matches one or more of any character
{0,1} means zero or one of the preceeding character, it's the same as ?
{0,} means zero or more of the preceeding character, it's the same as *
{1,} means one or more of the preceeding character, it's the same as +
\s matches all whitespace characters (which includes [ \t\n\r\f\v])
^ matches the start of string (line)
$ matches the end of string (line)
(?i)^.*regards.*http.*$ matches if "regards" is followed by "http" in the same line, with any number of any characters before, between and
after the patterns

709
16 VLC Player

https://www.videolan.org/vlc/
Documentation: https://wiki.videolan.org/Documentation:User_Guide/ or https://wiki.videolan.org/Documentation:Documentation/

This is a subset of VLC's keyboard hotkeys:


Key Notes
F Toggle fullscreen
ESC Leave fullscreen/close dialogue
space Play/pause
E Next frame
+ Faster
- Slower
= Normal rate
] Faster (fine)
[ Slower (fine)
S Stop
T Position/time
Ctrl + Q Quit
M Mute
B Cycle audio track
V Cycle subtitle track
Shift + s Make a snapshot picture

710
My notebook doesn't have enough computing power for playing 4K videos (400Mbit/s from Panasonic GH5S) smoothly with VLC player. This batch file
reduces the size to 50% and pipes the video to VLC. Simply drag and drop the video on the batch file's icon. The parameter %1 causes the file name to be
passed to FFmpeg. The option "-b:v 10M" sets the bitrate.
ffmpeg -i %1 -vf scale=w=iw/2:h=ih/2 -b:v 10M -f mpegts - | "C:\Program Files\VideoLAN\VLC\vlc.exe" -

Note: VLC does also play 360° videos, if they contain the required metadata.
Note: The "-" character at the end of the command line is a placeholder for "stdin".

16.1 Wrong colors in VLC Player

Warning: VLC Player has huge problems to show videos with the correct colors, especially if the pixel format is yuv420p, as can be demonstrated with
this example:
ffmpeg -f lavfi -i color=black:s=26x6 -lavfi
geq=r='clip(64*mod(X,5),0,255)':g='clip(64*Y,0,255)':b='clip(64*trunc(X/5),0,255)',crop=25:5:0:0,format=rgb24,datascope=
s=750x180:mode=color2:format=dec,scale='2*iw':'2*ih' -frames 1 -y test.png

ffmpeg -loop 1 -i test.png -vf zscale=r=full:p=709:t=709:m=709:rin=full:pin=709:tin=709:min=709,format=yuv420p -crf 0


-vcodec libx264 -t 20 -y out1.mp4

ffmpeg -loop 1 -i test.png -vf zscale=r=full:p=709:t=709:m=709:rin=full:pin=709:tin=709:min=709,format=yuv444p -crf 0


-vcodec libx264 -t 20 -y out2.mp4

pause

16.2 How fullscreen video on the extended desktop

Tools --> Preferences --> Video --> Tick "Fullscreen" and select the "Fullscreen Video Device".

711
16.3 Show live video from a webcam (and make screenshots)

• Right-click --> Open Capture Device


• Select the camera at "Video Device name"
• Set "Audio Device Name" to "none", other wise you might get an acoustic feedback
• Under "Advanced Options" you can set more details, for example aspect ratio 16:9
• Set "Video Size" for example to 1280x720
• Now when you click on "Play" you will see the live video
• You can make screenshots with SHIFT-S
• The images can be found in your own folder --> Pictures
• The file format, folder location and filenam can be set as follows:
• Right-click, Tools --> Preferences, click on "Video"

Unfortunately I don't know how all these settings can be saved.

712
17 MPV

MPV is a video player and can be downloaded here: https://mpv.io/installation/


You must install two files: mpv.exe and mpv.com

How to show filter curves:


mpv av://lavfi:aevalsrc="not(mod(n\,32768)):d=50" -lavfi-complex
"[aid1]lowpass,asplit[ao],showfreqs=overlap=0:win_size=32768:win_func=rect:ascale=log,format=rgb0[vo]"

MPV supports the same filters as FFmpeg. This is an example for translating a FFmpeg command with a complex filterchain to MPV:
rem FFmpeg:
ffmpeg -i 7Z7A2027.jpg -filter_complex
"split[1][2];[1]hue=h=0:s=1:b=-2[3];[2][3]hstack" -y out.jpg

rem This is the same thing with MPV:


mpv 7Z7A2027.jpg --keep-open=yes
--lavfi-complex="[vid1]split[1][2];[1]hue=h=0:s=1:b=-2[3];[2][3]hstack,scale=iw/2:ih/2[vo]"

Notes:
• Don't use "-i" before the input file.
• "--keep-open=yes" means that mpv doesn't close shortly after showing the output image.
• "-filter_complex" must be replaced by "--lavfi-complex="
• The input pad in the filter chain must be called [vid1]. You can't omit it as in FFmpeg.
• The output pad in the filter chain must be called [vo]. You can't omit it as in FFmpeg.

713

You might also like