FFmpeg Explorer is a tool to help you explore FFmpeg filters.

ffmpeg -i input.mkv -i audio.mp3 -c copy output.mkv

ffmpeg -i input.mp4 -b:a 320K -vn output-audio.mp3

options '-c:a copy' and '-c:v copy'

transcode from 240 to 30 fps w/out dropping frames

ffmpeg -i IMG_0707\ \(203fps\).MOV -filter “setpts=6.0*PTS” -r 30 -an output.mp4 (note -an removes audio)

…all files in a folder

for file in `/bin/ls -1`; do ffmpeg -i $file -filter “setpts=6.0*PTS” -r 30 -an $file.30fps.mp4 ; done

…timelapsed (1 frame every 4 seconds)

ffmpeg -i IMG_4566.MOV -r 0.25 IMG_4566_seq/output_%05d.png

ffmpeg -i %05d.jpg -c:v libx265 -r 25 out.mkv

create a list of files (e.g. named files.txt)
file 'dust_and_shadow_28min.mp4'
file 'dust_and_shadow_black.mp4'

then…
ffmpeg -f concat -i files.txt -c copy dust_and_shadow_30min.mp4

alternatively, partial automation…
ls *.mp4 | while read each; do echo “file '$each'” » files.txt; done
ffmpeg -f concat -i files.txt -c copy 000-qualia-RN101.mkv

also → https://stackoverflow.com/questions/7333232/how-to-concatenate-two-mp4-files-using-ffmpeg

10.122 minterpolate / interpolate missing frames…

ffmpeg -i IMG_0707\ \(203fps\).MOV -filter “setpts=9.0*PTS;minterpolate='fps=300'” -r 30 -an IMG_0707\ \(30fps\).MOV

ffmpeg -i IMG_0707\ \(203fps\).MOV -filter “minterpolate='fps=600'” -filter “setpts=10.0*PTS” -r 30 -an IMG_0707\ \(minterp600-30fps\).mp4

ffmpeg -i IMG_0707\ \(203fps\).MOV -filter “minterpolate='fps=600':mci:mc_mode=aobmc:scd_threshold=0.1” -filter “setpts=9.0*PTS” -r 30 -an IMG_0707\ \(minterp00-aobmc\).mp4

  • 10.164 sab - Apply Shape Adaptive Blur
  • 10.168 selectivecolor
  • 10.175 shuffleframes (…and then interpolate)
  • 10.133 ocv - Apply a video transform using libopencv

ffmpeg -i INPUT -f lavfi -i nullsrc=s=hd720,lutrgb=128:128:128 -f lavfi -i nullsrc=s=hd720,geq='r=128+30*sin(2*PI*X/400+T):g=128+30*sin(2*PI*X/400+T):b=128+30*sin(2*PI*X/400+T)' -lavfi '[0][1][2]displace' OUTPUT

ffmpeg -i “IMG_0707 (8m03).mp4” -f lavfi -i IMG_0705.MOV -f lavfi -i IMG_0706.MOV -lavfi '[0][1][2]displace' displaced.mp4

'[0][1][2]displace=edge=mirror' overlooked-forest-disp.mp4

  • 10.138 palettegen - Generate one palette for a whole video stream.
  • 10.139 paletteuse - Use a palette to downsample an input video stream.
  • split toning → normalize w. black & whitepoints set to color (e.g. “normalize=blackpt=red:whitept=cyan” )

1-step ffmpeg -i Gomorra.S04E04.mp4 -vf subtitles=Gomorra.S04E04.srt Gomorra.S04E04.hardsubs.mp4

2-step ffmpeg -i Gomorra.S04E04.srt Gomorra.S04E04.ass && ffmpeg -i Gomorra.S04E04.mp4 -vf ass=Gomorra.S04E04.ass Gomorra.S04E04.hardsubs.mp4

see also → https://stackoverflow.com/questions/8672809/use-ffmpeg-to-add-text-subtitles and/or https://trac.ffmpeg.org/wiki/HowToBurnSubtitlesIntoVideo

metadata only

ffmpeg -i input.m4v -map_metadata 0 -metadata:s:v rotate=“90” -codec copy output.m4v

pixelwise

ffmpeg -i in.mp4 -vf “transpose=1” out.mmp4

For the transpose parameter you can pass
0 = 90CounterCLockwise and Vertical Flip (default)
1 = 90Clockwise
2 = 90CounterClockwise
3 = 90Clockwise and Vertical Flip

ffmpeg -flags2 +export_mvs -i input.mp4 -vf codecview=mv=pf+bf+bb output.mp4

ffmpeg -debug vis_mb_type -i input.mp4 output.mp4

Example Stabilization Commands (via https://scottlinux.com/2016/09/17/video-stabilization-using-vidstab-and-ffmpeg-on-linux/ and/or https://gist.github.com/maxogden/43219d6dcb9006042849 )

https://ffmpeg.org/ffmpeg-filters.html#vidstabdetect-1

Option A: Stabilize a video using default settings for a quick fix:
ffmpeg -i input.mp4 -vf vidstabtransform,unsharp=5:5:0.8:3:3:0.4 output.mp4
Option B: (Better) 
First let FFmpeg analyze the video (No changes are made- this just makes a file called transform.trf)
1. Analyze with default values:
ffmpeg -i input.mp4 -vf vidstabdetect -f null -
Or analyze a very shaky video, on a scale of 1-10:
ffmpeg -i input.mp4 -vf vidstabdetect=shakiness=10:accuracy=15 -f null -
2. Next, use that generated file transform.trf to help better stabilize the video:
ffmpeg -i input.mp4 -vf vidstabtransform=smoothing=30:input="transforms.trf" output.mp4
Done.

Make a side by side video with:

ffmpeg -i video1.mp4 -i video2.mp4 -filter_complex “[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w” side_by_side.mp4

etc

set VID "sourcevideo.m4v"
set VIDST "$VID-stablized.mov"; ffmpeg -i $VID -vf vidstabdetect -f null -; ffmpeg -i $VID -vf vidstabtransform=smoothing=5 $VIDST; ffmpeg -i $VID -i $VIDST -filter_complex "[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w" "$VID-merged.mov"

side-by-side (horizontally stacked)

ffmpeg -i clip.mov -i clip2.mov -filter_complex hstack merged.mov

vertically stacked

ffmpeg -i clip.mov -i clip2.mov -filter_complex vstack merged.mov

  • 38.11 blend, tblend (time blend) filter takes two consecutive frames from one single stream, and outputs the result obtained by blending the new frame on top of the old frame.
  • ffmpeg_notes.txt
  • Last modified: 2023-08-27 12:27
  • by nik