Ffmpeg Overlay Size

How to scale overlay image in ffmpeg

In the filter chain, you must first scale the image separately, and then perform the overlay. Just prepend your filterchain with [1:v]scale=320:240 [ovrl],[0:v][ovrl].
The final command line (split to multiple lines for better readability):

ffmpeg -i video.mp4 -i image.jpg -b:v 1M \
-filter_complex "[1:v]scale=320:240 [ovrl], \
[0:v][ovrl]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2, \
drawtext=fontfile=/usr/share/fonts/truetype/dejavu/DejaVuSerif-Bold.ttf: \
text='Test Text': x=1: y=1: fontsize=30" output.mp4

However, if your video is anamorphic (storage aspect ratio (SAR) is different from display aspect ratio (DAR), used mainly in TV broadcasts), then the video is resized (stretched) upon playback. Of course, the overlaid image is then stretched as well, as it is part of the video.

For example, PAL SD broadcast (stored in 720x576 pixels, SAR=5:4) is usually displayed using 16:9 DAR, thus would be resized upon playback to 1024x576 to keep DAR. So if you overlay 320x240 image on such video, its display size would then be 455x240 and it would look stretched.

If you require that the aspect ratio of your overlay image (4:3) is kept, you need to take into account the SAR and DAR of your video and calculate the correct dimensions to resize the image for overlay. If you know SAR and DAR of your video, you can use this formula, to calculate the correct width to resize your overlay image (assuming height stays at 240): width=320*SAR/DAR

How can I resize an overlay image with ffmpeg?

You have found all parts. Let's bring them together:

ffmpeg -i "$videoPath" -i "$overlayPath" -filter_complex "[1:v]scale=360:360[z];[0:v][z]overlay[out]" -map "[out]" -map "0:a" "$outputPath"

For explanation:
We're executing here two filter within the "filter_complex" parameter separated by a semicolon ";".

First we scale the second video input ([1:v]) to a new resolution and store the output in variable "z" (you can put here any name).
Second we bring the first input video ([0:v]) and the overlay ([z]) together and store the output in variable "out".

Now it's time to tell ffmpeg what he should pack into our output file:
-map "[out]" (for the video)
-map "0:a" (for the audio of the first input file)

FFMPEG - How to set overlay video's size to input size

First scale using scale2ref, then change opacity.

-i input.mkv -i overlay.mov -filter_complex [1:v][0:v]scale2ref[zork][video]; [zork]format=argb,lutrgb=a=val*0.5[zork];[video][zork]overlay -pix_fmt yuv420p -preset ultrafast -c:a copy output.mkv

Resize overlay video

With filter_complex you can combine commands and make named outputs
Try something like this:

ffmpeg -i image.png -i input.mp4 -filter_complex "[1:v]scale=320x240[scaled_v];[0:v][scaled_v] overlay=10:10:enable='between(t,0,38)'" -pix_fmt yuv420p -c:a copy output.mp4

Just change resolution to what you need instead of 320x240.

If you need image overlay on top of video, swap [1:v] and [0:v]

FFMPEG : overlay image on video and retain size

Skip the scale2ref.

"-i", video.mp4, "-i", image.mp4, "-filter_complex", "[0:v]pad=iw:2*trunc(iw*16/9/2):(ow-iw)/2:(oh-ih)/2[v0];[1:v][v0]scale2ref[v1][v0];[v0][v1]overlay=x=(W-w)/2:y=(H-h)/2[v]", "-map", "[v]", "-q:v", "2", directoryToStore + "/" + ImageName + ".jpeg"

how to define overlay video position and size ffmpeg electronJS?

The overlay filter only specifies where to place the overlaid video and not its size. So, if it is not in the desired dimension, you need to pre-scale it explicitly. Try

"[0:v]scale=1920:-4,pad=0:1080:0:(oh-ih)/2[vid];
[1:v]scale=w:h[ovly];
[vid][ovly]overlay=x:y"

Substitute w,h,x, & y with desired width, height, and (x,y) upper-left placement of the overlay video.

FFMPEG - How to resize an image overlay?

Instead of [v][3]overlay=20:500[z] you would use [3]scale=360:360[3v];[v][3v]overlay=20:500[z]. Your GIF should be square-shaped to begin with, to avoid distorting it.

FFmpeg overlay one video to another, soung mixing, scale overlayed video and positioning it in right bottom corner

Add the scale and amix filters. Example to make camera.mp4 half size and mix the sound from both inputs:

ffmpeg -i screen.mp4 -i camera.mp4 -filter_complex "[1:v]scale=iw/2:-1[cam];[0:v][cam]overlay=main_w-overlay_w-5:main_h-overlay_h-5;[0:a][1:a]amix" output.mp4

ffmpeg image overlay and scaling

Use

ffmpeg -i bg.jpeg -i banana.png -filter_complex [0]scale=W:H[b];[1]scale=W:H[w];[b][w]overlay=5:H-h-5[v] -map "[v]" out.png


Related Topics



Leave a reply



Submit