Pipe Raw Opencv Images to Ffmpeg

Pipe raw OpenCV images to FFmpeg

Took a bunch of fiddling but I figured it out using the FFmpeg rawvideo demuxer:

python capture.py | ffmpeg -f rawvideo -pixel_format bgr24 -video_size 640x480 -framerate 30 -i - foo.avi

Since there is no header in raw video specifying the assumed video parameters, the user must specify them in order to be able to decode the data correctly:

  • -framerate Set input video frame rate. Default value is 25.
  • -pixel_format Set the input video pixel format. Default value is yuv420p.
  • -video_size Set the input video size. There is no default, so this value must be specified explicitly.

And here's a little something extra for the power users. Same thing but using VLC to stream the live output to the web, Flash format:

python capture.py | cvlc --demux=rawvideo --rawvid-fps=30 --rawvid-width=320 --rawvid-height=240  --rawvid-chroma=RV24 - --sout "#transcode{vcodec=h264,vb=200,fps=30,width=320,height=240}:std{access=http{mime=video/x-flv},mux=ffmpeg{mux=flv},dst=:8081/stream.flv}"

Edit:
Create a webm stream using ffmpeg and ffserver

python capture.py | ffmpeg -f rawvideo -pixel_format rgb24 -video_size 640x480 -framerate 25 -i - http://localhost:8090/feed1.ffm

Pipe opencv images to ffmpeg using python

I had similar problem once. I opened an issue on Github, turns out it may be a platform issue.

Related to your question, you can as well pipe OpenCV images to FFMPEG. Here's a sample code:

# This script copies the video frame by frame
import cv2
import subprocess as sp

input_file = 'input_file_name.mp4'
output_file = 'output_file_name.mp4'

cap = cv2.VideoCapture(input_file)
ret, frame = cap.read()
height, width, ch = frame.shape

ffmpeg = 'FFMPEG'
dimension = '{}x{}'.format(width, height)
f_format = 'bgr24' # remember OpenCV uses bgr format
fps = str(cap.get(cv2.CAP_PROP_FPS))

command = [ffmpeg,
'-y',
'-f', 'rawvideo',
'-vcodec','rawvideo',
'-s', dimension,
'-pix_fmt', 'bgr24',
'-r', fps,
'-i', '-',
'-an',
'-vcodec', 'mpeg4',
'-b:v', '5000k',
output_file ]

proc = sp.Popen(command, stdin=sp.PIPE, stderr=sp.PIPE)

while True:
ret, frame = cap.read()
if not ret:
break
proc.stdin.write(frame.tostring())

cap.release()
proc.stdin.close()
proc.stderr.close()
proc.wait()

Stream images from python openCV with ffmpeg

Here is a reproducible sample - hoping you can copy paste and execute, but nothing is promised...

The example applies the following stages:

  • Create 10 synthetic JPEG images in ./test_dataset folder, to be used as input.
  • Execute FFplay sub-process as RTSP listener.

    When using TCP protocol we should start the TCP server first (FFplay is used as a TCP server in out case).

    We also need the receiver process, because without it, FFmpeg streamer process halts after the first frame.
  • Execute FFmpeg sub-process for RTSP streaming.

    Cyclically read JPEG image to NumPy array (in BGR color format), and write the array as raw video frame to stdin pipe.

    Note: It is more efficient to write raw video frames, than encoding each frame to PNG (as used by your reference sample).

Here is the code:

import cv2
#import time
import subprocess as sp
import glob
import os

img_width = 1280
img_height = 720

test_path = './test_dataset' # Folder with synthetic sample images.

os.makedirs(test_path, exist_ok=True) # Create folder for input images.

os.chdir(test_path)

ffmpeg_cmd = 'ffmpeg' # May use full path like: 'c:\\FFmpeg\\bin\\ffmpeg.exe'
ffplay_cmd = 'ffplay' # May use full path like: 'c:\\FFmpeg\\bin\\ffplay.exe'

# Create 10 synthetic JPEG images for testing (image0001.jpg, image0002.jpg, ..., image0010.jpg).
sp.run([ffmpeg_cmd, '-y', '-f', 'lavfi', '-i', f'testsrc=size={img_width}x{img_height}:rate=1:duration=10', 'image%04d.jpg'])

img_list = glob.glob("*.jpg")
img_list_len = len(img_list)
img_index = 0

fps = 5

rtsp_server = 'rtsp://localhost:31415/live.stream'

# You will need to start the server up first, before the sending client (when using TCP). See: https://trac.ffmpeg.org/wiki/StreamingGuide#Pointtopointstreaming
ffplay_process = sp.Popen([ffplay_cmd, '-rtsp_flags', 'listen', rtsp_server]) # Use FFplay sub-process for receiving the RTSP video.

command = [ffmpeg_cmd,
'-re',
'-f', 'rawvideo', # Apply raw video as input - it's more efficient than encoding each frame to PNG
'-s', f'{img_width}x{img_height}',
'-pixel_format', 'bgr24',
'-r', f'{fps}',
'-i', '-',
'-pix_fmt', 'yuv420p',
'-c:v', 'libx264',
'-bufsize', '64M',
'-maxrate', '4M',
'-rtsp_transport', 'tcp',
'-f', 'rtsp',
#'-muxdelay', '0.1',
rtsp_server]

process = sp.Popen(command, stdin=sp.PIPE) # Execute FFmpeg sub-process for RTSP streaming

while True:
current_img = cv2.imread(img_list[img_index]) # Read a JPEG image to NumPy array (in BGR color format) - assume the resolution is correct.
img_index = (img_index+1) % img_list_len # Cyclically repeat images

process.stdin.write(current_img.tobytes()) # Write raw frame to stdin pipe.

cv2.imshow('current_img', current_img) # Show image for testing

# time.sleep(1/FPS)
key = cv2.waitKey(int(round(1000/fps))) # We need to call cv2.waitKey after cv2.imshow

if key == 27: # Press Esc for exit
break

process.stdin.close() # Close stdin pipe
process.wait() # Wait for FFmpeg sub-process to finish
ffplay_process.kill() # Forcefully close FFplay sub-process
cv2.destroyAllWindows() # Close OpenCV window

is it possible to send ffmpeg images by using pipe?

"I want to send images as input to FFmpeg... I believe that FFmpeg could receive image from a pipe, does anyone know how this can be done?"

Yes it's possible to send FFmpeg images by using a pipe. Use the standardInput to send frames. The frame data must be uncompressed pixel values (eg: 24bit RGB format) in a byte array that holds enough bytes (widthxheightx3) to write a full frame.

Normally (in Command or Terminal window) you set input and output as:

ffmpeg -i inputvid.mp4 outputvid.mp4.

But for pipes you must first specify the incoming input's width/height and frame rate etc. Then aso add incoming input filename as -i - (where by using a blank - this means FFmpeg watches the standardInput connection for incoming raw pixel data.

You must put your frame data into some Bitmap object and send the bitmap values as byte array. Each send will be encoded as a new video frame. Example pseudo-code :

public function makeVideoFrame ( frame_BMP:Bitmap ) : void
{
//# Encodes the byte array of a Bitmap object as FFmpeg video frame
if ( myProcess.running == true )
{
Frame_Bytes = frame_BMP.getBytes(); //# read pixel values to a byte array
myProcess.standardInput.writeBytes(Frame_Bytes); //# Send data to FFmpeg for new frame encode

Frame_Bytes.clear(); //# empty byte array for re-use with next frame

}
}

Anytime you update your bitmap with new pixel information, you can write that as a new frame by sending that bitmap as input parameter to the above function eg makeVideoFrame (my_new_frame_BMP);.

Your pipe's Process must start with these arguments:

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - ....etc

Where...

  • -f rawvideo -pix_fmt argb means accept uncompressed RGB data.

  • -s 800x600 and -r 25 are example input width & height, r sets frame rate meaning FFmpeg must encode this amount of images per one second of output video.

The full setup looks like this:

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_vid.h264

If you get blocky video output try setting two output files...

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_tempData.h264 out_vid.h264

This will output a test h264 video file which you can later put inside an MP4 container.
The audio track -i someTrack.mp3 is optional.

-i myH264vid.h264 -i someTrack.mp3 outputVid.mp4

How to send opencv video's to ffmpeg

Same question ... same answer...
I hope this answer will help you.

In this program I make a screen copy(RGB data) at 20 fps and send image to ffmpeg. i don't use pipe but socket. I use this command line :

ffmpeg -f rawvideo -pixel_format rgb24  -video_size 640x480 -i  "tcp://127.0.0.1:2345" -codec:v libx264 -pix_fmt yuv420p Video.mp4

to run ffmpeg and then send data to port 2345 using socket

sock->Write(b.GetData(), nb);

I don't encode frame it is raw data



Related Topics



Leave a reply



Submit