Opencv Read Jpeg Image from Buffer

opencv read jpeg image from buffer

I have decompressed the JPEG image using libjpeg using the standard procedure described in the libjpeg API documentation under 'Decompression details'.

After having decompressed the data you can use it to construct the cv::Mat. Mind you, the decompressed image is in RGB format, whereas openCV uses a BGR format so a cvtColor() operation with format CV_RGB2BGR is needed.

Python convert JPEG bytes to image array

To answer your initial question, you could do

import requests
from PIL import Image
import cv2
import numpy as np

url = "https://www.cleverfiles.com/howto/wp-content/uploads/2018/03/minion.jpg"
frame = Image.open(requests.get(url, stream=True, allow_redirects=True).raw)
cv2.imshow("dfsdf", np.asarray(frame)[:,:,::-1])

But as per Mark's comment and this answer, you can skip PIL altogether:

import requests
import cv2
import numpy as np

url = "https://www.cleverfiles.com/howto/wp-content/uploads/2018/03/minion.jpg"

# Fetch JPEG data
d = requests.get(url)
# Decode in-memory, compressed JPEG into Numpy array
frame = cv2.imdecode(np.frombuffer(d.content,np.uint8), cv2.IMREAD_COLOR)
cv2.imshow("dfsdf",frame)

How to read image from in memory buffer (StringIO) or from url with opencv python library

To create an OpenCV image object with in memory buffer(StringIO), we can use OpenCV API imdecode, see code below:

import cv2
import numpy as np
from urllib2 import urlopen
from cStringIO import StringIO

def create_opencv_image_from_stringio(img_stream, cv2_img_flag=0):
img_stream.seek(0)
img_array = np.asarray(bytearray(img_stream.read()), dtype=np.uint8)
return cv2.imdecode(img_array, cv2_img_flag)

def create_opencv_image_from_url(url, cv2_img_flag=0):
request = urlopen(url)
img_array = np.asarray(bytearray(request.read()), dtype=np.uint8)
return cv2.imdecode(img_array, cv2_img_flag)

How to use opencv in Python3 to read file from file buffer

Here a solution:

import io, requests, cv2, numpy as np

url = "https://images.pexels.com/photos/236047/pexels-photo-236047.jpeg"
img_stream = io.BytesIO(requests.get(url).content)
img = cv2.imdecode(np.frombuffer(img_stream.read(), np.uint8), 1)

cv2.imshow("img", img)
cv2.waitKey(0)

How to process a JPEG binary data in OpenCV?

As @Micka already pointed out, you should use cv::imdecode

You can use it with your FILE*. You probably may want to use fstreams if you're using C++. You can also rely directly on OpenCV capabilities to read files.

The code below will show you these options for reading files. Code for writing is similar (I can add it if you need it).

Remember that if you want to write the binary stream, you should use imencode

#include <opencv2\opencv.hpp>
#include <fstream>
#include <stdio.h>

using namespace std;
using namespace cv;

int main()
{

////////////////////////////////
// Method 1: using FILE*
////////////////////////////////

FILE* read_image = fopen("path_to_image", "rb");
if (read_image == NULL)
{
printf("Image Not Found\n");
}

fseek(read_image, 0, SEEK_END);
int fileLen = ftell(read_image);
fseek(read_image, 0, SEEK_SET);

unsigned char* pre_image = (unsigned char *)malloc(fileLen);
size_t data = fread(pre_image, 1, fileLen, read_image);

// Printed and verify the values
printf("File Size %d\n", fileLen);
printf("Read bytes %d\n", data);

fclose(read_image);

vector<unsigned char> buffer(pre_image, pre_image + data);
Mat img = imdecode(buffer, IMREAD_ANYCOLOR);

////////////////////////////////
//// Method 2: using fstreams
////////////////////////////////

//ifstream ifs("path_to_image", iostream::binary);
//filebuf* pbuf = ifs.rdbuf();
//size_t size = pbuf->pubseekoff(0, ifs.end, ifs.in);
//pbuf->pubseekpos(0, ifs.in);
//vector<char> buffer(size);
//pbuf->sgetn(buffer.data(), size);
//ifs.close();
//Mat img = imdecode(buffer, IMREAD_ANYCOLOR);

////////////////////////////////
//// Method 3: using imread
////////////////////////////////

//Mat img = imread("path_to_image", IMREAD_ANYCOLOR);

// Work with img as you want

imshow("img", img);
waitKey();

return 0;
}

Reading RGB raw input, converting to openCV object, then converting to .JPEG bytes - without PIL

I figured this out. Further improvements welcome. Still not entirely sure I understand the shape and contents of the imencode returned tuple.

See below my working numpy/opencv alternative to the PIL approach above:

Read in RBG 24bit image from buffer and convert to numpy array (opencvFr) for openCV:

FlatNp=np.frombuffer(buf,dtype=np.uint8, count=-1)
opencvFr=np.resize(FlatNp,(576,768,3))

Encoding the edited numpy image array (opencvFr) as JPEG bytes:

tmp=cv2.imencode('.jpg',opencvFr) #this returns a tuple.
self.frame=tmp[1].tobytes() #bytes containing compressed JPEG image

(I am experimenting with this for a streaming camera application)

OpenCV how to put jpeg image buffer to cvShowImage

Looks like you forgot to set widthStep

  |-- int  widthStep;     // size of aligned image row in bytes
|-- int imageSize; // image data size in bytes = height*widthStep


Related Topics



Leave a reply



Submit