Mediacodec and Camera: Colorspaces Don't Match

MediaCodec and Camera: colorspaces don't match

I solved it by swapping the byteplanes myself on Android level, using a simple function:

public byte[] swapYV12toI420(byte[] yv12bytes, int width, int height) {
byte[] i420bytes = new byte[yv12bytes.length];
for (int i = 0; i < width*height; i++)
i420bytes[i] = yv12bytes[i];
for (int i = width*height; i < width*height + (width/2*height/2); i++)
i420bytes[i] = yv12bytes[i + (width/2*height/2)];
for (int i = width*height + (width/2*height/2); i < width*height + 2*(width/2*height/2); i++)
i420bytes[i] = yv12bytes[i - (width/2*height/2)];
return i420bytes;
}

Android MediaCodec video wrong color and playing too fast

The YUV formats used by Camera output and MediaCodec input have their U/V planes swapped.

If you are able to move the data through a Surface you can avoid this issue; however, you lose the ability to examine the YUV data. An example of recording to a .mp4 file from a camera can be found on bigflake.

Some details about the colorspaces and how to swap them is in this answer.

There is no timestamp information in the raw H.264 elementary stream. You need to pass the timestamp through the decoder to MediaMuxer or whatever you're using to create your final output. If you don't, the player will just pick a rate, or possibly play the frames as fast as it can.

How to use Android MediaCodec encode Camera data(YUV420sp)

I have solved the problem.As follows:

private synchronized void encode(byte[] data)
{
inputBuffers = mMediaCodec.getInputBuffers();// here changes
outputBuffers = mMediaCodec.getOutputBuffers();

int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
Log.i(TAG, "inputBufferIndex-->" + inputBufferIndex);
//......

And next,you will find your encoded video color is not right, for more information,please go to here MediaCodec and Camera: colorspaces don't match

Android 4.1 MediaCodec supported resolutions

  1. Try the the CamcorderProfile API. Configurations exactly matching one of the ones retrieved from that are likely to work.

  2. Clearly in your case there is some combination of resolution and other parameters that makes it not work. I wonder if it is the framerate? 25 is a bit odd, try 29.97 or 30, or maybe 15. I also wonder if it is the bitrate? It is is much too low for the resolution, try 500kbit/s.

Buffering Surface input to MediaCodec

You really don't want to copy the data at all. Allocating storage for and copying a large chunk of data can take long enough to kill your frame rate. This generally rules out byte[] and ByteBuffer[] solutions, even if you didn't have to do a U/V plane swap.

The most efficient way to move data through the system is with a Surface. The trick is that a Surface isn't a buffer, it's an interface to a queue of buffers. The buffers are passed around by reference; when you unlockCanvasAndPost() you're actually placing the current buffer onto a queue for the consumer, which is often in a different process.

There is no public mechanism for creating a new buffer and adding it to the set used by the queue, or for extracting buffers from the queue, so you can't implement a DIY buffering scheme on the side. There's no public interface to change the number of buffers in the pool.

It'd be useful to know what it is that's causing the hiccups. The Android tool for analyzing such issues is systrace, available in Android 4.1+ (docs, example, bigflake example). If you can identify the source of the CPU load, or determine that it's not CPU but rather some bit of code getting tangled up, you'll likely have a solution that's much easier than adding more buffers to Surface.



Related Topics



Leave a reply



Submit