Decoding Raw H264 Stream in Android

How to play raw h264 produced by MediaCodec encoder?

You should be able to play back a raw H264 stream (as you wrote, other raw .264 files play back just fine with VLC or ffplay), but you are missing the parameter sets. These are passed in two different ways, and you happen to be missing both. First they are returned in MediaFormat when you get MediaCodec.INFO_OUTPUT_FORMAT_CHANGED (which you don't handle, you just log a message about it), secondly they are returned in a buffer with MediaCodec.BUFFER_FLAG_CODEC_CONFIG set (which you ignore by setting the size to 0). The simplest solution here is to remove the special case handling of MediaCodec.BUFFER_FLAG_CODEC_CONFIG, and it should all work just fine.

The code you've based it on does things this way in order to test all the different ways of doing things - where you copied it from, the parameter sets were carried in the MediaFormat from MediaCodec.INFO_OUTPUT_FORMAT_CHANGED. If you wanted to use that in your case with a raw H264 bytestream, you could write the byte buffers with keys csd-0 and csd-1 from the MediaFormat and keep ignoring the buffers with MediaCodec.BUFFER_FLAG_CODEC_CONFIG set.

Decoding raw h264 with MediaCodec stream results in black surface

So the only problem was, that dequeueOutputBuffers still may return -3, aka MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED, which is marked as deprecated. Very nice. By not handling this return value, or to be more specific, use the constants value as input for getOutputBuffer(), the codec throws an error -> black screen.

Edit:
Oh, and apparently the whole NAL stuff isn't needed as well. Even if the API states, that the SPS and PPS NALUs have to be provided before start. I marked the part that can be left out in my Question.

Decoding h264 ByteStream on Android

I see a few problems...

(1) You're attempting to replace a buffer in the input buffer array. MediaCodec doesn't work like this -- the framework provides the buffers, and you copy the data into them. The idea is that, by allowing the framework to do the allocation, it can potentially avoid copying the data later on.

You need to get the array of input buffers from decoder.getInputBuffers(), and use those. Make sure to clear() the ByteBuffer to reset the position and limit each time.

(2) You're writing a single packet of data and expecting a frame of output data. In practice, you may need to supply multiple buffers of data before the first frame is generated. See this post for an example. In some profiles the encoder is allowed to reorder frames, so even after the decoder starts going you can't just feed a frame and wait for decoded data to pop out the other side.

(3) The AVC decoder needs the SPS/PPS data, which you can provide via a buffer with the BUFFER_FLAG_CODEC_CONFIG flag set, or by adding the data with "csd-0" / "csd-1" keys to the MediaFormat using MediaFormat#setByteBuffer(). Examples of both approaches can be found in EncodeDecodeTest.

There are a number of AVC decoding examples on bigflake, but the data source is the MediaCodec encoder, so they generally get point #3 for free.

This posting may be useful for you.

For displaying the frames, you can see different approaches in Grafika (which generally works with .mp4 files, so the encode/decode implementation there isn't as relevant).



Related Topics



Leave a reply



Submit