Encoding H.264 from Camera with Android Mediacodec

Encoding H.264 from camera with Android MediaCodec

For your fast playback - frame rate issue, there is nothing you have to do here. Since it is a streaming solution the other side has to be told the frame rate in advance or timestamps with each frame. Both of these are not part of elementary stream. Either pre-determined framerate is chosen or you pass on some sdp or something like that or you use existing protocols like rtsp. In the second case the timestamps are part of the stream sent in form of something like rtp. Then the client has to depay the rtp stream and play it bacl. This is how elementary streaming works. [either fix your frame rate if you have a fixed rate encoder or give timestamps]

Local PC playback will be fast because it will not know the fps. By giving the fps parameter before the input e.g

ffplay -fps 30 in.264

you can control the playback on the PC.

As for the file not being playable: Does it have a SPS and PPS. Also you should have NAL headers enabled - annex b format. I don't know much about android, but this is requirement for any h.264 elementary stream to be playable when they are not in any containers and need to be dumped and played later.
If android default is mp4, but default annexb headers will be switched off, so perhaps there is a switch to enable it. Or if you are getting data frame by frame, just add it yourself.

As for color format: I would guess the default should work. So try not setting it.
If not try 422 Planar or UVYV / VYUY interleaved formats. usually cameras are one of those. (but not necessary, these may be the ones I have encountered more often).

Android Camera2 pipeline: How do I encode h.264 units using MediaCodec from an input Surface?

You don't need MediaExtractor - that's for processing a complete container file and splitting out its various streams and other components.

The MediaCodec receives the raw image buffers from the camera directly, and will output encoded buffers. If you want to save a standard video file, you'll need to feed those encoded ByteBuffers into a MediaMuxer instance. If you're just sending the encoded buffers elsewhere for decode (like for a video chat application), you can just feed the ByteBuffers to a MediaCodec at your destination.

I can't speak to whether all your parameters to MediaCodec are correct, but I don't see anything obviously wrong.

Android Mediacodec encode h264 and decode in different platforms (Android,iOS, Web)

Yes, the input color type shouldn't matter. Even if you use COLOR_FormatSurface, it still is encoded as a normal YUV 4:2:0 video as if you'd use the other, non-surface color formats.

If you'd manually choose a more uncommon color format like YUV 4:4:4 (when not using surface input), the encoder could either choose to actually downsample this into YUV 4:2:0 (and all decoders would support it) or keep it as is and encode into the more uncommon variants of H264, which not all decoders might be able to decode.



Related Topics



Leave a reply



Submit