Stream image from Android with FFMPEG

Question:

I’m currently receiving images from an external source as byte array and I would like to send it as raw video format via ffmpeg to a stream URL, where I have a RTSP server that receives RTSP streams (a similar unanswered question). However, I haven’t worked with FFMPEG in Java, so i can’t find an example on how to do it. I have a callback that copies the image bytes to a byte array as follows:

        public class MainActivity extends Activity {
            final String rtmp_url = "rtmp://192.168.0.12:1935/live/test";
            private int PREVIEW_WIDTH = 384;
            private int PREVIEW_HEIGHT = 292;
            private String TAG = "MainActivity";
            String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
            final String command[] = {ffmpeg,
                            "-y",  //Add "-re" for simulated readtime streaming.
                            "-f", "rawvideo",
                            "-vcodec", "rawvideo",
                            "-pix_fmt", "bgr24",
                            "-s", (Integer.toString(PREVIEW_WIDTH) + "x" + Integer.toString(PREVIEW_HEIGHT)),
                            "-r", "10",
                            "-i", "pipe:",
                            "-c:v", "libx264",
                            "-pix_fmt", "yuv420p",
                            "-preset", "ultrafast",
                            "-f", "flv",
                            rtmp_url};
            
      private UVCCamera mUVCCamera;

public void handleStartPreview(Object surface) throws InterruptedException, IOException {
    Log.e(TAG, "handleStartPreview:mUVCCamera" + mUVCCamera + " mIsPreviewing:");
    if ((mUVCCamera == null)) return;
    Log.e(TAG, "handleStartPreview2 ");
    try {
        mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, 0, UVCCamera.DEFAULT_BANDWIDTH, 0);
        Log.e(TAG, "handleStartPreview3 mWidth: " + mWidth + "mHeight:" + mHeight);
    } catch (IllegalArgumentException e) {
        try {
            // fallback to YUV mode
            mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, UVCCamera.DEFAULT_PREVIEW_MODE, UVCCamera.DEFAULT_BANDWIDTH, 0);
            Log.e(TAG, "handleStartPreview4");
        } catch (IllegalArgumentException e1) {
            callOnError(e1);
            return;
        }
    }
    Log.e(TAG, "handleStartPreview: startPreview1");
    int result = mUVCCamera.startPreview();
    mUVCCamera.setFrameCallback(mIFrameCallback, UVCCamera.PIXEL_FORMAT_RGBX);
    mUVCCamera.startCapture();
    Toast.makeText(MainActivity.this,"Camera Started",Toast.LENGTH_SHORT).show();
    ProcessBuilder pb = new ProcessBuilder(command);
    pb.redirectErrorStream(true);
    Process process = pb.start();
    BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
    OutputStream writer = process.getOutputStream();
    byte img[] = new byte[192*108*3];
    for (int i = 0; i < 10; i++)
    {
        for (int y = 0; y < 108; y++)
        {
            for (int x = 0; x < 192; x++)
            {
                byte r = (byte)((x * y + i) % 255);
                byte g = (byte)((x * y + i*10) % 255);
                byte b = (byte)((x * y + i*20) % 255);
                img[(y*192 + x)*3] = b;
                img[(y*192 + x)*3+1] = g;
                img[(y*192 + x)*3+2] = r;
            }
        }

        writer.write(img);
    }

    writer.close();
    String line;
    while ((line = reader.readLine()) != null)
    {
        System.out.println(line);
    }

    process.waitFor();
}
public static void buildRawFrame(Mat img, int i)
{
    int p = img.cols() / 60;
    img.setTo(new Scalar(60, 60, 60));
    String text = Integer.toString(i+1);
    int font = Imgproc.FONT_HERSHEY_SIMPLEX;
    Point pos = new Point(img.cols()/2-p*10*(text.length()), img.rows()/2+p*10);
    Imgproc.putText(img, text, pos, font, p, new Scalar(255, 30, 30), p*2);  //Blue number
}

Additionally: Android Camera Capture using FFmpeg

uses ffmpeg to capture frame by frame from native android camera and instead of pushing it via RTMP, they used to generate a video file as output. Although how the image was passed via ffmpeg was not informed.

frameData is my byte array and I’d like to know how can I write the necessary ffmpeg commands using ProcessBuilder to send an image via RTSP using ffmpeg for a given URL.

An example of what I am trying to do, In Python 3 I could easily do it by doing:

import cv2
import numpy as np
import socket
import sys
import pickle
import struct
import subprocess

fps = 25
width = 224
height = 224
rtmp_url = 'rtmp://192.168.0.13:1935/live/test'
    
    
    
    command = ['ffmpeg',
               '-y',
               '-f', 'rawvideo',
               '-vcodec', 'rawvideo',
               '-pix_fmt', 'bgr24',
               '-s', "{}x{}".format(width, height),
               '-r', str(fps),
               '-i', '-',
               '-c:v', 'libx264',
               '-pix_fmt', 'yuv420p',
               '-preset', 'ultrafast',
               '-f', 'flv',
               rtmp_url]
    
    p = subprocess.Popen(command, stdin=subprocess.PIPE)
    
    while(True):
        frame = np.random.randint([255], size=(224, 224, 3))
        frame = frame.astype(np.uint8)
        p.stdin.write(frame.tobytes())

I would like to do the same thing in Android

Update: I can reproduce @Rotem ‘s answer on Netbeans although, in Android I am getting NullPointer exception error when trying to execute pb.start().

    Process: com.infiRay.XthermMini, PID: 32089
    java.lang.NullPointerException
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
        at com.infiRay.XthermMini.MainActivity.handleStartPreview(MainActivity.java:512)
        at com.infiRay.XthermMini.MainActivity.startPreview(MainActivity.java:563)
        at com.infiRay.XthermMini.MainActivity.access$1000(MainActivity.java:49)
        at com.infiRay.XthermMini.MainActivity$3.onConnect(MainActivity.java:316)
        at com.serenegiant.usb.USBMonitor$3.run(USBMonitor.java:620)
        at android.os.Handler.handleCallback(Handler.java:938)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at android.os.Looper.loopOnce(Looper.java:226)
        at android.os.Looper.loop(Looper.java:313)
        at android.os.HandlerThread.run(HandlerThread.java:67)
2022-06-02 11:47:20.300 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.304 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.304 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.308 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.312 32089-32089/com.infiRay.XthermMini E/MainActivity: onPause:
2022-06-02 11:47:20.314 32089-32581/com.infiRay.XthermMini I/Process: Sending signal. PID: 32089 SIG: 9
Asked By: xnok

||

Answers:

Here is a JAVA implementation that resembles the Python code:

The example writes raw video frames (byte arrays) to stdin pipe of FFmpeg sub-process:

 _____________             ___________                  ________ 
| JAVA byte   |           |           |                |        |
| Array       |   stdin   | FFmpeg    |                | Output |
| BGR (format)| --------> | process   | -------------> | stream |
|_____________| raw frame |___________| encoded video  |________|

Main stages:

  • Initialize FFmpeg command arguments:

     final String command[] = {"ffmpeg", "-f", "rawvideo", ...}
    
  • Create ProcessBuilder that executes FFmpeg as a sub-process:

     ProcessBuilder pb = new ProcessBuilder(command);
    
  • Redirect stderr (required for reading FFmpeg messages), without it, the sub-process halts:

     pb.redirectErrorStream(true);
    
  • Start FFmpeg sub-process, and create BufferedReader:

     Process process = pb.start();
     BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
    
  • Create OutputStream for writing to stdin pipe of FFmpeg sub-process:

     OutputStream writer = process.getOutputStream();
    
  • Write raw video frames to stdin pipe of FFmpeg sub-process in a loop:

     byte img[] = new byte[width*height*3];
    
     for (int i = 0; i < n_frmaes; i++)
     {
         //Fill img with pixel data
         ...
         writer.write(img);
     }
    
  • Close stdin, read and print stderr content, and wait for sub-process to finish:

     writer.close();
    
     String line;
     while ((line = reader.readLine()) != null)
     {
         System.out.println(line);
     }        
    
     process.waitFor();
    

Code sample:
The following code sample writes 10 raw video frames with size 192×108 to FFmpeg.
Instead of streaming to RTMP, we are writing the result to test.flv file (for testing).
The example uses hard coded strings and numbers (for simplicity).

Note:
The code sample assume FFmpeg executable is in the execution path.

package myproject;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStream;

public class FfmpegVideoWriter {
    public static void main(String[] args) throws IOException, InterruptedException {
        final String rtmp_url = "test.flv"; //Set output file (instead of output URL) for testing.
        
        final String command[] = {"ffmpeg",
                                  "-y",  //Add "-re" for simulated readtime streaming.
                                  "-f", "rawvideo",
                                  "-vcodec", "rawvideo",
                                  "-pix_fmt", "bgr24",
                                  "-s", "192x108",
                                  "-r", "10",
                                  "-i", "pipe:",
                                  "-c:v", "libx264",
                                  "-pix_fmt", "yuv420p",
                                  "-preset", "ultrafast",
                                  "-f", "flv",
                                  rtmp_url};
        
        //https://stackoverflow.com/questions/5483830/process-waitfor-never-returns
        ProcessBuilder pb = new ProcessBuilder(command);    //Create ProcessBuilder
        pb.redirectErrorStream(true); //Redirect stderr
        Process process = pb.start();               
        BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
        
        //Create OutputStream for writing to stdin pipe of FFmpeg sub-process.
        OutputStream writer = process.getOutputStream();
        
        byte img[] = new byte[192*108*3];   //Dummy image 
        
        //Write 10 video frames to stdin pipe of FFmpeg sub-process
        for (int i = 0; i < 10; i++)
        {
            //Fill image with some arbitrary pixel values
            for (int y = 0; y < 108; y++)
            {
                for (int x = 0; x < 192; x++)
                {
                    //Arbitrary RGB values:
                    byte r = (byte)((x * y + i) % 255); //Red component
                    byte g = (byte)((x * y + i*10) % 255); //Green component
                    byte b = (byte)((x * y + i*20) % 255); //Blue component
                    img[(y*192 + x)*3] = b; 
                    img[(y*192 + x)*3+1] = g;
                    img[(y*192 + x)*3+2] = r;
                }
            }
            
            writer.write(img);  //Write img to FFmpeg
        }
        
        writer.close();  //Close stdin pipe.

        //Read and print stderr content
        //Note: there may be cases when FFmpeg keeps printing messages, so it may not be the best solution to empty the buffer only at the end.
        //We may consider adding an argument `-loglevel error` for reducing verbosity.
        String line;
        while ((line = reader.readLine()) != null)
        {
            System.out.println(line);
        }        
       
        process.waitFor();
    }
}

The code was tested in my PC (with Windows 10), and I am not sure it’s going to work with Android…

The above sample is simplistic and generic, in your case you may use rgba pixel format and write FrameData inside onFrame method.

Sample video frame ("arbitrary pixel values"):
enter image description here


Update:

The following code sample uses JavaCV – writes Mat data to FFmpeg:

package myproject;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStream;

import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.Scalar;
import org.opencv.core.Point;
import org.opencv.imgproc.Imgproc;

public class FfmpegVideoWriter {
    static { System.loadLibrary(Core.NATIVE_LIBRARY_NAME); }
    
    //Build synthetic "raw BGR" image for testing
    public static void buildRawFrame(Mat img, int i)
    {
        int p = img.cols() / 60;    //Used as font size factor.
        img.setTo(new Scalar(60, 60, 60));  //Fill image with dark gray color
        String text = Integer.toString(i+1);
        int font = Imgproc.FONT_HERSHEY_SIMPLEX;
        Point pos = new Point(img.cols()/2-p*10*(text.length()), img.rows()/2+p*10);
        Imgproc.putText(img, text, pos, font, p, new Scalar(255, 30, 30), p*2);  //Blue number
    }
    
    public static void main(String[] args) throws IOException, InterruptedException {
        final int cols = 192;
        final int rows = 108;
        
        final String rtmp_url = "test.flv"; //Set output file (instead of output URL) for testing.
        
        final String command[] = {"ffmpeg",
                                  "-y",  //Add "-re" for simulated readtime streaming.
                                  "-f", "rawvideo",
                                  "-vcodec", "rawvideo",
                                  "-pix_fmt", "bgr24",
                                  "-s", (Integer.toString(cols) + "x" + Integer.toString(rows)),
                                  "-r", "10",
                                  "-i", "pipe:",
                                  "-c:v", "libx264",
                                  "-pix_fmt", "yuv420p",
                                  "-preset", "ultrafast",
                                  "-f", "flv",
                                  rtmp_url};
        
        //https://stackoverflow.com/questions/5483830/process-waitfor-never-returns
        ProcessBuilder pb = new ProcessBuilder(command);    //Create ProcessBuilder
        pb.redirectErrorStream(true); //Redirect stderr
        Process process = pb.start();               
        BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
        
        //Create OutputStream for writing to stdin pipe of FFmpeg sub-process.
        OutputStream writer = process.getOutputStream();
        
        //Dummy image (BGR pixel format).
        Mat img = new Mat(rows, cols, CvType.CV_8UC3, Scalar.all(0));
        
        byte buffer[] = new byte[cols*rows*3]; //Byte array for storing img data    
        
        //Write 10 video frames to stdin pipe of FFmpeg sub-process
        for (int i = 0; i < 10; i++)
        {
            buildRawFrame(img, i); //Build image with blue frame counter.
                       
            img.get(0, 0, buffer); //Copy img data to buffer (not sure if this is the best solution).  
            
            writer.write(buffer); //Write buffer (raw video frame as byte array) to FFmpeg
        }
        
        writer.close(); //Close stdin pipe.

        //Read and print stderr content
        String line;
        while ((line = reader.readLine()) != null)
        {
            System.out.println(line);
        }        
       
        process.waitFor();
    }
}

Sample output frame:
enter image description here

Answered By: Rotem

I found using ffmpeg-kit for Android slightly more convenient than invoking the ffmpeg binary using ProcessBuilder.

The simplest way to pass arbitrary data (ie. images as bytes[]) to ffmpeg is to make use of named pipes.

The pipe is created in {app_data}/cache/pipes/fk_pipe_1 and can be accessed like any Unix file with a FileOutputStream.

Note that the ffmpeg command is only executed once data can be read from the pipe. More info can be found in the ffmpeg-kit wiki.

In some cases it may be easier to access the camera directly, this is supported for API level 24 and higher.

String rtmp_url = "rtp://127.0.0.1:9000";

String pipe1 = FFmpegKitConfig.registerNewFFmpegPipe(context);

FFmpegKit.executeAsync("-re -f rawvideo -pixel_format bgr24 -video_size 640x480 -i " + pipe1 + " -f rtp_mpegts" + " " + rtmp_url, new FFmpegSessionCompleteCallback() {

    @Override
    public void apply(FFmpegSession session) {
        SessionState state = session.getState();
        ReturnCode returnCode = session.getReturnCode();
        // CALLED WHEN SESSION IS EXECUTED
        Log.d(TAG, String.format("FFmpeg process exited with state %s and rc %s.%s", state, returnCode, session.getFailStackTrace()));
    }
}, new LogCallback() {

    @Override
    public void apply(com.arthenica.ffmpegkit.Log log) {
        // CALLED WHEN SESSION PRINTS LOGS
    }
}, new StatisticsCallback() {

    @Override
    public void apply(Statistics statistics) {
        // CALLED WHEN SESSION GENERATES STATISTICS
    }
});

byte img[] = new byte[640*480*3];   // dummy image
FileOutputStream out = new FileOutputStream(pipe1);
try {
    for (int i=0; i<100; i++) { // write 100 empty frames
        out.write(img);
    }
} catch (Exception e) {
    e.printStackTrace();
} finally {
    out.close();
Answered By: volzo
Categories: questions Tags: , , , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.