terça-feira, 3 de março de 2020

Capturando frames usando o DJI UX SDK

Existem situações en que é necessário capturar frames para realizar alguma analise ou processamento.

Existem exemplos da DJI que realizam esta tarefa, como o Android Video Stream Decoding Sample.

Na prática muitos desenvolvedores preferem utilizar o UX SDK, pois ele facilita o desenvolvimento com widgets interessantes.

O exemplo básico o uso do UX SDK é o Android-UXSDKDemo. Recomendo realizar o download e instalar no celular para conhecer o funcionamento deste demo original.

Aqui vou apresentar uma forma de extrair frames a partir deste exemplo.

Meu ambiente:

Android Studio 3.3
Build #AI-182.5107.16.33.5199772, built on December 25, 2018
JRE: 1.8.0_152-release-1248-b01 amd64
JVM: OpenJDK 64-Bit Server VM by JetBrains s.r.o
Windows 10 10.0

Códigos completos no GitHub:
Sem OpenCV: https://github.com/Marchanjo/UXSDKDemo-CaptureFrames
Com OpenCV: https://github.com/Marchanjo/UXSDKDemo-OpenCV

Passo 1 - Adicionar o OpenCV

a)
New Import OpenCV Module
C:\opencv-3.4.3\OpenCV-android-sdk\sdk\java

b)
Add no build.gradle
implementation project(':openCVLibrary343')


c)
Add Native Libraries
De: C:\opencv-3.4.3\OpenCV-android-sdk\sdk\native\libs
Para: C:\Archanjo\GitHub\UXSDKDemo-OpenCV\UXSDKDemo\app\src\main
renomeando libs para jniLibs

d)build.gradle do openCVLibrary343

 compileSdkVersion 28
    buildToolsVersion "28.0.3"

    defaultConfig {
        minSdkVersion 8
        targetSdkVersion 28

e)android.manifest do  openCVLibrary343
retirar linha do minSDK


Passo 2 - Ajustes no Layout


Iremos criar uma TextureView que não deve ser usada de maneira concomitante ao FPVWidget, pois ambas irão acessar os frames o que gera um excesso de processamento atrapalhando o funcionamento. Como não estaremos usando a FPVWidget os controles de Camera ficam sem sentido, logo podem ser retirados.

Retirar Widgets conflitantes.

Em activity_main.xml comentar:

dji.ux.widget.FPVWidget




Adicionar uma TextureView em RelativeLayout (na mesma seção onde foi comentado o dji.ux.widget.FPVWidget):

        android:id="@+id/livestream_preview"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:layout_centerInParent="true"
        android:layout_gravity="center"
        android:alpha="50"
        android:visibility="visible"/>

       


Passo 3: Adicionar a classe: VideoDecodingApplication.java  com este conteúdo:

package com.dji.uxsdkdemo;

import android.app.Application;
import android.content.Context;

import dji.sdk.base.BaseProduct;
import dji.sdk.sdkmanager.DJISDKManager;

public class VideoDecodingApplication extends Application {

    private static BaseProduct mProduct;

    public static synchronized BaseProduct getProductInstance() {
        if (null == mProduct) {
            mProduct = DJISDKManager.getInstance().getProduct();
        }
        return mProduct;
    }

    public static synchronized void updateProduct(BaseProduct product) {
        mProduct = product;
    }

    @Override
    protected void attachBaseContext(Context base) {
        super.attachBaseContext(base);
        com.secneo.sdk.Helper.install(VideoDecodingApplication.this);
    }
}

Passo4: Adicionar a classe:  CaptureFrame.java com este conteúdo:



package com.dji.uxsdkdemo;

import android.content.Context;
import android.graphics.ImageFormat;
import android.graphics.Rect;
import android.graphics.SurfaceTexture;
import android.graphics.YuvImage;
import android.os.Environment;
import android.util.Log;
import android.view.TextureView;
import android.view.View;
import android.widget.ImageButton;
import android.widget.Toast;

import org.opencv.core.Mat;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;

import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.ByteBuffer;

import dji.common.camera.SettingsDefinitions;
import dji.common.error.DJIError;
import dji.common.product.Model;
import dji.common.util.CommonCallbacks;
import dji.sdk.base.BaseProduct;
import dji.sdk.camera.Camera;
import dji.sdk.camera.VideoFeeder;
import dji.sdk.codec.DJICodecManager;
import dji.thirdparty.afinal.core.AsyncTask;

import static org.opencv.core.CvType.CV_8UC1;
import static org.opencv.core.CvType.CV_8UC4;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.CvType;
import org.opencv.core.Size;

import static org.opencv.imgproc.Imgproc.cvtColor;

public class CaptureFrame {
    private static final String TAG = MainActivity.class.getName();
    private DJICodecManager mCodecManager;//Marcelo
    private VideoFeeder.VideoFeed standardVideoFeeder;//Marcelo
    protected VideoFeeder.VideoDataListener mReceivedVideoDataListener = null;//Marcelo
    private Camera mDroneCamera;//Marcelo
    private TextureView videostreamPreviewTtView;//Marcelo
    private int videoViewWidth;//Marcelo
    private int videoViewHeight;//Marcelo
    private ImageButton screenShot;//Marcelo
    private int  count;//Marcelo
    private Context appContext;

    public CaptureFrame(Context appContext, TextureView videostreamPreviewTtView) {
        this.appContext = appContext;
        this.videostreamPreviewTtView = videostreamPreviewTtView;
        videostreamPreviewTtView.setVisibility(View.VISIBLE);
        openCVStart();
    }


    public CaptureFrame(Context appContext,ImageButton screenShot, TextureView videostreamPreviewTtView) {
        this.appContext = appContext;
        this.screenShot = screenShot;
        screenShot.setSelected(false);
        screenShot.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                handleYUVClick();//Captura 1 frame a cada 30
                //handleYUVClickSingleFrame();//Captura somente um frame
            }
        });

        this.videostreamPreviewTtView = videostreamPreviewTtView;
        videostreamPreviewTtView.setVisibility(View.VISIBLE);
        openCVStart();
    }

    public void openCVStart() {
        if (!OpenCVLoader.initDebug()) {
            Log.d(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
            OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_4_0, appContext,mLoaderCallback);
        } else {
            Log.d(TAG, "OpenCV library found inside package. Using it!");
            mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
        }
    }

    private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(appContext) {
        @Override
        public void onManagerConnected(int status) {
            switch (status) {
                case LoaderCallbackInterface.SUCCESS: {
                    Log.i(TAG, "OpenCV loaded successfully");
                    //mOpenCvCameraView.enableView();
                    //mOpenCvCameraView.setOnTouchListener(MainActivity.this);
                }
                break;
                default: {
                    super.onManagerConnected(status);
                }
                break;
            }
        }
    };

    public void onPause() {
        if (mDroneCamera != null) {
            if (VideoFeeder.getInstance().getPrimaryVideoFeed() != null) {
                VideoFeeder.getInstance().getPrimaryVideoFeed().removeVideoDataListener(mReceivedVideoDataListener);
            }
            if (standardVideoFeeder != null) {
                standardVideoFeeder.removeVideoDataListener(mReceivedVideoDataListener);
            }
        }
    }

    public void onDestroy() {
        if (mCodecManager != null) {
            mCodecManager.cleanSurface();
            mCodecManager.destroyCodec();
        }
    }

    public void onResume() {
        initSurfaceOrTextureView();
        notifyStatusChange();
    }


    private void showToast(String s) {
        Toast.makeText(videostreamPreviewTtView.getContext(), s, Toast.LENGTH_SHORT).show();
    }

    private long lastupdate;

    private void notifyStatusChange() {
        final BaseProduct product = VideoDecodingApplication.getProductInstance();
        Log.d(TAG, "notifyStatusChange: " + (product == null ? "Disconnect" : (product.getModel() == null ? "null model" : product.getModel().name())));

        if (product != null && product.isConnected() && product.getModel() != null) {
            showToast(product.getModel().name() + " Connected ");
        } else {
            showToast("Disconnected");
        }

        // The callback for receiving the raw H264 video data for camera live view
        mReceivedVideoDataListener = new VideoFeeder.VideoDataListener() {

            @Override
            public void onReceive(byte[] videoBuffer, int size) {
                if (System.currentTimeMillis() - lastupdate > 1000) {
                    Log.d(TAG, "camera recv video data size: " + size);
                    lastupdate = System.currentTimeMillis();
                }

                if (mCodecManager != null) {
                    mCodecManager.sendDataToDecoder(videoBuffer, size);

                }

            }
        };

        if (null == product || !product.isConnected()) {
            mDroneCamera = null;
            showToast("Disconnected");
        } else {
            if (!product.getModel().equals(Model.UNKNOWN_AIRCRAFT)) {
                mDroneCamera = product.getCamera();
                mDroneCamera.setMode(SettingsDefinitions.CameraMode.SHOOT_PHOTO, new CommonCallbacks.CompletionCallback() {
                    @Override
                    public void onResult(DJIError djiError) {
                        if (djiError != null) {
                            showToast("can't change mode of camera, error:" + djiError.getDescription());
                        }
                    }
                });


                if (VideoFeeder.getInstance().getPrimaryVideoFeed() != null) {
                    VideoFeeder.getInstance().getPrimaryVideoFeed().addVideoDataListener(mReceivedVideoDataListener);
                }

            }
        }
    }

    private void initSurfaceOrTextureView() {//Marcelo
        initPreviewerTextureView();
    }

    /**
     * Init a fake texture view to for the codec manager, so that the video raw data can be received
     * by the camera
     */
    private void initPreviewerTextureView() {
        videostreamPreviewTtView.setSurfaceTextureListener(new TextureView.SurfaceTextureListener() {
            @Override
            public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
                Log.d(TAG, "real onSurfaceTextureAvailable");
                videoViewWidth = width;
                videoViewHeight = height;
                Log.d(TAG, "real onSurfaceTextureAvailable: width " + videoViewWidth + " height " + videoViewHeight);
                if (mCodecManager == null) {
                    mCodecManager = new DJICodecManager(videostreamPreviewTtView.getContext(), surface, width, height);
                }
            }

            @Override
            public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
                videoViewWidth = width;
                videoViewHeight = height;
                Log.d(TAG, "real onSurfaceTextureAvailable2: width " + videoViewWidth + " height " + videoViewHeight);
            }

            @Override
            public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
                if (mCodecManager != null) {
                    mCodecManager.cleanSurface();
                }
                return false;
            }

            @Override
            public void onSurfaceTextureUpdated(SurfaceTexture surface) {

            }
        });
    }

  /*  public void onClick(View v) {
        if (v.getId() == R.id.activity_main_screen_shot) {
            handleYUVClick();
        }
    }*/

  /*  private void handleYUVClick() {
        if (screenShot.isSelected()) {
            screenShot.setText("Screen Shot");
            screenShot.setSelected(false);
            mCodecManager.enabledYuvData(false);
            mCodecManager.setYuvDataCallback(null);
        } else {//Começa a capturar frames
            screenShot.setText("Live Stream");
            screenShot.setSelected(true);
            mCodecManager.enabledYuvData(true);
            mCodecManager.setYuvDataCallback(this);
        }
    }*/

 //Captura 1 frame a cada 30 frames - funciona OK
    private void handleYUVClick() {
        if (screenShot.isSelected()) {
            showToast("Stop Capturing Frames ");
            screenShot.setImageResource(R.drawable.ic_burst_mode);
//            screenShot.setText("Screen Shot");
            screenShot.setSelected(false);
            mCodecManager.enabledYuvData(false);
            mCodecManager.setYuvDataCallback(null);
        } else {//Começa a capturar frames
            showToast("Capturing Frames ");
            screenShot.setImageResource(R.drawable.ic_action_playback_stop);
//            screenShot.setText("Live Stream");
            screenShot.setSelected(true);
            mCodecManager.enabledYuvData(true);
            mCodecManager.setYuvDataCallback(new DJICodecManager.YuvDataCallback() {
                @Override
                public void onYuvDataReceived(final ByteBuffer yuvFrame, int dataSize, final int width, final int height) {
                    //In this demo, we test the YUV data by saving it into JPG files.
                    //DJILog.d(TAG, "onYuvDataReceived " + dataSize);
                    if (count++ % 30 == 0 && yuvFrame != null) {
                        final byte[] bytes = new byte[dataSize];
                        yuvFrame.get(bytes);
                        Log.i(TAG, "SaveFrame: " + count);
                        AsyncTask.execute(new Runnable() {
                            @Override
                            public void run() {
                                saveYuvDataToJPEG(bytes, width, height);
                            }
                        });
                    }
                }
            });
        }
    }

//Captura um único frame
    public void handleYUVClickSingleFrame() {
            showToast("Frame Captured");
            mCodecManager.enabledYuvData(true);
        Log.i(TAG, "SaveFrame01");
            mCodecManager.setYuvDataCallback(new DJICodecManager.YuvDataCallback() {
                @Override
                public void onYuvDataReceived(final ByteBuffer yuvFrame, int dataSize, final int width, final int height) {
                    if (count++ == 30 && yuvFrame != null){
                        Log.i(TAG, "SaveFrame02");
                        final byte[] bytes = new byte[dataSize];
                        Log.i(TAG, "SaveFrame03");
                        yuvFrame.get(bytes);
                        Log.i(TAG, "SaveFrame04");
                        saveYuvDataToJPEG(bytes, width, height);
                        Log.i(TAG, "SaveFrame05"); //ele demora entre 1 e 2 e demora mais entre o 5 e o 6 e parece que falha na segunda captura

                        mCodecManager.enabledYuvData(false);
                        Log.i(TAG, "SaveFrame06");
                        mCodecManager.setYuvDataCallback(null);
                        Log.i(TAG, "SaveFrame07");
                    }


                }
            });

    }





/*
    private void handleYUVClick() {
        //if (!screenShot.isSelected()) {
          //  savedScreenShot=false;
            //screenShot.setText("Live Stream");
          //  screenShot.setSelected(true);
     Log.i(TAG, "SaveFrame1");
     saveOneFrame=true;
     mCodecManager.enabledYuvData(true);
     mCodecManager.setYuvDataCallback(new DJICodecManager.YuvDataCallback() {
         @Override
         public void onYuvDataReceived(ByteBuffer byteBuffer, int i, int i1, int i2) {
             teste isso
         }
     });

         //   while(savedScreenShot==false) {
         //       sleep(100);
         //   }
     //savedScreenShot=true;
     //screenShot.setText("Screen Shot");
     //screenShot.setSelected(false);

        veja como fazer com um único click
                talvez usar
        https://developer.android.com/reference/android/os/AsyncTask

     Log.i(TAG, "SaveFrame3");
     mCodecManager.enabledYuvData(false);
        Log.i(TAG, "SaveFrame4");
        mCodecManager.setYuvDataCallback(null);
        Log.i(TAG, "SaveFrame5");

       // }
    }
*/
 /* @Override
    public void onYuvDataReceived(final ByteBuffer yuvFrame, int dataSize, final int width, final int height) {
        //In this demo, we test the YUV data by saving it into JPG files.
        //DJILog.d(TAG, "onYuvDataReceived " + dataSize);
      if (count++ % 30 == 0 && yuvFrame != null) {//if (saveOneFrame == true && yuvFrame != null) {
            saveOneFrame=false;
            final byte[] bytes = new byte[dataSize];
            yuvFrame.get(bytes);
            Log.i(TAG, "SaveFrame: " + count);
            AsyncTask.execute(new Runnable() {
                @Override
                public void run() {
                    saveYuvDataToJPEG(bytes, width, height);
                }
            });
        }
    }*/

    private void saveYuvDataToJPEG(byte[] yuvFrame, int width, int height) {
        if (yuvFrame.length < width * height) {
            //DJILog.d(TAG, "yuvFrame size is too small " + yuvFrame.length);
            return;
        }

        byte[] y = new byte[width * height];
        byte[] u = new byte[width * height / 4];
        byte[] v = new byte[width * height / 4];
        byte[] nu = new byte[width * height / 4]; //
        byte[] nv = new byte[width * height / 4];

        System.arraycopy(yuvFrame, 0, y, 0, y.length);
        for (int i = 0; i < u.length; i++) {
            v[i] = yuvFrame[y.length + 2 * i];
            u[i] = yuvFrame[y.length + 2 * i + 1];
        }
        int uvWidth = width / 2;
        int uvHeight = height / 2;
        for (int j = 0; j < uvWidth / 2; j++) {
            for (int i = 0; i < uvHeight / 2; i++) {
                byte uSample1 = u[i * uvWidth + j];
                byte uSample2 = u[i * uvWidth + j + uvWidth / 2];
                byte vSample1 = v[(i + uvHeight / 2) * uvWidth + j];
                byte vSample2 = v[(i + uvHeight / 2) * uvWidth + j + uvWidth / 2];
                nu[2 * (i * uvWidth + j)] = uSample1;
                nu[2 * (i * uvWidth + j) + 1] = uSample1;
                nu[2 * (i * uvWidth + j) + uvWidth] = uSample2;
                nu[2 * (i * uvWidth + j) + 1 + uvWidth] = uSample2;
                nv[2 * (i * uvWidth + j)] = vSample1;
                nv[2 * (i * uvWidth + j) + 1] = vSample1;
                nv[2 * (i * uvWidth + j) + uvWidth] = vSample2;
                nv[2 * (i * uvWidth + j) + 1 + uvWidth] = vSample2;
            }
        }
        //nv21test
        byte[] bytes = new byte[yuvFrame.length];
        System.arraycopy(y, 0, bytes, 0, y.length);
        for (int i = 0; i < u.length; i++) {
            bytes[y.length + (i * 2)] = nv[i];
            bytes[y.length + (i * 2) + 1] = nu[i];
        }

        //Marcelo OpenCV
        Mat myuv = new Mat(height + height / 2, width, CV_8UC1);


        myuv.put(0,0,bytes);//carga da matriz

        Mat picBGR = new Mat(height, width, CV_8UC4);

        cvtColor(myuv, picBGR, Imgproc.COLOR_YUV2BGRA_NV21);

        Mat mOut = new Mat(picBGR.height(),picBGR.width(), CvType.CV_8UC4);
        Mat mIntermediate = new Mat(picBGR.height(),picBGR.width(), CvType.CV_8UC4);

        final String path = Environment.getExternalStorageDirectory() + "/DJI_ScreenShot" + "/ScreenShot_" + System.currentTimeMillis() +"_OpenCV.jpg";
        Log.i(TAG, "OpenCV path: " + path);


        Imgproc.blur(picBGR, mIntermediate, new Size(3, 3));
        Imgproc.Canny(mIntermediate, mOut, 80, 100);

        Imgcodecs.imwrite(path, mOut);
        //showImg(mOut);
        //fim Meu OpenCV

        Log.i(TAG, "SaveFrame 04a");
        screenShot(bytes, Environment.getExternalStorageDirectory() + "/DJI_ScreenShot", width, height);
        Log.i(TAG, "SaveFrame 04b");
    }
/* não funcionou
    private void showImg(Mat img) {
        Log.i(TAG, "OpenCV show 01: ");
        Bitmap bm = Bitmap.createBitmap(img.cols(), img.rows(),Bitmap.Config.ARGB_8888);
        Log.i(TAG, "OpenCV show 02: ");
        Utils.matToBitmap(img, bm);
        Log.i(TAG, "OpenCV show 03: ");
        //videostreamPreviewSf.setVisibility(View.GONE);quando ativei esta linha começou a dar problema,
        //videostreamPreviewTtView.setVisibility(View.GONE);
        //videostreamPreviewOpenCV.setVisibility(View.GONE);
        //dá problema quando usa imageView, veja se é conflito com textura ou a chamada neste ponto (testar uma das linhas acima e sem imagaView)
        //imageView.setVisibility(View.VISIBLE);quando ativei esta linha começou a dar problema,
        Log.i(TAG, "OpenCV show 04: ");
        //imageView.setImageBitmap(bm);quando ativei esta linha começou a dar problema,
        Log.i(TAG, "OpenCV show 05: ");
    }*/

    /**
     * Save the buffered data into a JPG image file
     */
    private void screenShot(byte[] buf, String shotDir, int width, int height) {
        File dir = new File(shotDir);
        if (!dir.exists() || !dir.isDirectory()) {
            dir.mkdirs();
        }
        YuvImage yuvImage = new YuvImage(buf,
                ImageFormat.NV21,
                width,
                height,
                null);
        OutputStream outputFile;
        final String path = dir + "/ScreenShot_" + System.currentTimeMillis() + ".jpg";
        try {
            outputFile = new FileOutputStream(new File(path));
        } catch (FileNotFoundException e) {
            Log.e(TAG, "test screenShot: new bitmap output file error: " + e);
            return;
        }
        if (outputFile != null) {
            yuvImage.compressToJpeg(new Rect(0,
                    0,
                    width,
                    height), 100, outputFile);
            //Log.e(TAG, "Ori path: " + path);
        }
        try {
            outputFile.close();
        } catch (IOException e) {
            Log.e(TAG, "test screenShot: compress yuv image error: " + e);
            e.printStackTrace();
        }
        /*runOnUiThread(new Runnable() {
            @Override
            public void run() {
                displayPath(path);
            }
        });*/
    }


}


Passo 4 - Adicionar chamadas no MainActivity.java


private CaptureFrame frameAccess;

dentro do onCreate:

frameAccess = new CaptureFrame(this, (TextureView) findViewById(R.id.livestream_preview));

@Override
    protected void onPause() {//Marcelo
        frameAccess.onPause();
        super.onPause();
    }

    @Override
    protected void onDestroy() { //Marcelo
        frameAccess.onDestroy();
        super.onDestroy();
    }

    @Override
    protected void onResume() {//Marcelo
        super.onResume();
        frameAccess.onResume();//depois do super.onResume
    }

Passo 5 - Capturar frame


frameAccess.handleYUVClickSingleFrame();//Captura somente um frame



Nenhum comentário:

Postar um comentário