日日操夜夜添-日日操影院-日日草夜夜操-日日干干-精品一区二区三区波多野结衣-精品一区二区三区高清免费不卡

公告:魔扣目錄網為廣大站長提供免費收錄網站服務,提交前請做好本站友鏈:【 網站目錄:http://www.ylptlb.cn 】, 免友鏈快審服務(50元/站),

點擊這里在線咨詢客服
新站提交
  • 網站:51998
  • 待審:31
  • 小程序:12
  • 文章:1030137
  • 會員:747

本文介紹了修復不正確的bindingBox坐標?的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

問題描述

我正在使用Google ML和CameraX為Android(Java)開發一個對象檢測應用程序。我還使用了TensorFlow模型,該模型可以在here中找到。我的問題是我的bindingBox的坐標稍微有點不對齊,如下圖所示。請忽略它被檢測為鏟子的事實,我的問題當前專注于捕獲屏幕上顯示的圖形中的圖像。

這是用于繪制GraphicOverlay的以下類;

DrawGraphic.java;

public class DrawGraphic extends View {

    Paint borderPaint, textPaint;
    Rect rect;
    String text;

    ImageProxy imageProxy;
    PreviewView previewView;


    public DrawGraphic(Context context, Rect rect, String text, ImageProxy imageProxy, PreviewView previewView) {
        super(context);
        this.rect = rect;
        this.text = text;

        borderPaint = new Paint();
        borderPaint.setColor(Color.WHITE);
        borderPaint.setStrokeWidth(10f);
        borderPaint.setStyle(Paint.Style.STROKE);

        textPaint = new Paint();
        textPaint.setColor(Color.WHITE);
        textPaint.setStrokeWidth(50f);
        textPaint.setTextSize(32f);
        textPaint.setStyle(Paint.Style.FILL);
    }

    @Override
    protected void onDraw(Canvas canvas) {
        super.onDraw(canvas);
        canvas.setMatrix(getMappingMatrix(imageProxy, previewView));
        canvas.concat(getMappingMatrix(imageProxy, previewView));
        canvas.drawText(text, rect.centerX(), rect.centerY(), textPaint);
        canvas.drawRect(rect.left, rect.bottom, rect.right, rect.top, borderPaint);

        ImageProxy imageProxy;
        PreviewView previewView;
    }

    Matrix getMappingMatrix(ImageProxy imageProxy, PreviewView previewView) {
        Rect cropRect = imageProxy.getCropRect();
        int rotationDegrees = imageProxy.getImageInfo().getRotationDegrees();
        Matrix matrix = new Matrix();

        float[] source = {
                cropRect.left,
                cropRect.top,
                cropRect.right,
                cropRect.top,
                cropRect.right,
                cropRect.bottom,
                cropRect.left,
                cropRect.bottom
        };

        float[] destination = {
                0f,
                0f,
                previewView.getWidth(),
                0f,
                previewView.getWidth(),
                previewView.getHeight(),
                0f,
                previewView.getHeight()
        };

        int vertexSize = 2;

        int shiftOffset = rotationDegrees / 90 * vertexSize;
        float[] tempArray = destination.clone();
        for (int toIndex = 0; toIndex < source.length; toIndex++) {
            int fromIndex = (toIndex + shiftOffset) % source.length;
            destination[toIndex] = tempArray[fromIndex];
        }
        matrix.setPolyToPoly(source, 0, destination, 0, 4);
        return matrix;
    }
}

MainActivity.java

public class MainActivity extends AppCompatActivity {

    private static final int PERMISSIONS_REQUEST = 1;

    private static final String PERMISSION_CAMERA = Manifest.permission.CAMERA;

    public static final Size DESIRED_PREVIEW_SIZE = new Size(640, 480);

    private PreviewView previewView;

    ActivityMainBinding binding;

    @Override
    protected void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        binding = ActivityMainBinding.inflate(getLayoutInflater());
        setContentView(binding.getRoot());

        previewView = findViewById(R.id.previewView);

        if (hasPermission()) {
            // Start CameraX
            startCamera();
        } else {
            requestPermission();
        }
    }

    @SuppressLint("UnsafeOptInUsageError")
    private void startCamera() {
        ListenableFuture<ProcessCameraProvider> cameraProviderFuture = ProcessCameraProvider.getInstance(this);

        cameraProviderFuture.addListener(() -> {
            // Camera provider is now guaranteed to be available
            try {
                ProcessCameraProvider cameraProvider = cameraProviderFuture.get();

                // Set up the view finder use case to display camera preview
                Preview preview = new Preview.Builder().build();

                // Choose the camera by requiring a lens facing
                CameraSelector cameraSelector = new CameraSelector.Builder()
                        .requireLensFacing(CameraSelector.LENS_FACING_BACK)
                        .build();

                // Image Analysis
                ImageAnalysis imageAnalysis =
                        new ImageAnalysis.Builder()
                                .setTargetResolution(DESIRED_PREVIEW_SIZE)
                                .setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
                                .build();

                imageAnalysis.setAnalyzer(ContextCompat.getMainExecutor(this), imageProxy -> {
                    // Define rotation Degrees of the imageProxy
                    int rotationDegrees = imageProxy.getImageInfo().getRotationDegrees();
                    Log.v("ImageAnalysis_degrees", String.valueOf(rotationDegrees));

                    @SuppressLint("UnsafeExperimentalUsageError") Image mediaImage = imageProxy.getImage();
                    if (mediaImage != null) {
                        InputImage image =
                                InputImage.fromMediaImage(mediaImage, imageProxy.getImageInfo().getRotationDegrees());
                        //Pass image to an ML Kit Vision API
                        //...

                        LocalModel localModel =
                                new LocalModel.Builder()
                                        .setAssetFilePath("mobilenet_v1_0.75_192_quantized_1_metadata_1.tflite")
                                        .build();

                        CustomObjectDetectorOptions customObjectDetectorOptions =
                                new CustomObjectDetectorOptions.Builder(localModel)
                                        .setDetectorMode(CustomObjectDetectorOptions.STREAM_MODE)
                                        .enableClassification()
                                        .setClassificationConfidenceThreshold(0.5f)
                                        .setMaxPerObjectLabelCount(3)
                                        .build();

                        ObjectDetector objectDetector =
                                ObjectDetection.getClient(customObjectDetectorOptions);

                        objectDetector.process(image)
                                .addOnSuccessListener(detectedObjects -> {
                                    getObjectResults(detectedObjects);
                                    Log.d("TAG", "onSuccess" + detectedObjects.size());
                                    for (DetectedObject detectedObject : detectedObjects) {
                                        Rect boundingBox = detectedObject.getBoundingBox();

                                        Integer trackingId = detectedObject.getTrackingId();
                                        for (DetectedObject.Label label : detectedObject.getLabels()) {
                                            String text = label.getText();
                                            int index = label.getIndex();
                                            float confidence = label.getConfidence();
                                        }
                                    }
                                })
                                .addOnFailureListener(e -> Log.e("TAG", e.getLocalizedMessage()))
                                .addOnCompleteListener(result -> imageProxy.close());
                    }

                });

                // Connect the preview use case to the previewView
                preview.setSurfaceProvider(
                        previewView.getSurfaceProvider());

                // Attach use cases to the camera with the same lifecycle owner
                if (cameraProvider != null) {
                    Camera camera = cameraProvider.bindToLifecycle(
                            this,
                            cameraSelector,
                            imageAnalysis,
                            preview);
                }

            } catch (ExecutionException | InterruptedException e) {
                e.printStackTrace();
            }


        }, ContextCompat.getMainExecutor(this));
    }

    private void getObjectResults(List<DetectedObject> detectedObjects) {
        for (DetectedObject object : detectedObjects) {
            if (binding.parentlayout.getChildCount() > 1) {
                binding.parentlayout.removeViewAt(1);
            }
            Rect rect = object.getBoundingBox();
            String text = "Undefined";
            if (object.getLabels().size() != 0) {
                text = object.getLabels().get(0).getText();
            }

            DrawGraphic drawGraphic = new DrawGraphic(this, rect, text);
            binding.parentlayout.addView(drawGraphic);
        }
    }

    private boolean hasPermission() {
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
            return checkSelfPermission(PERMISSION_CAMERA) == PackageManager.PERMISSION_GRANTED;
        } else {
            return true;
        }
    }

    private void requestPermission() {
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
            if (shouldShowRequestPermissionRationale(PERMISSION_CAMERA)) {
                Toast.makeText(
                        this,
                        "Camera permission is required for this demo",
                        Toast.LENGTH_LONG)
                        .show();
            }
            requestPermissions(new String[]{PERMISSION_CAMERA}, PERMISSIONS_REQUEST);
        }
    }

    @Override
    public void onRequestPermissionsResult(
            final int requestCode, final String[] permissions, final int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        if (requestCode == PERMISSIONS_REQUEST) {
            if (allPermissionsGranted(grantResults)) {
                // Start CameraX
                startCamera();
            } else {
                requestPermission();
            }
        }
    }

    private static boolean allPermissionsGranted(final int[] grantResults) {
        for (int result : grantResults) {
            if (result != PackageManager.PERMISSION_GRANTED) {
                return false;
            }
        }
        return true;
    }
}

所有這些都引出了我的問題,為什么bindingBox稍微關閉了。補充此問題所需的任何進一步信息將根據請求提供。

推薦答案

如模型說明中所述;

圖像數據:ByteBuffer大小為192 x 192 x 3 x像素深度,其中,對于浮點模型,像素深度為4,對于量化模型,為1。

確保media.Image具有相同的分辨率。如果您提供的是不同的圖像數據,這可能會導致錯誤的邊界框和檢測。這很可能就是為什么它一開始就被檢測為鏟子的原因。
您可以將ImageAnalysis配置設置為以此分辨率向您發送圖像,或者在將其作為模型的輸入之前必須調整圖像的大小。

請記住,輸出邊界框將根據192 x 192圖像。現在,您需要將該坐標轉換為預覽視圖的坐標。為此,有很多解決方案,但您可以使用this。

這篇關于修復不正確的bindingBox坐標?的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,

分享到:
標簽:bindingBox 不正確 修復 坐標
用戶無頭像

網友整理

注冊時間:

網站:5 個   小程序:0 個  文章:12 篇

  • 51998

    網站

  • 12

    小程序

  • 1030137

    文章

  • 747

    會員

趕快注冊賬號,推廣您的網站吧!
最新入駐小程序

數獨大挑戰2018-06-03

數獨一種數學游戲,玩家需要根據9

答題星2018-06-03

您可以通過答題星輕松地創建試卷

全階人生考試2018-06-03

各種考試題,題庫,初中,高中,大學四六

運動步數有氧達人2018-06-03

記錄運動步數,積累氧氣值。還可偷

每日養生app2018-06-03

每日養生,天天健康

體育訓練成績評定2018-06-03

通用課目體育訓練成績評定