Tuesday 30 April 2019

EXEC_BAD_ACCESS w/ OpenCV `cv::aruco::detectMarkers()` on IOS

  • opencv: 4.1.0 (w/ 'contrib' extensions)
  • swift: 5
  • IOS: 12.2

I am trying to run opencv's method cv::aruco::detectMarkers on every frame from an iphone camera. This works but after a minute or so it crashes with the error: Thread 8: EXC_BAD_ACCESS (code=1, address=0x10dea0000)

I've included what I think are the two most relevant pieces of the app, the UIViewController and the Objective-C wrapper and I've marked the two lines in each where the exception is thrown with a comment.

This doesn't appear to me to be a concurrency issue since this should be running synchronously on the main thread.

Here is the result of thread backtrace

* thread #8, queue = 'com.apple.root.default-qos', stop reason = EXC_BAD_ACCESS (code=1, address=0x10dea0000)
  * frame #0: 0x000000010505c700 Camera`cv::pointSetBoundingRect(cv::Mat const&) + 432
    frame #1: 0x000000010505c8c0 Camera`cvBoundingRect + 236
    frame #2: 0x0000000104fdf168 Camera`cvFindNextContour + 4348
    frame #3: 0x0000000104fe00fc Camera`cvFindContours_Impl(void*, CvMemStorage*, CvSeq**, int, int, int, CvPoint, int) + 1008
    frame #4: 0x0000000104fe118c Camera`cv::findContours(cv::_InputArray const&, cv::_OutputArray const&, cv::_OutputArray const&, int, int, cv::Point_<int>) + 972
    frame #5: 0x0000000104fe1bb0 Camera`cv::findContours(cv::_InputArray const&, cv::_OutputArray const&, int, int, cv::Point_<int>) + 96
    frame #6: 0x000000010507df68 Camera`cv::aruco::DetectInitialCandidatesParallel::operator()(cv::Range const&) const + 2056
    frame #7: 0x0000000104f8e068 Camera`(anonymous namespace)::ParallelLoopBodyWrapper::operator()(cv::Range const&) const + 248
    frame #8: 0x0000000104f8df5c Camera`(anonymous namespace)::block_function(void*, unsigned long) + 32
    frame #9: 0x0000000105318824 libdispatch.dylib`_dispatch_client_callout2 + 20

This is how I setup the AVCaptureVideoDataOutputSampleBufferDelegate, which receives every frame as a CMSampleBuffer, converts it to a UIImage and sends that UIImage to opencv for Aruco marker detection.

extension ViewController : AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(
        _ output: AVCaptureOutput,
        didOutput sampleBuffer: CMSampleBuffer,
        from connection: AVCaptureConnection) {

        let image : UIImage = self.sample_buffer_to_uiimage(sampleBuffer: sampleBuffer)

        // call out to opencv wrapper, which eventually blows up
        let annotated_image : UIImage = OpenCVWrapper.drawMarkers(image)

        self.imageView.image = annotated_image

    }
    func sample_buffer_to_uiimage(sampleBuffer:CMSampleBuffer) -> UIImage
    {
        let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
        let cimage : CIImage  = CIImage(cvPixelBuffer: imageBuffer)
        let context:CIContext = CIContext.init(options: nil)
        let cgImage:CGImage   = context.createCGImage(cimage, from: cimage.extent)!
        let image:UIImage     = UIImage.init(cgImage: cgImage)
        return image
    }
}

This is how I setup the objective-c opencv wrapper method

+(UIImage *) drawMarkers:(UIImage *)image {

    cv::Mat colorImageRGBA;
    cv::Mat colorImage;
    cv::Mat grayImage;

    UIImageToMat(image, colorImageRGBA);

    cvtColor(colorImageRGBA, grayImage, cv::COLOR_BGR2GRAY);
    cvtColor(colorImageRGBA, colorImage, cv::COLOR_RGBA2RGB);

    cv::Ptr<cv::aruco::Dictionary> dictionary = cv::aruco::getPredefinedDictionary(cv::aruco::DICT_6X6_250);

    std::vector<int> markerIds;
    std::vector<std::vector<cv::Point2f>> markerCorners;

    // this is the line that blows up
    cv::aruco::detectMarkers(grayImage, dictionary, markerCorners, markerIds);

    if (markerIds.size() > 0) {
        cv::aruco::drawDetectedMarkers(colorImage, markerCorners, markerIds);
    }

    return MatToUIImage(colorImage);
}



from EXEC_BAD_ACCESS w/ OpenCV `cv::aruco::detectMarkers()` on IOS

No comments:

Post a Comment