Skip to content

Bug fix: corrected a bug in HOGDescriptor::detectMultiScale()#891

Merged
opencv-pushbot merged 3 commits intoopencv:2.4from
NCBee:2.4
Jul 31, 2013
Merged

Bug fix: corrected a bug in HOGDescriptor::detectMultiScale()#891
opencv-pushbot merged 3 commits intoopencv:2.4from
NCBee:2.4

Conversation

@NCBee
Copy link
Copy Markdown
Contributor

@NCBee NCBee commented May 16, 2013

The detectMultiScale did not group weights along with ROIs. A new protected method is added to HOGDescriptor structure to handle weight grouping, instead of using the generic groupRectangles() function in cascadedetect.cpp. A class definition had to be moved out of cascadedetect,cpp into cascadedetect.hpp to accommodate the change.

Fixes issue #3021 (http://code.opencv.org/issues/3021)

@ghost ghost assigned geexie May 17, 2013
@alekcac
Copy link
Copy Markdown
Contributor

alekcac commented May 17, 2013

Marina, could you please review this request. More details are available in the issue details on code.opencv.org.

@geexie
Copy link
Copy Markdown
Contributor

geexie commented May 19, 2013

@NCBee , why we can't use groupRectangles with signature that support weighting (objdetect.hpp, line 331)

@NCBee
Copy link
Copy Markdown
Contributor Author

NCBee commented May 19, 2013

@cuda-geek I spent a couple of hours reading through the (undocumented) groupRectangles code and at first it seemed very specific to the CascadeClassifier class. However after experimenting with it tonight, I found a "work-around" that if you pass an initialized dummy vector to groupRectangles, it gets the job done:

    // inside hog.cpp's void HOGDescriptor::detectMultiScale(...) const method definition
    // ...
    if ( useMeanshiftGrouping )
    {
        groupRectangles_meanshift(foundLocations, foundWeights, foundScales, finalThreshold, winSize);
    }
    else
    {
        std::vector<int> weights;
        weights.resize(foundLocations.size(), INT_MAX);
        groupRectangles(foundLocations, (int)finalThreshold, 0.2, &weights, &foundWeights);
    }

This is a hack and we are not using the function the way it was intended to be used (possibly by CascadeClassifier). However if you think this hack does the job, feel free to close this pull request and either you or me can add the extra two lines inside hog.cpp to fix the problem.

Note: IMO in general the location of groupRectangles() functions inside cascadedetect.cpp is problematic both logically and for future development, if groupRectangles() is going to be used by many algorithms outside of cascadetect.cpp. My groupROI was an attempt to bring sanity to hog.cpp by making it independent of cascadedetect.cpp.

@NCBee
Copy link
Copy Markdown
Contributor Author

NCBee commented May 19, 2013

By the way groupRectangles with signature declared on line 331 of objectdetect.hpp does not work simply because weight is a vector of int but SVM distances are vector of double. I used the signature on line 333, with a dummy variable for weight.

@NCBee
Copy link
Copy Markdown
Contributor Author

NCBee commented May 20, 2013

I simplified the solution with a new groupRectangles overloaded function defined in cascadedetect.cpp (just like other groupRectangles() functions).

@apavlenko
Copy link
Copy Markdown
Contributor

@NCBee Looks like the 'Objdetect_HOGDetector.regression' test fails now, could you fix?
(http://build.opencv.org/builders/precommit_windows/builds/3608/steps/opencv_test_objdetect/logs/stdio)

@NCBee
Copy link
Copy Markdown
Contributor Author

NCBee commented May 21, 2013

@apavlenko I noticed earlier. The detector seems to now pick up more ROIs over the threshold. Given how simple the fix is, I think it's the original regression test that needs to change. It's possible that the test was based on incorrect weight information for each detected ROI. To be more specific, if we focus on one image out of the few in regression analysis, it is possible that the test was built using grouped ROIs of vector<Rect>::size() == 2 for that image, but probably used the first 2 weights stored in the vector<double>::size() == 16, which is the un-grouped weight vector and is obviously not holding the correct weights.

However, since this is a regression test we need to be very careful when making changes. I have downloaded opencv_extra and will run the tests locally to see why we get a regression error. Given my schedule it may take a week or two.

…ngles was added to HOGDescriptor to take care of ROI and weight grouping
@NCBee
Copy link
Copy Markdown
Contributor Author

NCBee commented May 24, 2013

After going through the test cases and following the logic in groupRectangles( vector<Rect>& rectList, int groupThreshold, double eps, vector<int>* weights, vector<double>* levelWeights ) function in cascadedetect.cpp, I don't think the short method to group weights with ROIs using the current methods works. Therefore I uploaded a slightly modified version of my previous solution where a public method called groupRectangles was added to HOGDescriptor class to take care of both ROIs and weights.

@apavlenko
Copy link
Copy Markdown
Contributor

Marina, could you review the updated version?

@ghost
Copy link
Copy Markdown

ghost commented Jul 15, 2013

Marina,

The pull request is very old. Could you please review the patch?

@geexie
Copy link
Copy Markdown
Contributor

geexie commented Jul 15, 2013

👍 , It can be.

apavlenko pushed a commit that referenced this pull request Jul 31, 2013
@opencv-pushbot opencv-pushbot merged commit e2d6a3a into opencv:2.4 Jul 31, 2013
@NCBee NCBee deleted the 2.4 branch August 5, 2013 17:37
@SpecLad SpecLad mentioned this pull request Aug 6, 2013
savuor pushed a commit to nickyu-zhu/opencv that referenced this pull request Oct 27, 2023
* add test data for LSTM w/ activations

* dnn: put lstm_cntk_tanh testdata in opencv_extra

Co-authored-by: Alexander Alekhin <alexander.a.alekhin@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants