Skip to content

Conversation

@BlGene
Copy link
Contributor

@BlGene BlGene commented Feb 22, 2015

This PR extends the accuracy layer to behave more like the SoftmaxWithLoss layer. It allows bottom datum to have a width and height, as well as the ignore_label option. Original functionality of the layer remains unchanged.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is added because it allows one to define databases that store several Datums to one keyword.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Leave that for another PR, where it is used.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seconding @sguada. Note also that there is a BlobProtoVector that could be
used for packing multiple pieces of data too.

On Sun, Feb 22, 2015 at 4:01 PM, Sergio Guadarrama <notifications@github.com

wrote:

In src/caffe/proto/caffe.proto
#1942 (comment):

@@ -30,6 +30,10 @@ message Datum {
optional bool encoded = 7 [default = false];
}

+message DatumVector {

Leave that for another PR, where it is used.


Reply to this email directly or view it on GitHub
https://github.com/BVLC/caffe/pull/1942/files#r25138930.

@BlGene BlGene force-pushed the fix_accuracy_layer branch 2 times, most recently from c249d14 to 415e668 Compare February 22, 2015 20:53
@BlGene
Copy link
Contributor Author

BlGene commented Feb 22, 2015

The travis test timed out, it AFAIK it builds and tests fine.

@BlGene BlGene force-pushed the fix_accuracy_layer branch from 415e668 to 8916348 Compare February 23, 2015 07:50
@BlGene
Copy link
Contributor Author

BlGene commented Feb 23, 2015

@sguada @shelhamer I concede DatumVector is out.

philkr added a commit to philkr/caffe that referenced this pull request Feb 25, 2015
@philkr
Copy link
Contributor

philkr commented Mar 2, 2015

Nice PR. I'm currently using it and it works as advertised.
One small remark: As this now operates on larger blobs it might make sense to also have a GPU implementation similar to SoftmaxWithLoss.

@shelhamer
Copy link
Member

Thanks for the accuracy layer extension. This needs a rebase after the generalization to N-D blobs in #1970.

@BlGene BlGene force-pushed the fix_accuracy_layer branch 5 times, most recently from 24ed8c8 to 6e20ca1 Compare March 9, 2015 17:01
@BlGene
Copy link
Contributor Author

BlGene commented Mar 9, 2015

@shelhamer @jeffdonahue
Hi Guys, I updated the PR to include the changes from #2076.

@philkr Thanks. I don't have a GPU implementation planned ATM, but I will push one as soon as I have one.

@BlGene BlGene force-pushed the fix_accuracy_layer branch from 6e20ca1 to 1440731 Compare March 9, 2015 17:36
Merge branch 'master' of https://github.com/BVLC/caffe into fix_accuracy_layer
@jeffdonahue
Copy link
Contributor

Thanks @BlGene, in general please rebase your code rather than merging. I squashed and cherry-picked your changes in #2076.

@jeffdonahue jeffdonahue closed this Mar 9, 2015
@BlGene BlGene deleted the fix_accuracy_layer branch March 10, 2015 11:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants