Expand a bit the presentation of examples#10799
Conversation
| This folder contains actively maintained examples of use of 🤗 Transformers organized along NLP tasks. If you are looking for an example that used to | ||
| be in this folder, it may have moved to our [research projects](https://github.com/huggingface/transformers/tree/master/examples/research_projects) subfolder (which contains frozen snapshots of research projects). | ||
| This folder contains actively maintained examples of use of 🤗 Transformers organized along NLP tasks. If you are looking for an example that used to be in this folder, it may have moved to our [research projects](https://github.com/huggingface/transformers/tree/master/examples/research_projects) subfolder (which contains frozen snapshots of research projects). | ||
|
|
There was a problem hiding this comment.
what about examples/legacy? it seems that we have 2 of these now
| #### Fine-tuning on SWAG | ||
| ## PyTorch script: fine-tuning on SWAG | ||
|
|
||
| `run_swag` allows you to fine-tune any model from our [hub](https://huggingface.co/models) on the SWAG dataset or your own csv/jsonlines files as long as they are structured the same way. To make it works on another dataset, you will need to tweak the `preprocess_function` inside the script. |
There was a problem hiding this comment.
it can't be any model, no? any model trained on text-classification perhaps?
There was a problem hiding this comment.
It's pretty much any model yes, as long as they have a ForMultipleChoice head.
There was a problem hiding this comment.
I found only 18:
src/transformers/utils/dummy_pt_objects.py:class AlbertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class AutoModelForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class BertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class CamembertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class ConvBertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class DistilBertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class ElectraForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class FlaubertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class FunnelForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class IBertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class LongformerForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class MobileBertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class MPNetForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class RobertaForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class SqueezeBertForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class XLMForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class XLMRobertaForMultipleChoice:
src/transformers/utils/dummy_pt_objects.py:class XLNetForMultipleChoice:
may be it could just say that: "any model, as long as it has a ForMultipleChoice head" - same for the other one.
multiple-choice is missing from the tasks on the left side of the hub UI.
I tried: https://huggingface.co/models?pipeline_tag=multiple-choice
There was a problem hiding this comment.
Will adapt with this, and also will make a nice table showing which model have which head for our documentation one of these days. Once it's there we can link to it.
There was a problem hiding this comment.
great and for qa it's:
grep -Ir ForQuestionAnswering src | grep -v TF | grep dummy | perl -lne 'm|class (.*):| && print($1)'
AlbertForQuestionAnswering
AutoModelForQuestionAnswering
BartForQuestionAnswering
BertForQuestionAnswering
CamembertForQuestionAnswering
ConvBertForQuestionAnswering
DebertaForQuestionAnswering
DebertaV2ForQuestionAnswering
DistilBertForQuestionAnswering
ElectraForQuestionAnswering
FlaubertForQuestionAnswering
FlaubertForQuestionAnsweringSimple
FunnelForQuestionAnswering
IBertForQuestionAnswering
LEDForQuestionAnswering
LongformerForQuestionAnswering
LxmertForQuestionAnswering
MBartForQuestionAnswering
MobileBertForQuestionAnswering
MPNetForQuestionAnswering
ReformerForQuestionAnswering
RobertaForQuestionAnswering
SqueezeBertForQuestionAnswering
TapasForQuestionAnswering
XLMForQuestionAnswering
XLMForQuestionAnsweringSimple
XLMRobertaForQuestionAnswering
XLNetForQuestionAnswering
XLNetForQuestionAnsweringSimple
|
I'd be super-handy to link directly to suitable datasets and models for each example as in may be this could be an easy first good issue. Some of the keywords and whether to use |
|
The first may be helpful, but the second is not necessarily: it shows the models that have been fine-tuned on a squad dataset, not the models that can be fine-tuned on it. There is no way to filter all the models that have an architecture containing a question-answering head as far as I know, which is what we would want to show. |
Co-authored-by: Stas Bekman <stas00@users.noreply.github.com>
|
Would this be at least in the right direction? https://huggingface.co/models?pipeline_tag=question-answering |
|
Mmm, those seem to be models fine-tuned on a question-answering task, not all models with a QuestionAnswering arch available (for instance, you should see all BERT checkpoints, all distilBERT checkpoints etc). |
|
OK, then it won't work. It'd be really awesome if in the future we had a filter to filter models by architecture - and sub-architecture in this case - that is without the model-specific part of the class name. |
LysandreJik
left a comment
There was a problem hiding this comment.
This is great, thanks for updating the notes @sgugger. I think this should help clarify things.
* Expand a bit the presentation of examples * Apply suggestions from code review Co-authored-by: Stas Bekman <stas00@users.noreply.github.com> * Address review comments Co-authored-by: Stas Bekman <stas00@users.noreply.github.com>
What does this PR do?
This PR adds a bit more information to the examples README (main and specific per example), copying some information from the main philosophy and expanding a bit, to make sure all users know what we want for the examples.