Skip to content

Conversation

@iedmrc
Copy link
Owner

@iedmrc iedmrc commented Nov 16, 2019

No description provided.

stefan-it and others added 30 commits November 11, 2019 16:18
Custom schedulers are currently initiated by wrapping Pytorch's LambdaLR
class and passing a method of the wrapping class to the __init__
function of LambdaLR. This approach is not appropriate for several
reasons:

1. one does not need to define a class when it only defines a
__init__() method;
2. instantiating the parent class by passing a method of the child class
creates a cyclical reference which leads to memory leaks. See issues #1742 and #1134.

In this commit we replace the wrapper classes with functions that
instantiate `LambdaLR` with a custom learning rate function. We use a
closure to specify the parameter of the latter. We also do a bit of
renaming within the function to explicit the behaviour and removed
docstrings that were subsequently not necessary.
Token indices sequence length is longer than the specified maximum sequence length for this model
replace LambdaLR scheduler wrappers by function
…cement

sum() is replaced by itertools.chain.from_iterable()
Fix special tokens addition in decoder #1807
fix multi-gpu eval in torch examples
…cation

DistilBERT for token classification
@iedmrc iedmrc merged commit 5162acc into iedmrc:master Nov 16, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants