Skip to content

Bump torch to 1.8.0#2442

Merged
himkt merged 9 commits intooptuna:masterfrom
hvy:bump-torch
Mar 17, 2021
Merged

Bump torch to 1.8.0#2442
himkt merged 9 commits intooptuna:masterfrom
hvy:bump-torch

Conversation

@hvy
Copy link
Copy Markdown
Member

@hvy hvy commented Mar 5, 2021

Motivation

Try bumping to latest torch releases.

Description of the changes

See above.

@hvy hvy added the installation Installation and dependency. label Mar 5, 2021
@hvy
Copy link
Copy Markdown
Member Author

hvy commented Mar 5, 2021

Build Docker Image failure due to conflicting dependencies.

#10 55.23 The conflict is caused by:
#10 55.23     optuna[checking,doctest,document,example,testing] 2.6.0.dev0 depends on torchvision==0.9.0+cpu
#10 55.23     allennlp 2.1.0 depends on torchvision<0.9.0 and >=0.8.1
#10 55.23     optuna[checking,doctest,document,example,testing] 2.6.0.dev0 depends on torchvision==0.9.0+cpu
#10 55.23     allennlp 2.0.1 depends on torchvision<0.9.0 and >=0.8.1
#10 55.23     optuna[checking,doctest,document,example,testing] 2.6.0.dev0 depends on torchvision==0.9.0+cpu
#10 55.23     allennlp 2.0.0 depends on torchvision<0.9.0 and >=0.8.1

https://github.com/optuna/optuna/pull/2442/checks?check_run_id=2035915426

@hvy
Copy link
Copy Markdown
Member Author

hvy commented Mar 5, 2021

Related ongoing PR #2434.

@hvy
Copy link
Copy Markdown
Member Author

hvy commented Mar 5, 2021

The tutorial that used to require the urllib hotfix now seems to work without it.

@hvy hvy mentioned this pull request Mar 5, 2021
@himkt
Copy link
Copy Markdown
Member

himkt commented Mar 7, 2021

[Just a note]

I think AllenNLP will support torch v1.8.0 and torchvision v0.9.0 from the next release.
(ref. allenai/allennlp@7f60990)

@hvy hvy linked an issue Mar 10, 2021 that may be closed by this pull request
@hvy hvy force-pushed the bump-torch branch 2 times, most recently from 050b51f to 72d9533 Compare March 11, 2021 10:52
@hvy
Copy link
Copy Markdown
Member Author

hvy commented Mar 11, 2021

Currently blocked by the following error.

Traceback (most recent call last):
  File "examples/pytorch/pytorch_ignite_simple.py", line 133, in <module>
    study.optimize(objective, n_trials=100, timeout=600)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/optuna/study.py", line 394, in optimize
    show_progress_bar=show_progress_bar,
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/optuna/_optimize.py", line 76, in _optimize
    progress_bar=progress_bar,
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/optuna/_optimize.py", line 163, in _optimize_sequential
    trial = _run_trial(study, func, catch)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/optuna/_optimize.py", line 268, in _run_trial
    raise func_err
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/optuna/_optimize.py", line 217, in _run_trial
    value_or_values = func(trial)
  File "examples/pytorch/pytorch_ignite_simple.py", line 106, in objective
    train_loader, val_loader = get_data_loaders(TRAIN_BATCH_SIZE, VAL_BATCH_SIZE)
  File "examples/pytorch/pytorch_ignite_simple.py", line 75, in get_data_loaders
    train_data = MNIST(download=True, root=".", transform=data_transform, train=True)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/torchvision/datasets/mnist.py", line 79, in __init__
    self.download()
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/torchvision/datasets/mnist.py", line 146, in download
    download_and_extract_archive(url, download_root=self.raw_folder, filename=filename, md5=md5)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/torchvision/datasets/utils.py", line 314, in download_and_extract_archive
    download_url(url, download_root, filename, md5)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/torchvision/datasets/utils.py", line 140, in download_url
    raise e
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/torchvision/datasets/utils.py", line 132, in download_url
    _urlretrieve(url, fpath)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/torchvision/datasets/utils.py", line 29, in _urlretrieve
    with urllib.request.urlopen(urllib.request.Request(url, headers={"User-Agent": USER_AGENT})) as response:
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/urllib/request.py", line 641, in http_response
    'http', request, response, code, msg, hdrs)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/urllib/request.py", line 503, in _call_chain
    result = func(*args)
  File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 503: Service Unavailable

Comment on lines -26 to -32
# TODO(crcrpar): Remove the below three lines once everything is ok.
# Register a global custom opener to avoid HTTP Error 403: Forbidden when downloading MNIST.
opener = urllib.request.build_opener()
opener.addheaders = [("User-agent", "Mozilla/5.0")]
urllib.request.install_opener(opener)


Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure if we can remove the UA of the HTTP requests to download MNIST in the example of non-pytorch examples including Chainer. Mayb we should discuss in #2469.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, you're absolutely right. I omitted the removals of these UA configurations.

@codecov-io
Copy link
Copy Markdown

Codecov Report

Merging #2442 (3874574) into master (590d4aa) will decrease coverage by 1.58%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #2442      +/-   ##
==========================================
- Coverage   91.36%   89.78%   -1.59%     
==========================================
  Files         135      135              
  Lines       11308    11308              
==========================================
- Hits        10332    10153     -179     
- Misses        976     1155     +179     
Impacted Files Coverage Δ
optuna/integration/allennlp.py 0.00% <0.00%> (-88.12%) ⬇️
optuna/storages/_cached_storage.py 96.74% <0.00%> (-0.33%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 590d4aa...3874574. Read the comment docs.

@hvy hvy marked this pull request as ready for review March 12, 2021 03:45
@hvy
Copy link
Copy Markdown
Member Author

hvy commented Mar 12, 2021

PTAL.

@HideakiImamura
Copy link
Copy Markdown
Member

@toshihikoyanase @himkt Could you review this PR?

Copy link
Copy Markdown
Member

@HideakiImamura HideakiImamura left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I think it makes sense to run the tests for non-allennlp first, and then run the tests for allennlp. These will be integrated after the next release of allennlp.

Copy link
Copy Markdown
Member

@himkt himkt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me merge this PR as we have a consensus that we should bump torch ASAP.

@himkt himkt added this to the v2.7.0 milestone Mar 17, 2021
@himkt himkt merged commit d0aa2e9 into optuna:master Mar 17, 2021
@hvy hvy deleted the bump-torch branch March 18, 2021 01:17
@hvy hvy mentioned this pull request Mar 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

installation Installation and dependency.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bump PyTorch version to 1.8

5 participants