You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 3, 2023. It is now read-only.
Hi,
I'm using the ray-lightning 0.2.0 version with PL 1.5.9, to use hyperparameters optimization in parallel.
The following script doesn't finish, and trainings are still running even when my test metrics are printed (so the test function is terminated).
If I remove 'trainer.test()' then the script finishes normally.
I have tested this code with ray-lightning 0.1.1 version with PL 1.4.9. and it is OK.
deftrain(config):
metrics= {"loss": "val_loss"}
callbacks= [TuneReportCallback(metrics, on="validation_end")]
model=md.MyModel(...)
datamodule= ...
trainer=pl.Trainer(
max_epochs=max_epochs,
logger=logger,
callbacks=callbacks,
plugins=[RayPlugin(num_workers=2)])
trainer.fit(model, datamodule)
trainer.test(model, datamodule) # OK without this lineif__name__=='__main__':
analysis=tune.run(
train,
metric="loss",
mode="min",
config=training_config,
num_samples=1,
resources_per_trial=get_tune_resources(num_workers=2),
name="tune_mnist")
print("\n\nBest hyperparameters found were: ", analysis.best_config)