Conversation
85b77ab to
1f6f644
Compare
fix README.md file name casing
1f6f644 to
8d585bb
Compare
| [](https://python.org/downloads) | ||
| [](https://pypi.org/project/torch_sim_atomistic) | ||
| [](https://github.com/torchsim/torch-sim/actions/workflows/test.yml) | ||
| [](https://codecov.io/gh/torchsim/torch-sim) |
There was a problem hiding this comment.
https://codecov.io/gh/torchsim/torch-sim is a broken link currently
There was a problem hiding this comment.
tried adding codecov app to this new org. seems to have worked but then but it's not showing up for me in the web UI (yet)...
| if step % 10 == 0: | ||
| hybrid_state = swap_step(hybrid_state, kT=torch.tensor(kT), generator=generator) | ||
| hybrid_state = ts.swap_mc_step( | ||
| model=model, state=hybrid_state, kT=kT, seed=42 + step |
There was a problem hiding this comment.
why not keep passing the same generator? reseeding is bad for entropy vs using the same prng sequence
|
|
||
| # %% Create individual filenames for each system | ||
| filenames = [f"batch_traj_{i}.h5" for i in range(len(systems))] | ||
| filenames = [f"tmp/batch_traj_{i}.h5" for i in range(len(systems))] |
There was a problem hiding this comment.
nit: Do you want to auto-clean up at the end of the tutorial and delete tmp?
There was a problem hiding this comment.
better not, could be people (like myself) have other stuff in tmp/ folder which they want to keep
There was a problem hiding this comment.
could write to a subfolder in tmp/ and clean that up instead. then only minor concern is that people might want to manually inspect the files and be annoyed if they need to comment out the removal code and rerun tutorial
There was a problem hiding this comment.
I think it's fine letting people clean up their own tutorials
tests/test_integrators.py
Outdated
| temperatures = [] | ||
| for _step in range(n_steps): | ||
| state = update_fn(state=state) | ||
| state = ts.npt_langevin_update( |
There was a problem hiding this comment.
feels like the refactor likely introduces a bug here in that we should be passing a generator to this update function?
There was a problem hiding this comment.
i don't follow. you mean we need to externally seed?
torch-sim/torch_sim/integrators/npt.py
Line 152 in 49be107
There was a problem hiding this comment.
I would think (not sure/haven't/won't check) before the init function that returned the update function would have made a generator in the factory that was used for the PRNG sequence inside the update. Here this is unseeded which is probably fine for now but would be better to fix alongside something like: #51
There was a problem hiding this comment.
That would have been the right way to do it but not what we did. Fixing the seeds doesn't really work on most integrators beyond init. Agree it can be addressed later.
for #211 no, for #127 a few minor changes to bring it inline with the new API (was still using |
…id_swap_tutorial.py
lots of bug fixes, type fixes, moving integrator+optimizer functions into module scope (for easier typing and easier-to-reason-about-API since no more function closures over models). this PR includes the fairchem v2 PR #238, #127 by @stefanbringuier.