Skip to content

Update transformers requirement from >=4.52.4 to >=4.57.6#8

Open
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/pip/transformers-gte-4.57.6
Open

Update transformers requirement from >=4.52.4 to >=4.57.6#8
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/pip/transformers-gte-4.57.6

Conversation

@dependabot
Copy link
Copy Markdown
Contributor

@dependabot dependabot Bot commented on behalf of github May 1, 2026

Updates the requirements on transformers to permit the latest version.

Release notes

Sourced from transformers's releases.

Patch release v4.57.6

What's Changed

Another fix for qwen vl models that prevented correctly loading the associated model type - this works together with huggingface/transformers#41808 of the previous patch release.

Full Changelog: huggingface/transformers@v4.57.5...v4.57.6

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [transformers](https://github.com/huggingface/transformers) to permit the latest version.
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](huggingface/transformers@v4.52.4...v4.57.6)

---
updated-dependencies:
- dependency-name: transformers
  dependency-version: 4.57.6
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot Bot added dependencies Pull requests that update a dependency file python Pull requests that update python code labels May 1, 2026
FortPercent pushed a commit that referenced this pull request May 2, 2026
…ing null

`context_null_path` is required for working CFG (otherwise
cond + scale*(cond - 0) = (1+scale)*cond and reward variance collapses to
zero — silent grad_norm=0 across every previous smoke).

- Replace 3 legacy preprocessors + 2 launcher shells with one tool:
    data_preprocess/prepare_wan_data.py
  Accepts .txt (one prompt per line) or .json (list of {caption, ...}).
  Loads the umT5 encoder lazily; per-row encode is skipped when
  `context_path` already points at an existing .npy; the negative-prompt
  encode is skipped when `context_null.npy` already exists. Re-running is
  safe and a no-op if everything is already in place.
- Default negative prompt = Wan official Chinese template (overridable via
  `--negative_prompt`).
- Dataset (`teleboost/utils/dataset/_dancegrpo_rl_dataset.py`): swap the
  silent `null_context = torch.zeros_like(context)` fallback for a
  KeyError that names the fix command. The fallback was a documented
  training-killer pretending to be a "smoke convenience".
- README: new "Prepare data" section before "Run a smoke", updated table to
  include all six 14B Wan22 smoke variants we ship; remove the
  "*_wxe.sh removed" sentence (no longer relevant after cleanup).
- docs/install_from_scratch.md: rewrite the data-prep section accordingly,
  rephrase gotcha #8 to describe the fail-fast behavior.

Removed:
- data_preprocess/preprocess_wan_data.py
- data_preprocess/preprocess_wan_embeddings.py
- data_preprocess/preprocess_wan_embeddings_fromlist.py
- data_preprocess/preprocess_wan_rl_embeddings{,_1p3B}.sh

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
FortPercent pushed a commit that referenced this pull request May 7, 2026
Smoke 全覆盖结果:
  #7  14B + HPSv2          PASS (33s/step cold, 26s warm)
  #8  1.3B + HPSv2         PASS (703s cold incl. inductor compile, 3.6s warm)
  #9  1.3B + Qwen-VL-7B    PASS (31s cold, 6.7s warm; rewards=44→19.75 真实)
  #11 1.3B + Joint 4-reward PASS (80s cold, 12s warm; 4 worker init OK)

新增依赖: pip install decord (videophy reward 用)
FortPercent pushed a commit that referenced this pull request May 7, 2026
…ing null

`context_null_path` is required for working CFG (otherwise
cond + scale*(cond - 0) = (1+scale)*cond and reward variance collapses to
zero — silent grad_norm=0 across every previous smoke).

- Replace 3 legacy preprocessors + 2 launcher shells with one tool:
    data_preprocess/prepare_wan_data.py
  Accepts .txt (one prompt per line) or .json (list of {caption, ...}).
  Loads the umT5 encoder lazily; per-row encode is skipped when
  `context_path` already points at an existing .npy; the negative-prompt
  encode is skipped when `context_null.npy` already exists. Re-running is
  safe and a no-op if everything is already in place.
- Default negative prompt = Wan official Chinese template (overridable via
  `--negative_prompt`).
- Dataset (`teleboost/utils/dataset/_dancegrpo_rl_dataset.py`): swap the
  silent `null_context = torch.zeros_like(context)` fallback for a
  KeyError that names the fix command. The fallback was a documented
  training-killer pretending to be a "smoke convenience".
- README: new "Prepare data" section before "Run a smoke", updated table to
  include all six 14B Wan22 smoke variants we ship; remove the
  "*_wxe.sh removed" sentence (no longer relevant after cleanup).
- docs/install_from_scratch.md: rewrite the data-prep section accordingly,
  rephrase gotcha #8 to describe the fail-fast behavior.

Removed:
- data_preprocess/preprocess_wan_data.py
- data_preprocess/preprocess_wan_embeddings.py
- data_preprocess/preprocess_wan_embeddings_fromlist.py
- data_preprocess/preprocess_wan_rl_embeddings{,_1p3B}.sh

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants