Skip to content

hotfix: add 9.0a to README and installation doc#2112

Merged
yzh119 merged 1 commit intoflashinfer-ai:mainfrom
yzh119:install-9.0a
Nov 20, 2025
Merged

hotfix: add 9.0a to README and installation doc#2112
yzh119 merged 1 commit intoflashinfer-ai:mainfrom
yzh119:install-9.0a

Conversation

@yzh119
Copy link
Copy Markdown
Collaborator

@yzh119 yzh119 commented Nov 19, 2025

📌 Description

9.0a was removed from installation documentation by accident, in some recent PRs.

🔍 Related Issues

🚀 Pull Request Checklist

Thank you for contributing to FlashInfer! Before we review your pull request, please make sure the following items are complete.

✅ Pre-commit Checks

  • I have installed pre-commit by running pip install pre-commit (or used your preferred method).
  • I have installed the hooks with pre-commit install.
  • I have run the hooks manually with pre-commit run --all-files and fixed any reported issues.

If you are unsure about how to set up pre-commit, see the pre-commit documentation.

🧪 Tests

  • Tests have been added or updated as needed.
  • All tests are passing (unittest, etc.).

Reviewer Notes

Summary by CodeRabbit

Release Notes

  • New Features

    • Customizable Attention support
    • CUDAGraph and torch.compile compatibility
    • High-level LLM-specific operators
  • Documentation

    • Expanded installation and setup guidance with improved build instructions
    • Enhanced package options and adoption references
    • Added support for additional CUDA architectures (9.0a, 12.0f)
    • Faster offline initialization option documented

@yzh119 yzh119 enabled auto-merge (squash) November 19, 2025 10:30
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Nov 19, 2025

Walkthrough

Documentation updates including new feature highlights (Customizable Attention, CUDAGraph/torch.compile compatibility, high-level LLM operators), expanded adoption references, enhanced installation guidance, and CUDA architecture support additions (9.0a, 12.0f) across README and installation docs.

Changes

Cohort / File(s) Summary
Documentation Updates
README.md, docs/installation.rst
Added feature highlights and usage notes; expanded News and Adoption sections; updated CUDA architecture list to include 9.0a and 12.0f in build instructions; enhanced package installation guidance with faster offline initialization notes and improved command examples; minor formatting adjustments

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

  • Documentation-only changes with no code logic modifications
  • Additions are descriptive text, feature highlights, and CUDA configuration updates
  • No API changes or exported entity alterations

Poem

🐰 New features shine in the README bright,
CUDA arches expanded—9.0a takes flight!
Installation guides now sparkle anew,
With adoption tales and compiler tricks too,
The docs hop forward, hooray for the view! 🎉

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately describes the main change: restoring CUDA architecture 9.0a to both README and installation documentation after accidental removal.
Description check ✅ Passed The description explains the purpose (restoring 9.0a), includes pre-commit checklist completion, and confirms tests are passing. Related Issues section is empty but non-critical.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b9964cc and 7e5f86e.

📒 Files selected for processing (2)
  • README.md (6 hunks)
  • docs/installation.rst (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Deploy Docs
🔇 Additional comments (3)
docs/installation.rst (1)

95-95: CUDA architecture list correctly restored.

The 9.0a architecture is properly added to FLASHINFER_CUDA_ARCH_LIST in ascending numerical order between 8.9 and 10.0a. This aligns with the README.md change and restores the inadvertently removed architecture support.

README.md (2)

99-99: CUDA architecture list correctly restored with consistent formatting.

The 9.0a addition matches the docs/installation.rst change and maintains proper ascending order in the export statement. Format is correct for bash shell commands.


23-30: Scope extends well beyond the PR title and objective.

The PR is titled "hotfix: add 9.0a" but includes substantial documentation additions: new core features (lines 23-30), expanded News section (lines 35-40), clarified Package Options (lines 54-68), updated Adoption section (lines 183-192), and verification output details (lines 130-133). While these changes appear legitimate and beneficial, please confirm whether they are intentional as part of this hotfix or represent unintended scope creep from a previous merge/rebase.

Also applies to: 35-40, 54-68, 81-84, 130-133, 183-192


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello @yzh119, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request provides a hotfix to correct an accidental omission in the project's documentation. It restores the 9.0a CUDA architecture version to the FLASHINFER_CUDA_ARCH_LIST environment variable examples, ensuring that users have accurate and complete information when building flashinfer-jit-cache from source. This change prevents potential issues related to unsupported or missing architecture configurations during the installation process.

Highlights

  • Documentation Correction: The 9.0a CUDA architecture version has been re-added to the FLASHINFER_CUDA_ARCH_LIST environment variable examples in both the README.md and docs/installation.rst files.
  • Formatting Improvements: Minor whitespace adjustments, primarily adding blank lines, have been made in the README.md file to improve readability.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly adds 9.0a to the FLASHINFER_CUDA_ARCH_LIST in both README.md and docs/installation.rst, fixing an accidental removal. The changes are correct and address the issue described. I have one suggestion regarding commit hygiene to improve reviewability in the future, but the PR is good to merge.

Comment thread README.md

```bash
export FLASHINFER_CUDA_ARCH_LIST="7.5 8.0 8.9 10.0a 10.3a 11.0a 12.0f"
export FLASHINFER_CUDA_ARCH_LIST="7.5 8.0 8.9 9.0a 10.0a 10.3a 11.0a 12.0f"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The addition of 9.0a is correct. However, this PR mixes this logical change with numerous formatting adjustments (adding/removing blank lines) throughout the file, which were likely introduced by an auto-formatter. This makes the diff noisy and obscures the primary purpose of the hotfix. For future changes, it would be best to separate stylistic/formatting changes into a separate commit from functional or documentation content changes to improve reviewability.

@yzh119 yzh119 merged commit 049e8db into flashinfer-ai:main Nov 20, 2025
4 checks passed
BingooYang pushed a commit to BingooYang/flashinfer that referenced this pull request Mar 13, 2026
<!-- .github/pull_request_template.md -->

9.0a was removed from installation documentation by accident, in some
recent PRs.

<!-- Link any related issues here -->

Thank you for contributing to FlashInfer! Before we review your pull
request, please make sure the following items are complete.

- [x] I have installed `pre-commit` by running `pip install pre-commit`
(or used your preferred method).
- [x] I have installed the hooks with `pre-commit install`.
- [x] I have run the hooks manually with `pre-commit run --all-files`
and fixed any reported issues.

> If you are unsure about how to set up `pre-commit`, see [the
pre-commit documentation](https://pre-commit.com/).

- [x] Tests have been added or updated as needed.
- [x] All tests are passing (`unittest`, etc.).

<!-- Optional: anything you'd like reviewers to focus on, concerns, etc.
-->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants