Skip to content

Conversation

@MekkCyber
Copy link
Contributor

Reverts #41563

Reverting to check what's happening

@SunMarc
Copy link
Member

SunMarc commented Oct 14, 2025

ImportError while loading conftest '/admin/home/marc/transformers/conftest.py'.
conftest.py:27: in <module>
    from transformers.testing_utils import (
src/transformers/testing_utils.py:53: in <module>
    from transformers import Trainer
src/transformers/utils/import_utils.py:1899: in __getattr__
    module = self._get_module(self._class_to_module[name])
src/transformers/utils/import_utils.py:1929: in _get_module
    raise e
src/transformers/utils/import_utils.py:1927: in _get_module
    return importlib.import_module("." + module_name, self.__name__)
src/transformers/trainer.py:41: in <module>
    from .integrations import (
src/transformers/utils/import_utils.py:1899: in __getattr__
    module = self._get_module(self._class_to_module[name])
src/transformers/utils/import_utils.py:1929: in _get_module
    raise e
src/transformers/utils/import_utils.py:1927: in _get_module
    return importlib.import_module("." + module_name, self.__name__)
src/transformers/integrations/integration_utils.py:43: in <module>
    from .. import PreTrainedModel, TrainingArguments
src/transformers/utils/import_utils.py:1899: in __getattr__
    module = self._get_module(self._class_to_module[name])
src/transformers/utils/import_utils.py:1929: in _get_module
    raise e
src/transformers/utils/import_utils.py:1927: in _get_module
    return importlib.import_module("." + module_name, self.__name__)
src/transformers/modeling_utils.py:60: in <module>
    from .integrations.eager_paged import eager_paged_attention_forward
src/transformers/integrations/eager_paged.py:6: in <module>
    from ..generation.continuous_batching.cache import PagedAttentionCache
src/transformers/generation/continuous_batching/__init__.py:16: in <module>
    from .continuous_api import ContinuousBatchingManager, ContinuousMixin
src/transformers/generation/continuous_batching/continuous_api.py:33: in <module>
    from ...integrations.hub_kernels import load_and_register_attn_kernel
src/transformers/integrations/hub_kernels.py:141: in <module>
    register_kernel_mapping(_KERNEL_MAPPING)
../miniconda3/envs/test/lib/python3.10/site-packages/kernels/layer.py:699: in register_kernel_mapping
    feature_repos = device_repo.setdefault(device.type, device.create_repo())
../miniconda3/envs/test/lib/python3.10/site-packages/kernels/layer.py:129: in create_repo
    raise ValueError(f"Unknown device type: {self.type}")
E   ValueError: Unknown device type: xpu

not stable on kernels side, let's fix this first

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@MekkCyber MekkCyber merged commit ae6f6cc into main Oct 14, 2025
26 checks passed
@MekkCyber MekkCyber deleted the revert-41563-xpu-rmsnorm-kernel branch October 14, 2025 13:49
@kaixuanliu
Copy link
Contributor

kaixuanliu commented Oct 14, 2025

Hi, @SunMarc @MekkCyber The issue you mentioned above may be solved after this PR: huggingface/kernels#141, which is contributed by @YangKai0616 from our team. Can you use latest version for kernels to double check? Many thanks!

@SunMarc
Copy link
Member

SunMarc commented Oct 14, 2025

yeah we need to pin kernels version to the latest one otherwise, users who upgrade transformers might face the same issue as me. Or we need some kind of check for the different devices and ask users to upgrade. Right now it is impacting all users and not only xpu users.

@MekkCyber
Copy link
Contributor Author

I will add a general check to upgrade before using the mapping to resolve the issue for now

ngazagna-qc pushed a commit to ngazagna-qc/transformers that referenced this pull request Oct 23, 2025
Revert "add rmsnorm kernels support for Intel XPU (huggingface#41563)"

This reverts commit fd787c5.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants