How to Install xformers
xformers ships prebuilt wheels through the PyTorch package index, not PyPI. Like PyTorch itself, the correct binary depends on the CUDA version of the target machine (see Why Installing GPU Python Packages Is So Complicated for background). Each xformers release is pinned to a specific PyTorch version, so a version mismatch between the two is the most common source of installation failures.
Requirements
- Linux (manylinux_2_28, e.g. Ubuntu 20.04+) or Windows. macOS is not supported.
- NVIDIA GPU with compute capability 8.0+ (Ampere or newer). V100 support was dropped in xformers 0.0.30.
- NVIDIA driver compatible with the target CUDA version.
- Python 3.9 or later.
Version compatibility
Each xformers release requires a specific PyTorch version. Installing the wrong combination produces import errors or silent correctness bugs.
| xformers | PyTorch | CUDA indexes |
|---|---|---|
| 0.0.35 | >= 2.10 | cu126, cu128, cu130 |
| 0.0.34 | >= 2.10 | cu126, cu128, cu130 |
| 0.0.33.post2 | 2.9.1 | cu126, cu128 |
| 0.0.31 | 2.7.1 | cu124, cu126, cu128 |
| 0.0.29.post3 | >= 2.6.0 | cu124 |
When in doubt, install the latest xformers and let it pull a compatible PyTorch.
Install from the PyTorch index
The PyTorch package index hosts prebuilt xformers wheels. Pick the URL that matches the CUDA version on the machine:
# CUDA 12.8
pip install -U xformers --index-url https://download.pytorch.org/whl/cu128
# CUDA 12.6
pip install -U xformers --index-url https://download.pytorch.org/whl/cu126This also installs (or upgrades) a compatible PyTorch build from the same index. The --index-url flag replaces PyPI entirely, so all packages in the install command resolve from the PyTorch index.
Warning
Do not install xformers from PyPI with a bare pip install xformers. PyPI carries only source distributions for old versions and a single wheel for the latest version. The PyTorch index is the intended distribution channel.
Add to a uv project
To add xformers to a project managed by uv, configure the PyTorch index in pyproject.toml and route both torch and xformers to it. This follows the same pattern described in How to Install PyTorch with uv:
[[tool.uv.index]]
name = "pytorch-cu128"
url = "https://download.pytorch.org/whl/cu128"
explicit = true
[tool.uv.sources]
torch = [
{ index = "pytorch-cu128", marker = "sys_platform == 'linux' or sys_platform == 'win32'" },
]
xformers = [
{ index = "pytorch-cu128", marker = "sys_platform == 'linux' or sys_platform == 'win32'" },
]Then add the dependency and sync:
uv add xformers
uv syncBoth torch and xformers must be routed to the same index. If xformers is listed in [tool.uv.sources] but torch is not (or vice versa), the resolver may pull mismatched builds.
For a quick one-off install with uv pip:
uv pip install xformers --torch-backend=cu128Install with conda-forge or pixi
xformers is available on conda-forge with CUDA variants managed through conda’s virtual package system.
With pixi:
pixi add xformersWith conda:
conda install -c conda-forge xformersConda resolves the CUDA version as a shared dependency across PyTorch and xformers, which avoids the index-URL coordination that pip requires. See uv vs. pixi vs. conda for scientific Python for help deciding between toolchains.
Verify the installation
After installing, confirm that xformers loads and can run on the GPU:
import torch
import xformers.ops
q = torch.randn(1, 128, 8, 64, device="cuda")
k = torch.randn(1, 128, 8, 64, device="cuda")
v = torch.randn(1, 128, 8, 64, device="cuda")
out = xformers.ops.memory_efficient_attention(q, k, v)
print(out.shape) # torch.Size([1, 128, 8, 64])Troubleshooting
ImportError: cannot import name '_C' or undefined symbol errors: xformers and PyTorch were compiled against different CUDA versions. Reinstall both from the same --index-url.
RuntimeError: No available kernel at runtime: The GPU’s compute capability is below the minimum for that xformers version. Check torch.cuda.get_device_capability() and verify it returns (8, 0) or higher.
No matching distribution found for xformers: The --index-url does not carry wheels for the requested Python version or platform. xformers wheels are only published for Linux (x86_64) and Windows (AMD64).
Source build fails with No module named 'torch': When building from source, install PyTorch first, then install xformers with --no-build-isolation so the build can find the existing torch installation.