Skip to content

fix: replace all instances of deprecated torch._check_is_size with torch._check#1940

Merged
matthewdouglas merged 1 commit intobitsandbytes-foundation:mainfrom
neil-the-nowledgeable:fix/check-is-size-deprecation
May 7, 2026
Merged

fix: replace all instances of deprecated torch._check_is_size with torch._check#1940
matthewdouglas merged 1 commit intobitsandbytes-foundation:mainfrom
neil-the-nowledgeable:fix/check-is-size-deprecation

Conversation

@neil-the-nowledgeable
Copy link
Copy Markdown
Contributor

@neil-the-nowledgeable neil-the-nowledgeable commented May 7, 2026

Summary

Closes #1933.

PyTorch's torch._check_is_size is being removed in a future release per pytorch/pytorch#169400 — the FutureWarning advises "Use _check(i >= 0) instead". This PR finds / replaces all older approach instances with the recommended substitution.

Scope

23 occurrences across 6 files:

File Hits
bitsandbytes/_ops.py 8
bitsandbytes/backends/triton/ops.py 5
bitsandbytes/backends/default/ops.py 4
bitsandbytes/backends/cpu/ops.py 3
bitsandbytes/backends/cuda/ops.py 2
bitsandbytes/backends/hpu/ops.py 1

Each call torch._check_is_size(blocksize) is replaced with the upstream-recommended pattern:

torch._check(blocksize >= 0, lambda: f"Blocksize must be non-negative, got {blocksize}")

This matches the existing torch._check(...) style already used at adjacent lines in the same files.

Backwards compatibility

torch._check is available in all torch >= 2.0; bitsandbytes requires torch >= 2.2 per pyproject.toml. No backwards-compat shim needed.

Test plan

  • pre-commit run --files <changed> passes (ruff + ruff-format + hygiene hooks)
  • Replacement string verified at all 23 sites (count delta zero: 23 removed, 23 added)
  • Diff preserves indentation; no other code paths touched

Additional audits

Audit structure adopted from @rapsealk's #1931 (asserts → exceptions refactor) for similar mechanical-substitution PRs.

  • python -m py_compile on all 6 changed files: all pass
  • No leftover torch._check_is_size anywhere in bitsandbytes/ after the substitution (count went 23 → 0)
  • Argument variable is blocksize at all 23 sites — no slip in argument naming
  • Call-site context: every call sits inside either a @register_fake(...) (FakeTensor / dispatch hook) or a @register_kernel(...) (backend kernel impl) decorator

Semantic equivalence under SymInt (FakeTensor / torch.compile)

_check_is_size(i) does two things: (1) asserts i >= 0, and (2) informs PyTorch's dynamic-shape engine that i is a size (i.e., non-negative). The substitution _check(blocksize >= 0, ...) preserves both behaviors:

  • For plain Python int, both produce identical runtime checks.
  • For SymInt (which can flow through register_fake under torch.compile), blocksize >= 0 evaluates to SymBool and torch._check of a SymBool informs the dynamic-shape engine of the same >= 0 guard.

This matches the upstream migration explicitly recommended in pytorch/pytorch#169400 ("Use _check(i >= 0) instead").

Out of scope

  • csrc/ (C++/CUDA) — _check_is_size is a Python-only API
  • tests/ and examples/ — no calls to _check_is_size outside bitsandbytes/ package code

PyTorch's _check_is_size is being removed in a future release per
pytorch/pytorch#169400 ("Use _check(i >= 0) instead"). This replaces
all 23 occurrences with the recommended torch._check pattern, matching
the existing torch._check style already used elsewhere in these files.

Affected files (23 occurrences):
- bitsandbytes/_ops.py (8)
- bitsandbytes/backends/triton/ops.py (5)
- bitsandbytes/backends/default/ops.py (4)
- bitsandbytes/backends/cpu/ops.py (3)
- bitsandbytes/backends/cuda/ops.py (2)
- bitsandbytes/backends/hpu/ops.py (1)

torch._check is available in all torch >= 2.0; bitsandbytes requires
torch >= 2.2 per pyproject.toml, so no backwards-compat shim is needed.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@neil-the-nowledgeable neil-the-nowledgeable changed the title fix: replace all instances of deprecated torch._check_is_size with torch.check fix: replace all instances of deprecated torch._check_is_size with torch._check May 7, 2026
@github-actions
Copy link
Copy Markdown

github-actions Bot commented May 7, 2026

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@matthewdouglas
Copy link
Copy Markdown
Member

LGTM, thanks!

@matthewdouglas matthewdouglas added this to the v0.50.0 milestone May 7, 2026
@matthewdouglas matthewdouglas merged commit 4396187 into bitsandbytes-foundation:main May 7, 2026
94 of 95 checks passed
@neil-the-nowledgeable
Copy link
Copy Markdown
Contributor Author

you are most welcome. I'm very happy to be of service. I appreciate the excellent community support you've provided

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Pytorch FutureWarning of _check_is_size

2 participants